-
Notifications
You must be signed in to change notification settings - Fork 16
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
In a docker container it is not possible to save the netcdf file of a dataset In the same container, directly using satpy, does not create problems.
To Reproduce
touch /mnt/input/*202309190100*
trollflow.yaml:
product_list:
output_dir:
/mnt/output/
subscribe_topics:
- /dataset/seviri_hrit
publish_topic: /image/seviri_hrit
reader: seviri_l1b_hrit
fname_pattern:
"{start_time:%Y%m%d_%H%M}_{platform_name}_{areaname}_{productname}.{format}"
sunzen_check_lon: 5.0
sunzen_check_lat: 56.0
# Delay composite generation until resampling is done. This is
# faster when many products share the same channels
delay_composites: True
areas:
Etna:
# Add a priority. Not relevant for one area, but just to show the option
priority: 1
areaname: Etna
# Search radius for resampling
radius_of_influence: 20000
# Resampling method
resampler: nearest #or# bilinear
#Cachedir
cache_dir: /mnt/config/cache/
products:
IR_087:
productname: 'ir_087'
formats:
- format: nc
writer: cf
workers:
- fun: !!python/name:trollflow2.plugins.create_scene
- fun: !!python/name:trollflow2.plugins.sza_check
- fun: !!python/name:trollflow2.plugins.load_composites
- fun: !!python/name:trollflow2.plugins.resample
- fun: !!python/name:trollflow2.plugins.save_datasets
- fun: !!python/object:trollflow2.plugins.FilePublisher {}
areas.yaml
Etna:
description: Etna Area
projection:
proj: tmerc
ellps: WGS84
lat_0: 36.5
lon_0: 15.0
lat_ts: 36.5
shape:
height: 259
width: 374
area_extent:
lower_left_xy: [-561138.8507973854, -372238.69255371124]
upper_right_xy: [561138.8507973858, 405788.8732805471]
Expected behavior
Netcdf file output
Actual results
[DEBUG: 2023-09-19 18:24:55 : h5py._conv] Creating converter from 7 to 5
[DEBUG: 2023-09-19 18:24:55 : h5py._conv] Creating converter from 5 to 7
[INFO: 2023-09-19 18:24:55 : satpy.writers.cf_writer] Saving datasets to NetCDF4/CF.
[ERROR: 2023-09-19 18:24:55 : trollflow2.launcher] Process crashed
Traceback (most recent call last):
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/file_manager.py", line 211, in _acquire_with_cache_info
file = self._cache[self._key]
~~~~~~~~~~~^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/lru_cache.py", line 56, in __getitem__
value = self._cache[key]
~~~~~~~~~~~^^^^^
KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/mnt/output/20230919_0100_Meteosat-11_Etna_ct.nc',), 'a', (('clobber', True), ('diskless', False), ('format', 'NC'), ('persist', False)), 'a7d16e51-cb92-44d9-b0d1-5d817d1db8d0']
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/conda/lib/python3.11/site-packages/trollflow2/launcher.py", line 375, in process
cwrk.pop('fun')(job, **cwrk)
File "/opt/conda/lib/python3.11/site-packages/trollflow2/plugins/__init__.py", line 292, in save_datasets
obj = save_dataset(scns, fmat, fmat_config, renames, compute=eager_writing)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/trollflow2/plugins/__init__.py", line 235, in save_dataset
obj = scns[fmat['area']].save_dataset(dsid,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/satpy/scene.py", line 1221, in save_dataset
return writer.save_dataset(self[dataset_id],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/satpy/writers/cf_writer.py", line 1093, in save_dataset
return self.save_datasets([dataset], filename, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/satpy/writers/cf_writer.py", line 1176, in save_datasets
res = ds.to_netcdf(filename,
^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/core/dataset.py", line 2252, in to_netcdf
return to_netcdf( # type: ignore # mypy cannot resolve the overloads:(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/api.py", line 1229, in to_netcdf
store = store_open(target, mode, format, group, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/netCDF4_.py", line 400, in open
return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/netCDF4_.py", line 347, in __init__
self.format = self.ds.data_model
^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/netCDF4_.py", line 409, in ds
return self._acquire()
^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/netCDF4_.py", line 403, in _acquire
with self._manager.acquire_context(needs_lock) as root:
File "/opt/conda/lib/python3.11/contextlib.py", line 137, in __enter__
return next(self.gen)
^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/file_manager.py", line 199, in acquire_context
file, cached = self._acquire_with_cache_info(needs_lock)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/file_manager.py", line 217, in _acquire_with_cache_info
file = self._opener(*self._args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "src/netCDF4/_netCDF4.pyx", line 2278, in netCDF4._netCDF4.Dataset.__init__
File "src/netCDF4/_netCDF4.pyx", line 1617, in netCDF4._netCDF4._set_default_format
ValueError: unrecognized format requested
[DEBUG: 2023-09-19 18:24:55 : trollflow2.launcher] Cleaning up
[DEBUG: 2023-09-19 18:24:55 : posttroll.publisher] exiting publish
Process SpawnProcess-62:
Traceback (most recent call last):
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/file_manager.py", line 211, in _acquire_with_cache_info
file = self._cache[self._key]
~~~~~~~~~~~^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/lru_cache.py", line 56, in __getitem__
value = self._cache[key]
~~~~~~~~~~~^^^^^
KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/mnt/output/20230919_0100_Meteosat-11_Etna_ct.nc',), 'a', (('clobber', True), ('diskless', False), ('format', 'NC'), ('persist', False)), 'a7d16e51-cb92-44d9-b0d1-5d817d1db8d0']
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/conda/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/opt/conda/lib/python3.11/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/opt/conda/lib/python3.11/site-packages/trollflow2/launcher.py", line 331, in queue_logged_process
process(msg, prod_list, produced_files)
File "/opt/conda/lib/python3.11/site-packages/trollflow2/launcher.py", line 375, in process
cwrk.pop('fun')(job, **cwrk)
File "/opt/conda/lib/python3.11/site-packages/trollflow2/plugins/__init__.py", line 292, in save_datasets
obj = save_dataset(scns, fmat, fmat_config, renames, compute=eager_writing)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/trollflow2/plugins/__init__.py", line 235, in save_dataset
obj = scns[fmat['area']].save_dataset(dsid,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/satpy/scene.py", line 1221, in save_dataset
return writer.save_dataset(self[dataset_id],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/satpy/writers/cf_writer.py", line 1093, in save_dataset
return self.save_datasets([dataset], filename, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/satpy/writers/cf_writer.py", line 1176, in save_datasets
res = ds.to_netcdf(filename,
^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/core/dataset.py", line 2252, in to_netcdf
return to_netcdf( # type: ignore # mypy cannot resolve the overloads:(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/api.py", line 1229, in to_netcdf
store = store_open(target, mode, format, group, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/netCDF4_.py", line 400, in open
return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/netCDF4_.py", line 347, in __init__
self.format = self.ds.data_model
^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/netCDF4_.py", line 409, in ds
return self._acquire()
^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/netCDF4_.py", line 403, in _acquire
with self._manager.acquire_context(needs_lock) as root:
File "/opt/conda/lib/python3.11/contextlib.py", line 137, in __enter__
return next(self.gen)
^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/file_manager.py", line 199, in acquire_context
file, cached = self._acquire_with_cache_info(needs_lock)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xarray/backends/file_manager.py", line 217, in _acquire_with_cache_info
file = self._opener(*self._args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "src/netCDF4/_netCDF4.pyx", line 2278, in netCDF4._netCDF4.Dataset.__init__
File "src/netCDF4/_netCDF4.pyx", line 1617, in netCDF4._netCDF4._set_default_format
ValueError: unrecognized format requested
[DEBUG: 2023-09-19 18:24:55 : posttroll.publisher] exiting publish
[CRITICAL: 2023-09-19 18:24:55 : trollflow2.launcher] Process crashed with exit code 1
Environment Info:
- OS: [Linux, docker container]
- Trollflow2 Version: [[v0.14.0]]
- Satpy Version: 0.43.0
- Pyresample Version: 1.27.1
Additional context
In the same container this code works:
from satpy import Scene
import glob
area='Etna'
filenames = glob.glob('/mnt/input/*202309190100*')
scn = Scene(filenames=filenames, reader='seviri_l1b_hrit')
scn.load(['IR_087'])
scn2 = scn.resample(area,cache_dir='cache',resampler='nearest')
scn2.save_datasets(writer='cf', datasets=['IR_087'], filename='/mnt/output/seviri_test.nc')
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working