climix issueshttps://git.smhi.se/climix/climix/-/issues2023-06-19T14:40:11Zhttps://git.smhi.se/climix/climix/-/issues/310Percentile and ThresholdedPercentile index functions does not pass all unit t...2023-06-19T14:40:11ZCarolina NilssonPercentile and ThresholdedPercentile index functions does not pass all unit testsThe Percentile and ThresholdedPercentile index functions does not pass all unit tests.
1. The percentile call_function does not run since the aux coordinate for the percentile is passed as an input to the numpy function and not the poin...The Percentile and ThresholdedPercentile index functions does not pass all unit tests.
1. The percentile call_function does not run since the aux coordinate for the percentile is passed as an input to the numpy function and not the point value of the aux coordinate.
2. The numpy percentile function does not take the mask into consideration when computing the percentiles. A work around is to set the thresholded values to NaN and then use the np.nanpercentile to exclude the masked values from the computation.0.19 (Poco Mas)Carolina NilssonCarolina Nilssonhttps://git.smhi.se/climix/climix/-/issues/307RunningStatistics and ThresholdedRunningStatistics does not pass all unit tests2023-07-03T15:50:06ZCarolina NilssonRunningStatistics and ThresholdedRunningStatistics does not pass all unit testsThe RunningStatistics and ThresholdedRunningStatistics index functions does not pass all the unit tests. Some test fails because the mask is not preserved in the process e.g., when using np.concatenate and np.where. Other tests fails if ...The RunningStatistics and ThresholdedRunningStatistics index functions does not pass all the unit tests. Some test fails because the mask is not preserved in the process e.g., when using np.concatenate and np.where. Other tests fails if another statistics than "max" is used and some fails because of the padding with zeros in the start and end which can give lower aggregated values.0.19 (Poco Mas)Carolina NilssonCarolina Nilssonhttps://git.smhi.se/climix/climix/-/issues/306Update clix-meta files to v0.5.2 when ready2023-04-21T15:56:13ZLars BärringUpdate clix-meta files to v0.5.2 when readyThe new maintenance version of clix-meta will soon be ready, and it would be good to have it in Climix v.0.18.The new maintenance version of clix-meta will soon be ready, and it would be good to have it in Climix v.0.18.0.18 (Gull Olle)Joakim LöwJoakim Löwhttps://git.smhi.se/climix/climix/-/issues/305CountJointOccurrences index functions does not pass all unit tests2023-05-24T14:39:03ZCarolina NilssonCountJointOccurrences index functions does not pass all unit testsWhen running count_joint_occurrences_precipitation_temperature, where one of the inputs have masked data, will return the other condition as True or False.
Here the mask needs to be preserved, if a grid-cell contains masked data then th...When running count_joint_occurrences_precipitation_temperature, where one of the inputs have masked data, will return the other condition as True or False.
Here the mask needs to be preserved, if a grid-cell contains masked data then the output grid-cell should probably be masked as well.0.19 (Poco Mas)Carolina NilssonCarolina Nilssonhttps://git.smhi.se/climix/climix/-/issues/304Modernize build infrastructure2023-04-19T15:05:32ZKlaus ZimmermannModernize build infrastructureWe rely on traditional `setup.py`, `setup.cfg`, and various tool specific config files. We should follow modern Python packaging standards and unify things in `pyproject.toml`.We rely on traditional `setup.py`, `setup.cfg`, and various tool specific config files. We should follow modern Python packaging standards and unify things in `pyproject.toml`.0.18 (Gull Olle)Klaus ZimmermannKlaus Zimmermannhttps://git.smhi.se/climix/climix/-/issues/301Running climix 0.17 with new env throws hdf5 error messages2023-04-19T13:24:26ZCarolina NilssonRunning climix 0.17 with new env throws hdf5 error messagesCreating a new env using climix 0.17 environment.yml and running climix throws hdf5 error messages.
```
HDF5-DIAG: Error detected in HDF5 (1.14.0) thread 1:
#000: H5A.c line 679 in H5Aopen_by_name(): unable to synchronously open attrib...Creating a new env using climix 0.17 environment.yml and running climix throws hdf5 error messages.
```
HDF5-DIAG: Error detected in HDF5 (1.14.0) thread 1:
#000: H5A.c line 679 in H5Aopen_by_name(): unable to synchronously open attribute
major: Attribute
minor: Can't open object
#001: H5A.c line 641 in H5A__open_by_name_api_common(): unable to open attribute: '_QuantizeBitRoundNumberOfSignificantBits'
major: Attribute
minor: Can't open object
#002: H5A.c line 464 in H5A__open_common(): unable to open attribute: '_QuantizeBitRoundNumberOfSignificantBits'
major: Attribute
minor: Can't open object
#003: H5VLcallback.c line 1138 in H5VL_attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#004: H5VLcallback.c line 1105 in H5VL__attr_open(): attribute open failed
major: Virtual Object Layer
minor: Can't open object
#005: H5VLnative_attr.c line 161 in H5VL__native_attr_open(): can't open attribute
major: Attribute
minor: Can't open object
#006: H5Aint.c line 658 in H5A__open_by_name(): unable to load attribute info from object header
major: Attribute
minor: Unable to initialize object
#007: H5Oattribute.c line 502 in H5O__attr_open_by_name(): can't locate attribute: '_QuantizeBitRoundNumberOfSignificantBits'
major: Attribute
minor: Object not found
.
.
.
```
This can be solved by adding a constraint in the environment.yml `libnetcdf<4.9.1` Which makes the following changes:
```
──────────────────────────────────────────────────────────────────────────────────────
Install:
──────────────────────────────────────────────────────────────────────────────────────
+ jpeg 9e h166bdaf_2 conda-forge/linux-64 Cached
Change:
──────────────────────────────────────────────────────────────────────────────────────
- hdf4 4.2.15 h501b40f_6 conda-forge
+ hdf4 4.2.15 h9772cbc_5 conda-forge/linux-64 Cached
- lcms2 2.15 haa2dc70_1 conda-forge
+ lcms2 2.15 hfd0df8a_0 conda-forge/linux-64 Cached
- libtiff 4.5.0 ha587672_6 conda-forge
+ libtiff 4.5.0 h6adf6a1_2 conda-forge/linux-64 Cached
- pillow 9.4.0 py310h065c6d2_2 conda-forge
+ pillow 9.4.0 py310h023d228_1 conda-forge/linux-64 Cached
Downgrade:
──────────────────────────────────────────────────────────────────────────────────────
- hdf5 1.14.0 nompi_hb72d44e_103 conda-forge
+ hdf5 1.12.2 nompi_h4df4325_101 conda-forge/linux-64 Cached
- libdeflate 1.18 h0b41bf4_0 conda-forge
+ libdeflate 1.17 h0b41bf4_0 conda-forge/linux-64 Cached
- libjpeg-turbo 2.1.5.1 h0b41bf4_0 conda-forge
+ libjpeg-turbo 2.1.4 h166bdaf_0 conda-forge/linux-64 Cached
- libnetcdf 4.9.2 nompi_hf3f8848_103 conda-forge
+ libnetcdf 4.8.1 nompi_h261ec11_106 conda-forge/linux-64 Cached
- netcdf4 1.6.3 nompi_py310h2d0b64f_102 conda-forge
+ netcdf4 1.6.2 nompi_py310h55e1e36_100 conda-forge/linux-64 Cached
Summary:
Install: 1 packages
Change: 4 packages
Downgrade: 5 packages
```
However constraining libnetcdf, seems to create a new error with dask:
```
INFO:distributed.batched:Batched Comm Closed <TCP (closed) Scheduler connection to worker local=tcp://127.0.0.1:49048 remote=tcp://127.0.0.1:53328>
Traceback (most recent call last):
File "/home/sm_carni/.conda/envs/climix_error_test/lib/python3.10/site-packages/distributed/batched.py", line 115, in _background_send
nbytes = yield coro
File "/home/sm_carni/.conda/envs/climix_error_test/lib/python3.10/site-packages/tornado/gen.py", line 769, in run
value = future.result()
File "/home/sm_carni/.conda/envs/climix_error_test/lib/python3.10/site-packages/distributed/comm/tcp.py", line 269, in write
raise CommClosedError()
distributed.comm.core.CommClosedError
```
can be solved by constraining `netCDF4==1.6.0` but I am not sure if this is the right approach.0.18 (Gull Olle)Joakim LöwJoakim Löwhttps://git.smhi.se/climix/climix/-/issues/299Add unit tests for index functions2023-04-21T15:56:13ZCarolina NilssonAdd unit tests for index functionsImplemented tests:
- [x] CountLevelCrossings
- [x] CountOccurrences
- [x] CountJointOccurrencesPrecipitationTemperature
- [x] CountJointOccurrencesTemperature
- [x] DiurnalTemperatureRange
- [x] ExtremeTemperatureRange
- [ ] FirstOccurre...Implemented tests:
- [x] CountLevelCrossings
- [x] CountOccurrences
- [x] CountJointOccurrencesPrecipitationTemperature
- [x] CountJointOccurrencesTemperature
- [x] DiurnalTemperatureRange
- [x] ExtremeTemperatureRange
- [ ] FirstOccurrence (moved to later release)
- [x] InterdayDiurnalTemperatureRange
- [ ] LastOccurrence (moved to later release)
- [x] Percentile
- [x] Statistics
- [x] ThresholdedPercentile
- [x] ThresholdedStatistics
- [x] RunningStatistics
- [x] ThresholdedRunningStatistics
- [x] TemperatureSum
The following unit tests needs to be reviewed from a scientific point of view:
* [x] CountLevelCrossings - Gustav
* [ ] CountOccurrences - Gustav
* [ ] CountJointOccurrencesPrecipitationTemperature
* [ ] CountJointOccurrencesTemperature
* [x] DiurnalTemperatureRange - Renate
* [x] ExtremeTemperatureRange - Renate
* [x] InterdayDiurnalTemperatureRange - Renate
* [ ] Statistics
* [x] ThresholdedStatistics - Renate
* [ ] RunningStatistics (complex output with post-processing included)
* [ ] ThresholdedRunningStatistics (complex output with post-processing included)
* [ ] TemperatureSum
E.g.,
```
TEST_COUNT_LEVEL_CROSSINGS_PARAMETERS = [
(
{"data": (-1) * np.arange(12).reshape(2, 2, 3), "units": "degree_Celsius"},
{"data": np.arange(12).reshape(2, 2, 3), "units": "degree_Celsius"},
{"data": 0, "units": "degree_Celsius", "standard_name": "air_temperature"},
np.array([[1, 2, 2], [2, 2, 2]]),
), # ordinary np
]
parameter_names = "f_cube_tasmin, f_cube_tasmax, f_first_threshold, expected"
```
The index function is count_level_crossings, bellow the parameter list with the data we can see the parameter names. Such that,
```
f_cube_tasmin = {"data": (-1) * np.arange(12).reshape(2, 2, 3), "units": "degree_Celsius"}
f_cube_tasmax = {"data": np.arange(12).reshape(2, 2, 3), "units": "degree_Celsius"}
f_first_threshold = {"data": 0, "units": "degree_Celsius", "standard_name": "air_temperature"}
expected = np.array([[1, 2, 2], [2, 2, 2]])
```0.18 (Gull Olle)Carolina NilssonCarolina Nilssonhttps://git.smhi.se/climix/climix/-/issues/298Update clix-meta files to v0.5.12023-03-14T09:52:37ZJoakim LöwUpdate clix-meta files to v0.5.1Replace `index_definitions.yml` and `variables.yml` with files released with clix-meta v0.5.1 and update Climix to handle it.Replace `index_definitions.yml` and `variables.yml` with files released with clix-meta v0.5.1 and update Climix to handle it.0.18 (Gull Olle)Joakim LöwJoakim Löwhttps://git.smhi.se/climix/climix/-/issues/296Index requests of combined threshold indices (2 variables)2023-09-12T09:20:54ZRenate WilckeIndex requests of combined threshold indices (2 variables)New indices:
I need three new combined threshold indices.
1. Number of days(Temperature between two values (-2, 2) and precipitation above 0.1 mm/d)
2. ~~Number of days(Temperature below 0 and precipitation above 0.1 mm/d)~~
3. Akkumulat...New indices:
I need three new combined threshold indices.
1. Number of days(Temperature between two values (-2, 2) and precipitation above 0.1 mm/d)
2. ~~Number of days(Temperature below 0 and precipitation above 0.1 mm/d)~~
3. Akkumulated precip with Temperature below 0
I don't have names for them. If you like to get suggestions, let me know. ~~There is no rush, though it would be nice to work with them after summer.~~ Now is after summer and I would need those.https://git.smhi.se/climix/climix/-/issues/294Function to parse date and timerange in iso 8601 format2023-04-19T15:05:32ZJoakim LöwFunction to parse date and timerange in iso 8601 formatDates and time ranges will be required to be given in iso 8601 format in index definition yaml (reference period, see #273) and for command line arguments (reference period and computational period, see #289).
The end date of a time ran...Dates and time ranges will be required to be given in iso 8601 format in index definition yaml (reference period, see #273) and for command line arguments (reference period and computational period, see #289).
The end date of a time range should be interpreted as the upper bound of the date, i.e. `1961/1990` should mean `1961-01-01T00:00:00` to `1991-01-01T00:00:00` and `1961-01/1990-01` should mean `1961-01-01T00:00:00` to `1990-02-01T00:00:00` (see https://git.smhi.se/climix/climix/-/issues/273#note_34689)
The time range parser should have support for duration as start or end, i.e. `P20Y/2100` should produce same as `2081/2100`.0.17 (Black Lion)Joakim LöwJoakim Löwhttps://git.smhi.se/climix/climix/-/issues/291Make use of guvectorize in spell kernels2023-02-24T07:32:52ZErik HolmgrenMake use of guvectorize in spell kernelsTracking the progress of the investigation, and possible implementation, of using `@guvectorize` for `chunk_column` in spell kernels.
Essentially, this means that we could remove the loop over the cube dimensions in `chunk` and let numb...Tracking the progress of the investigation, and possible implementation, of using `@guvectorize` for `chunk_column` in spell kernels.
Essentially, this means that we could remove the loop over the cube dimensions in `chunk` and let numba vectorize it.Erik HolmgrenErik Holmgrenhttps://git.smhi.se/climix/climix/-/issues/290Handeling of non-CMORized input data2023-11-13T12:07:05ZCarolina NilssonHandeling of non-CMORized input dataIn issue #15 there was a discussion on how climix would handle input data that is not CMORized, i.e., does not follows the CF-conventions and CMIP standards. This is moved to a separate issue since a decision needs to be taken about if ...In issue #15 there was a discussion on how climix would handle input data that is not CMORized, i.e., does not follows the CF-conventions and CMIP standards. This is moved to a separate issue since a decision needs to be taken about if we should rely on running some other tool first (ESMValTool) or if we should implement some solution on our own possibly using ESMValTools API or similar.https://git.smhi.se/climix/climix/-/issues/289Command line option to specify computational period2023-12-12T09:57:28ZJoakim LöwCommand line option to specify computational periodAdd a command line option to specify the period to perform index calculation on. Reasons:
1. User may only want to compute the index for a sub period of the data.
2. Beginning and end of input data may be incomplete with respect to time ...Add a command line option to specify the period to perform index calculation on. Reasons:
1. User may only want to compute the index for a sub period of the data.
2. Beginning and end of input data may be incomplete with respect to time period. E.g. if 'annual' and first and/or last year of data does not contain entries for all days of year, climix may raise error for some indices.
See also #273 and #2570.20 (Urbane Goat)Carolina NilssonCarolina Nilssonhttps://git.smhi.se/climix/climix/-/issues/287Produce a nice summary of which indices are available in climix2023-04-23T13:41:47ZLars BärringProduce a nice summary of which indices are available in climixUsers have requested to have access, possibly published via SMHI internal web pages to which indices that climix can produce.
A starting point may be the output from `climix -x list`:
``` bash
~ >climix -x list
INFO:root:Activating sen...Users have requested to have access, possibly published via SMHI internal web pages to which indices that climix can produce.
A starting point may be the output from `climix -x list`:
``` bash
~ >climix -x list
INFO:root:Activating sentry (automatic error reporting)
INFO:root:Looking for metadata in directory /home/sm_lbarr/CODE/climix/climix/etc
INFO:root:Looking for metadata in directory /etc/climix
INFO:root:Looking for metadata in directory /home/sm_lbarr/.config/climix
INFO:root:Reading index definitions from file /home/sm_lbarr/CODE/climix/climix/etc/metadata.yml
INFO:root:Reading index definitions from file /home/sm_lbarr/CODE/climix/climix/etc/variables.yml
INFO:root:Reading index definitions from file /home/sm_lbarr/CODE/climix/climix/etc/index_definitions.yml
Available indices are:
['fd', 'tnlt2', 'tnltm2', 'tnltm20', 'id', 'su', 'txge30', 'txge35', 'tr', 'tmge5', 'tmlt5', 'tmge10', 'tmlt10',
'tngt{TT}', 'tnlt{TT}', 'tnge{TT}', 'tnle{TT}', 'txgt{TT}', 'txlt{TT}', 'txge{TT}', 'txle{TT}', 'tmgt{TT}',
'tmlt{TT}', 'tmge{TT}', 'tmle{TT}', 'ctngt{TT}', 'cfd', 'csu', 'ctnlt{TT}', 'ctnge{TT}', 'ctnle{TT}', 'ctxgt{TT}',
'ctxlt{TT}', 'ctxge{TT}', 'ctxle{TT}', 'ctmgt{TT}', 'ctmlt{TT}', 'ctmge{TT}', 'ctmle{TT}', 'txx', 'tnx', 'txn', '
tnn', 'txm', 'tnm', 'tmx', 'tmn', 'tmm', 'txmax', 'tnmax', 'txmin', 'tnmin', 'txmean', 'tnmean', 'tmmax', 'tmmin',
'tmmean', 'tn10p', 'tx10p', 'tn90p', 'tx90p', 'tg10p', 'tg90p', 'txgt50p', 'txgt{PRC}p', 'tngt{PRC}p', 'tmgt{PRC}p',
'txlt{PRC}p', 'tnlt{PRC}p', 'tmlt{PRC}p', 'dtr', 'vdtr', 'etr', 'tx{PRC}pctl', 'tn{PRC}pctl', 'tm{PRC}pctl',
'hd17', 'hddheat{TT}', 'ddgt{TT}', 'cddcold{TT}', 'ddlt{TT}', 'gddgrow{TT}', 'gd4', 'r10mm', 'r20mm', 'r{RT}mm',
'wetdays', 'rr1', 'cdd', 'cwd', 'prcptot', 'sdii', 'r{PRC}pctl', 'r{PRC}pDAYS', 'rx1day', 'rx5day', 'rx{ND}day',
'rh', 'rr', 'pp', 'tg', 'tn', 'tx', 'sd', 'sd1', 'sd5cm', 'sd50cm', 'sd{D}cm', 'ss', 'fxx', 'fg6bft', 'fgcalm',
'fg', 'nzero', 'maxdtr']
```
This can of course be expanded in various directions, e.g. to
* list the indices per file that is scanned,
* also give the `long_name`, `OUTPUT_unit`, input variable(s), ...
* see #57 for another and much more complex idea0.18 (Gull Olle)Klaus ZimmermannKlaus Zimmermannhttps://git.smhi.se/climix/climix/-/issues/284Amend the yaml reader function to be aware of the CLIX-META version number2023-04-19T15:05:32ZLars BärringAmend the yaml reader function to be aware of the CLIX-META version number0.18 (Gull Olle)Klaus ZimmermannKlaus Zimmermannhttps://git.smhi.se/climix/climix/-/issues/275Add integration tests using NGCD dataset2023-04-21T15:56:13ZJoakim LöwAdd integration tests using NGCD datasetAdd integration tests running climix on the NGCD dataset, and compare results with NGCD reference indicators.Add integration tests running climix on the NGCD dataset, and compare results with NGCD reference indicators.0.18 (Gull Olle)Klaus ZimmermannKlaus Zimmermannhttps://git.smhi.se/climix/climix/-/issues/273Reference period definition in index_definitions.yml2023-04-19T15:05:32ZCarolina NilssonReference period definition in index_definitions.ymlTo conclude after offline conversation. Some index definitions utilises a reference period in the definition for the calculation and there is no obvious way in how this should be handled. A suggestion would be to included the reference p...To conclude after offline conversation. Some index definitions utilises a reference period in the definition for the calculation and there is no obvious way in how this should be handled. A suggestion would be to included the reference period in the clix-meta index definition, given the following example:
```
index_function:
name: count_thresholded_percentile_occurrences
parameters:
data_threshold:
kind: quantity
standard_name: lwe_precipitation_rate
long_name: "Wet day threshold"
data: 1
units: "mm day-1"
data_condition:
kind: operator
operator: ">="
percentile:
kind: quantity
standard_name:
proposed_standard_name: quantile
long_name: "Percentile value"
data: "{PRC}"
units: "%"
percentile_condition:
kind: operator
operator: ">"
reference_period:
kind: time_period
data: ['1961-01-01 00:00:00', '1991-01-01 00:00:00']
```
the reference period could then be used in the index calculation and the output should contain a global attribute giving information about which reference period that was used.
```
global attributes:
:reference_period = "1961-01-01 00:00:00 to 1991-01-01 00:00:00"
```
or something similar.
Notes there were also a discussion about having an additional Auxiliary coordinate variable with the following information:
```
reference_period:(var_name: "reference_period", standard_name: "reference_epoch", value: mean(n1,n2), bounds: [n1, n2], unit: "days since 1961-01-01 00:00:00")
```
where n1 and n2 are the point values for the start and end of the time period defined by the unit.0.18 (Gull Olle)Carolina NilssonCarolina Nilssonhttps://git.smhi.se/climix/climix/-/issues/272tn10p, tn90p, tx10p, tx90p with NGCD data results in ZeroDivisionError2023-01-30T14:51:02ZJoakim Löwtn10p, tn90p, tx10p, tx90p with NGCD data results in ZeroDivisionErrorCalculating `tn10p`, `tn90p`, `tx10p` and `tx90p` with input from NGCD dataset results in a `ZeroDivisionError`. E.g.:
```
climix -v -e -x tn10p /home/rossby/joint_exp/climix/1/testdata/NGCD/input_data/SverigeScaled/NGCD_TN_type2_*.nc
``...Calculating `tn10p`, `tn90p`, `tx10p` and `tx90p` with input from NGCD dataset results in a `ZeroDivisionError`. E.g.:
```
climix -v -e -x tn10p /home/rossby/joint_exp/climix/1/testdata/NGCD/input_data/SverigeScaled/NGCD_TN_type2_*.nc
```
generates the following traceback:
```
Traceback (most recent call last):
File "/home/sm_joalo/.conda/envs/climix-devel/bin/climix", line 8, in <module>
sys.exit(main())
File "/home/sm_joalo/dev/repos/climix/climix/main.py", line 293, in main
do_main(
File "/home/sm_joalo/dev/repos/climix/climix/main.py", line 271, in do_main
result = index(input_data, client=scheduler.client, sliced_mode=sliced_mode)
File "/home/sm_joalo/dev/repos/climix/climix/index.py", line 24, in __call__
self.index_function.preprocess(cubes, client)
File "/home/sm_joalo/dev/repos/climix/climix/index_functions/percentile_functions.py", line 232, in preprocess
all_data = all_data.rechunk(("auto",) * (all_data.ndim - 1) + (-1,))
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/dask/array/core.py", line 2745, in rechunk
return rechunk(self, chunks, threshold, block_size_limit, balance)
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/dask/array/rechunk.py", line 297, in rechunk
chunks = normalize_chunks(
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/dask/array/core.py", line 3073, in normalize_chunks
chunks = auto_chunks(chunks, shape, limit, dtype, previous_chunks)
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/dask/array/core.py", line 3206, in auto_chunks
multiplier = _compute_multiplier(limit, dtype, largest_block, result)
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/dask/array/core.py", line 3119, in _compute_multiplier
limit
ZeroDivisionError: float division by zero
```Joakim LöwJoakim Löwhttps://git.smhi.se/climix/climix/-/issues/271rx5day index fails with NGCD input2023-01-30T14:51:14ZJoakim Löwrx5day index fails with NGCD inputRunning the following command line:
```
climix -v -e -x rx5day /home/rossby/joint_exp/climix/1/testdata/NGCD/input_data/SverigeScaled/NGCD_RR_type2_*.nc
```
results in the following traceback:
```
Traceback (most recent call last):
Fil...Running the following command line:
```
climix -v -e -x rx5day /home/rossby/joint_exp/climix/1/testdata/NGCD/input_data/SverigeScaled/NGCD_RR_type2_*.nc
```
results in the following traceback:
```
Traceback (most recent call last):
File "/home/sm_joalo/.conda/envs/climix-devel/bin/climix", line 8, in <module>
sys.exit(main())
File "/home/sm_joalo/dev/repos/climix/climix/main.py", line 293, in main
do_main(
File "/home/sm_joalo/dev/repos/climix/climix/main.py", line 271, in do_main
result = index(input_data, client=scheduler.client, sliced_mode=sliced_mode)
File "/home/sm_joalo/dev/repos/climix/climix/index.py", line 52, in __call__
aggregated = multicube_aggregated_by(
File "/home/sm_joalo/dev/repos/climix/climix/iris.py", line 126, in multicube_aggregated_by
result = list(map(agg, groupby_subcubes))
File "/home/sm_joalo/dev/repos/climix/climix/iris.py", line 109, in agg
result = aggregate(data, axis=dimension_to_groupby, cube=cube, **kwargs)
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/iris/analysis/__init__.py", line 549, in lazy_aggregate
return self.lazy_func(data, axis=axis, **kwargs)
File "/home/sm_joalo/dev/repos/climix/climix/index_functions/index_functions.py", line 449, in lazy_func
return super().call_func(thresholded_data, axis, **kwargs)
File "/home/sm_joalo/dev/repos/climix/climix/index_functions/index_functions.py", line 354, in call_func
rolling_view = np.lib.stride_tricks.sliding_window_view(
File "<__array_function__ internals>", line 180, in sliding_window_view
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/dask/array/core.py", line 1760, in __array_function__
return da_func(*args, **kwargs)
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/dask/array/overlap.py", line 815, in sliding_window_view
safe_chunks = tuple(
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/dask/array/overlap.py", line 816, in <genexpr>
ensure_minimum_chunksize(d + 1, c) for d, c in zip(depths, x.chunks)
File "/home/sm_joalo/.conda/envs/climix-devel/lib/python3.10/site-packages/dask/array/overlap.py", line 353, in ensure_minimum_chunksize
raise ValueError(
ValueError: The overlapping depth 5 is larger than your array 1.
```Joakim LöwJoakim Löwhttps://git.smhi.se/climix/climix/-/issues/269Update clix-meta files to clix-meta release 0.4.12022-12-09T13:30:11ZJoakim LöwUpdate clix-meta files to clix-meta release 0.4.1The YAML files from Clix-meta need to be updated to the ones in the latest Clix-meta (0.4.1), which contains a fix for a bug that caused some indices to get the wrong condition operator.The YAML files from Clix-meta need to be updated to the ones in the latest Clix-meta (0.4.1), which contains a fix for a bug that caused some indices to get the wrong condition operator.0.16