Grid alignment, ASF PC¶
Goal is to put ASF, PC data on the same grid in order to compare the two datasets with the layover-shadow mask of the ASF dataset applied to the PC dataset as well (maybe the inclusion of layover/shadow pixels in the PC dataset that are excldued from the ASF dataset is resonsible for the observed offset between the two when comparing spatial mean backscatter over time)
Steps:
- downsample: PC dataset is 10m resolution, ASF is 30m. want to downsample PC to spatial resolution of ASF (i'm using
interp_like()
currently) - once they are on the same grid, mask PC dataset where ASF = nan
1 2 | import xarray as xr import numpy as np |
1 | %store -r |
Initial setup¶
Read in data and subset both datasets to only common acquisitions between the two (drops a handful of time steps )
1 | da_pc
|
<xarray.DataArray 'stackstac-6805d1700af7afbc3f34ddbf3b38393f' (time: 100, band: 2, y: 1188, x: 869)> dask.array<fetch_raster_window, shape=(100, 2, 1188, 869), dtype=float64, chunksize=(1, 1, 1024, 869), chunktype=numpy.ndarray> Coordinates: (12/40) * time (time) datetime64[ns] 2021-06-02T1... id (time) <U66 'S1A_IW_GRDH_1SDV_2021... * band (band) <U2 'vh' 'vv' * x (x) float64 6.194e+05 ... 6.281e+05 * y (y) float64 3.102e+06 ... 3.09e+06 sat:platform_international_designator <U9 '2014-016A' ... ... start_datetime (time) <U32 '2021-06-02 12:05:44.9... description (band) <U173 'Terrain-corrected ga... title (band) <U41 'VH: vertical transmit... raster:bands object {'nodata': -32768, 'data_ty... epsg int64 32645 granule_id (time) <U62 'S1A_IW_GRDH_1SDV_2021... Attributes: spec: RasterSpec(epsg=32645, bounds=(619420.0, 3089780.0, 628110.0... crs: epsg:32645 transform: | 10.00, 0.00, 619420.00|\n| 0.00,-10.00, 3101660.00|\n| 0.0... resolution: 10.0
1 | vrt_full
|
<xarray.Dataset> Dimensions: (acq_date: 103, y: 396, x: 290) Coordinates: (12/19) * acq_date (acq_date) datetime64[ns] 2021-05-02 ..... granule_id (acq_date) <U67 'S1A_IW_SLC__1SDV_20210... * x (x) float64 6.194e+05 ... 6.281e+05 * y (y) float64 3.102e+06 ... 3.09e+06 spatial_ref int64 0 sensor (acq_date) <U3 'S1A' 'S1A' ... 'S1A' 'S1A' ... ... masked (acq_date) <U1 'u' 'u' 'u' ... 'u' 'u' 'u' filtered (acq_date) <U1 'n' 'n' 'n' ... 'n' 'n' 'n' area (acq_date) <U1 'e' 'e' 'e' ... 'e' 'e' 'e' product_id (acq_date) <U4 '748F' '0D1E' ... 'BD36' acq_hour (acq_date) int64 0 12 12 12 ... 0 12 12 0 orbital_dir (acq_date) <U4 'asc' 'desc' ... 'asc' Data variables: vv (acq_date, y, x) float32 dask.array<chunksize=(1, 396, 290), meta=np.ndarray> vh (acq_date, y, x) float32 dask.array<chunksize=(1, 396, 290), meta=np.ndarray> ls (acq_date, y, x) float64 dask.array<chunksize=(1, 396, 290), meta=np.ndarray>
1 2 | data_take_asf = [str(vrt_full.isel(acq_date = t).granule_id.values)[56:62] for t in range(len(vrt_full.acq_date))] data_take_pc = [str(da_pc.isel(time=t).granule_id.values)[56:] for t in range(len(da_pc.time))] |
1 2 | vrt_full.coords['data_take_id'] = ('acq_date', data_take_asf, {'ID of data take from SAR acquisition':'ID data take'}) da_pc.coords['data_take_id'] = ('time', data_take_pc) |
1 2 | pc_data_take_ls = list(da_pc.data_take_id.values) asf_data_take_ls = list(vrt_full.data_take_id.values) |
1 2 | common_data_takes = list(set(pc_data_take_ls) & set(asf_data_take_ls)) len(common_data_takes) |
84
1 2 3 | subset_condition_asf = vrt_full.data_take_id.isin(common_data_takes) subset_condition_pc = da_pc.data_take_id.isin(common_data_takes) |
1 2 | asf_subset = vrt_full.sel(acq_date = vrt_full.data_take_id.isin(common_data_takes)) pc_subset = da_pc.sel(time = da_pc.data_take_id.isin(common_data_takes)) |
Downsample¶
1 | pc_downsample = pc_subset.interp_like(asf_subset) |
1 | pc_downsample
|
<xarray.DataArray 'stackstac-6805d1700af7afbc3f34ddbf3b38393f' (time: 84, band: 2, y: 396, x: 290)> dask.array<transpose, shape=(84, 2, 396, 290), dtype=float64, chunksize=(1, 1, 396, 290), chunktype=numpy.ndarray> Coordinates: (12/41) * time (time) datetime64[ns] 2021-06-02T1... id (time) <U66 'S1A_IW_GRDH_1SDV_2021... * band (band) <U2 'vh' 'vv' sat:platform_international_designator <U9 '2014-016A' s1:datatake_id (time) <U6 '295165' ... '338944' s1:instrument_configuration_ID (time) <U1 '6' '6' '6' ... '7' '7' ... ... raster:bands object {'nodata': -32768, 'data_ty... epsg int64 32645 granule_id (time) <U62 'S1A_IW_GRDH_1SDV_2021... data_take_id (time) <U6 '0480FD' ... '052C00' * x (x) float64 6.194e+05 ... 6.281e+05 * y (y) float64 3.102e+06 ... 3.09e+06 Attributes: spec: RasterSpec(epsg=32645, bounds=(619420.0, 3089780.0, 628110.0... crs: epsg:32645 transform: | 10.00, 0.00, 619420.00|\n| 0.00,-10.00, 3101660.00|\n| 0.0... resolution: 10.0
Mask¶
this is where i'm stuck...
The subsetted ASF object has 90 time steps whereas PC has 84 (should be the same) ASF dataset has a number of duplicate time steps, it looks like they mostly have nodata (see data comparison notebook). not really sure why they're there/what causes this exactly. Dropping for now..
1 | asf_subset = asf_subset.drop_duplicates('acq_date') |
1 | asf_subset
|
<xarray.Dataset> Dimensions: (acq_date: 84, y: 396, x: 290) Coordinates: (12/20) * acq_date (acq_date) datetime64[ns] 2021-06-02 ..... granule_id (acq_date) <U67 'S1A_IW_SLC__1SDV_20210... * x (x) float64 6.194e+05 ... 6.281e+05 * y (y) float64 3.102e+06 ... 3.09e+06 spatial_ref int64 0 sensor (acq_date) <U3 'S1A' 'S1A' ... 'S1A' 'S1A' ... ... filtered (acq_date) <U1 'n' 'n' 'n' ... 'n' 'n' 'n' area (acq_date) <U1 'e' 'e' 'e' ... 'e' 'e' 'e' product_id (acq_date) <U4 '1424' 'ABA0' ... 'FA4F' acq_hour (acq_date) int64 12 12 0 12 0 ... 0 0 0 12 orbital_dir (acq_date) <U4 'desc' 'desc' ... 'desc' data_take_id (acq_date) <U6 '0480FD' ... '052C00' Data variables: vv (acq_date, y, x) float32 dask.array<chunksize=(1, 396, 290), meta=np.ndarray> vh (acq_date, y, x) float32 dask.array<chunksize=(1, 396, 290), meta=np.ndarray> ls (acq_date, y, x) float64 dask.array<chunksize=(1, 396, 290), meta=np.ndarray>
1 2 | pc_times = [np.datetime64(str(pc_subset.isel(time=t).time.values).replace(str(pc_subset.isel(time=t).time.values)[-19:], 'T00:00:00.000000000')) for t in range(len(pc_subset.time))] time_condition = asf_subset.acq_date.isin(pc_times) |
1 | asf_short = asf_subset.sel(acq_date = time_condition) #this is the same as asf_subset..... |
1 | #asf_short
|
1 2 | pc_subset.time.values[0] pc_times[0] |
numpy.datetime64('2021-06-02T00:00:00.000000000')
1 | asf_subset.acq_date.values[0] |
numpy.datetime64('2021-06-02T00:00:00.000000000')
Issue:¶
1 | pc_mask = xr.where(asf_short.notnull(), pc_downsample,np.nan) |
/home/emmamarshall/miniconda3/envs/sentinel/lib/python3.10/site-packages/dask/array/core.py:4796: PerformanceWarning: Increasing number of chunks by factor of 84 result = blockwise( /home/emmamarshall/miniconda3/envs/sentinel/lib/python3.10/site-packages/dask/array/core.py:4796: PerformanceWarning: Increasing number of chunks by factor of 84 result = blockwise( /home/emmamarshall/miniconda3/envs/sentinel/lib/python3.10/site-packages/dask/array/core.py:4796: PerformanceWarning: Increasing number of chunks by factor of 84 result = blockwise(
Making ASF object w/ same dim names to use in where()
... don't think this will fix my issue...
1 | asf_scratch = asf_subset.rename_dims({'acq_date':'time'}).rename({'acq_date':'time'}) |
1 | asf_scratch
|
<xarray.Dataset> Dimensions: (time: 84, y: 396, x: 290) Coordinates: (12/20) * time (time) datetime64[ns] 2021-06-02 ... 20... granule_id (time) <U67 'S1A_IW_SLC__1SDV_20210602T... * x (x) float64 6.194e+05 ... 6.281e+05 * y (y) float64 3.102e+06 ... 3.09e+06 spatial_ref int64 0 sensor (time) <U3 'S1A' 'S1A' ... 'S1A' 'S1A' ... ... filtered (time) <U1 'n' 'n' 'n' 'n' ... 'n' 'n' 'n' area (time) <U1 'e' 'e' 'e' 'e' ... 'e' 'e' 'e' product_id (time) <U4 '1424' 'ABA0' ... '5BAD' 'FA4F' acq_hour (time) int64 12 12 0 12 0 ... 12 0 0 0 12 orbital_dir (time) <U4 'desc' 'desc' ... 'asc' 'desc' data_take_id (time) <U6 '0480FD' '048318' ... '052C00' Data variables: vv (time, y, x) float32 dask.array<chunksize=(1, 396, 290), meta=np.ndarray> vh (time, y, x) float32 dask.array<chunksize=(1, 396, 290), meta=np.ndarray> ls (time, y, x) float64 dask.array<chunksize=(1, 396, 290), meta=np.ndarray>
1 | pc_mask1 = pc_downsample.where(asf_scratch.notnull(), pc_downsample, np.nan) |
--------------------------------------------------------------------------- ValueError Traceback (most recent call last) Input In [30], in <cell line: 1>() ----> 1 pc_mask = pc_downsample.where(asf_scratch.notnull(), pc_downsample, np.nan) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/common.py:1081, in DataWithCoords.where(self, cond, other, drop) 1078 self = self.isel(**indexers) 1079 cond = cond.isel(**indexers) -> 1081 return ops.where_method(self, cond, other) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/ops.py:177, in where_method(self, cond, other) 175 # alignment for three arguments is complicated, so don't support it yet 176 join = "inner" if other is dtypes.NA else "exact" --> 177 return apply_ufunc( 178 duck_array_ops.where_method, 179 self, 180 cond, 181 other, 182 join=join, 183 dataset_join=join, 184 dask="allowed", 185 keep_attrs=True, 186 ) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/computation.py:1192, in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, meta, dask_gufunc_kwargs, *args) 1190 # feed datasets apply_variable_ufunc through apply_dataset_vfunc 1191 elif any(is_dict_like(a) for a in args): -> 1192 return apply_dataset_vfunc( 1193 variables_vfunc, 1194 *args, 1195 signature=signature, 1196 join=join, 1197 exclude_dims=exclude_dims, 1198 dataset_join=dataset_join, 1199 fill_value=dataset_fill_value, 1200 keep_attrs=keep_attrs, 1201 ) 1202 # feed DataArray apply_variable_ufunc through apply_dataarray_vfunc 1203 elif any(isinstance(a, DataArray) for a in args): File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/computation.py:471, in apply_dataset_vfunc(func, signature, join, dataset_join, fill_value, exclude_dims, keep_attrs, *args) 468 objs = _all_of_type(args, Dataset) 470 if len(args) > 1: --> 471 args = deep_align( 472 args, join=join, copy=False, exclude=exclude_dims, raise_on_invalid=False 473 ) 475 list_of_coords, list_of_indexes = build_output_coords_and_indexes( 476 args, signature, exclude_dims, combine_attrs=keep_attrs 477 ) 478 args = tuple(getattr(arg, "data_vars", arg) for arg in args) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/alignment.py:827, in deep_align(objects, join, copy, indexes, exclude, raise_on_invalid, fill_value) 824 else: 825 out.append(variables) --> 827 aligned = align( 828 *targets, 829 join=join, 830 copy=copy, 831 indexes=indexes, 832 exclude=exclude, 833 fill_value=fill_value, 834 ) 836 for position, key, aligned_obj in zip(positions, keys, aligned): 837 if key is no_key: File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/alignment.py:764, in align(join, copy, indexes, exclude, fill_value, *objects) 568 """ 569 Given any number of Dataset and/or DataArray objects, returns new 570 objects with aligned indexes and dimension sizes. (...) 754 755 """ 756 aligner = Aligner( 757 objects, 758 join=join, (...) 762 fill_value=fill_value, 763 ) --> 764 aligner.align() 765 return aligner.results File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/alignment.py:551, in Aligner.align(self) 549 self.find_matching_unindexed_dims() 550 self.assert_no_index_conflict() --> 551 self.align_indexes() 552 self.assert_unindexed_dim_sizes_equal() 554 if self.join == "override": File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/alignment.py:397, in Aligner.align_indexes(self) 395 if need_reindex: 396 if self.join == "exact": --> 397 raise ValueError( 398 "cannot align objects with join='exact' where " 399 "index/labels/sizes are not equal along " 400 "these coordinates (dimensions): " 401 + ", ".join(f"{name!r} {dims!r}" for name, dims in key[0]) 402 ) 403 joiner = self._get_index_joiner(index_cls) 404 joined_index = joiner(matching_indexes) ValueError: cannot align objects with join='exact' where index/labels/sizes are not equal along these coordinates (dimensions): 'time' ('time',)
^the only difference btw asf_scratch.time
and pc_downsample.time
is that one is date + time and one is just date w/ zeros for time. try dropping time next to see if this fixes it (I think I tried that in the other notebook and it didn't....)
When I try to execute the above cell using xr.where()
, pc_mask
is a 4d object with the time dimension from the PC object as well as the ASF object.
When I try with xr.DataArray.where()
I think this is some sort of indexing issue but not sure yet how to fix it, or if Im' using the wrong approach
trying w/ xr.where()
¶
1 | pc_mask = xr.where(asf_short.notnull(), pc_downsample,np.nan) |
/home/emmamarshall/miniconda3/envs/sentinel/lib/python3.10/site-packages/dask/array/core.py:4796: PerformanceWarning: Increasing number of chunks by factor of 84 result = blockwise( /home/emmamarshall/miniconda3/envs/sentinel/lib/python3.10/site-packages/dask/array/core.py:4796: PerformanceWarning: Increasing number of chunks by factor of 84 result = blockwise( /home/emmamarshall/miniconda3/envs/sentinel/lib/python3.10/site-packages/dask/array/core.py:4796: PerformanceWarning: Increasing number of chunks by factor of 84 result = blockwise(
1 | pc_mask
|
<xarray.Dataset> Dimensions: (acq_date: 84, y: 396, x: 290, time: 84, band: 2) Coordinates: (12/55) * acq_date (acq_date) datetime64[ns] 2021-06-... * x (x) float64 6.194e+05 ... 6.281e+05 * y (y) float64 3.102e+06 ... 3.09e+06 spatial_ref int64 0 sensor (acq_date) <U3 'S1A' 'S1A' ... 'S1A' beam_mode (acq_date) <U2 'IW' 'IW' ... 'IW' ... ... sar:instrument_mode <U2 'IW' start_datetime (time) <U32 '2021-06-02 12:05:44.9... description (band) <U173 'Terrain-corrected ga... title (band) <U41 'VH: vertical transmit... raster:bands object {'nodata': -32768, 'data_ty... epsg int64 32645 Data variables: vv (acq_date, y, x, time, band) float64 dask.array<chunksize=(1, 396, 290, 1, 1), meta=np.ndarray> vh (acq_date, y, x, time, band) float64 dask.array<chunksize=(1, 396, 290, 1, 1), meta=np.ndarray> ls (acq_date, y, x, time, band) float64 dask.array<chunksize=(1, 396, 290, 1, 1), meta=np.ndarray>
Tried building the object :
1 2 3 4 5 | pc_new = xr.DataArray(coords = {'x':pc_mask.x.values, 'y':pc_mask.y.values, 'time':pc_mask.time.values}, dims = ('x','y','time'), ) |
1 | pc_new
|
<xarray.DataArray (x: 290, y: 396, time: 84)> array([[[nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], ..., [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan]], [[nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], ..., [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan]], [[nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], ..., ... ..., [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan]], [[nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], ..., [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan]], [[nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], ..., [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan], [nan, nan, nan, ..., nan, nan, nan]]]) Coordinates: * x (x) float64 6.194e+05 6.195e+05 6.195e+05 ... 6.281e+05 6.281e+05 * y (y) float64 3.102e+06 3.102e+06 3.102e+06 ... 3.09e+06 3.09e+06 * time (time) datetime64[ns] 2021-06-02T12:05:57.441074 ... 2022-05-21T...
1 | pc_downsample.sel(band='vv') |
<xarray.DataArray 'stackstac-6805d1700af7afbc3f34ddbf3b38393f' (time: 84, y: 396, x: 290)> dask.array<getitem, shape=(84, 396, 290), dtype=float64, chunksize=(1, 396, 290), chunktype=numpy.ndarray> Coordinates: (12/41) * time (time) datetime64[ns] 2021-06-02T1... id (time) <U66 'S1A_IW_GRDH_1SDV_2021... band <U2 'vv' sat:platform_international_designator <U9 '2014-016A' s1:datatake_id (time) <U6 '295165' ... '338944' s1:instrument_configuration_ID (time) <U1 '6' '6' '6' ... '7' '7' ... ... raster:bands object {'nodata': -32768, 'data_ty... epsg int64 32645 granule_id (time) <U62 'S1A_IW_GRDH_1SDV_2021... data_take_id (time) <U6 '0480FD' ... '052C00' * x (x) float64 6.194e+05 ... 6.281e+05 * y (y) float64 3.102e+06 ... 3.09e+06 Attributes: spec: RasterSpec(epsg=32645, bounds=(619420.0, 3089780.0, 628110.0... crs: epsg:32645 transform: | 10.00, 0.00, 619420.00|\n| 0.00,-10.00, 3101660.00|\n| 0.0... resolution: 10.0
1 | pc_new['vv'] = pc_downsample.sel(band='vv').data |
--------------------------------------------------------------------------- MissingDimensionsError Traceback (most recent call last) Input In [37], in <cell line: 1>() ----> 1 pc_new['vv'] = pc_downsample.sel(band='vv').data File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/dataarray.py:776, in DataArray.__setitem__(self, key, value) 774 def __setitem__(self, key: Any, value: Any) -> None: 775 if isinstance(key, str): --> 776 self.coords[key] = value 777 else: 778 # Coordinates in key, value and self[key] should be consistent. 779 # TODO Coordinate consistency in key is checked here, but it 780 # causes unnecessary indexing. It should be optimized. 781 obj = self[key] File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/coordinates.py:32, in Coordinates.__setitem__(self, key, value) 31 def __setitem__(self, key: Hashable, value: Any) -> None: ---> 32 self.update({key: value}) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/coordinates.py:162, in Coordinates.update(self, other) 160 other_vars = getattr(other, "variables", other) 161 self._maybe_drop_multiindex_coords(set(other_vars)) --> 162 coords, indexes = merge_coords( 163 [self.variables, other_vars], priority_arg=1, indexes=self.xindexes 164 ) 165 self._update_coords(coords, indexes) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/merge.py:564, in merge_coords(objects, compat, join, priority_arg, indexes, fill_value) 560 coerced = coerce_pandas_values(objects) 561 aligned = deep_align( 562 coerced, join=join, copy=False, indexes=indexes, fill_value=fill_value 563 ) --> 564 collected = collect_variables_and_indexes(aligned) 565 prioritized = _get_priority_vars_and_indexes(aligned, priority_arg, compat=compat) 566 variables, out_indexes = merge_collected(collected, prioritized, compat=compat) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/merge.py:365, in collect_variables_and_indexes(list_of_mappings, indexes) 362 indexes.pop(name, None) 363 append_all(coords, indexes) --> 365 variable = as_variable(variable, name=name) 366 if name in indexes: 367 append(name, variable, indexes[name]) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/variable.py:147, in as_variable(obj, name) 145 data = as_compatible_data(obj) 146 if data.ndim != 1: --> 147 raise MissingDimensionsError( 148 f"cannot set variable {name!r} with {data.ndim!r}-dimensional data " 149 "without explicit dimension names. Pass a tuple of " 150 "(dims, data) instead." 151 ) 152 obj = Variable(name, data, fastpath=True) 153 else: MissingDimensionsError: cannot set variable 'vv' with 3-dimensional data without explicit dimension names. Pass a tuple of (dims, data) instead.
1 2 | vv = pc_mask.vv clip_vv = vv[0,:,:,:,:] |
1 | pc_mask = pc_mask.reset_index('acq_date', drop=True) |
1 | clip_vv.sel(band='vv').data |
|
1 2 3 4 5 6 7 | pc_new_vv = xr.DataArray( clip_vv.sel(band='vv').data, coords = {'x': pc_downsample.x.values, 'y':pc_downsample.y.values, 'time':pc_downsample.time.values}, dims = ('x','y','time'), ) |
--------------------------------------------------------------------------- ValueError Traceback (most recent call last) Input In [44], in <cell line: 1>() ----> 1 pc_new_vv = xr.DataArray( 2 clip_vv.sel(band='vv').data, 3 coords = {'x': pc_downsample.x.values, 4 'y':pc_downsample.y.values, 5 'time':pc_downsample.time.values}, 6 dims = ('x','y','time'), 7 ) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/dataarray.py:412, in DataArray.__init__(self, data, coords, dims, name, attrs, indexes, fastpath) 410 data = _check_data_shape(data, coords, dims) 411 data = as_compatible_data(data) --> 412 coords, dims = _infer_coords_and_dims(data.shape, coords, dims) 413 variable = Variable(dims, data, attrs, fastpath=True) 414 indexes, coords = _create_indexes_from_coords(coords) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/dataarray.py:160, in _infer_coords_and_dims(shape, coords, dims) 158 for d, s in zip(v.dims, v.shape): 159 if s != sizes[d]: --> 160 raise ValueError( 161 f"conflicting sizes for dimension {d!r}: " 162 f"length {sizes[d]} on the data but length {s} on " 163 f"coordinate {k!r}" 164 ) 166 if k in sizes and v.shape != (sizes[k],): 167 raise ValueError( 168 f"coordinate {k!r} is a DataArray dimension, but " 169 f"it has shape {v.shape!r} rather than expected shape {sizes[k]!r} " 170 "matching the dimension size" 171 ) ValueError: conflicting sizes for dimension 'x': length 396 on the data but length 290 on coordinate 'x'
^^ x,y dims are flipped?
1 2 3 4 5 6 7 | pc_new_vv = xr.DataArray( clip_vv.sel(band='vv').data.transpose(['x','y']), coords = {'x': pc_downsample.x.values, 'y':pc_downsample.y.values, 'time':pc_downsample.time.values}, dims = ('x','y','time'), ) |
--------------------------------------------------------------------------- ValueError Traceback (most recent call last) Input In [47], in <cell line: 1>() 1 pc_new_vv = xr.DataArray( ----> 2 clip_vv.sel(band='vv').data.transpose(['x','y']), 3 coords = {'x': pc_downsample.x.values, 4 'y':pc_downsample.y.values, 5 'time':pc_downsample.time.values}, 6 dims = ('x','y','time'), 7 ) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/dask/array/core.py:2144, in Array.transpose(self, *axes) 2142 return self 2143 else: -> 2144 return transpose(self, axes=axes) File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/dask/array/routines.py:183, in transpose(a, axes) 181 if axes: 182 if len(axes) != a.ndim: --> 183 raise ValueError("axes don't match array") 184 axes = tuple(d + a.ndim if d < 0 else d for d in axes) 185 else: ValueError: axes don't match array
1 | clip_vv.sel(band='vv').transpose('x','y') |
--------------------------------------------------------------------------- ValueError Traceback (most recent call last) Input In [55], in <cell line: 1>() ----> 1 clip_vv.sel(band='vv').transpose('x','y') File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/dataarray.py:2509, in DataArray.transpose(self, transpose_coords, missing_dims, *dims) 2476 """Return a new DataArray object with transposed dimensions. 2477 2478 Parameters (...) 2506 Dataset.transpose 2507 """ 2508 if dims: -> 2509 dims = tuple(utils.infix_dims(dims, self.dims, missing_dims)) 2510 variable = self.variable.transpose(*dims) 2511 if transpose_coords: File ~/miniconda3/envs/sentinel/lib/python3.10/site-packages/xarray/core/utils.py:808, in infix_dims(dims_supplied, dims_all, missing_dims) 806 existing_dims = drop_missing_dims(dims_supplied, dims_all, missing_dims) 807 if set(existing_dims) ^ set(dims_all): --> 808 raise ValueError( 809 f"{dims_supplied} must be a permuted list of {dims_all}, unless `...` is included" 810 ) 811 yield from existing_dims ValueError: ('x', 'y') must be a permuted list of ('y', 'x', 'time'), unless `...` is included
1 |