I have a xenium image, which is the concatenation of 4 different samples. To get samples, I use bounding_box_query().

For instance, I get the sample number 3 like that:

```
crop_sample_3 = lambda x: bounding_box_query(
x,
min_coordinate=[15000, 70000],
max_coordinate=[17000, 70000 + 17000],
axes=("x", "y"),
target_coordinate_system="global",
)
sdata_sample3 = crop_sample_3(sdata)
```

Then, I need to reset the coordinates (I want the point in the upper left corner of the image to have coordinates (0, 0) instead of (15000, 70000).

To do so, I create the following translation:

```
translation = Translation([0, 0], axes=("x", "y"))
```

Then I apply the transformation to my spatialdata object:

```
set_transformation(sdata_sample3.images["morphology_mip"], translation, to_coordinate_system="global")
set_transformation(sdata_sample3.labels["cell_labels"], translation, to_coordinate_system="global")
set_transformation(sdata_sample3.shapes["cell_boundaries"], translation, to_coordinate_system="global")
```

But, the transformation works fine for images and labels, but It does something really weirds for shapes: shapes are translated on different coordinates, and seemed to be ‘downscaled’ (They appear smaller when I plot them).

In the image, i applied the same translation to the image, labels, and to one shape. We see that the shape got translated in the bottom right corner. tried many scaling/translation coordinates, the result is always weird.