Control sampling of rasterized image to avoid blending of pixels while zooming

Hi,
I am trying to rasterizing an image where in a toy dataset I want to keep a crisp representation of three vertical lines in the image while zooming. So far I cannot avoid the lines from blending in the rasterized version. Sharing the code below.

import holoviews as hv
from holoviews import opts
from holoviews.operation.datashader import rasterize
hv.extension('bokeh')

# initiate array
arr = np.zeros((50,50))

# Draw lines in array
arr[10:40,23] = 1.
arr[10:40,25] = 1.
arr[10:40,27] = 1.

# Make image from array
img = hv.Image(arr, bounds=(0,0,50,50)).opts(cmap='gray')

# Make rasterized version of image with sampling set to pixel size
rast = rasterize(img, x_sampling=1).opts(cmap='gray')

# Display the image and the rasterized image in a layout
img + rast

Here is an animation demonstrating the issue while zooming.

2023-05-31_09-33-01

I was thinking it might be possible if I could control the zoom to snap to some x-axis increment. Anyone know how to control the zoom in such a way? Or maybe other ideas on how to do this?

Could a possible way forward be to manipulate the ranges returned from zoom-tools and pan or should it be tackled on a deeper level. I think the key is that the viewport returned always needs to start and end at, in this case, whole numbers. Otherwise one might control this through datashader itself maybe?

Now, I started to wonder if it could be possible to have rasterize only work in the one-dimension, i.e. the y-dimension in the above example. If so, then I wonder if the problem of bleeding/smoothing the crisp representation would disappear(?).