Datashader and GPU

HI, assuming CUPY and CUDF are correctly installed on a machine that has access to a Nvidia GPU, will datashader (and/or geoviews) pick the GPU automatically and use it for the data reduction and other computations? or do I need to explicitly set it to use it?

My plan is to read a dataset (netcdf gridded data) with xarray and render it with datashader/geoviews. Should I accomplish this by passing from an xarray-dataset to a cudf-dataframe and pass it to datashader/geoviews or can I use xarray directly in order to use GPU acceleration for the computation?

It is automatic in the sense that when the data is on the GPU it will automatically call the GPU pipeline internally, i.e. if you’re feeding in a cuDF it will automatically use the GPU acceleration. There is some subtlety though since not all “glyph” types are supported, you can see which combinations are supported by looking at the table at the end of this page.

1 Like