Thank you for your response.
what kind of IDE do you suggest me to work with to avoid this?
I dont want to use jupyter as the IDEs like pycharm have the ability of tracing, …
is it a common issue between the users?
Thanks.
both strategies worked for me.
my data is not dynamic but I work with large dataset containing milions of points. when I want to render them it would take a lot of time. and when I want to zoom in, it takes a lot of time.
which solution do yo recommend?
As mentioned by philipp datashade is the way to go.
When using hvplot you can simple use setting datashade=True to visualize millions of datapoints.
Here’s a code example for a large data sample:
# import libraries
import numpy as np
import pandas as pd
import hvplot.pandas
import holoviews as hv
hv.extension('bokeh')
# create a large data sample
data = np.random.normal(size=[500000, 2])
df = pd.DataFrame(
data=data,
columns=['col1', 'col2'],
)
# use setting datashade=True for large data visualization
plot = df.hvplot.scatter(x='col1', y='col2', datashade=True)