-
-
Notifications
You must be signed in to change notification settings - Fork 406
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IOPub data rate exceeded #1181
Comments
For the Showcase notebook, I was able to get the output to appear in place of those error messages if I launched the server with:
Not sure what units that value is in; 10 million somethings per something... |
Here's what their docs say: NotebookApp.iopub_data_rate_limit : Float
Default: 1000000
(bytes/sec) Maximum rate at which messages can be sent on iopub before they are limited. Surely there must be a way to override this for certain output, this was just meant to prevent a lot of text being dumped into your browser at once, which is a problem. I'll do some digging. |
As far as I can tell there is no way to override this except setting a higher rate in your Jupyter config, which would be a real pain. |
Should we file an issue with Jupyter? |
Looks like this will be fixed in Jupyter 5.1, which means we'll have to pin 4.2.2 in our requirements or provide guidance to increase the limit. |
I can confirm that using 5.0 a bunch of examples in |
Is there a dev release we can rely on? Otherwise, pin 4.2.2? |
The actual fix has not even been merged yet (see jupyter/notebook#2368), so no dev release we can use. Suppose pinning 4.2.2 is our only option but then that's bound to annoy people if we downgrade their notebook on install. |
Yep, it's very annoying. I would argue that the Jupyter 5.0 release is broken with its default values for the data rate, for visualization purposes, so I don't see any other option. Basically, anything they do with any viz program is likely to run into problems with 5.0, so if they are doing viz in a notebook, downgrading to 4.2.2 is actually doing them a favor. |
I agree! Just need to make it clear in the release somehow that we're not the ones causing problems and that Jupyter 5.0 is what is causing some general problems for a lot of notebook users. |
This may deserve a section in the release notes, not only to explain why we are pinning but what people who've upgrades to 5.0 anyway should do if they see those warnings. |
Right. |
Note that the PR has now been merged to Jupyter master again, so whatever next dev release there is should be ok again. |
I don't think this is true really, you'll be hard pressed to reach the limit with bokeh plots as bokeh will likely fall over before it bites and in matplotlib you have to push at least a 2000x2000 pixels (I suspect a lot more) to trigger it. I'm still fairly annoyed with it because it doesn't make sense to limit the data throughput, but I actually think outside of HoloMaps this will be a fairly rare problem. |
I ran into the issue with vastly smaller plots in mpl, I just wasn't able to reproduce it reliably. It's not the total size that matters, it's the data rate... |
That is worrying - I was about to suggest we update the tutorials so they don't trigger this warning and leave notebook unpinned (and warn the user in the release notes). Sounds like that might not be possible. |
It is related to total size though since it averages the throughput over some small period of time and anything that completes before that time is up won't be affected. As far as I can tell the effective limit is about 2MB (after 100 attempts I haven't been able to reproduce it at that size). |
A 2MB plot is not all that much... |
Agreed, for a completely random image exported to png that's only about 1200x1200. Fortunately most charts aren't random and I was able to get 3000x3000 pixel pngs of a Curve to export reliably. |
Hello, I'm sorry, my question is might be a little dummy, but I wonder how to resolve it for jupyerhub? I've tried pass the coressponding argument argument right to 'jupyterhub' and tried to set it in config file, but without any success. |
@BoomBidiBuyBuy That's probably a question for the JupyterHub repo. |
There really isn't much more we can do here except hope that Jupyter 5.1 is released soon. Going to close. |
Good Choice: |
As a workaround something that worked for me was put a "time.sleep(0.3)" inside the loop; of course it must worked for others time intervals |
This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
With the 5.0.0b1 version of the Jupyter notebook server, I'm getting error messages instead of some of my HoloViews plots:
If this is a new thing, should HoloViews be increasing the limit to something more in line with the amount of data that HoloViews notebooks typically generate?
The text was updated successfully, but these errors were encountered: