!jupyter notebook --NotebookApp.iopub_data_rate_limit=1e10 not working in google colab

46 views Asked by At

I am running an data analyisis in google colab and at some point I received the following message:

IOPub data rate exceeded.
The notebook server will temporarily stop sending output
to the client in order to avoid crashing it.
To change this limit, set the config variable
`--NotebookApp.iopub_data_rate_limit`.

Current values:
NotebookApp.iopub_data_rate_limit=1000000.0 (bytes/sec)
NotebookApp.rate_limit_window=3.0 (secs)

Then I searched and found that I should use !jupyter notebook --NotebookApp.iopub_data_rate_limit=1e10 to increase data limit. But every time I use this command, the output from google is:

|DEBUG|Paths used for configuration of jupyter_notebook_config: 
        /etc/jupyter/jupyter_notebook_config.json
|DEBUG|Paths used for configuration of jupyter_notebook_config: 
        /usr/local/etc/jupyter/jupyter_notebook_config.d/panel-client-jupyter.json
        /usr/local/etc/jupyter/jupyter_notebook_config.json
|DEBUG|Paths used for configuration of jupyter_notebook_config: 
        /usr/etc/jupyter/jupyter_notebook_config.json
|DEBUG|Paths used for configuration of jupyter_notebook_config: 
        /root/.local/etc/jupyter/jupyter_notebook_config.json
|DEBUG|Paths used for configuration of jupyter_notebook_config: 
        /root/.jupyter/jupyter_notebook_config.json

  _   _          _      _
 | | | |_ __  __| |__ _| |_ ___
 | |_| | '_ \/ _` / _` |  _/ -_)
  \___/| .__/\__,_\__,_|\__\___|
       |_|
                       
Read the migration plan to Notebook 7 to learn about the new features and the actions to take if you are using extensions.

https://jupyter-notebook.readthedocs.io/en/latest/migrate_to_notebook7.html

Please note that updating to Notebook 7 might break some of your extensions.

|INFO|google.colab serverextension initialized.
|INFO|Serving notebooks from local directory: /content/drive/MyDrive/coxiela_brunetti
|INFO|Jupyter Notebook 6.5.5 is running at:
|INFO|http://localhost:8888/?token=d4c79a17cf3cd5ab0c7df31986fa2c3497e76c637b313a9c
|INFO| or http://127.0.0.1:8888/?token=d4c79a17cf3cd5ab0c7df31986fa2c3497e76c637b313a9c
|INFO|Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
|CRITICAL|
    
    To access the notebook, open this file in a browser:
        file:///root/.local/share/jupyter/runtime/nbserver-4903-open.html
    Or copy and paste one of these URLs:
        http://localhost:8888/?token=d4c79a17cf3cd5ab0c7df31986fa2c3497e76c637b313a9c
     or http://127.0.0.1:8888/?token=d4c79a17cf3cd5ab0c7df31986fa2c3497e76c637b313a9c

And get stuck in this until I press Ctrl+C. I tried to use the command in the console, but the result is the same.

I am very newbie in google colab, and have no idea about why it is not working.

1

There are 1 answers

1
gogasca On BEST ANSWER

The iopub data rate iopub_data_rate_limit parameter is for sending data from the kernel to the browser.

If the processing is mostly being done on the kernel then it doesn't seem like you should be transferring that much. You can try removing print statements?

You can also try the following from a Notebook file.

!echo 'c.NotebookApp.iopub_data_rate_limit = 20_000_000' >>/etc/jupyter/jupyter_notebook_config.py
!kill $(pidof -x jupyter-notebook)