Scrapinghub will support eggs within the Scrapinghub Dashboard through the end of 2016. After that, project dependencies outside of the Scrapy Cloud stack must be managed via a requirements or configuration file (scrapinghub.yml ).


Customers that previously added dependencies via the Scrapinghub Dashboard can migrate their eggs to the new format by using the "shub migrate-eggs" command:


This command will:

  1. Add any of your eggs that are available from PyPI to your requirements.txt file. If the requirements file does not exist, one will be created.
  2. The project’s directory and scrapinghub.yml will be updated to reflect the new requirements file.
  3. Those eggs that are not publicly available will be added to your projects _eggs directory.
  4. Again, the project’s scrapinghub.yml file will be updated to reflect these changes.


After running the shub migrate-eggs command it is important to deploy and test your project. The eggs specified in your project’s requirements and configuration file take precedence over those within the Scraping    hub Dashboard.


Once your project has been tested, the eggs within the Scrapinghub Dashboard can be removed via the “Code and Deploys” page (this will also remove the alert that appears with the Dashboard):