While deploying custom Docker images to Scrapy Cloud there're some known issues.
We are actively working on getting it resolved, but until it's completely fixed, here are some work-arounds to overcome it.
1. Please authorize with docker login.
In the recent release of SHUB v2.10.0 we have added flag "--reauth" to re-authenticate when pushing an image to a registry.
If using --reauth flag did not help, then try to re-login manually via "docker login images.scrapinghub.com" with username = your SH API key and password = a single space.
2. "SignatureDoesNotMatch" when deploying from Dockerhub.
Suggested workaround is to use the Scrapy Cloud Docker registry instead of Dockerhub: it's free for Scrapinghub customers, secure, easier to use and provides better performance.
For this please add "image: true" in scrapinghub.yml and then run "shub deploy" without username/password.
The command builds/updates a local image, pushes it to Scrapy Cloud registry and after it deploys it to SC.
For any further assistane plesae contact Support team by going toHelp > Contact Support.