Start a new topic
Answered

Shub Deploy Fails

Deploying to Scrapy Cloud project "242717"

Deploy log last 30 lines:

    sys.exit(list_spiders())

  File "/usr/local/lib/python2.7/dist-packages/sh_scrapy/crawl.py", line 170, in

 list_spiders

    _run_usercode(None, ['scrapy', 'list'], _get_apisettings)

  File "/usr/local/lib/python2.7/dist-packages/sh_scrapy/crawl.py", line 127, in

 _run_usercode

    _run(args, settings)

  File "/usr/local/lib/python2.7/dist-packages/sh_scrapy/crawl.py", line 87, in

_run

    _run_scrapy(args, settings)

  File "/usr/local/lib/python2.7/dist-packages/sh_scrapy/crawl.py", line 95, in

_run_scrapy

    execute(settings=settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 142, in

execute

    cmd.crawler_process = CrawlerProcess(settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 209, in

__init__

    super(CrawlerProcess, self).__init__(settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 115, in

__init__

    self.spider_loader = _get_spider_loader(settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 296, in

_get_spider_loader

    return loader_cls.from_settings(settings.frozencopy())

  File "/usr/local/lib/python2.7/dist-packages/scrapy/spiderloader.py", line 30,

 in from_settings

    return cls(settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/spiderloader.py", line 21,

 in __init__

    for module in walk_modules(name):

  File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/misc.py", line 71, i

n walk_modules

    submod = import_module(fullpath)

  File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module

    __import__(name)

  File "/app/__main__.egg/pdfbot/spiders/scraper_spider_all.py", line 9, in <mod

ule>

ImportError: No module named PyPDF2

{"message": "list-spiders exit code: 1", "details": null, "error": "list_spiders

_error"}

 

{"status": "error", "message": "Internal error"}

Deploy log location: c:\users\manoj.k\appdata\local\temp\shub_deploy_etd28r.log

Error: Deploy failed: {"status": "error", "message": "Internal error"}

Finding the above error while Shub Deploying the working Script of PDF Extracted link....

Kindly help me out, How to deploy the pdf extracted items into scrapinghub.com


Regards

Manoj


Best Answer

Python dependencies need to be added to the requirements.txt before deploying using shub. Please see: https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud


Answer

Python dependencies need to be added to the requirements.txt before deploying using shub. Please see: https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud


1 person likes this

Hi Nestor,

Thank you... I got the solution by your help...

Login to post a comment