Start a new topic

How to deploy my spider into scrapinghub ?

 Here is my first spider created on local pc.

scrapy startproject project
cd project
scrapy genspider quotes
vim project/

import scrapy
class ProjectItem(scrapy.Item):
    quote = scrapy.Field()
    author = scrapy.Field()

vim project/spiders/

# -*- coding: utf-8 -*-
import scrapy
from project.items import ProjectItem
class QuotesSpider(scrapy.Spider):
    name = 'quotes'
    allowed_domains = ['']
    start_urls = ['']
    def parse(self, response):
        item = ProjectItem()
        for quote in response.css('div.quote'):
            item['quote'] = quote.css('span.text::text').extract_first()
            item['author'] = quote.xpath('span/small/text()').extract_first()
            yield item
        next_page = response.css(' a::attr("href")').extract_first()
        if next_page is not None:
            yield response.follow(next_page, self.parse)

I can scrawl many quotes with command

scrapy crawl quotes -o /tmp/quotes.json -t json

Now i want to deploy it into  scrapinghub.

In my local pc,

sudo pip  install shub --upgrade

shub deploy
Error: Cannot find project: There is no scrapinghub.yml, scrapy.cfg, or Dockerfile in this directory or any of the parent directories.

How to deploy my spider into scrapinghub ?

Best Answer

 Using shub deploy from the project folder ,solved.

Are you using shub deploy from the project folder?


 Using shub deploy from the project folder ,solved.

Login to post a comment