Start a new topic
Answered

Queue

Can I somehow queuing  scraping request for same spider? currently if I try to call spiders multiple times using scraping-hub api, it rejects latter.

Best Answer

Hi Indospirit1,


Have you tried Scheduling jobs?


Best,


Pablo


Hi vaz,


Can you  briefly explain "perhaps you can clone spiders and run simultaneously"? Is it copying spider into multiple files or creating different projects with same spider?

Hi Indospirit,


If you have N containers you can run N different spiders, perhaps you can clone spiders and run simultaneously.


We have provided extensive documentation of our API here: https://doc.scrapinghub.com/scrapy-cloud.html#


If you still find difficult to follow, please consider to hire our experts through https://scrapinghub.com/quote, it can save you a lot of time and resources.


Best regards,


Pablo

Hi,


It does not solves my purpose. I want to invoke spider via scrapinghub api, currently it rejects the same spider job if it's already in process. Is there a possibility to queue the same spider job using api? 


Does it helps If I buy more containers???? 


If I have n containers, Can I schedule / run same spider N times simultaneously / In queue?

Answer

Hi Indospirit1,


Have you tried Scheduling jobs?


Best,


Pablo

Login to post a comment