0
Answered
Andrew Mleczko (Support Engineer) 3 years ago in Scrapy Cloud • updated by Chien Kuo 2 years ago 7
Is there an API for adding periodic jobs, or some way to do that from an external script?  I'd like to schedule/deschedule periodic jobs as part of an auto-QA process.
Under review
There's an undocumented API.
Replace APIKEY, PROJECTID, and SPIDER with the appropriate values.  More information can be found in the existing Jobs API documentation.  JOBID is an internal periodic job ID returned in "_id" when listing periodic jobs.  It is also returned when creating a new periodic job.

  • List all periodic jobs:
curl -u APIKEY: "http://dash.scrapinghub.com/api/periodic_jobs?project=PROJECTID"
  • Add a new periodic job
curl -X POST -u APIKEY: "http://dash.scrapinghub.com/api/periodic_jobs?project=PROJECTID" -d '{"hour": "0", "minutes_shift": "0", "month": "*", "spiders": [{"priority": "2", "args": {}, "name": "SPIDER"}], "day": "*"}'
  • Delete a periodic job
curl -X DELETE -u APIKEY: "http://dash.scrapinghub.com/api/periodic_jobs/JOBID?project=PROJECTID"
  • Replace a periodic job
curl -X PUT -u APIKEY: "http://dash.scrapinghub.com/api/periodic_jobs/JOBID?project=PROJECTID" -d '{"hour": "0", "minutes_shift": "0", "month": "*", "spiders": [{"priority": "2", "args": {}, "name": "SPIDER"}], "day": "*"}'

You can disable/enable a job by replacing it with the "disable" parameter set:
curl -X PUT -u APIKEY: "http://dash.scrapinghub.com/api/periodic_jobs/JOBID?project=PROJECTID" -d "$(curl -u APIKEY: "http://dash.scrapinghub.com/api/periodic_jobs/JOBID?project=PROJECTID" | jq ".disabled = true")"
Anyone has a patch for the python library to support the API calls?

If not i just might write one.
Hi Sebiastian,

We don't know of any patch for this for python-scrapinghub or python-hubstorage.
The periodic jobs endpoint is not officially documented or supported (except internally with Dash)
so any patch could be obsolete anytime.
Please keep this in mind.
There's an API revamping in the works but no hard deadline yet.
What's the license of python-scrapinghub? Would a pull request be accepted, considering the "unofficial" status?
I doubt it would be accepted, but it can stay there on github as a patch
Could we use this API? Our project requires scheduling jobs for 100s periodic jobs every week.