If you need to provide data to a spider within a given project, you can use the API via the python-scrapinghub library to store the data in collections.

You can use collections to store an arbitrary number of records which are indexed by a key. Projects often use them as a single location to write data from multiple jobs.

The example below shows how you can create a collection and add some data:

$ curl -u APIKEY: -X POST  -d '{"_key": "first_name", "value": "John"}{ "_key": "last_name", "value": "Doe"}'  https://storage.scrapinghub.com/collections/79855/s/form_filling

To retrieve the data, you would then simply do:

$ curl -u APIKEY: -X GET  "https://storage.scrapinghub.com/collections/79855/s/form_filling?key=first_name&key=last_name"

And finally, you can delete the data by sending a DELETE request:

$ curl -u APIKEY: -X DELETE "https://storage.scrapinghub.com/collections/79855/s/form_filling"