0
Answered
ihoekstra 2 weeks ago in Scrapy Cloud • updated by Pablo Vaz (Support Engineer) 1 week ago 3

I've got this in my settings.py:


FEED_URI = 'search_results.csv'
FEED_FORMAT = 'csv'


This works OK locally but I wonder if it has any effect in the scrapinghub cloud. I know I can view the items and export them as CSV but the order of the fields is different from what I intended.


My spider is running now and I am wondering if there will actually be a file called search_results.csv file at the end. If so, where might I find it?

Answer

Answer

Our more experienced support engineers suggest to check:

https://doc.scrapy.org/en/latest/topics/feed-exports.html#feed-export-fields

use setting suggested and run the job.

Finally when you export CSV should be ordered.

Regards!

Answered

Hey Ihoekstra!

You can download the items easily through Scrapy Cloud UI, please check this article:

http://help.scrapinghub.com/scrapy-cloud/how-to-download-items-through-scrapy-cloud-ui

We wrote it inspired in your question, so... thank you! :)

Best regards.

Pablo

Thanks very much, I feel honored to see my question was worthy of a blogpost :-)


However... this wasn't exactly what I meant. I already found the export button. But the order of columns is different from what I had when I was testing my script locally and made my own output file.


Reordering the columns is not much trouble, but I was wondering if, in general, it is possible to create my own output file, exactly according to my requirements, just like you would do if you ran the script locally on your own computer.

Answer

Our more experienced support engineers suggest to check:

https://doc.scrapy.org/en/latest/topics/feed-exports.html#feed-export-fields

use setting suggested and run the job.

Finally when you export CSV should be ordered.

Regards!