Start a new topic
Answered

How to fetching the latest spider data by using curl?

In this article titled "Fetching the latest spider data" (

https://support.scrapinghub.com/support/solutions/articles/22000200409-fetching-latest-spider-data), the answer has provide a convenient way to download the latest data by using a static url. 


But when i try to utilize the curl tool to download, it seems doesn't work anymore.


I have notice that you have closed the download viability from dash.scrapinghub.com, which i use to download the data. Is there any way that i still can download the latest data by using curl?


For example, in the past time. I can use 


curl -u APIKEY: "https://dash.scrapinghub.com/api/items.csv?

project=PROJECTNUMBER&spider=SPIDERNAME&include_headers=1&fields=FIELDNAME1,FIELDNAME2&apikey=APIKEY"

--compressed -o C:\file.csv


but, it doesn't work now. even i change the coding (only from https://dash. to https://app.) to


curl -u APIKEY: "https://app.scrapinghub.com/api/items.csv?

project=PROJECTNUMBER&spider=SPIDERNAME&include_headers=1&fields=FIELDNAME1,FIELDNAME2&apikey=APIKEY"

--compressed -o C:\file.csv


Only error occurs.


error message:


curl: (35) schannel: next InitializeSecurityContext failed: Unknown error (0x800  92012) -


Could you give me any clue?


Thank you very much


Best Answer

I can't reproduce your issue, the URL works fine on my end, no CURL support has been removed.

You might want to check this out: https://curl.haxx.se/mail/lib-2016-03/0202.html if you are behind a proxy.


Answer

I can't reproduce your issue, the URL works fine on my end, no CURL support has been removed.

You might want to check this out: https://curl.haxx.se/mail/lib-2016-03/0202.html if you are behind a proxy.


1 person likes this

Problem solved as i disable some proxy  and anti-virus software. Thanks for the help.

You need to instruct curl to follow the redirection with -L

Thanks nestor, Even i add -L. i.e.


curl -L -u APIKEY: "https://app.scrapinghub.com/api/items.csv?

project=PROJECTNUMBER&spider=SPIDERNAME&include_headers=1&fields=FIELDNAME1,FIELDNAME2&apikey=APIKEY"

--compressed -o C:\file.csv


There is still the same error, in fact the very same syntax (with -L) works for me previously. I just wonder that if scrapinghub has withdraw some Curl support function. I just don't know.


Thanks again

Login to post a comment