0
Answered
Sergej 5 days ago in Crawlera • updated 4 hours ago 2

Hello,


I am experiencing issues by using Crawlera on https sites with Scrapy and PhantomJs. My Config is:


        service_args = [
            '--proxy=proxy.crawlera.com:8010',
            '--proxy-type=http',
            '--proxy-auth=XXXXXXXXXXXXXXXXX:',
            '--webdriver-logfile=phantom.log',
            '--webdriver-loglevel=DEBUG',
            '--ssl-protocol=any',
            '--ssl-client-certificate-file=crawlera-ca.crt',
            '--ignore-ssl-errors=true',
            ]


Tough I always get this error and the result is empty:

"errorCode":99,"errorString":"Cannot provide a certificate with no key, "


I am stuck for hours on this problem. Any help much appreciated. 


Thank you!

Sergej

Answer

Answer
Answered

Hi Sergej,


This is a common issue and our team is working to provide a prompt solution on next releases of Crawlera. An upgrade is planned for next sprint which should fix this problem.


Best regards,

Pablo

Answer
Answered

Hi Sergej,


This is a common issue and our team is working to provide a prompt solution on next releases of Crawlera. An upgrade is planned for next sprint which should fix this problem.


Best regards,

Pablo

Hi Pablo,


alright thanks, good to know. Will there be any info when this issue is fixed ? If so, where ? And are we talking about days, weeks, month until the next sprint ? 


thanks,

Sergej