Start a new topic

Crawlera with PhantomJS (Poltergeist)

I'm trying to get Crawlera to work with PhantomJS. I have it working with Curl, and I also have PhantomJS working without Crawlera.

I am trying to visit a https site, and am getting error `bad_proxy_auth` with these settings:

Capybara.register_driver :poltergeist do |app|, phantomjs_options: ["--proxy=#{proxy}", "--proxy-auth=#{proxy_auth}", "--ignore-ssl-errors=true"])

1 Comment

I have the exact same problem - my setup is: 

      Capybara.register_driver :poltergeist do |app|, js_errors: false, phantomjs_options: ['--load-images=no', '--ssl-protocol=any', '', "--proxy-auth=#{proxy_auth}", "--ssl-client-certificate-file=#{cert}"])


I get the error: failed to reach server, check DNS and/or server status.

1 person likes this
Login to post a comment