Welcome to the Scrapinghub feedback & support site! We discuss all things related to Scrapy Cloud, Portia and Crawlera. You can participate by reading posts, asking questions, providing feedback, helping others, and voting the best questions & answers. Scrapinghub employees regularly pop in to answer questions, share tips, and post announcements.
0
Answered
bobsaget 1 year ago • updated by Pablo Hoffman (Director) 10 months ago 1

Hello, I am new to scrapinghub and wondering if this product will be able to help me. Basically I am looking for a tool to search through all the links and sub-links on a website to find a certain word on this website. I am looking to see anywhere this word appears on the site.


Normally I would use the command in google "site:website.com 'word'" and that would do the trick. But this website requires a paid login and therefore is not google-facing.
I'm wondering if scrapinghub is able to do this and if so can someone point me in the right direction on how.
Answer
Pablo Hoffman (Director) 10 months ago

You can write a Scrapy spider to do this for you, and run it on Scrapy Cloud.

0
Answered
Beto Lopez 3 years ago • updated by Pablo Hoffman (Director) 10 months ago 1
I'm actually a user of scrapinghub, and I have 2 years experience scraping big sites.

I can reply tickets in english, spanish and german. I would love to work in the support team of scrapinghub.

Where can I send a message for applying? thank you.