Scrape box and the right setting for proxies

bartsmitson

Newbie
Joined
Nov 24, 2022
Messages
9
Reaction score
2
Hi,
I'm stuck i've used https://www.webshare.io/ and in the beginning i got great results with 100 proxies, around 300 URL's/s including Google.
After a while it slowed down, and now i'm not getting it up to more than 30 URL's a second.

Now i'm using rotatingproxies.com, a solid speed of 540 URL's a second of 75 proxies.
But i won't scrape Google.
Is there a better provider wich can scrape at a decent URL's per second and can also scrape Google?

Or is there some one willing to help me, and set the settings either for webshare of rotatingproxies in a way that it will work to mass scrape Google.
(paid if the outcome is positive;)
 

Steptoe

Jr. Executive VIP
Jr. VIP
Joined
Aug 9, 2017
Messages
1,122
Reaction score
1,208
Website
t2links.com
With datacenter proxies like Webshare, to keep them alive you'll want to impose delays of at least 60 seconds, and use 1 thread per 10 Google-passed proxies. Anything lower and they'll get burned quickly as you've experienced. But datacenter proxies aren't best for Google scraping.

It looks like rotatingproxies.com will only rotate IPs every 5 minutes, so you probably get an initial burst of successful scraping then they quickly get banned. They also offer unlimited bandwidth, which is a rarity these days, and so the service is probably being massively overused by scapers on a budget. You can look at a provider like Packetshare which I've found more successful with scraping Google, but you'll pay by the GB of bandwidth you use, rather than a monthly fee.
 

loopline

Jr. Executive VIP
Jr. VIP
Joined
Jan 25, 2009
Messages
6,130
Reaction score
3,510
Website
contactformmarketing.com
as noted above, its just a matter of speed, if you go too fast, then you get the ips blocked and thus you run into what your experiencing. So finding the right balance is key.
 

bartsmitson

Newbie
Joined
Nov 24, 2022
Messages
9
Reaction score
2
Is their some one with expierence, who is willing to help. Ofcourse i would pay for it!
Because now on 600.000 keywords. I only get 1.2 million URLS, and when i delete the duplicate domains and URLS, there are only 10k URLS left.
 

loopline

Jr. Executive VIP
Jr. VIP
Joined
Jan 25, 2009
Messages
6,130
Reaction score
3,510
Website
contactformmarketing.com
Is their some one with expierence, who is willing to help. Ofcourse i would pay for it!
Because now on 600.000 keywords. I only get 1.2 million URLS, and when i delete the duplicate domains and URLS, there are only 10k URLS left.
you could try using other engines, bing is especially useful with google, but Id guess your getting a massive amount of failed keywords, probably like 98% and the bulk of your results are coming from a lot of similar keywords.

that is worth saying, make sure your list of keywords is not full of duplicate keywords.
 

proxygo

Jr. Executive VIP
Jr. VIP
Joined
Nov 2, 2008
Messages
46,910
Reaction score
21,794
Website
www.localproxies.com
who remembers the days when you slapped in some G proxies
and got thousands of urls a second, then ips as time went by
got mass banned and url rates dropped
 
Top