I am planning to write my scraper with V and i need to send estimatedly ~2500 request per second but can't figure out what am i doing wrong, it should be sending concurrently but it is deadly slow right now. Feels like i'm doing something really wrong but i can't figure it out.
import net.http
import sync
import time
fn send_request(mut wg sync.WaitGroup) ?string {
start := time.ticks()
data := http.get('https://google.com')?
finish := time.ticks()
println('Finish getting time ${finish - start} ms')
wg.done()
return data.text
}
fn main() {
mut wg := sync.new_waitgroup()
for i := 0; i < 50; i++ {
wg.add(1)
go send_request(mut wg)
}
wg.wait()
}
Output:
...
Finish getting time 2157 ms
Finish getting time 2173 ms
Finish getting time 2174 ms
Finish getting time 2200 ms
Finish getting time 2225 ms
Finish getting time 2380 ms
Finish getting time 2678 ms
Finish getting time 2770 ms
V Version: 0.1.29
System: Ubuntu 20.04
You're not doing anything wrong. I'm getting similar results in multiple languages in multiple ways. Many sites have rate limiting software that prevent repeated reads like this, that's what you're running up against.
You could try using channels now that they're in, but you'll still run up against the rate limiter.