I need to develop a utility in python for taking screen shot of the web page while requesting them through their URL. But I can't get screenshot of the web page after it's completely loaded. I get incompletely loaded page.
I tried with selenium and pyppeteer. With selenium I got pop ups for each request but I need to request more than 500 URLs.
With puppeteer I tried every possible way which I have listed below but couldn't make it work.
1.await asyncio.sleep(10)
2.await page.waitForXPath('//main')
3.await page.waitForSelector("body")
4.await page.waitFor(60000)
5.await page.goto(url, {'waitUntil': 'load'})
6.await page.waitForNavigation({'waitUntil': 'domcontentloaded', 'timeout': 60000})
Seems like you're struggling to take screenshots of web pages after they've fully loaded. From your explanation, I can see you're facing issues with both Selenium and Pyppeteer. Your application is currently capturing screenshots before the page has finished loading, which is not ideal when you have to deal with more than 500 URLs.
Here's a solution that might work better for Pyppeteer:
You should consider using
networkidle0ornetworkidle2in conjunction withwaitUntiloption during navigation.networkidle0waits until no new network connections are made within at least 500 ms, andnetworkidle2waits until there are no more than 2 network connections within 500 ms.This will ensure that the webpage has completed loading and idle before you take the screenshot.
Your code could look something like this:
Or if
networkidle2suits your application better:This way, screenshots will only be taken when there are no more network connections happening, indicating that the page has completely loaded.