Scrolling Screenshots at Scale: Long Pages, Feeds, and Dashboards
Long pages and infinite feeds often render content on scroll, causing partial or cut screenshots. Supacrawler’s Screenshots API supports reliable full‑length captures with scroll simulation and smart waits—no local browser rig required.
When to use scrolling screenshots
- Product/category pages with long lists
- News/social feeds with infinite scroll
- Analytics dashboards where widgets load progressively
Core parameters
full_page
: capture the entire page heightscroll_to_bottom
: simulate user scrolling to trigger lazy contentmax_scroll_attempts
: limit scroll loops to bound time and costwait_until
,wait_for_selector
,delay
: ensure content is stable
Scrolling screenshot with lazy load
import { SupacrawlerClient, ScreenshotCreateRequest } from '@supacrawler/js'const client = new SupacrawlerClient({ apiKey: process.env.SUPACRAWLER_API_KEY || 'YOUR_API_KEY' })const job = await client.createScreenshotJob({url: 'https://example.com/infinite-feed',device: ScreenshotCreateRequest.device.DESKTOP,format: ScreenshotCreateRequest.format.WEBP,full_page: true,scroll_to_bottom: true,max_scroll_attempts: 8,scroll_delay: 400,wait_until: 'networkidle',wait_for_selector: '.feed-item:last-child',block_ads: true,})const res = await client.waitForScreenshot(job.job_id!)console.log('Screenshot URL:', res.screenshot)
Stabilize dynamic content
- Set a small
delay
(e.g., 0.5–1s) after conditions are met - Hide sticky UI (
hide_selectors: ['.cookie-banner', '.chat-widget']
) - For dashboards, wait for the slowest widget selector, not just
networkidle
Batching lots of pages
For catalogs or sitemap coverage, batch jobs and poll asynchronously.
import { SupacrawlerClient } from '@supacrawler/js'const client = new SupacrawlerClient({ apiKey: process.env.SUPACRAWLER_API_KEY })async function captureMany(urls) {const jobs = await Promise.all(urls.map(url => client.createScreenshotJob({url,full_page: true,scroll_to_bottom: true,max_scroll_attempts: 6,wait_until: 'networkidle',})))const results = await Promise.all(jobs.map(j => client.waitForScreenshot(j.job_id!)))return results}
File format and size
- Prefer
webp
for long pages to reduce size - Use
jpeg
withquality: 80–85
for photo-heavy feeds - PNG for pixel-perfect UI documentation
With controlled scrolls and waits, you’ll capture the entire story of a page—cleanly and consistently—at any scale.
By Supacrawler Team
Published on July 10, 2025