Take a Full‑Page Screenshot of a JavaScript‑Heavy Website
Single‑page apps (SPAs) render most of their content in the browser. Traditional HTTP-only scrapers miss what JavaScript adds after page load. Supacrawler solves this by driving a real browser for you — no Playwright/Puppeteer setup, no headless Chrome to manage.
Note: If you’re using the SDKs (recommended), install them first or see our Install guide: Install the SDKs.
Goal
Take a full‑page screenshot of a JavaScript‑heavy page, after it finishes rendering, with optional waits and device settings.
Full‑page screenshot of a JS‑heavy page
curl -X POST https://api.supacrawler.com/api/v1/screenshots \-H "Authorization: Bearer YOUR_API_KEY" \-H "Content-Type: application/json" \-d '{"url": "https://spa-example.com","device": "desktop","full_page": true,"format": "png","wait_until": "networkidle","wait_for_selector": "#content","block_ads": true}'
Note: When the API returns an image URL (instead of inline base64), it is a signed URL that expires after 15 minutes. If you need long-term access, download the image and store it in your own storage.
Tips for reliable screenshots
- Render timing: Use
wait_until='networkidle'
orwait
(milliseconds) for heavy apps. Combine withwait_for_selector
when a specific widget signals readiness. - Devices & viewport: Set
device='desktop'
and adjustwidth/height
ordevice_scale
for crisp output. Useis_mobile
/has_touch
for mobile variants. - Quality & format: Prefer
png
for UI docs,jpeg
for photography,webp
for smaller files; control withquality
(for lossy formats). - Remove distractions:
block_ads
,block_cookies
,block_chats
, andhide_selectors
help produce clean images. - Actions before capture:
click_selector
,scroll_delay
, anddelay
let you open menus, lazy‑load content, or stabilize animations before the capture.
With Supacrawler, you get production‑grade screenshots without owning browser infrastructure. Point it at any page, tell it how to wait, and get a high‑quality image back.