scrapernode
  • All Platforms
Platforms
  • inLinkedIn5
  • igInstagram4
  • tkTikTok3
  • ๐•Twitter/X2
  • ytYouTube3
  • fbFacebook5
  • inIndeed2
  • gdGlassdoor3
  • ypYelp2
  • ghGitHub1
  • cbCrunchbase1
  • Use Cases
  • Jobs
  • Billing
  • Docs
  • Settings

ยฉ 2026 Scrapernode

scrapernode
PlatformsJobsBillingDocsSettings
Home/Use Cases/n8n

Use Case

The scraping API built for n8n workflows

Drop a single HTTP Request node into any n8n workflow to scrape LinkedIn, Instagram, TikTok, Twitter/X, YouTube, and more. Per-result credits, structured JSON output, zero scraper maintenance.

Build your first n8n workflowView pricing

The Problem

Challenges teams face without Scrapernode

Complex HTTP node configs

Setting up Brightdata, Apify, or PhantomBuster in n8n means configuring OAuth, pagination, webhook callbacks, and custom headers. Each tool needs a different HTTP Request node pattern.

Code nodes that break

You wrote a Puppeteer script inside n8n's Code node to scrape profiles. It worked for a week. Then the site updated its markup and now you're debugging JavaScript at midnight.

No native social scraping nodes

n8n has nodes for Slack, Google Sheets, and Airtable โ€” but nothing for extracting structured data from LinkedIn, Instagram, or TikTok. You're stuck with raw HTTP or Code nodes.

Async polling is tedious

Scraping APIs return results asynchronously. In n8n, that means building Wait โ†’ HTTP โ†’ IF โ†’ Loop patterns to poll for job completion. Every workflow has the same boilerplate.

The Solution

How Scrapernode solves this

Two HTTP Request nodes, any platform

POST to create a job, GET to fetch results. Same node config for LinkedIn, Instagram, TikTok, Twitter/X, YouTube, Facebook, Indeed, Glassdoor, Yelp, GitHub, and Crunchbase. Copy-paste between workflows.

Clean JSON for Set and IF nodes

Scrapernode returns flat, structured JSON โ€” names, titles, follower counts, post content. Use n8n's Set node to extract fields, IF node to filter, and merge directly into your downstream tools.

Scrapers that never break

When platforms change their markup, Scrapernode updates the scraper. Your n8n workflow keeps running. No more midnight debugging sessions for broken Code nodes.

Example Workflows

Copy these n8n patterns to get started in minutes

Basic scrape-and-store workflow

The simplest Scrapernode + n8n pattern. Trigger, scrape, store. Works for any platform โ€” just change the scraper slug.

  1. 1

    Manual or schedule trigger

    Kick off manually or run on a schedule (daily, hourly, etc.)

  2. 2

    HTTP Request: create job

    POST /api/jobs/create with scraper slug, URLs, and API key

  3. 3

    Wait node

    Pause for 30-60 seconds while Scrapernode processes the job

  4. 4

    HTTP Request: get results

    GET /api/jobs/results returns structured JSON array

  5. 5

    Set node: extract fields

    Pull out the fields you need โ€” name, title, bio, followers, etc.

  6. 6

    Google Sheets / Airtable

    Append rows to your tracking sheet or base

Multi-platform enrichment with merge

Enrich the same list of people across LinkedIn, Twitter, and Instagram, then merge results per person.

  1. 1

    Trigger

    Webhook receives a list of people with social URLs

  2. 2

    Split by platform

    Code node groups URLs by platform (LinkedIn, Twitter, Instagram)

  3. 3

    Parallel Scrapernode jobs

    Three HTTP Request nodes fire simultaneously โ€” one per platform

  4. 4

    Wait and collect

    Wait for all three jobs to complete, fetch results from each

  5. 5

    Merge node

    Combine results per person using name or URL as the join key

  6. 6

    Output

    Write enriched profiles to Airtable, CRM, or send via webhook response

Conditional scraping with IF nodes

Scrape LinkedIn profiles, then conditionally scrape additional platforms based on the results.

  1. 1

    Scrape LinkedIn profiles

    Initial enrichment with linkedin-profiles scraper

  2. 2

    IF: has Twitter URL?

    Check if the LinkedIn profile includes a Twitter/X handle

  3. 3

    Scrape Twitter profiles

    For leads with Twitter handles, run twitter-profiles scraper

  4. 4

    IF: follower count > 1000?

    Filter for leads with significant social presence

  5. 5

    Tag as high-value

    Mark qualifying leads as high-priority in your CRM with the social context attached

Recommended Scrapers

The scrapers most relevant to this use case

in

Profiles

LinkedIn ยท 2 credits / row

in

Companies

LinkedIn ยท 2 credits / row

in

Posts

LinkedIn ยท 1 credits / row

ig

Profiles

Instagram ยท 1 credits / row

ig

Posts

Instagram ยท 1 credits / row

tk

Profiles

TikTok ยท 1 credits / row

tk

Videos

TikTok ยท 1 credits / row

๐•

Profiles

Twitter/X ยท 1 credits / row

๐•

Posts

Twitter/X ยท 1 credits / row

yt

Channels

YouTube ยท 1 credits / row

fb

Profiles/Posts

Facebook ยท 1 credits / row

cb

Companies

Crunchbase ยท 2 credits / row

in

Job Listings

Indeed ยท 1 credits / row

Integrations

Connect your scraped data to your favorite tools

Google Sheets

Google Sheets

Auto-sync results to spreadsheets

Webhooks

Real-time delivery to any endpoint

REST API

Programmatic access for developers

n8n

n8n

Connect to 1000+ apps

CSV/JSON

Download in standard formats

Frequently Asked Questions

Common questions about n8n