n8n-nodes-scrapernode

Connect Scrapernode to 1,000+ Apps with n8n

The Scrapernode community node for n8n gives you drag-and-drop access to 23 platform-specific scraping nodes. No HTTP configuration needed — just pick a platform, paste URLs, and pipe the results into any other n8n node.

Installation

Install directly from your n8n instance. Works with both n8n Cloud and self-hosted.

  1. 1

    Open Community Nodes

    In n8n, go to Settings → Community Nodes → Install.

  2. 2

    Enter the package name

    Type n8n-nodes-scrapernode and click Install.

  3. 3

    Restart if self-hosted

    n8n Cloud handles this automatically. Self-hosted instances may need a restart for the nodes to appear.

Package name
n8n-nodes-scrapernode

Authentication Setup

Every Scrapernode node requires a Scrapernode API credential. You only need to set this up once.

  1. 1

    Create an API key

    Log in to the dashboard, go to Settings → API Keys, and click Create Key. Copy the key (it starts with sn_).

  2. 2

    Add credential in n8n

    In n8n, open any Scrapernode node → click the credential dropdown → Create New. Paste your API key into the "API Key" field.

  3. 3

    Test the connection

    n8n will automatically verify your key by checking your credit balance. A green checkmark confirms it works.

Available Nodes

Each platform has its own dedicated node with tailored input fields, URL validation, color-coded icons, and AI agent descriptions.

Per-Platform Scraper Nodes

Each node supports three operations: Create, Get, and Get Results.

NodeScraper SlugInputCredits
Scrapernode LinkedIn Profileslinkedin-profilesProfile URLs5
Scrapernode LinkedIn Companieslinkedin-companiesCompany URLs5
Scrapernode LinkedIn Postslinkedin-postsProfile/Company/Post URLs3
Scrapernode Instagram Profilesinstagram-profilesProfile URLs4
Scrapernode Instagram Postsinstagram-postsPost URLs3
Scrapernode Instagram Commentsinstagram-commentsPost URLs2
Scrapernode TikTok Profilestiktok-profilesProfile URLs4
Scrapernode TikTok Videostiktok-postsVideo URLs3
Scrapernode Twitter/X Profilestwitter-profilesProfile URLs4
Scrapernode Twitter/X Poststwitter-postsPost URLs3
Scrapernode YouTube Channelsyoutube-channelsChannel URLs4
Scrapernode YouTube Commentsyoutube-commentsVideo URLs2
Scrapernode Facebook Profilesfacebook-profilesProfile URLs4
Scrapernode Facebook Groupsfacebook-groupsGroup URLs3
Scrapernode Indeed Jobsindeed-jobsSearch URLs5
Scrapernode Indeed Companiesindeed-companiesCompany URLs5
Scrapernode Glassdoor Companiesglassdoor-companiesCompany URLs5
Scrapernode Glassdoor Reviewsglassdoor-reviewsCompany URLs3
Scrapernode Glassdoor Jobsglassdoor-jobsSearch URLs3
Scrapernode Yelp Businessesyelp-businessesBusiness URLs4
Scrapernode Yelp Reviewsyelp-reviewsBusiness URLs3
Scrapernode GitHub Repositoriesgithub-repositoriesRepo URLs3
Scrapernode Crunchbase Companiescrunchbase-companiesCompany URLs5

Scrapernode Jobs (Management Node)

The Scrapernode Jobs node handles cross-cutting job management and credit operations.

ResourceOperationDescription
JobGetGet job status and details by ID
JobGet ResultsRetrieve scraped data from a completed job
JobListList recent jobs with optional status filter
JobCancelCancel a pending/processing job and refund credits
CreditsGet BalanceCheck your current credit balance
CreditsGet TransactionsView your credit transaction history

Quickstart: Scrape LinkedIn Profiles

Build a workflow that scrapes LinkedIn profiles and outputs the data as individual n8n items.

  1. 1

    Add a Scrapernode LinkedIn Profiles node

    Search for "Scrapernode LinkedIn" in the node palette and add it to your canvas.

  2. 2

    Select Create operation

    This is the default. It creates a new scrape job.

  3. 3

    Enter profile URLs

    Paste LinkedIn profile URLs, one per line. E.g. https://linkedin.com/in/satyanadella

  4. 4

    Enable "Wait for Completion"

    Toggle this on to have the node poll until the job finishes and return each profile as a separate output item.

  5. 5

    Connect to your destination

    Wire the output into Google Sheets, Airtable, a database, Slack — or any of n8n's 1,000+ integrations.

Output per profilejson
// Example output item (one per profile)
{
  "_jobId": "k57a8b3c9d0e1f2g3h4",
  "_scraperId": "linkedin-profiles",
  "name": "Satya Nadella",
  "headline": "Chairman and CEO at Microsoft",
  "location": "Redmond, Washington",
  "about": "...",
  "experience": [...],
  "education": [...]
}

Operations Reference

Create (Scraper Nodes)

ParameterTypeDescription
URLsstringURLs to scrape, one per line or comma-separated.
Job NamestringOptional label for the job.
Wait for CompletionbooleanPoll until done and return results as items.
Polling IntervalnumberSeconds between status checks (default: 10).
Max Wait TimenumberTimeout in seconds (default: 300).
Result LimitnumberMax results to fetch after completion (default: 1000).

Get / Get Results

ParameterTypeDescription
Job IDstringThe ID returned when the job was created.
Return AllbooleanFetch all results (up to 10,000) instead of a limited set.
LimitnumberMax results to return (default: 50). Shown when Return All is off.

Wait for Completion

When enabled, the node polls your job status at the configured interval until the job completes, then fetches all results and outputs each record as a separate n8n item.

Holds an n8n worker. While polling, the node occupies a workflow execution slot. For long-running jobs (50+ URLs), prefer this pattern instead:
  1. 1

    Create job without waiting

    Use a scraper node with "Wait for Completion" off. It returns immediately with a jobId.

  2. 2

    Schedule a check

    Use a Schedule Trigger (e.g. every 2 minutes) connected to a Scrapernode Jobs → Get node.

  3. 3

    Branch on status

    Use an IF node to check if status === "completed", then fetch results with Scrapernode Jobs → Get Results.

If the job doesn't complete within the max wait time, the node returns a timeout object with the jobId so you can check later.

Consider webhooks instead. Rather than polling, you can pass a webhookUrl when creating a job and use n8n's Webhook Trigger to receive a push notification when the job finishes. Learn more →

AI Agent Compatibility

Every scraper node has usableAsTool: true with a focused description optimized for AI tool selection.

When used with n8n's AI Agent node, the LLM can select the correct scraper based on the user's request. Each node's description is written to minimize ambiguity — the agent won't confuse LinkedIn Profiles with LinkedIn Posts.

Error Handling

The nodes handle transient failures automatically and surface clear error messages for issues you need to fix.

Automatic Retries

HTTP 5xx, 429 (rate limit), and network errors are retried up to 3 times with exponential backoff (2s, 4s, 8s).

Continue on Fail

Enable Continue on Fail in node settings to convert errors into output items with an error field instead of stopping the workflow.

Common Errors

ErrorCauseFix
Invalid URLURL doesn't start with http:// or https://Check URL format.
Wrong platformURL domain doesn't match the scraperUse the correct platform node.
Invalid API keyKey is missing, expired, or malformedRegenerate key in Settings → API Keys.
Insufficient creditsNot enough credits for the jobPurchase credits in the Scrapernode dashboard.
Rate limit exceededToo many requests in a short windowWait for Retry-After seconds, then retry.

FAQ

Do I need the community node?

No — you can use n8n's HTTP Request node with the REST API directly. The community node just makes it easier with dedicated UI fields, URL validation, built-in polling, and per-platform icons.

How do I handle large jobs (50+ URLs)?

Use Create without "Wait for Completion", then poll with a Schedule Trigger + Scrapernode Jobs → Get. This avoids tying up an n8n worker.

What does each scrape cost?

Credit costs are listed in the Available Nodes table above. Costs range from 2 credits (comments) to 5 credits (LinkedIn profiles/companies) per input record.

Does it work with n8n Cloud?

Yes. Community nodes are supported on n8n Cloud. Install through Settings → Community Nodes.

Can I use it with n8n's AI Agent?

Yes. Every scraper node has usableAsTool: true with optimized descriptions for accurate tool selection by LLMs.