Connect Scrapernode to 1,000+ Apps with n8n
The Scrapernode community node for n8n gives you drag-and-drop access to 23 platform-specific scraping nodes. No HTTP configuration needed — just pick a platform, paste URLs, and pipe the results into any other n8n node.
Installation
Install directly from your n8n instance. Works with both n8n Cloud and self-hosted.
- 1
Open Community Nodes
In n8n, go to Settings → Community Nodes → Install.
- 2
Enter the package name
Type n8n-nodes-scrapernode and click Install.
- 3
Restart if self-hosted
n8n Cloud handles this automatically. Self-hosted instances may need a restart for the nodes to appear.
n8n-nodes-scrapernodeAuthentication Setup
Every Scrapernode node requires a Scrapernode API credential. You only need to set this up once.
- 1
Create an API key
Log in to the dashboard, go to Settings → API Keys, and click Create Key. Copy the key (it starts with sn_).
- 2
Add credential in n8n
In n8n, open any Scrapernode node → click the credential dropdown → Create New. Paste your API key into the "API Key" field.
- 3
Test the connection
n8n will automatically verify your key by checking your credit balance. A green checkmark confirms it works.
Available Nodes
Each platform has its own dedicated node with tailored input fields, URL validation, color-coded icons, and AI agent descriptions.
Per-Platform Scraper Nodes
Each node supports three operations: Create, Get, and Get Results.
| Node | Scraper Slug | Input | Credits |
|---|---|---|---|
| Scrapernode LinkedIn Profiles | linkedin-profiles | Profile URLs | 5 |
| Scrapernode LinkedIn Companies | linkedin-companies | Company URLs | 5 |
| Scrapernode LinkedIn Posts | linkedin-posts | Profile/Company/Post URLs | 3 |
| Scrapernode Instagram Profiles | instagram-profiles | Profile URLs | 4 |
| Scrapernode Instagram Posts | instagram-posts | Post URLs | 3 |
| Scrapernode Instagram Comments | instagram-comments | Post URLs | 2 |
| Scrapernode TikTok Profiles | tiktok-profiles | Profile URLs | 4 |
| Scrapernode TikTok Videos | tiktok-posts | Video URLs | 3 |
| Scrapernode Twitter/X Profiles | twitter-profiles | Profile URLs | 4 |
| Scrapernode Twitter/X Posts | twitter-posts | Post URLs | 3 |
| Scrapernode YouTube Channels | youtube-channels | Channel URLs | 4 |
| Scrapernode YouTube Comments | youtube-comments | Video URLs | 2 |
| Scrapernode Facebook Profiles | facebook-profiles | Profile URLs | 4 |
| Scrapernode Facebook Groups | facebook-groups | Group URLs | 3 |
| Scrapernode Indeed Jobs | indeed-jobs | Search URLs | 5 |
| Scrapernode Indeed Companies | indeed-companies | Company URLs | 5 |
| Scrapernode Glassdoor Companies | glassdoor-companies | Company URLs | 5 |
| Scrapernode Glassdoor Reviews | glassdoor-reviews | Company URLs | 3 |
| Scrapernode Glassdoor Jobs | glassdoor-jobs | Search URLs | 3 |
| Scrapernode Yelp Businesses | yelp-businesses | Business URLs | 4 |
| Scrapernode Yelp Reviews | yelp-reviews | Business URLs | 3 |
| Scrapernode GitHub Repositories | github-repositories | Repo URLs | 3 |
| Scrapernode Crunchbase Companies | crunchbase-companies | Company URLs | 5 |
Scrapernode Jobs (Management Node)
The Scrapernode Jobs node handles cross-cutting job management and credit operations.
| Resource | Operation | Description |
|---|---|---|
| Job | Get | Get job status and details by ID |
| Job | Get Results | Retrieve scraped data from a completed job |
| Job | List | List recent jobs with optional status filter |
| Job | Cancel | Cancel a pending/processing job and refund credits |
| Credits | Get Balance | Check your current credit balance |
| Credits | Get Transactions | View your credit transaction history |
Quickstart: Scrape LinkedIn Profiles
Build a workflow that scrapes LinkedIn profiles and outputs the data as individual n8n items.
- 1
Add a Scrapernode LinkedIn Profiles node
Search for "Scrapernode LinkedIn" in the node palette and add it to your canvas.
- 2
Select Create operation
This is the default. It creates a new scrape job.
- 3
Enter profile URLs
Paste LinkedIn profile URLs, one per line. E.g. https://linkedin.com/in/satyanadella
- 4
Enable "Wait for Completion"
Toggle this on to have the node poll until the job finishes and return each profile as a separate output item.
- 5
Connect to your destination
Wire the output into Google Sheets, Airtable, a database, Slack — or any of n8n's 1,000+ integrations.
// Example output item (one per profile)
{
"_jobId": "k57a8b3c9d0e1f2g3h4",
"_scraperId": "linkedin-profiles",
"name": "Satya Nadella",
"headline": "Chairman and CEO at Microsoft",
"location": "Redmond, Washington",
"about": "...",
"experience": [...],
"education": [...]
}Operations Reference
Create (Scraper Nodes)
| Parameter | Type | Description |
|---|---|---|
| URLs | string | URLs to scrape, one per line or comma-separated. |
| Job Name | string | Optional label for the job. |
| Wait for Completion | boolean | Poll until done and return results as items. |
| Polling Interval | number | Seconds between status checks (default: 10). |
| Max Wait Time | number | Timeout in seconds (default: 300). |
| Result Limit | number | Max results to fetch after completion (default: 1000). |
Get / Get Results
| Parameter | Type | Description |
|---|---|---|
| Job ID | string | The ID returned when the job was created. |
| Return All | boolean | Fetch all results (up to 10,000) instead of a limited set. |
| Limit | number | Max results to return (default: 50). Shown when Return All is off. |
Wait for Completion
When enabled, the node polls your job status at the configured interval until the job completes, then fetches all results and outputs each record as a separate n8n item.
- 1
Create job without waiting
Use a scraper node with "Wait for Completion" off. It returns immediately with a jobId.
- 2
Schedule a check
Use a Schedule Trigger (e.g. every 2 minutes) connected to a Scrapernode Jobs → Get node.
- 3
Branch on status
Use an IF node to check if status === "completed", then fetch results with Scrapernode Jobs → Get Results.
If the job doesn't complete within the max wait time, the node returns a timeout object with the jobId so you can check later.
webhookUrl when creating a job and use n8n's Webhook Trigger to receive a push notification when the job finishes. Learn more →AI Agent Compatibility
usableAsTool: true with a focused description optimized for AI tool selection.When used with n8n's AI Agent node, the LLM can select the correct scraper based on the user's request. Each node's description is written to minimize ambiguity — the agent won't confuse LinkedIn Profiles with LinkedIn Posts.
Error Handling
The nodes handle transient failures automatically and surface clear error messages for issues you need to fix.
Automatic Retries
HTTP 5xx, 429 (rate limit), and network errors are retried up to 3 times with exponential backoff (2s, 4s, 8s).
Continue on Fail
Enable Continue on Fail in node settings to convert errors into output items with an error field instead of stopping the workflow.
Common Errors
| Error | Cause | Fix |
|---|---|---|
| Invalid URL | URL doesn't start with http:// or https:// | Check URL format. |
| Wrong platform | URL domain doesn't match the scraper | Use the correct platform node. |
| Invalid API key | Key is missing, expired, or malformed | Regenerate key in Settings → API Keys. |
| Insufficient credits | Not enough credits for the job | Purchase credits in the Scrapernode dashboard. |
| Rate limit exceeded | Too many requests in a short window | Wait for Retry-After seconds, then retry. |
FAQ
Do I need the community node?
No — you can use n8n's HTTP Request node with the REST API directly. The community node just makes it easier with dedicated UI fields, URL validation, built-in polling, and per-platform icons.
How do I handle large jobs (50+ URLs)?
Use Create without "Wait for Completion", then poll with a Schedule Trigger + Scrapernode Jobs → Get. This avoids tying up an n8n worker.
What does each scrape cost?
Credit costs are listed in the Available Nodes table above. Costs range from 2 credits (comments) to 5 credits (LinkedIn profiles/companies) per input record.
Does it work with n8n Cloud?
Yes. Community nodes are supported on n8n Cloud. Install through Settings → Community Nodes.
Can I use it with n8n's AI Agent?
Yes. Every scraper node has usableAsTool: true with optimized descriptions for accurate tool selection by LLMs.