Creating a LinkedIn Job Scraper Workflow with Apify and n8n

Creating a LinkedIn Job Scraper Workflow with Apify and n8n

Automating job searches on LinkedIn can save hours of manual work and help you stay on top of new opportunities. By combining the power of Apify for data extraction and n8n for workflow automation, you can build a powerful LinkedIn job scraping system that collects job listings and organizes them in Google Sheets.

Getting Started with Apify

Apify is a powerful web scraping platform that offers ready-made solutions for various websites. To begin scraping LinkedIn jobs:

  1. Create an account on Apify
  2. Navigate to the Apify Store
  3. Search for “LinkedIn jobs scraper”
  4. Select a suitable tool from the search results

The LinkedIn jobs scraper tool allows you to configure various search parameters including:

  • Job title (e.g., “automation specialist”)
  • Location
  • Company name
  • Published date filter
  • Number of jobs to scrape
  • Work type (on-site, remote, hybrid)
  • Job type (full-time, part-time, contract, etc.)
  • Experience level

Running Your First Scrape

After configuring your search parameters, click “Start” to begin the scraping process. The tool will crawl LinkedIn and collect job listings matching your criteria. The data extracted typically includes:

  • Job title
  • Job location
  • Posted date
  • Job URL
  • Company name
  • Company URL
  • Job description
  • Application count
  • Employment type
  • Seniority level
  • Job function
  • Industry information
  • Application URL

Setting Up n8n for Automation

To automate the scraping process, you’ll need to integrate Apify with n8n:

  1. Create a new workflow in n8n
  2. Add an HTTP Request node
  3. Configure the node to call the Apify API

For a more streamlined approach, you can import the cURL command from Apify’s documentation to create the HTTP request in n8n. You’ll need to provide:

  • Your Apify Actor ID (found in the URL of your Apify tool)
  • Your Apify API token (generated in Settings > API and Integrations)
  • JSON input defining your search parameters

Storing Results in Google Sheets

After successfully scraping job listings, you can automatically add them to Google Sheets:

  1. Add a Google Sheets node to your n8n workflow
  2. Select the “Append Row in Sheet” operation
  3. Connect your Google account
  4. Select your spreadsheet and worksheet
  5. Map the job data fields to your spreadsheet columns

The typical columns you might want to include are:

  • Job title
  • Company name
  • Employment type
  • Industry
  • Job function
  • Job description
  • Job URL

Testing Your Workflow

Once your workflow is set up, test it by clicking “Test Workflow” in n8n. The system will scrape LinkedIn based on your parameters and automatically add the results to your Google Sheets document.

You can easily modify the search parameters to look for different job titles or locations. For example, changing from “automation specialist” jobs in the United States to “web developer” jobs in the United Kingdom.

Benefits of Automated Job Scraping

This automated workflow offers several advantages:

  • Save time by eliminating manual job searches
  • Stay updated with the latest job postings
  • Organize job listings in a structured format
  • Filter and analyze job opportunities more efficiently
  • Run multiple searches for different job types or locations

With this LinkedIn job scraper workflow, you can streamline your job search process and focus on applying to the most relevant opportunities rather than spending hours manually searching for them.

Leave a Comment