Automated B2B Lead Generation Using Hiring Signals (Intent Data)
Why job postings are the ultimate intent data
Table of contents
If you sell B2B software, consulting services, or agency work, timing is everything. Reaching out to a prospect right when they need your help drastically increases your conversion rates.
Job postings are one of the strongest forms of intent data.
- Are you a Salesforce consultancy? A company posting a job for a "Salesforce Administrator" is a highly qualified lead.
- Do you sell cloud security software? A company hiring a "Cloud Security Engineer" with "AWS" in the description has the exact problem you solve.
In this tutorial, we will use our jobdata API to automatically hunt for these hiring signals, extract the companies posting them, and format them into a clean list of warm B2B leads ready for your CRM.
Prerequisites
To follow along, you will need:
- An active API access pro subscription
- Your jobdata API Key (Get one from your dashboard).
- A Python environment (an IPython shell or Jupyter Notebook is highly recommended for quick testing).
- The
requestsandpandasPython libraries installed.
(If you don't have them installed, run pip install requests pandas in your terminal).
Step 1: Set Up Your IPython Environment
Open your IPython shell or Jupyter Notebook and import the necessary libraries. We will set up our API key and define our search parameters.
For this example, let's pretend we are a Datadog implementation agency. We want to find companies that have posted jobs mentioning "Datadog" in the last 7 days.
import requests
import pandas as pd
from datetime import datetime, timedelta
# 1. Set your API credentials and base URL
API_KEY = "YOUR_API_KEY_HERE"
BASE_URL = "https://jobdataapi.com/api"
HEADERS = {
"Authorization": f"Api-Key {API_KEY}",
"Content-Type": "application/json"
}
# 2. Calculate the date for 7 days ago
seven_days_ago = (datetime.now() - timedelta(days=7)).strftime('%Y-%m-%d')
print(f"Searching for leads posting jobs since: {seven_days_ago}")
Step 2: Query the API for Hiring Signals
We will use the /jobs/ endpoint. By utilizing the full-text search parameter (description) and the freshness parameter (published_since), we can zero in on our exact target audience.
# 3. Define the search parameters
params = {
"description": "Datadog", # The technology or keyword we are targeting
"published_since": seven_days_ago, # Only recent jobs
"country_code": "US", # U.S. jobs/companies only
"page_size": 100 # Fetch up to 100 results per page
}
# 4. Make the API request
response = requests.get(f"{BASE_URL}/jobs/", headers=HEADERS, params=params)
if response.status_code == 200:
data = response.json()
jobs = data.get("results",[])
print(f"Success! Found {data.get('count')} total job postings mentioning 'Datadog'.")
else:
print(f"Error {response.status_code}: {response.text}")
Tip: If the API returns a massive number of jobs, you can easily implement pagination by checking the data.get("next") URL to loop through subsequent pages.
Step 3: Extract and Deduplicate the Leads
A single company might be on a hiring spree and post 5 different jobs mentioning "Datadog." We don't want 5 duplicate leads in our CRM; we just want the company's information and a reference to what they are hiring for.
Let's process the JSON payload to extract unique companies.
# 5. Process the jobs to extract unique company leads
leads = {}
for job in jobs:
# 1. Safely handle the company object (it might be None)
company = job.get("company")
if not company or not isinstance(company, dict):
continue
company_name = company.get("name")
if not company_name:
continue
# 2. If we haven't seen this company yet, add them to our leads dictionary
if company_name not in leads:
# 3. Safely handle location (handles both string and dict formats)
location_data = job.get("location", "Remote/Unknown")
if isinstance(location_data, dict):
location_name = location_data.get("name", "Remote/Unknown")
else:
location_name = str(location_data)
# 4. Extract the website (docs mention "website URL")
website = company.get("website") or company.get("website_url") or "N/A"
leads[company_name] = {
"Company Name": company_name,
"Website": website,
"First Seen Job Title": job.get("title", "N/A"),
"Job Location": location_name,
"Date Posted": job.get("published", "N/A")
# Note: To get 'Industry', you would take the company ID
# and make a separate request to the /api/companies/{id}/ endpoint.
}
print(f"Extracted {len(leads)} unique company leads.")
Step 4: Export to a CSV for Sales Outreach
Now that we have a clean dictionary of highly-qualified leads, let's use pandas to format it into a beautiful table. From here, you can export it to a CSV to hand off to your Sales Development Representatives (SDRs) or upload it directly to HubSpot/Salesforce.
# 6. Convert the dictionary of leads into a Pandas DataFrame
df_leads = pd.DataFrame.from_dict(leads, orient='index').reset_index(drop=True)
# Let's preview the first 5 leads right in our IPython shell
display(df_leads.head())
# 7. Export the leads to a CSV file
export_filename = f"datadog_intent_leads_{datetime.now().strftime('%Y%m%d')}.csv"
df_leads.to_csv(export_filename, index=False)
print(f"Successfully saved leads to {export_filename}! Ready for outreach.")
The Output
If you run df_leads.head() in Jupyter/IPython, you'll see a clean table looking something like this:
Company Name Website ... Job Location Date Posted
0 Clayco https://claycorp.com/ ... St. Louis, MO, United States 2026-03-26T15:18:03Z
1 American Express https://www.americanexpress.com/ ... US-New York-New York 2026-03-26T13:05:14Z
2 Spring Health https://www.springhealth.com/ ... San Francisco, CA (Hybrid) 2026-03-26T02:05:36Z
3 Imply https://imply.io/ ... Burlingame, California, United States 2026-03-26T00:22:16Z
4 Inter Miami CF https://www.intermiamicf.com/ ... Fort Lauderdale, FL, US 2026-03-25T23:46:06Z
Next Steps: Fully Automating Your Lead Machine
Congratulations! You just turned a job market API into a powerful intent-data engine.
To take this to the next level, you can:
- Automate the Script: Wrap this code in a simple AWS Lambda function or a local cron job that runs every Monday at 8:00 AM.
- Push Directly to your CRM: Instead of exporting to CSV, use the
requestslibrary to POST these leads directly into the HubSpot or Salesforce API. - Trigger Email Alerts: Have the script send a Slack message or email to your sales team the moment a high-value company posts a relevant job.