Automated Job Search
Monitor company career pages for new opportunities automatically. Automate your job search by monitoring company career pages for new postings and getting instant notifications.
Quick Example
import requests
import os
companies = [
{"name": "OpenAI", "url": "https://openai.com/careers", "selector": ".job-title"},
{"name": "Anthropic", "url": "https://www.anthropic.com/careers", "selector": ".job-listing"},
{"name": "Google", "url": "https://careers.google.com/", "selector": ".gc-card__title"}
]
for company in companies:
requests.post("https://api.supacrawler.com/api/v1/watch",
headers={"Authorization": f"Bearer {os.environ['SUPACRAWLER_API_KEY']}"},
json={
"url": company["url"],
"frequency": "daily",
"selector": company["selector"],
"notify_email": "[email protected]",
"webhook_url": "https://your-api.com/job-alerts",
"webhook_headers": {"X-Company": company["name"]}
}
)
print(f"✅ Monitoring {company['name']}")
curl -X POST https://api.supacrawler.com/api/v1/watch \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"url": "https://openai.com/careers",
"frequency": "daily",
"selector": ".job-title",
"notify_email": "[email protected]"
}'
Filter by Keywords
def filter_relevant_jobs(url, keywords):
result = client.scrape(url, format="markdown")
# Filter jobs by keywords
relevant_jobs = []
for line in result.content.split('\n'):
if any(keyword.lower() in line.lower() for keyword in keywords):
relevant_jobs.append(line)
return relevant_jobs
keywords = ["Senior Engineer", "Machine Learning", "Python", "Remote"]
jobs = filter_relevant_jobs("https://company.com/careers", keywords)
Best Practices
- Daily monitoring is sufficient for most companies
- Use webhooks for instant Slack/Discord notifications
- Filter by location, role, and keywords
- Track application deadlines
- Monitor smaller companies for less competition
Was this page helpful?