Skip to content
ASK.

Home / Work / agentic-job-search

Agentic AI · LLM Tooling2026

Hybrid Agentic Job Search Pipeline

Local Ollama + Anthropic API for Personal Automation

ProductEngineering
Hybrid Agentic Job Search Pipeline

Problem

Hundreds of job postings, most irrelevant, each needing slightly different framing. Frontier LLMs can help — but running every step through a paid API is wasteful for easy tasks and uncomfortable for private data.

Approach

Split the agent across two backends: a local Ollama model for cheap, privacy-sensitive operations (parsing resumes, screening listings, drafting first passes) and the Anthropic API for the hard reasoning (tailored cover letters, structured comparisons).

Implemented in Go for a tight, fast control loop — tool-use, retrieval, and routing live in code, not in a prompt.

Cost-aware routing rules pick the cheapest backend that meets the quality bar for each step, so frontier calls are reserved for the work that actually needs them.

Results

Hybrid local+cloud agent · cost- and privacy-aware

  • Functional end-to-end personal agent (parse → screen → draft → review)
  • Hybrid routing: local-first, frontier where it matters
  • Privacy: resume + personal data never leave the local node
  • Cost: pay-per-token only on the steps that genuinely benefit from frontier models

Stack

Agentic AILLM ToolingGo

What I learned

Agent design is mostly routing and constraints, not prompting. The interesting question isn't 'can the LLM do this'. It's 'which LLM, on which machine, with which tools, under which budget.'

View on GitHub ↗