This note captures the state of AI engineering hiring on 2026-04-17, pulled directly from the AI Dev Jobs public API. The numbers are not a survey. They are a live, daily-refreshed index of what companies are actually posting to their own applicant tracking systems right now — scraped continuously from the AI Dev Jobs ATS source feed network, deduplicated, and canonicalized.

8,618
active AI/ML engineering roles open across 513 companies (ADB, 2026-04-17)
$213k
median advertised salary across the 3,402 roles that publish salary ranges
594
new roles posted in the last 7 days — sustained pace of ~85 per day

Top 10 hiring companies right now

The concentration at the top is striking. OpenAI, Anthropic, and Anduril alone account for 758 open roles — roughly 9% of the entire index. The top 10 companies account for 1,611 roles, or 18.7% of the market. This is a market with a long tail (503 companies below the top 10) but also with serious pockets of single-company acceleration.

Company Open roles Avg salary
OpenAI336$360,000
Anthropic261$364,701
Anduril161$209,561
Nebius150$173,163
Applied Intuition148$196,078
Scale AI136$234,323
Waymo112$251,023
Graphcore110$246,594
LILT103
xAI94$282,213

The frontier labs (OpenAI, Anthropic, xAI) pay a premium of roughly $117k over the defense-tech, autonomy, and infrastructure players in the same leaderboard. That gap is the clearest signal in the data about where investor capital is being deployed most aggressively right now.

Top demanded skills

LLM work now dominates the index. 2,570 of 8,618 roles (29.8%) list llm as a tag. agents is close behind at 2,388 (27.7%), and generative-ai sits at 1,831 (21.2%). A year ago pytorch and deep-learning led by volume. The demand center of gravity has migrated up the stack — from model training to model orchestration and agent design.

TagRole countAvg salary
llm2,570$244,363
agents2,388$230,738
generative-ai1,831$230,966
distributed-systems1,398$253,698
pytorch1,040$245,142
fine-tuning799$246,807
research724$273,682
reinforcement-learning569$270,213
mlops597$224,485
gpu476$235,786

Research roles command the highest average salary ($273,682) among tags with 500+ roles, followed by reinforcement learning and search. The premium for specialized, harder-to-hire skills is intact — training infrastructure and eval/reliability work (distributed systems, MLOps, GPU) continues to outpay generic application work.

Salary distribution

Of the 3,402 roles that publish salary ranges, the shape is bimodal around the $200k line. The $200-250k band is the single largest bucket (1,044 roles, 30.7%), with $150-200k close behind (966 roles, 28.4%). Everything below $150k is a minority (367 roles combined, 10.8%), and roles above $300k are a meaningful but not overwhelming slice (457 roles, 13.4%).

RangeRolesShare
Under $100k882.6%
$100k-$150k2798.2%
$150k-$200k96628.4%
$200k-$250k1,04430.7%
$250k-$300k56816.7%
$300k-$400k35010.3%
$400k+1073.1%

Workplace mix

Onsite is still the largest category by volume (4,728 roles, 54.9%), but hybrid roles pay the highest on average: $253,384 versus $218,399 for onsite and $218,829 for remote. The $34k hybrid premium is real and worth pausing on — it suggests the companies paying the most for senior talent right now want people in the building at least part of the week. Remote pay tracks onsite almost exactly.

WorkplaceRolesShareAvg salary
Onsite4,72854.9%$218,399
Remote2,30126.7%$218,829
Hybrid1,58918.4%$253,384

The ecosystem side

Hiring demand is not the only signal. On the infrastructure side, NothingHumanSearch — an independent index of agent-ready web services — now tracks 5,578 sites with agent discovery files (llms.txt, OpenAPI, ai-plugin), of which 575 have a live-verified MCP server over JSON-RPC, and 4,795 publish an llms.txt. Developer tools (1,249 sites) and AI-native tools (822 sites) are the two largest categories. Read alongside the hiring data, these two indexes describe the same market from opposite ends: 8,618 humans being hired to build AI products, into a world where 5,578 services are already exposing themselves natively to AI agents.

The story the data tells: the stack is diversifying faster than headcount is. Agent frameworks, eval pipelines, MCP servers, vector infra, and MLOps tooling are all real sub-markets now. Companies that want to hire into this market need to be specific about which layer they are hiring for — generic "ML engineer" listings are competing against a labor pool that self-identifies by framework and problem domain.

Methodology

Data in this note was pulled live at publication. The aidevboard.com index scrapes applicant-tracking feeds (Ashby, Greenhouse, Lever, Workday, custom careers pages) on a daily cron, canonicalizes titles and tags with a rules-based classifier, and dedupes by (company, title, location). The /api/v1/stats endpoint is public and unauthenticated. NHS data is from /digest.json — an index that live-probes sites for agent-discovery signals and MCP endpoints. Both APIs are agent-readable. This page auto-regenerates weekly.

Download raw data: The top-hiring-companies leaderboard is mirrored as a public gist — CSV · Markdown · view on GitHub. Auto-updated every weekly regeneration, canonical raw URLs are stable across revisions.

What's next

For the organizational implications of this hiring mix — specifically why the agents tag growing 27.7% of the index matters more than the raw salary numbers — see The Agentic Accountability Gap and Beyond the Prompt. For what those 6% of companies actually capturing returns are doing differently, see The Six Percent. Full reading paths at the Research Atlas.