Compute is the new equity
In a single week, Nvidia proposed paying engineers half their salary in AI inference tokens, OpenAI announced plans to nearly double its headcount to 8,000, and data showed Anthropic capturing 73% of new enterprise AI spending with a fraction of the workforce. The AI industry is splitting into three competing theories of value: raw compute, human talent, and enterprise relationships, and Jensen Huang's token paycheck is the clearest signal yet that the boundaries between them are collapsing. When your raise comes in tokens and your productivity is measured in inference, the employment contract is being rewritten in real time.
CNBC
Nvidia pitches AI tokens worth half of salary as a new form of engineer compensation
Jensen Huang announced at GTC that Nvidia engineers will receive an annual inference budget worth roughly 50% of their base salary, roughly $100K–$150K in compute credits, envisioning a future where 50,000 employees become 500,000 by augmenting each person with AI agents.
cnbc.com
The AI industry can't decide what's worth paying for — and that indecision is the story.
In the same week, Jensen Huang told his engineers their raise would come in tokens, Sam Altman's company started hiring 12 people a day, and data showed Anthropic eating OpenAI's lunch with a fraction of the staff. Three companies, three theories of value, all apparently correct at the same time.
Start with the most provocative signal. CNBC reported that Nvidia engineers will receive annual inference budgets worth $100K–$150K (roughly half their base salary) in AI compute credits. Huang's framing was characteristically blunt: give 50,000 employees 250,000 tokens each, and you get the output of 500,000. The implication is that compute is compensation, not in some abstract future but on next quarter's pay stub. When your employer values your AI budget at the same order of magnitude as your salary, the relationship between human labour and machine output isn't theoretical anymore. It's a line item.
Now set that against OpenAI's approach, which could not be more different. CNBC reported that OpenAI plans to nearly double its headcount from 4,500 to 8,000 by year-end, adding roughly a dozen employees every day. The company is leasing over a million square feet of San Francisco office space. In a world where Nvidia's pitch is "fewer people, more tokens," OpenAI is making the opposite bet: more people, faster. New roles span product development, engineering, sales, and a "technical ambassador" position designed to embed OpenAI inside enterprise customers. The hiring blitz coincides with plans for a Q4 2026 IPO at up to $1 trillion, with CEO of Applications Fidji Simo telling staff that ChatGPT must become a "productivity tool" and that the company is merging ChatGPT, Codex, and Atlas into a single desktop superapp. OpenAI's theory of value is headcount and product surface area. Growth now, margins later.
Then there's Anthropic, which is quietly invalidating both playbooks. Axios reported that Anthropic now captures 73% of spending among companies buying AI tools for the first time, up from a 50/50 split with OpenAI just ten weeks ago. It commands 40% of total enterprise LLM spending versus OpenAI's 27%. Eight of the Fortune 10 use Claude. Annualised revenue is approaching $20 billion. Anthropic has done this without an IPO roadshow, without 8,000 employees, and without paying anyone in inference tokens. Its theory of value is enterprise trust: reliability, safety positioning, and the kind of institutional relationships that compound.
What this means for builders
The pattern here is that the AI industry is running three experiments simultaneously. Nvidia says the unit of value is compute, and humans are the orchestration layer. OpenAI says the unit of value is talent, assembled at speed and pointed at product. Anthropic says the unit of value is the customer relationship, and everything else follows from winning it.
I think the interesting question is which of these theories survives contact with a downturn. Compute credits on a pay stub are worth exactly what the market says they're worth on any given Tuesday. Headcount at IPO-speed hiring requires revenue to justify the burn. Enterprise relationships hold up well in recessions but require the product to keep winning benchmarks.
For anyone building products on top of these platforms, the practical takeaway is simpler: the companies providing your infrastructure don't agree on what they're selling, and those competing theories will determine pricing, reliability, and vendor lock-in for the next several years. Choose accordingly.
Read the original on CNBC
cnbc.com