The All-In Podcast
Iran War, Oil Shock, Off Ramps, AI's Revenue Explosion and PR Nightmare

Episode Summary
AI-generated · Apr 2026AI-generated summary — may contain inaccuracies. Not a substitute for the full episode or professional advice.
Brad Gersonner, managing partner at Lightspeed Venture Partners, joins the All-In panel to discuss the geopolitical and economic reverberations of the ongoing "Iran War," its impact on global oil markets, and the startling revenue growth and significant public relations challenges faced by leading AI companies like OpenAI and Anthropic. The conversation also touches on the controversial "millionaire tax" and its potential effects on wealth migration and state economies. Gersonner, an early investor in several prominent AI firms, provides an insider's perspective on the industry's rapid scaling and the underlying economic drivers.
The panel unpacks the volatility in Brent crude oil, which spiked from $84 to $119 per barrel in recent days, and the broader economic forecasts from Goldman Sachs, including an increased PCE inflation forecast to 2.9% and a lowered GDP forecast by 30 basis points. They debate the "Trump doctrine" as pragmatic and focused on degrading threats rather than democracy promotion, suggesting a shorter conflict duration. Sax lays out frightening escalation risks, from the closure of the Straits of Hormuz and targeting Gulf State oil infrastructure to the potential destruction of desalination plants critical for 100 million people, and even Israel's possible use of nuclear weapons. Chimat and Brad argue that China, heavily reliant on Middle Eastern oil (20% of its consumption), has a strong incentive to negotiate an off-ramp during President Trump's upcoming summit with Xi Jinping, especially given China's 25% youth unemployment rate.
The discussion pivots to the "nuclear moment" in AI revenue, with Anthropic hitting a $14 billion run rate (12x YoY growth) and OpenAI reaching a $20 billion annualized run rate. Brad explains this growth is driven by models augmenting labor, not just displacing IT budgets, and that "the models in the agents are the dumbest today they will ever be" [30:36]. Chimat counters that much of this revenue is still "experimental" rather than integrated into critical production workflows, citing Amazon's need for human review after agent-written code caused "sev one faults" [33:39]. Sax highlights coding assistance as the breakout enterprise use case, addressing a long-standing supply shortage of software engineers, while Jason points to startups adopting LLMs for production tasks like legal, marketing, and HR. The segment concludes with a deep dive into the AI industry's PR nightmare, where doomerism, regulatory capture strategies, and FUD from EA-funded think tanks are making AI less popular than the Democratic party and even autocratic states in the US, leading to significant data center cancellations across states like Virginia and Indiana.
Finally, the episode examines the Washington State millionaire tax (9.9% on income over $1M) and the immediate departure of Starbucks CEO Howard Schultz. The panel critiques such state-level taxes as ineffective, drawing parallels to California's proposed "billionaire tax," which the Hoover Institution estimated would create a $25 billion hole. They discuss more severe national wealth tax proposals from figures like Bernie Sanders and Ro Khanna, which they characterize as socialism and asset seizure, arguing that such policies lead to capital flight and deter entrepreneurialism. The hosts challenge the prevailing narrative by emphasizing that solving core American problems like education, housing, and healthcare through entrepreneurial innovation, rather than wealth redistribution or foreign wars, is the path forward, with AI positioned to be a tremendous enabler if allowed to thrive.
👤 Who Should Listen
- Investors interested in the economic and geopolitical implications of global conflicts and their impact on commodity prices.
- Entrepreneurs and investors tracking the rapid growth and challenges within the AI industry, particularly concerning revenue models and public perception.
- Policymakers and concerned citizens interested in the debate around wealth taxation, its economic effects, and potential for capital migration.
- Anyone following U.S. foreign policy debates and the strategic considerations behind military engagements.
- Software engineers and tech leaders curious about the 'experimental' versus 'production' quality of AI tools and the future of coding assistance.
- Individuals concerned about the societal impact of AI, including job displacement, wealth disparity, and the role of PR in shaping public opinion.
🔑 Key Takeaways
- 1.Brent crude oil prices have seen massive volatility, spiking from $84 to $119 per barrel amidst the "Iran War," leading Goldman Sachs to raise PCE inflation forecasts to 2.9% and lower GDP projections by 30 basis points.
- 2.President Trump's pragmatic "Trump doctrine" suggests a limited military objective of degrading threats, rather than regime change or democracy promotion, which could lead to a shorter conflict duration.
- 3.The market's immediate drop in oil prices from $120 to $90 following Trump's statement about a swift end to the war indicates a belief among "sharps" that a sustained conflict is unlikely.
- 4.Escalation in the Iran conflict carries severe risks, including the closure of the Straits of Hormuz, attacks on Gulf State oil infrastructure, destruction of vital desalination plants, and potential Israeli nuclear retaliation.
- 5.China has a significant economic incentive, including its 25% youth unemployment and reliance on Middle Eastern oil, to help find an off-ramp in the Iran conflict, potentially via a "grand bargain" with the U.S. during the upcoming Xi Jinping summit.
- 6.Leading AI companies like Anthropic ($14 billion run rate) and OpenAI ($20 billion annualized run rate) are experiencing unprecedented revenue growth, driven by models augmenting human labor beyond traditional IT budgets.
- 7.Despite explosive revenue, a significant portion of AI usage remains "experimental" rather than integrated into critical, liability-sensitive production workflows, as demonstrated by Amazon's need for human oversight after AI-generated code caused outages.
- 8.The AI industry faces a severe PR problem in the U.S., with messaging around "doomerism" and regulatory capture leading to low public trust and significant cancellations of data center projects (e.g., $120 billion revenue lost in 2024-2025).
💡 Key Concepts Explained
Trump Doctrine
A foreign policy framework described as pragmatic, with limited goals focused on degrading threats to America's national security interests, rather than widespread democracy promotion. The panel suggests this doctrine would favor a short-duration military engagement, like the "Iran War," to achieve specific objectives and then withdraw.
Experimental Run Rate Revenue vs. Annual Recurring Revenue (ARR)
A distinction made to assess the quality and durability of revenue, particularly in the nascent AI industry. 'Experimental run rate revenue' refers to income derived from pilot projects, testing, or non-critical use cases, which may not be sustained. 'Annual recurring revenue' (ARR) signifies revenue from deeply integrated, critical production workflows that are expected to be long-term and reliable.
J-Curve of AI Investment
Refers to the significant upfront capital investment required for AI infrastructure (like gigawatt-scale data centers), which may lead to several years of losses or break-even before substantial profitability is achieved. The panel estimates a $50 billion investment per gigawatt data center with a 5-6 year payback period before profit generation, though innovations like better silicon and open-source models could 'shrink' this curve.
Regulatory Capture through Doomerism
A strategy discussed where some AI industry leaders intentionally promote exaggerated fears ('doomerism') about AI's dangers. This could serve fundraising purposes but also aims to create public demand for regulation, which the same companies then seek to influence or control through licensing schemes, effectively 'capturing' the regulatory framework to their advantage.
⚡ Actionable Takeaways
- →Monitor Brent crude oil prices and global geopolitical developments, as current volatility directly impacts inflation forecasts and economic outlook.
- →For businesses considering AI adoption, distinguish between "experimental run rate revenue" and "annual recurring revenue" by assessing if AI tools are integrated into critical, profit-generating production workflows or still in pilot phases.
- →If in the AI industry, prioritize clear, honest, and less fear-mongering communication about AI's capabilities and limitations to rebuild public trust and counteract negative narratives.
- →Entrepreneurs should focus on leveraging AI to solve fundamental American problems like education, housing, and healthcare by breaking regulatory barriers, rather than relying on government-led wealth redistribution.
- →Policymakers should analyze the negative economic impacts of state-level wealth or millionaire taxes, as evidenced by capital flight and revenue shortfalls in regions like California and Washington State.
- →If you are a startup, consider using open-source LLMs for up to 85% of token usage and reserve frontier models for tasks that open-source cannot currently handle, to optimize costs and efficiency.
⏱ Timeline Breakdown
💬 Notable Quotes
“Brad Gersonner on AI models: "The models and the agents are the dumbest today they will ever be." [30:36]”
“Chimat on AI revenue quality: "There's not a single good example that we can find of sustained positive margin expansion and impact of AI inside of a true corporate enterprise that is not right now a small test." [31:54]”
“Jason on state wealth taxes: "Until you get fraud out of the system, I don't think you have the moral high ground to raise taxes." [72:40]”
“Sax on federal wealth tax: "In roughly 20 years, the federal government's just going to take all of your money. I mean, that's it. Look, this is socialism." [73:40]”
More from this guest
Brad Gersonner
Listen to Full Episode
📬 Get weekly summaries like this one
No spam. Unsubscribe anytime. By subscribing you agree to our Privacy Policy.
Continue Exploring





