Live updates from Elon Musk and Sam Altman’s court battle over the future of OpenAI

In a high-profile 2026 courtroom hearing, Elon Musk took the stand in his lawsuit against Sam Altman and other OpenAI founders, disputing the nonprofit’s origin, ownership and mission. Musk testified that he provided up to $38 million in early funding, recruited key researchers and intended OpenAI to be a 501(c)(3) charity focused on safety. The day’s testimony covered email exhibits about equity splits, early fundraising ideas including a proposed cryptocoin, and conversations with Nvidia’s Jensen Huang about early DGX hardware. The session ended with sharp exchanges over legal positions and warnings about precedent if the jury sides with Altman and OpenAI.

Key takeaways

  • Elon Musk testified that he invested up to $38 million in OpenAI’s earliest phase and described himself as instrumental in recruiting core researchers, including Ilya Sutskever.
  • Musk said he envisioned OpenAI as a nonprofit 501(c)(3) that would accumulate reserves for safety work, and that he rejected turning the organisation into a pure for-profit at founding.
  • Founders reportedly debated a four-way equal split (roughly 25% each); Musk maintained that an equal ownership structure was unfair given his funding role and sought a larger, diluting stake.
  • Emails entered into evidence show Musk solicited Nvidia founder Jensen Huang in 2016 to secure early DGX-class supercomputers for OpenAI, and Huang pledged priority access.
  • Musk recounted internal brainstorming about monetization, saying an ICO was discussed but that he opposed it as “scammy,” and that he was not categorically opposed to a small for-profit arm tied to a nonprofit.
  • On safety, Musk reiterated urgent concern about AGI, telling jurors he believed machines could match human general intelligence “as soon as next year,” a prediction the trial record preserves as his testimony.
  • Judge YGR admonished OpenAI’s counsel for presenting inconsistent positions about the origin of the OpenAI name, instructing counsel not to take contradictory stances in court.
  • Sam Altman attended opening arguments but left during a break; Musk’s team proceeded with testimony and document presentation while the jury observed witnesses and exhibits.

Background

OpenAI was founded amid public debates about artificial intelligence risks and the best institutional form for safety-focused research. In 2015 the group announced a mission to develop beneficial AI; founders and early backers tried to reconcile open research principles with mechanisms to fund sustained work. From the start, the structure tested tensions between a nonprofit mission and commercial incentives: founders considered various funding forms, including grants, corporate partnerships and hybrid approaches that would channel resources to the nonprofit.

Elon Musk was an early, visible backer who says he originated the name and concept for a nonprofit designed to prioritize safety over commercial gain. Relations later frayed as other founders and executives preferred structures and business models that allowed rapid development and capital flows — choices that ultimately led Musk to step away and later found xAI. The present litigation centers on those early decisions, whether founding commitments were altered without proper authority, and how equity and control were allocated among founders.

Main event

Musk’s testimony focused on a set of email and text exchanges entered as exhibits. He described repeated conversations with Sam Altman, Greg Brockman and Ilya Sutskever about funding models, ownership splits and governance. According to Musk, some cofounders pressed for a four-way equal distribution of equity; he objected, arguing his cash and recruiting contributions justified a larger stake that would dilute as the organisation matured.

On fundraising mechanics, Musk told jurors the team brainstormed many ideas — from a small for-profit arm to more speculative proposals like a cryptocoin issuance. He said he opposed the ICO idea as “kinda scammy,” but did not rule out limited for-profit arrangements that would feed the nonprofit’s work, so long as commercial incentives did not dominate safety goals.

Musk sought to document his role in procuring early compute, reading into evidence a 2016 exchange with Jensen Huang of Nvidia in which Huang promised to prioritize OpenAI for early DGX systems. Musk framed that procurement as critical to accelerating research capability and as a concrete benefit tied to his involvement. He also reiterated that he believed OpenAI should remain a charity that “does not benefit any individual person,” emphasizing his view that transferring control would amount to wrongfully taking a philanthropic enterprise.

Several courtroom moments emphasized motive and tenor: Musk described long-standing worries about AI extinction risk and said he had discussed safety at length with others, including a meeting with President Barack Obama years earlier. He also stressed his work habits — claiming an 80–100 hour workweek across multiple companies — to underscore the depth of his personal commitment to technology and safety initiatives. At one point the judge sharply reproved OpenAI counsel for inconsistent legal positions about the origin of the OpenAI name, underscoring the trial’s attention to documentary detail.

Analysis & implications

Legally, the case will test whether early oral understandings and email discussions created enforceable rights over ownership, control or mission that the current OpenAI leadership allegedly frustrated. If the jury accepts Musk’s framing that he provided funding and shaped the nonprofit’s founding, that could support claims that changes in structure or control violated fiduciary expectations or donor intent. Conversely, OpenAI’s defense argues founding discussions were exploratory and that later governance choices were legitimate decisions by its board and leadership.

Beyond the courtroom, the dispute has broader implications for how AI organizations formalize mission, governance and commercial strategy. Hybrid models — nonprofit research arms paired with for-profit deployment entities — are increasingly common in AI. A ruling that constrains such transitions could make startups and investors more cautious about informal or verbal founding commitments, pushing teams to record governance terms earlier and more explicitly.

The trial also spotlights the symbolic stakes of AI safety advocacy. Musk’s testimony frames his involvement as principally safety-driven, and he warns that a verdict favoring Altman and OpenAI could create a legal precedent encouraging what he described as the “looting” of charities. That rhetoric raises the political temperature: regulators, donors and academic partners may re-evaluate their engagement strategies if the decision alters incentives around mission preservation.

Commercially, the proceedings could reshape competitive dynamics. The record on Nvidia access and early compute assistance underlines how supplier relationships and early hardware commitments can confer research advantages. If the jury credits Musk’s account of privileged access, businesses and hardware vendors may reexamine contractual clarity around priority allocations and partner commitments.

Comparison & data

Proposal Ownership implication Notes
Four-way equal split ~25% per founder Described in testimony as a cofounders’ demand Musk called “unfair.”
Musk-preferred structure Higher initial stake for Musk, diluting over time Presented by Musk as justified by his cash and recruitment contributions.
Nonprofit 501(c)(3) No individual financial benefit Musk testified he intended OpenAI as a charity accumulating reserves for safety work.
Summary of ownership models discussed in trial testimony.

The table condenses competing early proposals recounted in court: an equal four-way split versus Musk’s preferred larger initial ownership that would dilute. The record shows $38 million tied to Musk’s early funding assertions and email exhibits documenting both governance conversations and outreach to Nvidia for DGX hardware. These elements anchor the dispute in concrete acts — emails, fundraising pitches and recruitment — rather than pure recollection.

Reactions & quotes

Judicial admonition highlighted procedural precision concerns when counsel for OpenAI was rebuked for offering inconsistent positions about the origin of the organisation’s name. The judge warned against taking contradictory stances in separate matters.

“Do not take inconsistent positions in front of me.”

Judge YGR (courtroom admonition)

The courtroom also heard pointed language from Musk asserting the moral stakes he attributes to the case. His characterization frames the dispute as more than a contract fight: it is a contest over how philanthropic missions are preserved.

“I have extreme concerns about AI.”

Elon Musk (testimony)

On the alleged risk of misappropriating a charitable entity, Musk warned of a wider precedent if the jury rules against him, arguing the decision could enable the improper transfer of charitable assets.

“It’s not okay to steal a charity.”

Elon Musk (testimony)

Unconfirmed

  • Exact enforceability of verbal or informal founder agreements: court will determine whether early conversations legally bound later governance choices.
  • Musk’s AGI timing prediction that machines will be as smart as humans “as soon as next year” is a personal forecast, not an established fact.
  • The full scope of any informal promises made by Nvidia or other suppliers is documented in emails but the long-term commercial terms and enforceability remain subject to further evidence.

Bottom line

This trial is a rare, public unpacking of how an influential AI organisation was conceived and how founders navigated competing values: open research, safety and commercial viability. The jury will have to weigh documentary evidence against competing recollections to decide whether early intentions were honored or altered in ways that breached obligations.

Beyond the legal outcome, the case is prompting the AI field to confront governance clarity: teams, funders and suppliers should formalize roles, equity and mission commitments earlier to avoid future disputes. For observers, the critical next steps are the jury’s decision and any appeals, which could reshape precedents affecting nonprofits, hybrid entities and large-scale AI projects.

Sources

Leave a Comment