clock menu more-arrow no yes mobile

Filed under:

Sam Altman is arguing it’s not all about making money. Can that be a model for Silicon Valley?

Here’s the idea behind a “capped-profit” model.

Y Combinator chairman Sam Altman.
Y Combinator chairman Sam Altman.
Matt Winkelmeyer/Getty Images for WIRED25

Sam Altman shocked Silicon Valley on Friday when he announced that he was stepping down as the president of Y Combinator, the Valley’s marquee startup accelerator.

And now we know what exactly he is trying to do next: Building what he is calling a “capped-profit” company that could be a model in Silicon Valley — where firms are struggling to determine whether they have any responsibilities to the world beyond making money for their shareholders.

Altman and the other leaders of OpenAI — a high-profile nonprofit that researches artificial general intelligence (AGI) and how to turn it into a force for good — are doing something pretty novel in the world of Silicon Valley startups: putting a maximum on the amount of money that its investors can make if the company is wildly successful.

OpenAI is launching into a for-profit company, OpenAI LP, that will effectively replace the nonprofit — but remain governed by it.

The big idea here — which companies from Facebook to Amazon would probably be wise to watch closely — is to protect the mission by erecting guardrails that keep the company from the overpursuit of money.

Altman, who is becoming OpenAI’s CEO, said it’s his “sincere hope” that Silicon Valley sees this as a model for how companies can balance profit with purpose. He told Recode that he wants OpenAI to release documents for a “capped-profit” structure that other companies can copy, too — calling it “a way to balance capitalism and sharing benefits.”

Now, to be clear, the guardrails are pretty low here: Investors can still make 100 times their money in the new for-profit entity. The company says it will need to raise billions of dollars, so that’s a lot of opportunities to strike it rich.

“That’s like saying we have a more than 1 in 100 chance of being successful in creating AGI, right? I’ll take that compliment!” Altman said. “This is a very, very hard thing to do.”

OpenAI LP is now structured so that the nonprofit’s charter is the governing document and should theoretically trump any financial motivations. That might be hard to do in practice, but the company says its investors and employees all signed a document stating that their “obligation to the Charter always comes first, even at the expense of some or all of their financial stake.”

The board’s nonprofit retains control of the company and oversees its work, too, which brings some comfort to its employees.

The organization’s mission is to make sure that artificial intelligence isn’t just another asset for a profit-seeking company that could prioritize research that is good for their bottom line but can be weaponized. And so the risk is that taking outside money would jeopardize that mission by making OpenAI just like any of the other startups and tech giants chasing the cash cow — a risk Altman has acknowledged.

“When I first heard about OpenAI LP, I was concerned, as OAI’s nonprofit status was part of what attracted me,” one OpenAI employee, Miles Brundage, wrote on Twitter. “But I’ve since come around about the idea for two reasons: 1. As I dug into the details, I realized many steps had been taken to ensure the org remains mission-centric. 2. It was reassuring that so many others at the org had raised such concerns already, as ultimately this is a matter of culture.”

Nonprofits using private-sector incentives to lure investment and improve company performance isn’t new. But what’s most compelling here is the reverse: the idea that for-profit companies could adopt the ethos of a nonprofit and “cap” their winnings, funneling back the remainder into the company.

“This is a hugely beneficial model for startups who wish to innovate for social good. The for-profit / nonprofit choice leaves founders conflicted, often asking the question ‘can i raise enough philanthropy?’” wrote Varun Arora, the CEO of education startup Open Curriculum. “Despite positive models in philanthropy for innovation (e.g. @gatesfoundation), they represent a small minority. Which means whether you are building a crucial healthcare or education or public services company, you often have no choice but to choose the for-profit model.”

The counterargument is obvious: Businesses exist to make money. Public companies have shareholders who buy stock because they want it to appreciate — and Americans’ 401k accounts might depend on it. Startups have venture capital backers who invest in risky ideas because they hope it makes them money — which they can then return to college endowments, charities, hospitals and other institutions that do good in the world.

But as technology invades the lives of people who aren’t shareholders, it raises questions about whether it should be accountable to a wider group of people.

Would Facebook be more of a force for good in the world if it was not profiting from a digital advertising behemoth that is built on a system that rapaciously collects any and all data, whatever the cost?

Sure, Juul was a phenomenal bet for early investors — but at the cost of addicting millions of teens to nicotine. Would venture capitalists back more humane products if there was an upper limit to the returns they could reap?

Silicon Valley is beginning to bat around these questions, and OpenAI’s announcement fits neatly into that vexed conversation. Facebook is changing its business model to emphasize privacy (which very well might imperil its profit engine). Investors have publicly criticized Juul and those who back it — a rare move of condemnation in the world of startups. And tech employees have confronted their employers over lucrative defense contracts with some success.

A capped-profit model seems like a pretty far stretch for Silicon Valley — which values squashing the competition and the Invisible Hand — but it’s of the times.

“I certainly won’t invest in companies that will be successful but would be bad for the world. Sometimes I invest thinking it would be good for the world and I get it wrong. But I won’t go into it doing that,” Altman said on an episode of Recode Decode in December. “And I think that in the long run that does work and I think that it makes you more successful.”

This article originally appeared on