Krafton, a gaming company in South Korea, bought Unknown Worlds Entertainment in 2021 for around $500 million. As part of this deal, they could end up paying an extra $250 million if Subnautica 2 sold a certain amount. The studio was also promised it could continue doing things its own way each day and that its leaders would be safe.
Acquisition terms and earn-out obligations
The $250 million “earn-out” was meant to protect Charlie Cleveland, Max McGuire (the two people who started the studio) and Ted Gill (the CEO) from being fired unless there was a good reason. This protection became very important when the studio’s predictions showed they were likely to make the bonus.
As Subnautica 2 got closer to being released, the company thought it might sell enough to reach those sales goals. Changhan Kim, the CEO of Krafton, thought the earn-out was a bad thing for his company and felt like they’d been tricked in the purchase. He looked for ways to reduce or get out of paying the bonus.
Internal forecasts and executive concerns
Lawyers working for Krafton told him that firing the leaders of the studio wouldn’t necessarily stop the need to pay the $250 million and could cause Krafton to have legal problems and damage its public image. But despite this advice, the CEO explored other ways to lessen the financial impact for Krafton.
The court found that the CEO asked an AI chatbot for ideas on what to do. At first, the AI said it would be hard to cancel the earn-out, but later the AI suggested creating a team inside the company, called Project X, to get more power over the studio or to control it.
Project X, ChatGPT advice, and leadership removals
Project X, it’s said, included things like getting the right to publish the game on platforms like Steam, taking over the computer code for the game, changing the argument to be about the quality of the game and whether fans would like it, and preparing statements for the legal system and the public. Then, the company removed the main leaders of the studio, which the court later said was not fair.
These changes in leadership worried players and the people making the game, and people on social media and in the gaming community expressed concern about whether the studio would continue to exist. The court took this reaction into account when deciding what happened and what the consequences were.
The Court of Chancery in Delaware criticized Krafton for what it did. The court said that company leaders have to use their own judgement, and not just hand over important decisions to an AI. Lori Will, the Vice Chancellor, said the firings were wrong and told Krafton to put the leaders back.
Court ruling and remedies ordered
Ted Gill is once again the CEO of Unknown Worlds and now has the ability to bring back the two founders. The court also extended the time period for the earn-out to make up for the problems, so the studio’s rights in the original agreement are protected during this disagreement.
Krafton says they don’t agree with the court’s decision and are thinking about what to do next, while still working on getting Subnautica 2 into “early access.” They say they are still committed to having a good launch for early access.
This case shows the legal and moral boundaries for companies using AI when making big, important plans. Courts expect the people who run companies to use their own thinking and to have proper, real reasons written down for changes in leadership. Just following a plan given by an AI might not be enough.
Implications for AI use and corporate governance
For teams working on mergers and acquisitions, the ruling shows how important it is to have very clear wording about earn-outs and to have a solid system of governing when disagreements happen. It also tells company boards and CEOs that doing things in a very forceful way can hurt both how employees feel and how much customers trust the company.
People who make and enforce rules, investors, and company lawyers will be watching to see if this case leads to new rules about how AI is used in boardrooms. For now, this decision is a warning about how much you can trust advice in the style of ChatGPT when it involves contracts, being responsible to shareholders, and the communities of people who use the product.











