🥟 Chao-Down #333 OpenAI disbands super alignment AI safety team, Sony Music Group pens open letter to tech companies warning against using their content to train AI, Why the obsession with AGI?
Plus, how web publishers are bracing for disruption as generative search answers upends internet traffic.
In less than one year’s time, OpenAI is breaking up its effort to make AI as safe as it can be.
Following the departure of OpenAI’s cofounder and chief scientist Ilya Sutskever and head of their company’s superalignment team Jan Leike, OpenAI has decided to disband the group entirely and rededicate its resources to other teams.
Originally, the Superalignment team was tasked with developing methods to ensure the safety and control of advanced AI systems. Following the very public episode of OpenAI leadership late last year, it’s come to light that there was significant disagreement over the prioritization of safety compared to the economic and commercial opportunities for the company.
Jan Leike even said on X that safety had “taken a backset to shiny products” at OpenAI.
Now with the latest release of GPT-4o and a race from multiple tech giants to create the next “Her”-style voice assistant, who’s going to work to make sure that these systems are safe for society to embrace?
Maybe Ilya and Jan will have something new to announce soon.
-Alex, your resident Chaos Coordinator.
What happened in AI? 📰
OpenAI team responsible for AI's existential dangers is disbanded (qz.com)
As Google AI search rolls out to more people, websites brace for carnage (The Washington Post)
Prepare to Get Manipulated by Emotionally Expressive Chatbots (WIRED)
Sony Music Group warns more than 700 companies against using its content to train AI (NBC News)
A.I.’s ‘Her’ Era Has Arrived (The New York Times)
Mistral's CEO Said the Obsession With AGI Is About Creating God (businessinsider.com)
Always be Learnin’ 📕 📖
How Generative AI Is Remaking UI/UX Design (a16z.com)
Evaluating Large Language Models using “CounterFactual Tasks” (AI: A Guide for Thinking Humans)
The Business Case for Longevity (NFX)
Projects to Keep an Eye On 🛠
MusicLang/musiclang_predict: MusicLang Predict, your controllable music copilot. (Github)
Mintplex-Labs/anything-llm: The all-in-one Desktop & Docker AI application with full RAG and AI Agent capabilities. (Github)
archinetai/audio-diffusion-pytorch: Audio generation using diffusion models, in PyTorch. (Github)
The Latest in AI Research 💡
Fishing for Magikarp: Automatically Detecting Under-trained Tokens in Large Language Models (arxiv)
Rethinking Interpretability in the Era of Large Language Models (arxiv)
Direct Language Model Alignment from Online AI Feedback (arxiv)
The World Outside of AI 🌎
Tech Giants Start to Treat Southeast Asia Like Next Big Thing (Bloomberg)
The way whales communicate is closer to human language than we realized (MIT Technology Review)
Renters’ pessimism about owning a home hits new low, New York Fed says (qz.com)
Why During a Recession Is a Great Time to Start a New Company (WSJ)
How Scammers Are Stealing Money From Food Stamps Recipients (The New York Times)
Are recruiters better than a coin flip at judging resumes? Here's the data. (interviewing.io)