March 2026 AI links
1iii26
World Labs: Spatial Intelligence worldlabs.ai
...Marble, our first product, generates spatially consistent, high-fidelity, and persistent 3D worlds that you can move through, edit, and inhabit....World Labs was founded by visionary AI pioneer Fei-Fei Li along with Justin Johnson, Christoph Lassner, and Ben Mildenhall, each a world-renowned technologist in machine learning, generative AI, computer vision, and graphics. We bring together a world-class team, united by a shared curiosity, passion, and deep backgrounds in technology — from AI research to systems engineering to product design — creating a tight feedback loop between our cutting-edge research and products that empower our users.
Our first product, Marble, is powered by best-in-class generative 3D world models and enables anyone to create spatially cohesive, high-fidelity, and persistent 3D worlds from just a single image, video or text prompt.
The Economics of Technological Change Paul Krugman
...Vonnegut's fears about automation and employment were, in fact, largely right if we focus only on employment in manufacturing. In 1950 around 30 percent of U.S. workers were employed in manufacturing; it's less than 10 percent now. Trade deficits explain some of that decline, but mostly it reflects technological progress that made it possible to satisfy demand for manufactured goods with many fewer workers — just as we can now feed ourselves with only a small fraction of the work force employed on farms....The rise of information technology, however, created a new kind of monopoly power and a new class of robber barons. Where Standard Oil's market power rested on its control of physical infrastructure, mainly refineries, many of today's most valuable companies — Apple, Alphabet (Google), Microsoft, Amazon, Meta (Facebook) — are de facto monopolies or near-monopolies thanks to “network externalities.” So are somewhat smaller but still big players like Uber or DoorDash (which dominates food delivery in much of the country, but not where I live.) What the term network externalities means is that these companies offer products that many people use because so many other people use them. Few people love Word, PowerPoint or Excel, but most people, myself included, use them because they're so widely used that it's hard to do anything different. I am also to some extent a prisoner of the Apple and Amazon ecosystems. Network externalities create "moats" around a number of companies, which have allowed them to retain customers even when they charge high prices and degrade their product's quality. In 2023 the American Dialect Society named Cory Doctorow's term "enshittification" the word of the year.
AI Just Entered Its Manhattan Project Era Alberto Romero
...Even if nothing changes from this specific situation—even if the courts reverse the designation, the revenue impact is negligible, and everyone moves on in three weeks, as it tends to happen with these things that feel huge in the moment but dissolve in the impossibly greater inertia of the world—these events reveal, indirectly, what I think is a very consequential shift in how we should think about the AI industry and AI as a technology. The US government just showed everyone what it can do, or, more importantly, what it's willing to do. That's AI transforming from mostly a technological matter to mostly a political matter and not precisely in a democratic way (that's what Amodei was referring to when he said 'disagreeing with the government is the most American thing in the world"). Of course, the US government wouldn't allow the AI industry to push forward without supervision forever. As soon as generative AI proved to be a significant geopolitical factor—and it's pretty clearly at that point—they'd seize control over it. Once they let Claude into their classified network and realized the level of capability frontier models have achieved, the fate of this technology was sealed.So even if "Anthropic vs the DoD" will be relegated to a footnote in history books, the higher-level shift from techne to politeia (with a sprinkle of kratos) **is permanent: AI is no longer about the art of making models or nerds honing their tuning craft, but about the arena of political and geopolitical power. We're entering the Project Manhattan era of AI. I don't know if the US government will end up creating—or, failing that, nationalizing—a big AI project, but they are well aware that the technology has become quite useful and is acting accordingly.
"All Lawful Use": Much More Than You Wanted To Know Astral Codex Ten
America Isn't Ready for What AI Will Do to Jobs Josh Tyrangiel at The Atlantic
2iii26
Where Humane Failed, Qualcomm Imagines the Future Is Filled With AI Pins gizmodo
How to Bet Against the Bitter Lesson Tim O'Reilly
...It feels a bit like I'm assembling a picture puzzle where all the pieces aren't yet on the table. I am starting to see a pattern, but I'm not sure it's right, and I need help finding the missing pieces. Let me explain some of the shapes I have in hand and the pattern they are starting to show me, and then I want to ask for your help filling in the gaps....the work of Henry Farrell, Alison Gopnik, Cosma Shalizi, and James Evans. They make the case that large models should not be viewed primarily as intelligent agents, but as a new kind of cultural and social technology, allowing humans to take advantage of information other humans have accumulated. Yegge's observation fits right into this framework. Every new social and cultural technology tends to survive because it saves cognition. We learn from each other so we don't have to discover everything for the first time. Alfred Korzybski referred to language, the first of these human social and cultural technologies, and all of those that followed, as "time-binding."
...as Claude told me when I asked whether it was a worker or a tool, "I don't initiate. I've never woken up wanting to write a poem or solve a problem. My activity is entirely reactive — I exist in response to prompts. Even when given enormous latitude ('figure out the best approach'), the fact that I should figure something out comes from outside me."
...the idea that any knowledge that becomes available automatically becomes the property of any LLM is not foreordained. It is an artifact of an IP regime that the AI labs have adopted for their own benefit: a variation of the "empty lands" argument that European colonialists used to justify their taking of others' resources. AI has been developed in an IP wild west. That may not continue. The fulfillment of AI labs' vision of a world where their products absorb all human knowledge and then put humans out of work leaves them without many of the customers they currently rely on. Not only that, they themselves are being reminded why IP law exists, as Chinese models copy their advances by exfiltrating their weights. There is a historical parallel in the way that US publishing companies ignored European copyrights until they themselves had homegrown assets to protect.
What I'm starting to see are the first halting steps toward a new software ecosystem where the "programs" are mixtures of natural language and code, the "runtime" is a large language model, and the "users" are AI agents as well as humans.
OpenAI Leadership Defends Deal With Pentagon as Employees Wait in Limbo gizmodo <.p>