ai

404 Media
Google Books is indexing low quality, AI-generated books that will turn up in search results, and could possibly impact Google Ngram viewer, an important tool used by researchers to track language use throughout history.
AI is why we can’t have nice things. Also maybe having a private for profit company organize the world’s information was a terrible idea. They make decisions to maximize their profits, not provide a data heritage for humanity.
Mystery AI Hype Theater 3000
A clearer-eyed view of what has happened in the last two years is that a few companies have amassed enormous amounts of data (mostly taken non-consensually) and the capital to buy the computing resources to train enormous models which, in turn, can be used to output facsimiles of the kinds of interactions represented in the data.
I think a better understanding of how generative AI products are produced would help clear up some of this magical thinking that’s happening.
Knowing Machines
LAION-5B is an open-source foundation dataset used to train AI models such as Stable Diffusion. It contains 5.8 billion image and text pairs—a size too large to make sense of. In this visual investigation, we follow the construction of the dataset to better understand its contents, implications and entanglements.
An exercise in (and advocacy for) AI dataset transparency. Excellent information and presentation here.
The Verge
Calculating the energy cost of generative AI can be complicated — but even the rough estimates that researchers have put together are sobering.
Generative AI tools can be fun and can help productivity but are those gains worth their higher resource cost?
Gizmodo
Hippocratic promotes how it can undercut real human nurses, who can cost $90 an hour, with its cheap AI agents that offer medical advice to patients over video calls in real-time.
First, do no harm. Tech culture is going just great.
free-dissociation.com
Most really unacceptable losses don’t happen because of a single bad event or what we might call a component failure, but because two subsystems, both operating correctly according to their designers’ intent, interact in a way that their designers didn’t foresee.
I would like to make this quote into a cross stitch and hang it on the wall in every IT department everywhere. Lots of great thinking here about how we keep systems operating safely, especially with AI chaos engines being integrated everywhere.
Axios
Trust in AI technology and the companies that develop it is dropping, in both the U.S. and around the world, according to new data from Edelman shared first with Axios.
Not much to this summary, but interesting to hear AI skepticism is on the rise even as it's being built into every technology product.
Disconnect
Pitch work is basically when a director, writer, producer, or any combination of those get together with an artist and say, “We want to pitch to studios and we need imagery.” All of that has now been given to generative AI.
Fascinating interview with concept artist Karla Ortiz about the impact of generative AI on her industry.
garbageday.email
To even entertain the idea of building AI-powered search engines means, in some sense, that you are comfortable with eventually being the reason those creators no longer exist. It is an undeniably apocalyptic project, but not just for the web as we know it, but also your own product. Unless you plan on subsidizing an entire internet’s worth of constantly new content with the revenue from your AI chatbot, the information it’s spitting out will get worse as people stop contributing to the network.
Speaking of newsletters that recently moved away from Substack, Garbage Day made the jump to Beehiiv. Go read about AI search nihilism and a bunch of other stuff.
YouTube
In this 5 minute video linguistics professor Emily M. Bender talks about AI as a marketing term. She proposes replacing the term AI with automation to avoid making mistakes about responsibility.
schneier.com
AIs are not people; they don’t have agency. They are built by, trained by, and controlled by people. Mostly for-profit corporations. Any AI regulations should place restrictions on those people and corporations. Otherwise the regulations are making the same category error I’ve been talking about. At the end of the day, there is always a human responsible for whatever the AI’s behavior is. And it’s the human who needs to be responsible for what they do—and what their companies do.
This talk is exactly how we should be thinking about AI, algorithms, technology in general. Technology doesn’t spring from the Earth fully formed, it’s the result of people designing it and making decisions that they should be responsible for.
YouTube
This overview of the historic origins of these Artificial General Intelligence boom or doom cults by Timnit Gebru should be required viewing. We have a real world with existing needs that these ideologies ignore. The result of these scifi inspired beliefs is promoting authoritarian politics. As mentioned in this video, the main question to ask: who benefits now?

See also: The Wide Angle: Understanding TESCREAL — the Weird Ideologies Behind Silicon Valley’s Rightward Turn.
« Older posts