Seems like it’s primarily a tool to devalue (and deskill?) labor. Why aren’t we seeing improvements in the software we use everyday?
Gartner projects that 50% of companies that attributed headcount cuts to AI will rehire for similar functions by 2027. Forrester found that over half of companies that cut staff for AI already regret the move. A Careerminds survey went further. One in three employers spent more on restaffing than they saved from the original layoffs. That is not efficiency. That is a wire transfer with extra steps.CEOs need to log off their group texts. I think layoffs are more of a social contagion than a sound business strategy. The more room for error a company has, the more likely they are to indulge terrible ideas.
You can’t advertise people out of reacting to their own experiences. This is a fundamental disconnect between how tech people with software brains see the world and how regular people are living their lives.Nilay Patel explains the disconnect between tech culture and culture at large around AI.
Most of the Americans surveyed believe that datacenters are bad for the environment, home energy costs, and the quality of life of people living nearby and the numbers aren’t close. Only four percent of people thought datacenters were good for the environment, six percent good for jobs, and six percent good for people’s quality of life.Oh, here’s another thing Americans can agree on.
I just don’t understand how anyone who has paid even the slightest bit of attention to the tech industry for the last thirty years can look at any of this stuff these huge companies are cranking out and think “Yeah, it’s gonna stay cheap and liberating and we will all be able to set ourselves free to experience utopia by embracing it!”So true. Enshittification will come for the vibe-coding machines when the VC subsidies end.
Imagine you have two machines. One you can open up and examine all of its workings, and if you give it every picture of a cat on the whole internet, it can reliably distinguish cats from non-cats. The other is a black box and it can also reliably distinguish cats from non-cats if you give it half a dozen pictures of cats, some apple sauce, and a hug. These machines sort of do the same thing, but even without knowing how the second one works I am extremely confident in saying it doesn’t work the same way as the first one.Rusty has a great way of cutting through bullshit.
Anthropomorphizing language influences how people perceive a system on multiple levels. It over-sells a system which is likely to under-deliver, and portrays a world view in which the people responsible for developing the systems are not held accountable for the system’s inaccurate, inappropriate, and sometimes deadly output. It promotes misplaced trust, over-reliance, and dehumanization.This article addresses a major pet peeve of mine in AI discourse. These simulation technologies do not have emotions! The technology can’t feel regret, can’t apologize for bad behaviors, can’t feel pain. The owners and operators of these technologies are human and transferring their real world responsibility to a non-entity is causing a lot of problems.
In automation theory, a "centaur" is a person who is assisted by a machine. You're a human head being carried around on a tireless robot body. Driving a car makes you a centaur, and so does using autocomplete.Cory Doctorow is an international treasure. I'm reading his book Enshitification and it's a very helpful explanation of how we got our modern tech hellscape. This talk about AI is also a nice encapsulation of why AI isn't an inevitable trajectory for tech, it's a deliberate choice by some of the worst people in that tech hellscape. This is a must read!
And obviously, a reverse centaur is machine head on a human body, a person who is serving as a squishy meat appendage for an uncaring machine.
The German lender is looking at options including shorting a basket of AI-related stocks that would help mitigate downside risk by betting against companies in the sector.That’s probably ok when your backers start mitigating their upcoming losses. I think they teach that’s a healthy economy signal in business school.
I have been on many projects and teams where I have been immensely frustrated by the people I am collaborating with, and wished that I had the power to just tell them to do the thing I want them to do. The friction that the political project of AI promises to remove is, by and large, the same friction that authoritarianism promises to remove: other people.This is an excellent description of the unspoken promise of AI and why we should all be leaning harder into life’s friction.
This is a list of writing and formatting conventions typical of AI chatbots such as ChatGPT, with real examples taken from Wikipedia articles and drafts. It is meant to act as a field guide to help detect undisclosed AI-generated content on Wikipedia. This list is descriptive, not prescriptive; it consists of observations, not rules.Makes sense that Wikipedia is on the leading edge of AI cleanup and sharing what they've learned. Very interesting read.
We all need to transition to this way of cooking, because clearly this is where the future is going. I expect in a few short years kitchens will be much smaller. Gone will be stoves and ovens and flat tops. Restaurant kitchens will only be a small closet with a microwave.[sensible chuckle]