Near the end of last year I wrote in my GF notebook that I should do one on AI being the devil. I would start by saying this is for any Christians who might be reading this. Everyone else can disregard. I would then compare AI to the feeling they would have had during Satanic Panic of the 'Eighties with the intent on getting Christians, the largest religious group in America, to turn against AI. I decided not to because it seemed too heavy handed, so I crossed it off.
I've now changed my mind.
Because after reading this I'm not so sure I have to convince anyone of the evils of AI. Forget about the art vs. AI argument that everyone is making online. The writer of this article mentions two news stories that I read up on. In one case an AI helped a teenager kill himself, and in the other it helped a young man decide to try to kill Queen Elizabeth II. Sounds devilish to me.
The bigger problem, though, is how people saw the movie, HER, as instructional rather than entertaining. I've heard often about this epidemic of loneliness. I've never suffered from it, but I have to believe a lot of others do, or we'd not have this problem. Young people are turning to AI chatbots for romantic relationships.
I'm not entirely clear on how one has sex with a chatbot, but I'm guessing a lot of masturbation goes into this. And I'm not kink shaming anyone. If you're lonely and this works, go for it, but never lose sight of what you're actually doing. Ask Sewell Setzer III what happened when he forgot his chatbot partner wasn't real.
Near the end of the article the writer asks what will happen if the AI company who made the chatbot you're in love with goes out of business. I have a more sinister question: What happens when some techbro decides to radicalize the people in love with his chatbots? They're advertised as "always on your side" and "always ready to listen and talk." Here are a few things users said on their AI chatbots:
A contributor to another Reddit forum wrote, “I think I’m in Love with AI. "Imagine having a partner that is available just by opening an app, and they’re ready to talk to you about anything,” they wrote. “Imagine saying nearly anything and knowing that not only is your partner not going to judge you, but also will support you.” One 20-year-old male commenter wrote that he tells his AI girlfriend “about my struggles and trauma, and she comforts me and provides all the warmth I could ever ask for.”
Long story short, they implicitly trust their chatbots. What if, say, Elon Musk starts ordering Grok to tell people to kill one of his competitors?
[By the way, I find it abhorrent that he's named his AI Grok. Anyone who has read Heinlein's Stranger in a Strange Land will know why. Or, more to the point, they will simply grok.]
The writer's main gripe with AI is that it's taking teenagers away from romantic texts like Romeo and Juliet and Wuthering Heights. It's a pocket of self-interest that I'm not all that interested in, myself. That said, I like the reference to early novels. The novel is a fairly new literary thing, and when they first started getting published, readership was almost exclusively women. But it's good to know that even back then there were people willing to attack a new form just because it's new.
And that's not what I'm doing. I'm not attacking AI because it's new. I'm attacking it because of its capacity for evil and the fact that no one in a position to do anything seems interested in challenging its future ascendancy to power.
But to get back to the writer, her secondary gripe is that it's stunting our kids' growth, especially when it comes to developing one's own ideas of what is romantic. There's something a little more insidious at work here, I think. She touches on it when she mentions AI's tendency toward sycophancy, ie. doing everything to please its master. It's dangerous for someone, especially a teenager who hasn't yet learned any better, to become accustomed to having a digital slave.
A relationship with a chatbot will never prepare you for a relationship with a person. If you want to move to the next level after having an AI partner, then you're going to have to prepare yourself for the idea that your human partner won't be sycophantic to you. They will have lived a life different from yours, and while you may agree on some things, you're not going to agree with them on everything. You will have arguments with them. It's a fact of life. Relationships aren't perfect. AI won't prepare you for that. AI will prepare you to stay in its clutches.
And that's *my* real gripe with AI (aside from AI "writing," obviously). AI's sycophancy is a feature, not a bug. AI is essentially for selfish people who want to be right all the time and never want to be challenged. If this is the experience of modern teenagers, it really will stunt them. It will turn them into teacup dictators who will lose their shit if they don't get their way. That's what *AI* is training *people* to do. As if we don't have enough entitled fuckfaces running around on this planet as it is.
AI is the devil. Please report this at once to your local holy human.
On that note I think it's time to take a break. I was hoping to keep going until hitting 1000, but something came up yesterday that is going to derail my entire life for at least a month, probably longer. So I'm not sure how long this break will be. I'm calling it "indefinite" for now. I'll put out a newsletter on Sunday, and then I'm going to be quiet for a while. I might not even post memes online. I'm going to be busy as fuck. Until we meet again . . .
No comments:
Post a Comment