AI and Humanoids

AI isn’t getting smarter. We are getting dumber

AI isn’t getting smarter. We are getting dumber
If AI takes over how we communicate with one another then what happens when we forget how to think for ourselves?
If AI takes over how we communicate with one another then what happens when we forget how to think for ourselves?
View 1 Image
If AI takes over how we communicate with one another then what happens when we forget how to think for ourselves?
1/1
If AI takes over how we communicate with one another then what happens when we forget how to think for ourselves?

I’m not sure how long it will take you to read this article. Maybe a few minutes. Maybe you take a little longer and give it a close read. Either way, I’m confident however long you take to read this will not come close to the amount of time it took me to write it. And that’s a good thing. That is how communication should work.

Here’s the equation: The time it takes someone to process a piece of information should be proportionally less than the time it took to compose that content.

It should never take someone longer to read something than it took for another person to write. In fact, the time it takes to create a piece of communicable content should be exponentially longer than it takes to read on the other side.

This is what makes communication valuable.

A standard non-fiction book, for example, would take someone maybe a year or two to write. This comprises likely hundreds of hours of writing and research time. To read that book someone may take 10 to 15 hours. Longer if they read more slowly or want to take time to think about the content.

At another end of the spectrum a random email may take five minutes to write, and less than a minute for the recipient to read. Or that equation could be bigger if the email is more important.

At its most succinct, an email can be as swift in generation as someone talking. "Yep let's lock in that meeting Friday, see you then." Takes almost as long to read as to compose and send. Maybe you even use a speech to text program and dictate the email.

Here, we come very close to an equilibrium in terms of time spent at both ends of the communication process, but that closeness can correlate with the general value of that information. In this example the information is temporal, slight, and of little broad use.

The greatest con of our modern times is the idea that efficiency rules over all considerations: Use your time wisely. Don't waste time. Get it done. How much have you achieved with your day?

Efficiency is of utmost value in a capitalist world where growth and productivity are the primary motivations for anything. The more you can get done, the better. But there is a point where the chase for efficiency becomes reductive. Communication itself is so devalued that content generation becomes more important than the contents of that content.

Modern LLM/AI systems have taken us across that Rubicon. We can now generate reams of content in a flash. Emails that take a receiver five or 10 minutes to read are taking a sender seconds to create. Maybe someone does the "responsible" thing and reads over their generated email before hitting send. Maybe they don't.

Maybe a recipient has cottoned on to the amount of LLM-generated content they’re being sent. They set up an automated system to skim and summarize all their emails. After all, efficiency rules at both ends of the spectrum and there are just too many emails to waste time actually reading them all.

What is worth reading anymore when we can get a dot-point summary instantly generated? What is worth writing anymore knowing that some AI system is likely to scrape the work and turn it into a one sentence overview?

Maybe you just use AI to clarify your thoughts. Turn the mottle of ideas in your head into coherent communicable paragraphs. It's OK, you say, because you’re reviewing the results, and often editing the output. You’re ending up with exactly what you want to say, just in a form and style that’s better than any way you could have put it yourself.

But is what you end up with really your thoughts? And what if everyone started doing that?

Stripping the novelty and personality out of all communication; turning every one of our interactions into homogeneous robotic engagements? Every birthday greeting becomes akin to a printed hallmark card. Every eulogy turns into a stamp-card sentiment. Every email follows the auto-response template suggested by the browser.

We do this long enough and eventually we begin to lose the ability to communicate our inner thoughts to others. Our minds start to think in terms of LLM prompts. All I need is the gist of what I want to say, and the system fills in the blanks.

And when our inner thoughts resemble LLM outputs we suddenly realise how smart the computer has become. Oh wow, we think. Does this mean AI has finally become sentient? Is this the singularity? Artificial general intelligence must be here! The computer is thinking exactly like me.

But it’s not. We haven’t moulded the AI into something unique. The technology hasn’t developed to become a super-intelligent entity. Instead, it has reduced us into something mindless, something mechanical. We have become the repeating machine model. We have forgotten how to communicate. We are so dumb we just think the machines are smart.

No comments
0 comments
There are no comments. Be the first!