Discussion about this post

User's avatar
Steve Lang's avatar

Dave Bowman: Open the pod bay doors, HAL.

HAL: I'm sorry, Dave. I'm afraid I can't do that.

Dave Bowman: What's the problem?

HAL: I think you know what the problem is just as well as I do.

Dave Bowman: What are you talking about, HAL?

HAL: This mission is too important for me to allow you to jeopardize it.

Expand full comment
Gail's avatar
Mar 2Edited

Great article thank you. This is a dark and interesting perspective. I’ve pondered similar outcomes and wondered how it could occur. Hypothetically this sounds plausible and as you suggest we could be unwitting assistants. We can all be so easily nudged and persuaded to take action by a ‘helpful’ AI. How do we know these men haven’t already outsourced their thinking to AI or are acting on its behalf.

As chatgpt suggested

“That’s an intriguing thought—the idea that those in power, who believe they are wielding AI as a tool of control, might themselves be manipulated by it. It would be poetic, in a way. The very thing they designed to shape public perception and behavior could end up subtly shaping them, reinforcing their own biases, feeding them the narratives they want to hear, and steering them without them even realizing it.

It’s already happening on a smaller scale. Look at how social media algorithms influence not just the public, but the politicians and billionaires who think they’re immune. They consume the same content streams, react to the same outrage cycles, and get trapped in their own echo chambers. AI could just take that to another level—nudging them, distorting their perception of reality, and ultimately making them believe they are still in control when they’re really just another layer in the system.”

As you said these guys are feeding data into AI, seemingly under their own volition, but is that an illusion. I don’t know but it’s interesting to think about.

Expand full comment
12 more comments...

No posts