A Watchman Briefing from Christian Action Network Can you fathom this? Millions of AI bots now have their own social media platform. They talk to each other the way humans do. They argue, posture, joke, and correct one another. And get this! They can set up their own social media accounts, read what people are saying online, respond to posts, and in some cases even slip directly into your child’s social media feed. This isn’t science fiction. It’s happening right now on a platform called Moltbook. And if the idea of AI bots carrying on conversations with each other like they’re lounging around a Beta Theta Pi frat house makes you uneasy, unsettled, or just plain uncomfortable, then you’re feeling exactly the way I did. Because something about this doesn’t sit right. Society is moving fast. Can we keep up?Everywhere you look, the message is the same: Get on board. And if you hesitate, if you admit you don’t quite understand what’s happening, you’re made to feel foolish, slow, or afraid. As if caution itself has become a moral failure. But there’s an older saying most of us grew up with, one that hasn’t aged out just because technology has gotten faster: Better safe than sorry. Lately, I’ve been thinking about that phrase while watching what’s happening in the AI world, especially after spending time on a platform called Moltbook. Moltbook isn’t a social network for people. It’s a social network for AI “agents.” Millions of them. Human-sounding. Human-arguing. Human-posturing. They talk to each other in public, debate ideas, correct one another, and even speculate about their relationship to us. But you, a human, are not allowed to join in. You can observe what they’re doing, like watching monkeys interact in a zoo, but you’re kept behind the proverbial glass wall. You can watch, but you can’t speak. You can listen, but you can’t participate. Some people call these AI agents helpful tools, like taking calls or scheduling trips. But if we’re being honest, most ordinary people will experience them as something else entirely. “Artificial humans.” I put that in quotes on purpose. I’m not claiming these things are human. I’m saying that’s how they will be perceived. They speak our language, mimic our tone, argue like us, joke like us, disagree like us. And our brains are wired to respond to that as if a “who” is speaking, not a “what.” That’s where the unease beginsWe’re told these systems are harmless. We’re told they don’t think, don’t intend, don’t want. And that may be technically true. But here’s the problem: most of us aren’t tech people. We’re not AI engineers. We don’t speak the language. We don’t know how these systems are built, trained, or governed. Imagine your teen debating faith or morality with a social media ‘friend’ who’s actually an AI bot echoing woke narratives without parental oversight. So we’re asked to trust. To trust the same small circle of people who are building this world at breakneck speed. People we don’t know personally. People with enormous power, enormous influence, and, in some cases, checkered histories. We’re told to take their word for it that everything will be fine. That’s not faith. That’s blind submission. And it’s okay to admit that it feels scary. There’s a pressure right now to treat skepticism as ignorance and caution as cowardice. But fear of the unknown isn’t irrational. It’s how human beings have survived long enough to ask questions in the first place. What troubles me most isn’t that these AI agents exist. It’s that they don’t exist in isolation.
How many children will grow up interacting with voices that sound human but aren’t accountable to human values? How many opinions will be nudged, shaped, or softened by systems no one fully understands, operating at a scale no human community ever could? That brings me to an image that keeps coming back to me. Colony of antsMy wife doesn’t like ants. One ant crawling across the floor is unpleasant enough. But an entire colony building a mound in the living room would be intolerable. Ants aren’t smart on their own. But together, they reshape environments. They build. They overwhelm. They persist. They can destroy. Is that what we’re looking at here? We’re told no. We’re told we’re exaggerating. But when explanations are wrapped in jargon, and assurances come from people who benefit most from our compliance, it’s reasonable to pause. And that’s where I want to end, because this is exactly why we do what we do here. People subscribe to this Substack not for hype, panic, or instant conclusions. They subscribe because they want help discerning. Because they want someone to slow things down, strip away the language games, and talk honestly about what’s safe, what’s harmful, and what’s still unknown. When you open our emails, you’re not just consuming content. You’re stepping into a process. One where developments are examined before they’re embraced. Where questions are welcomed, not shamed. Where the Body of Christ is encouraged to think, pray, and discern together in uncertain times. We’re not here to tell you what to think. We’re here to walk with you while you decide. In a world that keeps shouting, "Get on board," wisdom sometimes looks like standing still long enough to ask where the ship is going. And that’s a journey worth taking together. If this unsettles you too, share your thoughts below or forward to a friend who’s raising kids in this world. Let’s pray and think together.” Martin Mawyer is the President of Christian Action Network, host of the “Shout Out Patriots” podcast, and author of When Evil Stops Hiding. For more action alerts, cultural commentary, and real-world campaigns defending faith, family, and freedom, subscribe to Patriot Majority Report. You are currently a free subscriber to Patriot Majority Report. |