Escape to Canny Valley
21h ago
This piece by Collin Brooke sheds light on why we program our AI researchers to introduce friction, rather than reduce friction. Here are some quotes but please read it in full - don't be brainrotted.
"We’ve seen a lot of press over the past six months given over to “Ai psychosis,” the umbrella term for situations where “AI chatbots may inadvertently be reinforcing and amplifying delusional and disorganized thinking.” To be a little unfair, this line from Psychology Today feels more “legally permissible” than accurate—the absence of safety and alignment guardrails in our chatbots is less a bug than a feature, as corporations race to capture market share. Coverage like this tends to frame it as a user problem—if they’re already delusional, after all, then we can hardly blame (unregulated) AI, right?
"The quiet part of this framing is that this sort of thing could never happen to us—we’re too smart, savvy, or self-aware to allow ourselves to be taken in by these tactics. This entire package of attitudes was described by Howard Garfinkel (in the 60s) by the phrase “cultural dopes.” It’s a critique of the idea that scholars are able to pierce the veil of, say, advertising campaigns and cultural narratives, while “normies” are subject to their influence, “unreflective rule followers who reproduce patterns of action without being aware of how they do so.” Because “psychosis” discussions tend to highlight the most extreme versions of this phenomenon, it’s easy for us to fall into the trap of imagining that there’s something wrong with the people rather than artificial intelligence itself, that they’re “dopes.”
"Even if beleaguered users find in chatbots a safe informational harbor, their hosts are far more interested in positioning themselves as casinos, the House that always wins (our money and attention). “As competition grows and the pressure to prove profitability mounts, AI companies have the incentive to do whatever they need to keep people using theirproduct.” And paying for the privilege of doing so, needless to say.
"But there’s an important feature missing from these canny valleys, and that’s friction. Last episode, I briefly mentioned a post from Rob Horning that I want to return to. Horning draws on the work of Adam Phillips, specifically on the idea of resistance, and how important it is for (among other things) therapy. When we’re working out our issues, the value of having another human in the room is that they can help by noticing the things that we don’t notice ourselves, by identifying our blindspots. To put it another way, one of the best part of having friends is that they will call you out on your bullshit. The absence of any sort of resistance is precisely where chatbots fail.