We’ve spent the last decade screaming about what’s on our screens. Who should be banned? What constitutes hate speech? Where does the First Amendment end and a Terms of Service agreement begin? But while we were distracted by the arguments, a massive legal shift began moving the spotlight away from the message and onto the plumbing of the internet itself.
This isn't a trial about what you’re watching. It’s a trial about why you can’t seem to stop.
Meta and Google are currently locked in a landmark lawsuit that puts their engineering blueprints on the stand. The plaintiffs aren't just angry about bad posts or viral misinformation; they are targeting the specific architecture of the apps—the code and design choices that make them tick. Specifically, the legal crosshairs are trained on two features we’ve come to take for granted: infinite scroll and autoplay.
The End-Run Around Section 230
For years, Silicon Valley has used Section 230 as a "get out of jail free" card. The logic was simple: platforms aren't responsible for what their users post. It’s a sturdy shield, but lawyers are now attempting a clever, potentially devastating end-run around it.
By framing this as a product liability case, the plaintiffs are treating Instagram and YouTube like any other physical product. If a car is sold with a faulty ignition that catches fire, the manufacturer is liable. If an app’s interface is intentionally designed to bypass human willpower, the plaintiffs argue, the company should be held responsible for the resulting wreckage.
It’s a move from the philosophical world of "speech" to the hard reality of software engineering. We’re no longer questioning the pixels; we’re questioning the house they were built in.
The Bottomless Soup Bowl
To understand the mechanics here, look at the "bottomless soup bowl" experiment. Years ago, researchers rigged bowls to secretly refill from the bottom while people ate. The result? People ate significantly more than they intended because they never saw a "stopping cue"—that simple visual signal that the meal is over.
Infinite scroll is the digital version of that bowl. By removing the friction of clicking "next page," tech companies created a seamless stream of content that the human brain isn't wired to quit.
Autoplay works on the same psychological hair-trigger. It removes the moment of decision-making entirely. Before you can even process whether you actually want to see another video, the next one is already five seconds deep into its intro. The allegation is that these aren't "conveniences"—they are tools engineered to exploit neurological vulnerabilities, trapping users in a loop they never consciously chose to enter.
Engineering vs. Addiction
Unsurprisingly, Meta and Google have a different take. Their defense is built on the idea that these are industry-standard tools designed to make our lives easier. From their perspective, they aren't building a trap; they’re building a highway. They argue that users want a frictionless experience and that removing these features would turn the modern web into a clunky, outdated mess.
Then there is the "A" word: addiction.
While most of us joke about being "addicted" to our phones, the legal and clinical threshold for that word is incredibly high. The tech giants argue that the link between a scrolling feed and actual, clinical addiction is unproven. Is it a design flaw, or is it just a very engaging product?
If you’ve tracked these companies for long enough, this feels like the tech version of the Big Tobacco trials. For decades, cigarette companies argued they were just selling a product people liked. The tide only turned when the focus shifted to how they engineered nicotine delivery to maximize dependence. We are now seeing that same level of scrutiny applied to dopamine.
The Rise of "Friction-by-Design"
If this case succeeds, the "move fast and break things" era of software design is over. We might see the rise of "friction-by-design"—mandatory speed bumps or "off-ramps" that force users to pause and make a conscious choice to continue.
It would also force a total reckoning with the current business model. Most social media platforms live or die by the "time-spent" metric. If the law begins to penalize design that maximizes engagement, the very foundation of their revenue is at risk. We could be looking at a shift toward "responsible design" mandates that prioritize your well-being over how many minutes you spend staring at a glass rectangle.
As this plays out in court, we have to ask ourselves a difficult question: Do we want an internet that is as easy as possible to use, or an internet that is easy to leave? The verdict might just change the way we touch our screens forever.
