menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Meta court ruling: Big Tech’s Big Tobacco moment has arrived - and the world must pay attention

39 0
26.03.2026

A JURY VERDICT in California this week may prove to be one of the most important moments yet in the global reckoning over social media and child safety.

For years, the public conversation about online harm has revolved around content. How much screen time? What are children seeing? Who are they talking to? What dangerous material are they being exposed to?

Those questions matter, of course. But this case marks a crucial shift. The focus is no longer only on content. It is now on design.

That is a profound change.

A California jury found Meta and YouTube liable in a case brought by a young woman who said she became addicted to their platforms as a child and suffered serious mental health harms as a result.

The jury found that the companies were negligent in the design of their platforms, knew aspects of that design were dangerous, failed to warn users, and caused substantial harm.

Meta was assigned 70 per cent of the responsibility, YouTube 30 per cent.

This is not just another lawsuit. It is a signal. In The Cyber Effect, published over a decade ago, I warned that digital technologies do not simply reflect human behaviour – they shape it.

They influence how we think, how we feel, how we see ourselves and how easily we fall into cycles of repetition and compulsion. That is not moral panic. It is cyberpsychology.

Meta founder and CEO, Mark Zuckerberg. Alamy Stock Photo Alamy Stock Photo

Children are especially vulnerable to those effects. They are not entering social media as fully formed adults with mature judgment and strong self-regulation. They are entering as young people whose identities, emotions and impulse control are still developing, into systems deliberately designed to keep them engaged.

Every notification, every streak, every recommendation, every infinite scroll, and every appearance-altering filter serves the same goal: to keep the user there.

This is not an accidental by-product. It is the business model. It is the scroll economy. So when a court begins to examine not just what appeared on a platform but how that platform was designed to operate, we are entering new territory.

And it is long overdue.

For too long, tech companies have benefited from a convenient fiction: that they are simply passive hosts, reflecting the world back to us. But anyone who studies human behaviour in cyber contexts knows that is not true.

Platforms are not mirrors. They are behaviour-shaping systems. They cue, reward, steer, condition and arguably addict. They influence what we attend to, how long we scroll, how often we return, what we compare ourselves against and how we feel when we log off.

That is why this verdict matters.

It suggests that the legal system is beginning to grapple with what many parents, clinicians, educators and researchers have been saying for years: harm online is not always accidental. Sometimes it is structured into the design.

18th Feb, 2026. LA. Julianna Arnold, whose daughter died from fentanyl she bought from someone on Instagram, talks about watching Mark Zuckerberg testify outside the Los Angeles Superior Court on Feb. 18. Alamy Stock Photo Alamy Stock Photo

This is not anti-technology. Let me be very clear about that. I am not arguing that social media is inherently bad, or that every child who uses it will be harmed. Technology can educate, connect and empower.

But if a product is designed in ways that intensify vulnerability, dependency or distress, especially among minors, then we have to be honest about that.

A lawsuit waiting to happen

And we have to stop pretending that child safety can be addressed by parental controls alone. The burden cannot rest entirely on parents trying to supervise systems designed by some of the most sophisticated behavioural engineers in the world.

It cannot rest on children themselves, who are being asked to exercise mature judgement inside environments optimised to bypass it.

This is where the Irish and European dimension becomes important. It would be a mistake to see this as merely an American legal drama with no relevance here. Ireland sits at the heart of Europe’s digital regulation architecture. Many major tech firms have substantial operations here. We have our own regulator in Coimisiún na Meán, an Online Safety Code, and a wider European framework through the Digital Services Act that is increasingly focused on the protection of minors.

The central question now is no longer whether online environments can affect young people psychologically. Of course they can. The real question is whether we are willing to insist that the companies designing these systems owe children a meaningful duty of care.

Alamy Stock Photo Alamy Stock Photo

That is the issue. Not whether technology exists, but whether it is being built responsibly. We have seen this pattern before in other industries. Powerful companies resist scrutiny, deny causation, emphasise complexity and insist that responsibility lies elsewhere.

Sometimes they are right to say that human behaviour is complicated. It is. Mental health is complicated. Childhood is complicated. Family life is complicated. But complexity cannot become a shield against accountability.

If a platform is knowingly designed in ways that exploit developmental vulnerability, then complexity is not an excuse for inaction. It is a lawsuit waiting to happen.

What happens next matters enormously. This verdict may yet be appealed. Other cases will follow. The companies will defend themselves robustly. But the broader significance remains – a line has been crossed. A jury has looked at the evidence and accepted that the design of these systems can be central to the harm.

That should concentrate minds in boardrooms, courtrooms and government departments.

In Ireland, we should pay close attention. We should not wait passively for courts in California to decide the future of our children’s safety in the digital age. Given this verdict, our government and relevant authorities need to act urgently and regulate these platforms.

The Safety Tech sector can provide instant solutions – no more excuses. Most importantly, we need to stop social media companies from treating our children as an acceptable testing ground for products built to maximise engagement and instil addictive type behaviours.

In The Cyber Effect, I warned that the internet would not simply change what we do. It would change who we are. This week’s verdict suggests that the law is finally beginning to catch up.

Cyberpsychologist Dr Mary Aiken is Professor and Chair of the Department of Cyberpsychology at Capitol Technology University, Washington DC. She is a Professor of Forensic Cyberpsychology in the Department of Law & Criminology at the University of East London (UEL). Prof. Aiken is a Member of the INTERPOL Global Cybercrime Expert Group and is an Academic Advisor to Europol’s European Cyber Crime Centre (EC3).


© TheJournal