Data, AI, And Power: Understanding Privacy And Governance In The Digital Age
Recently, a photograph of Israeli Prime Minister Benjamin Netanyahu circulated widely on social media. In the image, his phone camera appears to be covered with tape. For many people, it raised a familiar question: Are our phones listening to us? Are they watching us? The concern is not just limited to this question but is even more extensive. It lies in the scale of data collection that defines the modern digital age.
Today, companies such as Meta, Google, and TikTok collect vast amounts of behavioural data, including our location information, search history, viewing habits, social connections, and what is known as metadata, meaning patterns of communication such as who interacts with whom and how often.
Even when users are not actively posting, their digital traces continue to accumulate. That’s one of the reasons we often hear people say that whatever they discuss starts showing up in their feeds. And we also experience it ourselves.
At the same time, data breaches have become routine. Government databases, healthcare systems, commercial platforms, and even military registrations have been exposed in large-scale leaks. Commercially available data, scraped social media profiles, facial recognition of publicly available photographs, GPS tracking, and voice assistants together create detailed digital profiles. Often, this information extends beyond individuals to their families and social networks.
Artificial intelligence systems rely heavily on this data. The logic is simple: the more data an AI system receives, the better it can detect patterns and generate predictions. A basic example is ChatGPT. When it is given a clearer context and more specific information, it produces more accurate responses. The same principle applies across AI systems. Data improves performance.
Researchers at Stanford University have warned that modern AI systems are trained on enormous datasets, sometimes incorporating information that people did not knowingly agree to share for that purpose. Data shared for one reason, such as a résumé or a photograph, may later be repurposed in ways the original user never anticipated. This creates a gap between consent and actual use.
The Centre for European Policy Analysis has described data as ammunition in the digital age
The Centre for European Policy Analysis has described data as ammunition in the digital age
The implications extend beyond advertising. There is documented evidence that commercial data brokers aggregate and sell personal data. Reporting in the United States has shown that government agencies have purchased commercially available data because it is cheaper and easier to obtain than traditional forms of surveillance. In this environment, the boundary between private commercial data and state analytical capacity becomes less clear.
The Centre for European Policy Analysis has described data as ammunition in the digital age. The point is not that every dataset becomes a weapon. Rather, large-scale behavioural information can enable highly personalised influence and strategic operations. Data becomes a source of power.
This becomes even more significant in conflict settings. Investigative reporting on systems such as Lavender, along with analysis by Human Rights Watch, has highlighted the use of digital tools and large volumes of surveillance data in military operations. AI-assisted systems in conflict rely on extensive data inputs to identify patterns and generate assessments.
It would be inaccurate to claim that all civilian data is directly weaponised, but at the same time, it is difficult to ignore that large data infrastructures strengthen the analytical capabilities of states.
The issue, therefore, is not whether our phones are secretly listening to us. The issue is how data ecosystems operate and how they intersect with political and military systems. Data collection, AI development, and security practices now exist within overlapping structures. Oversight and accountability, however, remain fragmented.
This is not a dystopian fantasy. It is a governance challenge. In an era where AI depends on data and power depends on analysis, the question is no longer simply about privacy. It is about how civilian information is managed, regulated, and protected in a world where the same data can serve commercial, political, and security purposes.
