Close Menu
    2digital.news2digital.news
    • News
    • Analytics
    • Interviews
    • About us
    • Editorial board
    2digital.news2digital.news
    Home»Analytics»Is Informed Consent Still Informed? What Happens When We Click on “I Agree” 
    Analytics

    Is Informed Consent Still Informed? What Happens When We Click on “I Agree” 

    December 3, 20257 Mins Read
    LinkedIn Twitter

    Our culture rests on a shared conviction that a person is largely of their own making – lifestyles assembled from an almost endless menu of options. In philosopher Zygmunt Bauman’s terms, late-modern citizens are expected to construct biographies rather than inherit them – to navigate risk, select identities and repeatedly “reboot” their lives in a fluid world. But the more choices are made online, the more this independence relies on invisible data trails that few individuals genuinely control.

    In that sense, personal data protection is the basic infrastructure of an agency. Whoever can aggregate and trade on data – can nudge what information appears, what prices are shown, what opportunities are offered. 

    A work  “AI, big data and the future of consent” by Adam Andreotta on AI and big data ethics has pointed out that when data can be endlessly recombined and repurposed, control over that data becomes a precondition for meaningful self-determination.If informational power is asymmetrically held by platforms and brokers, the ideal of a self-directed life risks becoming largely symbolic.

    Ownership of personal data has become an illusion, while the burden of consequences remains firmly individual. Users are told to “manage their privacy” – tick more boxes and “take responsibility” for their choices online. Yet the terms of those choices are defined in contracts that cannot be negotiated and, in technical uses, that even specialists struggle to predict. Philosophers of technology have argued that traditional informed consent is built for foreseeable interventions, and it breaks down when data can be recombined and fed into AI systems for purposes that are unknown at the time of collection. The legal risk, however, is still personalized: if something goes wrong, it is the individual who “agreed.”

    Empirical data show that most people do not read privacy policies or terms and conditions in full; in a 2019 Pew survey, about 9% of adults say they always and 13%

    often read a privacy policy before agreeing to it. The majority reported feeling that they had little or no control over how companies use their data. Yet these unread documents constitute binding contracts and allow data to be used for product development, advertising or research.

    The art installation “I Agree” by Dima Yarovinsky materializes this abstraction. Yarovinsky printed the full terms of service of platforms like Facebook on color-coded A4 rolls, then hung them so they spilled from walls onto the floor. Next to each scroll, he added the number of words and estimated reading time. For some services, that time stretches beyond an hour. The cognitive cost makes genuine understanding structurally impossible.

    A recent writing by Tim Green on the “consent paradox” says that in a world saturated with cookie banners, consent itself becomes a kind of performance.People learn to treat privacy prompts as obstacles between them and the content or service they want. The act of clicking “Accept all” turns into a ritual that legitimizes data practices while doing little to enhance real agency. This is a response to choice overload, complexity, and a sense that data collection is unavoidable. 

    What can actually go wrong when personal data is misused or repurposed? Some harms are “known unknowns”: discrimination in credit and insurance scoring, opaque risk profiling, and targeted political persuasion – all this we can foresee, yet not always prevent. Studies of reproductive and “female health” apps, for example, have documented lax security and vague privacy policies, with potential access by law enforcement or third parties that users did not reasonably anticipate.

    Other harms fall into the “unknown unknowns” category: big data analytics can infer traits that neither users nor designers predicted when consent was originally given. The Andreotta et al. analysis describes this as a problem of re-repurposed data and lack of meaningful alternatives: once data enters complex AI pipelines, it becomes practically impossible to foresee all future uses or opt out of the ecosystem. 

    Regulators have already dealt with a series of striking cases where misuse of personal data produced consequences.In September 2023, Ireland’s Data Protection Commission fined TikTok €345 million for GDPR violations in its handling of children’s accounts, including default public settings and insufficient transparency about how minors’ data were processed. In May 2025, the same authority imposed a further €530-million fine, ordering TikTok to suspend transfers of EU user data to China unless it could guarantee adequate protections. 

    Perhaps the most symbolically charged case came in 2025, when a California jury found that Meta had illegally intercepted sensitive data from the Flo period-tracking app. The verdict concluded that Meta had eavesdropped on in-app communications to build targeted advertising profiles, in violation of state privacy law, while Flo and other defendants had previously settled. The combination of opaque SDKs and advertising infrastructure can create risks that wouldn’t to be in friendly sign-up screens.

    Faced with these patterns, lawmakers tried to update the rulebook. In Europe, the GDPR defines consent as “freely given, specific, informed and unambiguous” and sets conditions for valid consent and withdrawal. But newer laws are increasingly targeting the environment where consent is sought. The EU Digital Services Act (DSA), fully applicable for most platforms since 2024, explicitly bans “dark patterns” – interface designs that deceive or manipulate users or materially impair their ability to make informed decisions. Draft initiatives, such as the proposed Digital Fairness Act, aim to go further against manipulative designs and exploitative targeted advertising. 

    The EU’s AI Act introduces obligations for high-risk AI systems, including requirements for data governance, documentation, and fundamental rights impact assessments. Taken together, these frameworks recognize that individual consent, on its own, cannot carry the full weight of protecting agency in data-driven environments.

    Legal change is just a  part of the picture, though – and probably not – the fastest-moving one. Scholars and practitioners are experimenting with other approaches to repair the consent model. Andreotta et al. propose supplementing consent with “soft governance” mechanisms – ethics committees for data-intensive projects, akin to human-research ethics boards, which can scrutinize secondary uses and re-purposing of personal data. They also point to more usable consent formats, such as pictorial contracts and layered interfaces, which foreground key risks rather than bury them in dense text. The “Consent Paradox” analysis emphasises privacy-by-design, participatory design with users, and technical tools like differential privacy or federated learning to reduce dependence on centralized raw data.

    More systemic proposals include data trusts or cooperatives that negotiate on behalf of groups rather than isolated individuals, and collective governance arrangements where regulators, civil-society organizations, and user representatives oversee high-risk data ecosystems. Recent AI-governance reports argue that such institutional innovations are essential complements to consent. 

    Original book cover of “Liquid Modernity” by Zygmunt Bauman – used here for illustrative purposes.

    Let’s consider Bauman’s “Liquid Modernity” where he describes a world rich in instruments – markets, technologies, mobility – and poor in stable frameworks that would help people orient themselves and share risks. Individuals are freed from traditional structures, but that freedom comes with permanent uncertainty and a transfer of responsibility from institutions to individuals, who in reality have little collective capacity to define the underlying systems.

    If we want informed consent to become informed again, the burden has to shift. Contracts and interfaces can no longer function as one-way disclaimers that transform structural opacity into individual “choice.” In a liquid world, individuals will always need to make decisions under uncertainty. The question is whether those decisions are made against a background of institutions that genuinely constrain misuse of data or whether “I agree” continues to be a small, exhausted performance that hides how little room for maneuver there really is.

    Share. Twitter LinkedIn
    Avatar photo
    Lidziya Tarasenka
    • LinkedIn

    Healthcare professional with a strong background in medical journalism, media redaction, and fact-checking healthcare information. Medical advisor skilled in research, content creation, and policy analysis. Expertise in identifying systemic healthcare issues, drafting reports, and ensuring the accuracy of medical content for public and professional audiences.

    Related Posts

    Interviews

    AI Medical Chatbot on Telegram: Doctorina’s New Model for Faster Patient Access

    December 8, 2025
    Analytics

    The FDA’s Elsa AI Explained: Has It Really Accelerated Drug and Device Approvals?

    December 2, 2025
    Analytics

    Google’s Latest Quantum Milestone Hints at Practical Uses. So Where Does the Field Actually Stand?

    November 25, 2025
    Read more

    Why does AI lie and get lazy about answering your questions? We spoke with an LLM expert

    November 27, 2025

    Google’s Latest Quantum Milestone Hints at Practical Uses. So Where Does the Field Actually Stand?

    November 25, 2025

    This Is Not a Toy: How Gaming Technologies Help Medicine

    November 21, 2025
    Stay in touch
    • Twitter
    • Instagram
    • LinkedIn
    Demo
    X (Twitter) Instagram LinkedIn
    • News
    • Analytics
    • Interviews
    • About us
    • Editorial board
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.