News

Meta’s Own Employees Called Instagram “A Drug”—Now a Jury Will Decide If They Were Right

4 min read Comments
Meta’s Own Employees Called Instagram “A Drug”—Now a Jury Will Decide If They Were Right

Internal discussions among Meta employees reportedly referred to Instagram as a “drug”, according to newly highlighted records. The revelations deepen long-standing concerns in Europe about addictive platform design, the mental health of young users, and whether current EU digital regulation adequately addresses engagement-driven business models.

According to a report published by Technology.org, internal Meta communications included striking language in which Instagram was compared to a “drug”. The characterisation, reportedly used by company staff themselves, adds weight to claims that the addictive qualities of social media platforms were not only foreseeable, but internally recognised.

The report draws attention to internal conversations in which Instagram’s ability to keep users engaged for prolonged periods was discussed in terms commonly associated with dependency. While such language does not constitute a formal admission of wrongdoing, critics argue it reveals a candid internal awareness of the platform’s psychological pull—particularly among adolescents and young adults.

Instagram, owned by Meta Platforms, has been at the centre of controversy for years over its impact on youth mental health. Previous disclosures, including those brought to light by former employee Frances Haugen and reported by outlets such as The Wall Street Journal, suggested that internal research linked intensive Instagram use to anxiety, depression, and body-image issues among teenage users. These findings intensified public scrutiny after it emerged that some internal research results were not proactively disclosed.

The renewed focus on internal language comes at a critical moment for regulators on both sides of the Atlantic. In the European Union, the Digital Services Act (DSA) has introduced obligations for so-called very large online platforms to assess and mitigate systemic risks, including those affecting minors. These obligations include examining how recommendation algorithms, interface design, and engagement mechanics may amplify harmful patterns of use.

However, digital-rights advocates argue that the DSA, while a significant step forward, still leaves unresolved questions about “addictive design”. Unlike traditional consumer-protection law, EU digital regulation has so far focused more on content moderation, transparency, and risk assessments than on civil liability for product design choices that encourage compulsive behaviour.

For child-protection organisations, the reported internal description of Instagram as a “drug” reinforces calls for stronger enforcement and clearer standards. They argue that features such as infinite scrolling, algorithmic content ranking, and intermittent reward mechanisms are not neutral design elements but deliberate choices optimised to maximise attention—often at the expense of well-being.

Meta has consistently defended its approach, stating that it invests heavily in user safety and well-being tools. The company points to features such as screen-time reminders, content sensitivity controls, and parental supervision options as evidence of its commitment to protecting young users. Meta also maintains that social media can have positive effects when used responsibly, including fostering connection and creativity.

Yet critics counter that such measures function more as mitigations than solutions. They argue that optional controls do little to address the underlying incentives of an advertising-driven business model, where revenue is closely tied to time spent on the platform and the intensity of user engagement.

Within Europe, the issue has broader societal implications. As policymakers debate the next phase of digital governance, questions are increasingly being raised about whether platform accountability should extend beyond transparency and risk reporting to include clearer legal consequences when design choices foreseeably contribute to harm.

As highlighted by Technology.org, the internal characterisation of Instagram may not immediately trigger legal action. Nevertheless, it adds to a growing body of evidence suggesting that concerns about addictive use were known within major technology companies well before they became central to public debate.

Ongoing coverage of digital regulation, platform accountability, and youth protection can be found at The European Times, which continues to follow EU-level policy responses to the social and human-rights impacts of large online platforms.