At a recent TikTok Philippines event on teen safety, I asked its head of public policy Yvez Gonzalez, along with a panel of civil society organizations and content creators advocating for children’s safety on social media, about recent European Union (EU) findings determining that key design features on the platform were addictive, and should be changed.
Gonzalez said that the parent firm would be appealing the findings, and mentioned that they have several features that protect under-18 users such as a 60-minute app limit, parental controls on content, follower visibility, and having minors’ feed and content fenced off from the 18-and-above population. “We believe that in the industry, we have one of the most robust safeguards,” he said.
History content creator Mona Magno-Veluz offered her own take to the question, and highlighted that above all, parental guidance is key to the social media teen use conundrum.
“I do agree that the platforms all have their action in place but we cannot also remove our responsibility from that equation. I genuinely feel that parents, the adults in the room also have a responsibility when it comes to restraint in the use of social media. So I think it’s not just a technology but it also has to be addressed in our culture and in societal norms,” Magno-Veluz said.
Riyan Portuguez, a psychologist and digital wellness advocate, noted that external factors may contribute to a child becoming addicted since social media has become one of the most accessible ways of coping.
She said: “Usually, the easiest way for us to cope is to use social media platforms because those are the most available. So that’s why there are addictions because every time there’s a problem, stress, use the phone.”
With social media’s always-on availability, it’s vital for parents and the family to be present for the child. “But if the communication is okay in the family, you have someone to talk to outside social media, you can regulate it. So it’s not just an issue of social media,” she said.
Portuguez’s statement actually reflects the most recent testimony from the landmark social media addiction trial in the US from plaintiff Kaley’s former therapist Victoria Burke, last Monday, February 24.
Burke identified mental health issues of the kid, and said social media was a “contributing factor” in her mental issues, but she didn’t pin the blame entirely on them. Kaley has also been described as having a fraught relationship with her divorced parents.
Meanwhile, at a local Senate hearing for social media regulation, Annalyn Capulong, assistant professor at UP Diliman’s Department of Psychology, told lawmakers: “Sentro ‘yung ambag ng magulang.” (The parents’ contribution is central.)
But Capulong herself acknowledged the reality sometimes parents do rely on devices and screens to keep a child occupied while they handle another task.
Ideally, the parent is present the majority of the time. We know that that’s not always possible, and not all families have an ideal setup as seen in the Kaley case in the US.
What do these mean? This means a certain part of the minor population will be, quite literally, left to their own devices, at the mercy of an app’s design features. These are the most vulnerable members of the population, and platforms must be attuned to the safety of these.
“Are we making the platform safe for our most at-risk users?” should ideally be the question they pose to themselves. And given that mental health and environmental and familial factors affect people beyond their minor years, it’s for the platforms to address this problem — not just for minors, even if, for now, legal and legislative talks are focused on those.
Internal, previously unpublished research from Meta, only recently revealed at trial, had also found that both parental supervision and time limits are not as effective as believed.
Project MYST (Meta and Youth Social Emotional Trends) was a survey of 1,000 teens and parents on social media use, which concluded, “there is no association between either parental reports or teen reports of parental supervision, and teens’ survey measures of attentiveness or capability” and that “parental and household factors have little association with teens’ reported levels of attentiveness to their social media use.”
Even more vulnerable are youth who are in a stressful environment. They are said to exhibit far less control over social media use, according to the survey’s results.
Supporters of plaintiff Kaley G.M. hold signs as they stand outside the courthouse on the day Kaley G.M. takes the stand at a trial in a key test case accusing Meta and Google’s YouTube of harming children’s mental health through addictive social media platforms, in Los Angeles, California, US, February 25, 2026
Looking back at the EU report on TikTok, the Commission said, “current measures on TikTok, particularly the screentime management tools and parental control tools, do not seem to effectively reduce the risks stemming from TikTok’s addictive design… they are easy to dismiss and introduce limited friction.”
It goes back to a social media platform’s — not just TikTok — design features (infinite scroll, autoplay, push notifications, algorithmic recommendations) that are increasingly being described as “addictive” in the US and EU.
Even in the Philippines, the words “addictive” and “compulsive usage” are already in at least two draft bills. Senate Bill (SB) 185 defines a social media platform as a platform employing algorithmic content delivery and has “addictive features.”
SB 601 orders platforms to “limit features that increase, sustain or extend the use of the SMP (social media platform) by the child such as automatic playing of media, rewards system for time spent on the platform, notifications, and other features that result in compulsive usage.”
TIME “100 Most Influential People in Health” honoree Laura Marquez-Garret, an attorney at the Seattle-based Social Media Victims Law Center, recently spoke to Democracy Now. She said that these design features are actually “defects,” and these are what must be addressed, short of an outright ban.
“And so ultimately fixing it — and they (social media companies) know this; it’s in their documents that are becoming public — they could remove the addictive mechanisms. It is as simple as — think of your television set at home. We have this remote control. We get to turn the volume down, up, change the channels. They’ve kept those controls on the back end. They could give you the option to slow down the algorithm,” Marquez-Garret said.
“So they are programming for engagement above all else… And in the case of vulnerable children, it’s deadly,” she added.
The current trial, and the trend of social media bans worldwide, shows there is momentum against the tech companies.
And it’s momentum that platforms have been lobbying against aggressively.
What they want is “the least bad option,” Bram Vranken of the Corporate Europe Observatory, a group monitoring corporate lobbying, told the New York Times.
What is the “least bad” option?
The Times reported: “What these companies prefer are laws that require parents, not governments, to have the final say over children’s online habits… Silicon Valley titans are lobbying aggressively for an alternative to bans.”
Instead of a ban, the companies are pushing for a “Digital Majority Age” in the EU which again puts the onus on parents to approve whether their children 15 or 16 years old can or cannot use social media.
A victory there steers the attention once again away from their platforms’ algorithm and features that create conditions of compulsive use — and away from a proposed Digital Fairness Act that would ban infinite scroll, autoplay, and engagement-based recommendation.
“It would probably require complex redesigns of platforms and hobble their marketing machines,” the New York Times wrote.
Currently, in the Philippines, the firms are backing a similar “age-appropriate framework” — different access rules for different age groups.
But it seems to be not that different from the “Digital Majority Age” in that the responsibility is on the parent to enforce, and it will require the employment of age verification processes that aren’t always easy to enforce rather than addressing the said core design features.
So perhaps here too the focus can be on the addictive features themselves.
Facebook whistleblower Frances Haugen told Democracy Now: “When you have kids whose brains are just being basted in dopamine from scrolling all day at such a young age, it changes their ability to sit still in class, to interact meaningfully face-to-face with her family or friends… yet they continue to optimize for spending more and more time on these platforms.”
Platforms know what’s at stake for them here. What happens at the US trial, and EU investigations can have a ripple effect on the platforms globally.
There’s evidence of a population that’s vulnerable to social media addiction whether that’s because of a tough family life or personal mental health issues. There’s also evidence that safety measures haven’t been all that effective because the “addictive” features are, for the lack of better words, still more powerful.
And amid those, platforms are still seemingly lobbying to keep the responsibility on parents’ shoulders rather than giving us the option to “slow down the algorithm.” Parents, families need more help. – Rappler.com

