Nigeria has finally started asking a question many governments avoided for far too long: what exactly are children doing on social media platforms that were never built with their safety in mind?
The Federal Ministry of Communications, Innovation and Digital Economy has opened a national conversation on regulating how children interact with digital platforms. Through a public consultation and survey, the government is exploring possible safeguards ranging from age restrictions and stronger verification systems to greater platform accountability and regulatory oversight. On paper, the objective is straightforward. Protect children while preserving the internet’s promise of learning, creativity, and connection. In practice, however, the issue is far less comfortable.
Social media companies have spent years cultivating a convenient myth that their platforms are neutral tools. According to that narrative, they simply provide space for communication while users decide how to behave within it. It sounds reasonable until one remembers that these platforms are not public squares. They are carefully engineered commercial environments designed to maximise attention, engagement, and data extraction.
Children are not simply users in this environment. They are prime targets.
The architecture of modern social media rewards emotional intensity, rapid reaction, and endless scrolling. Algorithms learn quickly what keeps people hooked and then deliver more of it with ruthless efficiency. Adults struggle with this dynamic. Teenagers and younger children, whose impulse control and risk awareness are still developing, have almost no chance of navigating it safely without guardrails.
Nigeria’s consultation recognises the most visible threats. Exposure to harmful or explicit content remains a constant concern. Cyberbullying has become relentless because the abuse no longer ends when a child leaves school. Online grooming thrives in the shadows of anonymity and private messaging systems. Personal data is harvested with remarkable sophistication, often from young users who barely understand what consent means in a digital environment.
Yet there is another issue that deserves far more attention in this debate: the deliberate design of digital addiction.
Features such as infinite scroll, algorithmic recommendation loops, autoplay videos, and persistent notifications are not accidental design quirks. They are behavioural mechanisms developed to keep users on the platform for as long as possible. More time on the platform means more data collected and more advertising revenue generated.
For children, this design logic can quietly reshape habits, attention spans, and mental wellbeing. Long before policymakers step in, the platform has already captured the user.
Artificial intelligence now complicates matters even further. AI systems are rapidly transforming how content is generated, distributed, and personalised. A teenager scrolling through a feed may soon encounter content written, voiced, or visually produced entirely by machines that are optimised for engagement rather than truth. Deepfakes and synthetic media will only make it harder for young users to distinguish reality from manipulation.
Against this backdrop, Nigeria’s move to gather public opinion is both sensible and overdue. The country must decide what level of responsibility technology companies should bear for the digital spaces they control. Consultation matters, but it cannot become an endless exercise in polite deliberation while children continue navigating systems designed without their protection in mind.
The Nigeria Data Protection Act already provides an important foundation. Section 31 acknowledges that children require enhanced protection when it comes to personal data processing. That principle reflects a simple truth: minors lack the capacity to understand the long-term implications of sharing personal information online. Safeguards therefore cannot rely solely on the individual user. They must be embedded into the systems collecting that data.
This is where the conversation inevitably becomes contentious.
Technology companies prefer self-regulation because it allows them to set the pace of reform. Governments often accept this arrangement because it avoids confrontation with powerful global platforms. The result, more often than not, is a series of voluntary safety promises that struggle to keep up with the commercial incentives driving the industry.
Nigeria should resist repeating that pattern.
A credible framework for child online protection must place enforceable obligations on the platforms themselves. Age assurance mechanisms need to move beyond superficial tick-box declarations. Algorithmic systems that shape children’s feeds should face transparency requirements so regulators can understand how those systems prioritise content. Behavioural advertising directed at minors deserves strict limits given how deeply it relies on personal data profiling.
Equally important is the question of enforcement. Regulation that exists only on paper rarely changes behaviour in a sector where scale and profit move faster than compliance. If Nigeria chooses to introduce new safeguards for children online, those rules must carry meaningful consequences for companies that ignore them.
None of this means the country should retreat from digital innovation. The internet remains an extraordinary tool for education, collaboration, and opportunity. Nigerian children deserve access to those possibilities, particularly in a nation that is investing heavily in its digital economy.
But access without protection is not empowerment. It is neglect dressed up as progress.
The deeper issue facing policymakers is philosophical as much as technical. Should the digital environment be shaped primarily by the commercial priorities of technology companies, or by the social priorities of the societies in which those platforms operate?
If Nigeria believes its children deserve safer digital spaces, then the answer cannot be left to Silicon Valley product teams and algorithm engineers. It must come from deliberate governance, clear rules, and the willingness to enforce them.
Children should be free to explore the internet. They should not have to survive it.
business a.m. commits to publishing a diversity of views, opinions and comments. It, therefore, welcomes your reaction to this and any of our articles via email: comment@businessamlive.com







Nigeria’s move to protect children from social media use