The internet is unquestionably getting younger. As of 2018 175,000 kids were coming online for the first time every day, and today that number is likely higher. But despite this shift in demographic, one huge issue remains: it was never designed for kids.
Safer internet day this year focused on exploring respect and relationships online, a timely theme as we begin to see the emergence of the ‘metaverse’. Immersive gaming platforms are rapidly absorbing much of the functionality which social platforms once held exclusivity over.
More of our cultural lives are now happening in these environments that are all evolving towards a persistent, interoperable digital world.
Although the nature of the metaverse is still being debated and shaped, this juncture presents the opportunity to reset how we think about digital rights and responsibilities for younger audiences and ensure the next iteration of the internet is safer than the last.
The metaverse will be home to a universe of content and brings up many questions for digital parenting.
Ultimately, children should be able to access any (appropriate) content with the supervision and consent of a parent or guardian, and have that experience tailored to that parent’s desired setting.
Some parents will want a very conservative experience for their child (no chat, no friend requests) but others will be comfortable with a more exploratory approach (eg friends are OK). All of that should just happen ‘magically’ without the need for parents to wade through a myriad of screens, settings and apps.
With the introduction of new child safety and privacy laws like the UK’s Age Appropriate Design Code, we are seeing many major technology companies investing in new and innovative ways to make their services safe for kids under-18.
While that has definitely been a welcome development, these solutions are often highly customised for their own product offering.
For many smaller developers, who don’t have access to millions of dollars in R&D and large engineering teams, the ability to add support for young audiences and parents remains challenging.
Today, nine years after we founded SuperAwesome to tackle some of these challenges, the internet is safer for kids than it was before, but in my view we are still behind where we truly should be.
As we think about the elemental building blocks of the metaverse, we want to ensure that support for engagement with younger audiences and their parents is there from the beginning. In fact, the ability to scale the tools that will enable all developers to do this is one of the reasons we joined Epic Games.
We commend every company working towards making their owned platforms available and safe for younger audiences, but we feel it’s important to look around us and invest in the future of our entire ecosystem as well.
Regulators in the US, UK and Europe are leading the charge in defining digital safety for young audiences, but it’s essential that there are kidtech tools available to everyone to enable these aspirations.
Ultimately, creating kid-safe digital experiences should not just be limited to the developers that can afford to be compliant and safe, this is an ecosystem responsibility.
That’s why we made our Kids Web Services parent verification tool (a critical component for enabling young audience access) free for everyone through Epic Online Services last year, and we will continue to invest in scaling more ecosystem-first tools to help support a metaverse that is safe for all audiences.
Digital child safety shouldn’t be a competitive advantage, it should be an area for open collaboration between all stakeholders to ensure the best possible outcome for younger audiences.
Unicef believes that access to a safe internet is a child’s right, we want to make sure the metaverse is built on that same principle.
Dylan Collins is the founder and CEO of SuperAwesome, a ‘kidtech’ firm that was acquired by the Fortnite developer Epic Games in 2020