Young people are not equipped to deal with pornography. It can have a deeply damaging effect on their behaviour and their understanding of consent and healthy relationships. It can distort their views on body image. It can be downright dangerous when it comes to learning about sexual health.
NSPCC research shows that by the age of 16 nearly half of young people have viewed pornography, and those children are just as likely to find it accidentally as they are to deliberately search for it.
At the NSPCC we have worked hard to stress this to Government, so its pledge to introduce a regulator and age verification measures to block children from accessing pornography websites online is a vital and welcome first step towards keeping children safe online.
But there is much more work to do.
By the age of 13 three quarters of young people now have a social media account. For children these networks are a way of chatting with friends, watching funny videos or finding out about the world.
Yet all too often we hear of grooming, hate speech, self-harm content, cyber bullying and even child abuse images cropping up on social media.
Some social networks have designed their platforms with child safety in mind, others have not. Some are good at taking action when harmful content is reported, others are not.
The problem is that each network has their own rules for handling inappropriate content or abusive behaviour on their platforms. This leads to inconsistencies in keeping children safe, and ultimately means social networks are marking their own homework.
We've had enough and children have had enough. Recent NSPCC research found four out of five children feel that social media companies aren't doing enough to protect them on their sites.
Government must now grasp the nettle and draw up a universal set of rules for all social networks and to create an independent regulator with teeth to enforce those rules.
These rules should require social networks to offer Safe Accounts to under 18s, with high privacy settings as default, location settings locked off, control over who follows you, and clear child-friendly rules and reporting buttons that are easy to find and easy to read.
Social networks should also bring in groomer and bully alerts for these accounts, to automatically flag behaviours to moderators and to the children being targeted.
And these companies must hire an army of online child safety moderators, to disclose the number of reports they receive and how moderation decisions are made to the regulator. Within this framework harmful, violent, abusive or adult content can be proactively filtered using key words - either to block it for Safe Accounts or to issue a pop-up warning to young people.
Out in the real world you'd be comfortable for your child to go to youth club, but there are laws in place to stop them entering an adult environment, such as a night club. Yet online our children are finding themselves in a very adult arena, with scarce legal safeguards to protect them.
Proper regulation of social networks will make the internet a better place for young people by protecting and empowering them. Children must be as safe online as they are offline.
Claire Lilley is head of child safety online at the NSPCC