Why are those platforms not playing ball, and why does it take going to court to force them to — especially when it’s about protecting children?

For illustrative purposes. Picture: iStock
As the legal upper guardian of all children, the high court really delivered in its action against Meta.
It’s frightening that the Digital Law Company had to go to court to deal with Instagram profiles and WhatsApp channels disseminating child sexual content, but I guess some privacy is more important than others?
Take nothing away from the massive achievement this is for protecting children, and thank goodness for the existence of the Digital Law Company, which ran this matter with urgency.
What are social media giants doing to stop the filth?
This is a victory to be celebrated.
Once the dust settles, the difficult questions need to be put to the social media giants: why was going to court necessary?
Our cyber laws really do go a long way in making these kinds of activities criminal and placing legal obligations on those who have control over the platforms to prevent and report. Not only do we have laws for these things, but the user policies of these platforms also prohibit child sexual content.
So even if South Africa was blind to the existence of child abuse, which thankfully it is not, these platforms are self-obligated to deal with the matter.
And yet, here we are, having to take up court time to deal with something that should never have happened, and when it did, should have taken less than a phone call to remove.
Let police and prosecutors work
What I’m so appreciative of is that the focus on the settlement agreement turned order doesn’t allow for the immediate publicity of the details of the perpetrators.
This may seem counterintuitive, but it is actually in line with a well-considered standard of care. Sometimes, the children involved can be identified when the accused is identified, and sometimes, there is collateral reputational damage to innocent people when names are simply released.
Allowing the specialised investigative unit and prosecuting authority to build a case will likely eventually reveal the names, but in a responsible way that will take into consideration the welfare of the kids and the possibility of re-traumatising them.
But again, why did the court need to get involved? Why is there a need for a hotline between the Digital Law Company and Meta? As great a result as that is, why must it come down to reporting before anything is done?
If Google can identify that I have an illegal copy of The Jungle Book on my GDrive, I don’t think it’s a far stretch to build AI models that can detect language that suggests child pornography or abuse.
Make it make sense why a legal team has to run to court to protect children from people who are not only breaking the law, but also breaching the terms and conditions of the platforms they’re using to hurt children.
Why are those platforms not playing ball, and why does it take going to court to force them to … especially in the interest of protecting children?
Perhaps they don’t want the bad publicity, or perhaps they don’t want to admit that things are as bad as they are.
Here’s a different suggestion.
Nobody is going to argue against the existence of bad people. Bad people will always be around. I think that’s something we can safely say society has accepted a long time ago.
If we can admit that, then there’s no reason why Meta can’t say we want to keep bad people off our platforms. There’s no reputational damage to any platform for accepting that sometimes they’ll get some awful person as a user, but then they apply their terms of service, report the person, and now that person is locked up.
We already know creeps are on every available platform. That’s not going to be what causes the reputational damage in future. What will cause reputational damage is knowing that those platforms will try to put their own name before the safety of our children.
Emma, Ben, Rorke. Take a number of bows, and thanks for putting in the effort to protect our kids, even when you shouldn’t have to.
NOW READ: Parents, use these Instagram tools to keep your teens safe online