@dalias@hachyderm.ioCassandrich @Em0nM4stodon@infosec.exchangeEm :official_verified: My approach is actually one of the former category - "trivially" bypassable.

By making the parents responsible. They can set up youth protection software on the device on their children's devices if they feel they need to. Just like now.

The only technical thing I'd ask for is that social networks describe themselves in some form of XML file, and that they respect a Do-Not-Track-like header.

All else is on the client software. Which the parents may or may not install. And if the kids are old enough to have the kind of money to buy their own phone and pay for their own internet connection, they can of course trivially bypass it and I don't care.

And sorry for being a fascist. I don't want platforms like Roblox, TikTok and X to keep harming children. Honestly, I'd rather have them banned entirely (and also every single short video platform or platform feature). But as that's not gonna happen, let's keep at least children out of there. Or else we'll be raising more fascists.

@divVerent You said the solution to your actual problem right there: ban these abusive platforms entirely. Or at least regulate them into not being able to do the really harmful things they do - to people of all ages. None of that has anything to do with policing children or policing whether users are adults.

0

If you have a fediverse account, you can quote this note from your own instance. Search https://hachyderm.io/users/dalias/statuses/116114841449910365 on your instance and quote it. (Note that quoting is not supported in Mastodon.)