By Denise Lisi DeRosa, Founder Cyber Sensible, LLC
The Children’s Online Privacy Protection Act provides safeguards for our children under the age of thirteen. Facebook, Twitter, Instagram and even YouTube use COPPA to inform their age requirements. This assumes that at 13 years old, our teens should be able to understand the difference between advertising and content, truth and distortion, appropriate and offensive postings and how to behave in an online environment. They should understand that it is within their power to (somewhat) control or (passively) consent to share information about themselves online. According to the Terms of Service for each of these platforms, a 13 year old should be mature and intelligent enough to understand the ramifications of sharing information online.
How then, can these platforms not be able to meet those standards themselves? Facebook was launched in 2004 so it is 14 years old, YouTube turns 13 this year and Twitter is 12. At this age, shouldn’t they all understand the power they have to spread (mis)information to a global audience? The problem is that we keep finding out how the platforms that we use every day are failing to meet their own core principles of truth, authenticity and responsibility that they expect from their users. When a child uses any of these platforms irresponsibly, the first suspect is that his or her parent did not teach proper digital citizenship.
But wait, the platforms themselves are not meeting the standards they set for their users. They expect the users to determine real from fake news, police the content and determine real people from stolen profiles on the various platforms. Sure, in Utopia that might work. But these platforms are open to all voices; positive, beautiful, innocent, charitable, honest, powerful, deceptive, manipulative, disruptive and destructive expressions are all welcomed equally online. Everyone can have a say in these open forums. This allows for bad actors to have the same power as those with good intentions, with no barriers or interference from the companies they use toward their own purpose. The social media companies only provide the opportunity for global communication; they are not responsible for what is shared. Or are they? Should they be? If they profit from the product then don’t they have some ownership in it? A common lesson of digital citizenship courses taught to students globally is the notion of ownership; that every time you ‘like’, share or retweet a thought, you endorse that thought. Shouldn’t that be the standard for the platforms themselves? They are all at, or entering, the age of consent (13).
We teach our kids to be responsible with what they share online, why shouldn’t we expect exactly that, and more from Facebook, Twitter and YouTube? If I had a chance to offer my advice, here is what I would say:
We learned this year that a foreign government manipulated your platform to create and share false stories, fan the flames of division and undermine our democratic process. ‘Fake news’ was shared and liked more than real news on Facebook during the 2016 election. Did you not notice a large uptick of American citizen accounts based in Russia? Also, can’t you determine where ads are placed and who is paying for those ads? Just a thought, maybe you should keep a better track of who is using your platform and why. Of course, it wasn’t just political ads, but political events were organized and political rants were posted by fake accounts. We see now that you intend to prioritize posts from friends and family in our news feed in what amounts to not so much as an overhaul of your algorithm as a tweaking. Wouldn’t it make more sense for you to identify and prioritize objectively trusted news sources over nonsense, to verify personal accounts, and to limit fake accounts from liking and sharing stories? Maybe you are not comfortable with that authority but with global power comes global responsibility and I think you need to step up.
300 hours of videos are uploaded every minute. So of course, not everything produced is appropriate for children and some videos are just not appropriate for a civil society. Recently, the popular YouTuber Logan Paul got in some trouble for posting an insensitive video seemingly making fun of suicide. Logan Paul gauged the reaction from fans and took the appropriate step of removing the offensive video. YouTube, you didn’t weigh in until after that fact. Unfortunately, Logan’s fans keep reposting the video in ways that don’t technically violate your terms of service in a snarky poke-in-the-eye to your terms of service. The sheer amount of videos loaded on a daily basis might make it impossible for you to fully police, but for all the videos that bring in considerable profit through advertising, shouldn’t you? Broadcasters are responsible for the shows they produce or share on TV, shouldn’t YouTube be held to the same standard? How about if you pay attention to the video stars you’ve created, especially the ones that are followed by millions of (under 13 year old) children. Get an executive producer or overseer for each star who has over a million followers to make sure what they share on your platform is appropriate for your enormous audience.
With you we can’t even trust the followers that appear to be our friends or family. There is an entire business model built on generating followers for celebrities and others looking to profit from advertisers, marketing campaigns and influencing political debates. The problem is that many, even millions, of these followers are fake ‘bot’ accounts that are programmed to follow, retweet and like for a price. In some cases our teens stolen profiles are manipulated to increase influence and power of shady figures. If there are so many fake accounts, how can we trust anything we see in our feeds? We teach our teens to be authentic online, isn’t it time that Twitter authenticate individual accounts and require that they be certified real live human beings? You guys are all pretty smart, I am sure you can come up with a way to defend your product from being overtaken by machines.
Dear Facebook, YouTube and Twitter,
You’ve got to grow up and recognize that YOU are responsible for what you created. You and only you are responsible for what is shared on the platform you provide. It’s not up to the users to police it for you. Sure, we’ll do our part but the buck stops with you. You’ve got to clean up your own mess. Have you been watching ‘Stranger Things? Ever see ‘Jurassic Park’? Sure the idea was fun, let’s bring back dinosaurs for our kids to see and touch…but when things get out of hand the people who created the mess are the ones to blame. That’s the same lesson we need our social media companies to learn. You’ve done some incredible work in creating opportunities for people to communicate and connect on a global level. It’s time you fully address the misinformation, manipulation, offensive content and millions of fake accounts that have infected your product. But, so far you have not adequately taken responsibility for the abuses prevalent in your systems. Don’t you want to protect the integrity of what you built? You have a lot of money, resources and brainpower, so figure out how to get it under control. Be responsible for your stuff. Just like we tell our teenagers.
The article originally appeared on Cyber Sensible’s A Positive Digital Life blog
Bio: Denise Lisi DeRosa founded Cyber Sensible in 2015 to provide Online Safety and Digital Citizenship Expertise to parents, teens, educators, athletes, and young professionals. Learn more here: https://www.cyber-