Editor’s note: Andy Parker is an activist and author of “For Alison: The Murder of a Young Journalist and a Father’s Fight for Gun Safety.” The opinions expressed in this commentary are his own.
My daughter Alison was murdered on live television on August 26, 2015. To this day, the video of her murder is still being hosted on YouTube and Facebook.
Both platforms took down the original videos, but the videos were reposted by conspiracy theorists and other bad actors. Along with my advocacy for gun violence prevention, I’ve spent the last five years trying to get Facebook and Google, which owns YouTube, to take down the reposted videos.
Facebook says it has banked and removed the content. It also claims it created a digital fingerprint so that any visually similar videos would be automatically detected and removed when uploaded. But despite this digital fingerprint, videos of my daughter’s murder continue to appear on Facebook.
Google says it has been training its machine systems and human reviewers to flag this type of violent content. It says it also removes content when flagged by users, and terminates the channel of repeat offenders in accordance with its three-strikes policy.
But, in early 2017, Google suggested that I view and flag the content I found offensive myself. They wanted me to watch my daughter’s murder and explain why it should be removed. I never have and I never will watch any of those videos.
In late spring of 2017, I reached out to Lenny Pozner (whose son Noah was murdered in the Sandy Hook Elementary shooting) for help. And, in 2019, I also reached out to Eric Feinberg of the Coalition for a Safer Web. Both have worked long hours on my behalf flagging videos on both Facebook and YouTube. Hundreds of videos have been taken down due to their diligence at flagging them for removal, but still some remain.
In the fall of 2018, I began collaborating with Georgetown University Civil Rights Law Clinic and have engaged in direct communications with Google regarding the proliferation of these videos. But while the company has professed a desire to help, they haven’t done enough. In May 2019, the Georgetown team and I had a videoconference with representatives from Google regarding specific content and our attempts to have it removed, which yielded no positive results. (Google declined to comment on this conversation.)
The main thing that has stood in my way is something called Section 230.
Section 230 of the Communications Decency Act says that websites that host users’ content are not responsible for what users post, and that companies can take down offensive content that may be protected by the First Amendment.
Senator Ron Wyden was one of the authors of Section 230. He has said his objective was to give up-and-coming tech companies a sword and a shield, and to foster free speech and innovation online. The provision ensures that users, not hosts, are responsible for the content they post, and it shields social media platforms from almost all responsibility or liability. The Section 230 sword allows social media companies to take down offensive content without fear of being sued or shut down.
Social media companies use the shield, but they have little incentive to use the sword and proactively self-police the content that appears on their platforms.
At my testimony last year before the Senate Judiciary committee I argued that as long as Google and Facebook are not liable for the content users post on their platforms, they will continue to play by their own rules.
That’s why we need to amend Section 230 to prevent killing videos, targeted harassment and hate speech from spreading on social media platforms.
I recognize the First Amendment gives everyone the right to publicly speculate that the moon landing was faked, or that the Earth is flat. But there is a difference between someone venting about a favorite conspiracy theory and anonymous users on Google and Facebook targeting and harassing victims of public tragedies — the former is free speech; the latter is violence.
The solution shouldn’t be that difficult. Not long ago, Section 230 was amended to restrict online sex trafficking. Restricting targeted harassment, incitement and killing videos should be an extension to this amendment. It’s a common-sense, narrow focus that should appeal to both sides of the aisle.
If social media platforms cannot properly protect citizens from online harassment, hate speech and ‘moment of death’ videos, Congress must step in and make sure that proper protections are in place for private citizens who have been targeted, harassed and exploited.Note: After CNN Business reached out for comment on this piece, Google removed two videos of Alison’s murder from YouTube.
The-CNN-Wire
™ & © 2020 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.
"still" - Google News
July 28, 2020 at 08:21PM
https://ift.tt/3f84KOG
The video of my daughter’s murder is still on YouTube and Facebook. They should have to take it down - The Mercury News
"still" - Google News
https://ift.tt/35pEmfO
https://ift.tt/2YsogAP
Bagikan Berita Ini
0 Response to "The video of my daughter’s murder is still on YouTube and Facebook. They should have to take it down - The Mercury News"
Post a Comment