By Kendall Trammell
Even though she’s a social media editor for The Associated Press, Hannah Cushman did not watch the video of the shooting of two Virginia television journalists.
Her colleagues warned her that it was graphic, and there was no reason for her to have to experience the shock they did.
But even as she tried to avoid the video, she glimpsed it on Facebook or Twitter. Every time, she felt anxious.
“No matter where you went, you had no choice,” said Cushman. “That panic I felt every time I scrolled past it, I should’ve just watched it.”
A month after many social media users were forced to watch the killing of two television reporters in Virginia after the shooter posted it on social media, Twitter is more strongly policing content that may be too graphic for users.
But Niall Kennedy, a partner engineer at Twitter, said he was unsure whether those changes were made as a direct result of the shooting video.
Experts say this is a necessary change.
“Autoplay sucks,” said Bruce Shapiro, the executive director of the Dart Center for Journalism and Trauma.
Giving viewers the chance to choose whether or not they want to watch graphic media like this is important, said Kim Bui, deputy managing editor at reported.ly.
“That second matters a lot,” Bui said. “And some people want to see it regardless. Adding that friction is important as an undecided viewer.”
Bui, Shapiro and Cushman appeared on a panel Thursday at #ONA15 discussing vicarious trauma — the emotional stress developed from exposure to the pain, fear or terror others experience. People don’t have to experience the terror first-hand; they can experience stress from exposure through any medium.
Social media in particular, which may not be easy to control, has its own set of issues.
“If you have autoplay, it makes it hard to minimize that trauma,” Bui said. “Just because you aren’t there, doesn’t mean you’re not experiencing something.”
It’s hard to predict when a sensitive content will surface on a social media feed, Kennedy said. Twitter relies on both its editorial staff and its community to police for and flag graphic material.
Not only should social media networks have a critical eye for highly graphic videos and photos, Bui said, but newsrooms should consider the material has journalistic merit. At reported.ly, everything is looked at on a case-by-case basis. Initially reported.ly posted screen shots from the video, but the news outlet eventually apologized and took them down.
“We’re looking at the value of the media in the story,” Bui said. “I think it’s ‘Do you want your mom, your daughter, your grandma to see this?’”
Whether you’re a journalist or a part of a news viewer, Cushman said, you have a reasonable expectation to decency.
“If people want to watch the video, they’re going to look for it,” Cushman said. “But with autoplay, I’d like to have that choice.”