Criminal News
Curiosities from the world, criminal news

How hate ran riot on the internet following the Christchurch mosque shootings – Stuff

0 0

Ten minutes after the gunshots had stopped ringing through the Masjid Al Noor in Christchurch on Friday, March 15, James was sent a link to the live-streamed attack. The accused gunman was still on the run, and emergency services were still arriving at the scene.

In the following minutes, he was sent more links — this time to messaging boards — where he watched an anonymous community begin to spread the video, along with the man’s manifesto, as far and as wide as possible.

“As soon as it was posted there was a group of people who got to work making sure this material went viral,” says James, who asked that we didn’t publish his surname. “No matter what Facebook or Twitter did, they were never going to be able to get rid of it.”

In the following hours, he saw the global giants of the tech world struggle to contain the video (which has been classified in New Zealand as “objectionable”, meaning it’s banned) as it was downloaded and uploaded again all over the internet. This is what the attacker had wanted, James assumes.

READ MORE:
* Alt-right-delete: Stopping Christchurch mosque shooting video’s spread
* Kiwis face charges related to sharing mosque massacre live stream
* Hate speech – we need to understand the damage it does

New technologies should be bringing us closer together. And often, they do. The internet allows us to overcome geographical and social barriers, and interact with new people and perspectives. But in other ways, it pulls us further apart.

Facebook, Twitter, Google and other online platforms are designed to give us more of what we want, and less content from strangers we disagree with. What we see in our personalised, online worlds is determined by our previous online behaviour and recommendation algorithms. For years, platforms have been promising to fix the likes of YouTube’s “Up Next” system, so they don’t so readily drag users down rabbit holes of hate speech, conspiracy theories, and hyperpartisan, misogynist, and racist videos.

Those rabbit holes aren’t as hard to find as you might think. While sites like the ones used by the accused are harder to find than Facebook and Twitter, alt-right ideology can easily be found on YouTube and Instagram. Popular online porn sites also have an array of videos with racist and white supremacist themes. Comments sections contain links to other sites, which promote more intense and vitriolic content.

It’s natural for people to search out others who support their world views, making them feel safe, and validated. It can be liberating, for example, for geographically isolated people to be able to connect with like-minded individuals online. So we end up organising ourselves into homogenous groups online, known as echo chambers, says Walter Quattrochiocchi, a computer scientist at the University of Venice and world-leading researcher on the subject. This is where the problem starts.

A police officer stands guard near Masjid Al Noor following the attack.

CARL COURT/GETTY IMAGES

A police officer stands guard near Masjid Al Noor following the attack.

The search for information online is driven by emotion rather than truth, he says. So countering ill-informed views with facts doesn’t change people’s minds. People who don’t trust official institutions in real life, even science, are drawn to conspiracy-related content online. When faced with dissenting information, they only become more committed to their erroneous beliefs.

When the rhetoric within these ecosystems escalates, “one bar at a time”, Quattrochiocchi says, then polarising and even radical opinions emerge. “The more you are engaged by an echo chamber the more you tend to be extreme with regards to the identity of the shared narrative.”

At this stage, a person is likely also isolating themselves socially in real life, too, he says. It’s not clear what sort of individual in particular is vulnerable to being radicalised online, he says. “We’re in the middle of radical change and we’re still trying to understand what’s going on.

The man responsible for Friday’s massacre represents the worst-case scenario for online extremism.

Prior to the attack, he’d set up social media accounts, posted photos of his weapons, and linked a rambling manifesto laced with references to alt-right online communities. He began his Facebook live-stream of the killings with a casual internet references. Anonymous users from across the world supported and encouraged him online.

Many of them then watched the massacre live or within minutes of it happening. They celebrated the video and shared it widely. (Facebook said it removed 1.5 million copies of the video from its platform with 24 hours.)

Police responding at the scene of the attack, which was live streamed on the internet.

GEORGE HEARD/STUFF

Police responding at the scene of the attack, which was live streamed on the internet.

All signs point to the gunman being steeped in a culture of what cyberhate expert and “Troll Hunting” author Ginger Gorman describes as predatory trolling — repeated, sustained threats or attacks on an individual or group through the use of electronic devices, which result in real-life harm to the target.

The social posts, manifesto, and video were likely meticulously planned to incite media attention and trick journalists into providing a mainstream platform for the gunman’s propaganda.

It’s difficult to separate the issues of radicalisation, terrorism, and predator trolling, Gorman says. Many of the world’s most notorious predator trolls are white supremacists. Often, they spent their childhood online, “from a very young age imbibing torrents of hate and they get radicalised into these behaviours and ideologies,” Gorman says.

When politicians and public figures tout bigotry, and news outlets quote them, “this type of thinly veiled white supremacy” becomes an accepted part of the public discourse, she says. “It’s normalised and it shouldn’t be. There’s plenty of evidence to show it leads to real-life harm.

“Let’s just remember the Holocaust didn’t start with murder it started with hate speech.”

It’s that very hate speech that Hunter Pollack believes social media platforms needed to pay more attention to.

Pollack’s 18-year-old sister Meadow was killed by Nikolas Cruz when he opened fire at Marjory Stoneman Douglas High School in Parkland, Florida, last year.

Nikolas Cruz, who shot and killed a number of teenagers last year at a school in Florida, United States.

SUPPLIED

Nikolas Cruz, who shot and killed a number of teenagers last year at a school in Florida, United States.

Cruz had posted pictures of his weapons and arsenal and other threatening images on Instagram in the lead up to the attack — but went unnoticed by the website.

“Instagram should have blocked him and alerted police. Social media companies need to be more proactive and less reactive,” Pollack says.

Sites which promote violence and extremist views are the breeding ground for people like Cruz, he says. Those who follow through on their threats become idolised and cherished in these communities, which encourages others to be like them, Pollock argues.

“These killers are motivated by the internet. They can read forums and fan pages about what other killers did.

“They are talked about so much and these isolated people want the same thing so much.

“They will kill because they want to be famous and be a celebrity.”

Gorman agrees it’s “high time law enforcement and social media companies connected the dots and started viewing predator trolling as a canary in a coal mine”.

But Netsafe CEO Martin Cocker says while sites like the ones used by the Christchurch attacker hosted graphic imagery and allowed people to spout racist views, they are also used by people who need to communicate anonymously in countries where there are oppressive regimes.

“Some of these technologies enable people to do what we support — like fight for freedom.

“But they are the exact same sites which are used for the complete opposite.”

Trying to stamp out sites which promote extremist ideologies wouldn’t work, according to Cocker. Even a unified approach by multiple governments would struggle to fully stamp out a site because they would have to agree on a consistent approach.

“People who want to run a site which contravenes those rules could easily find a country which would allow them.”

Cyberhate expert and author of "Troll Hunting", Ginger Gorman.

SUPPLIED/HILARY WARDHAUGH

Cyberhate expert and author of “Troll Hunting”, Ginger Gorman.

Short term fixes include telco companies blocking certain sites — which happened after the Christchurch video was distributed. However, that was only ever a temporary move because easily accessible technology allows users to bypass the blocks. There’s also a social dilemma when it comes to restricting an entire site.

“Our law typically leans towards the idea that anything which is not illegal can be accessed, even if we don’t like it.

“In blocking something like [the sites used by the accused] we are blocking stuff which is harmful. But if we block the whole site we are also blocking a whole lot of conversations which people have the right to have — even if we find it offensive.”

Working together with large social media platforms is better than throwing them under the bus, Cocker believes.

“Once the video was up, the big social media platforms worked very hard [to get it down]. I’m a lot happier with that than the sites who hosted and promoted the video and who, when we request they take it down, gave us two fingers and said ‘stuff off’.”

John Parsons, an internet safety and risk assessment consultant who lectures at schools across the country, says nobody could have conceived of these issues when Facebook and its ilk were built. But, he says, “we have to expect more from them.”

Parents often tell him their children have been exposed to objectionable material on social media. When he tells them to report it, they say they have, multiple times.

“That shows these systems are still focused on making money above all, and they need to better respond to our needs. I’m firmly of the opinion they need to employ more people. Thousands more.”

Martin Cocker from Netsafe says we need to work together with social media sites in the wake of the Christchurch attack.

GRAHAME COX/STUFF

Martin Cocker from Netsafe says we need to work together with social media sites in the wake of the Christchurch attack.

Hate speech doesn’t happen in one dark corner of the internet, he says. It’s everywhere, and spreads “at the speed of light”. Because of this, it’s hard to get a sense of the scale of the problem.

If social media sites started recording incidents of content designed to spread hate and civil unrest when they removed it, and shared that data with governments, we’d better understand what we’re dealing with, he says.

“But it starts with good reporting mechanisms from the organisations themselves.”

In the meantime, parents need to explain to children, in a simple way they’ll understand, what has happened, and keep young children off social media. “Don’t dismiss their fears. But do point out all the people around them who support them and keep them safe.”

InternetNZ CEO Jordan Carter says serious analysis is needed to determine what could have been done differently.

“We need to have a broad-based conversation about this,” he says.

“We need to have Facebook and Google at the table, we need to have the government at the table, the broader tech community and also the Muslim community.”

But as Carter points out, New Zealanders need to grapple with the fact that what led to the massacre was not just the internet’s fault.

“As a society we are going to need to talk about how we deal and grapple with some of the underlying fears and hatred which gives space to these conversations.

“That is not a technology problem, that is a deep values conversation.”

 

Original »

Comments
Loading...