ITP Techblog

Brought to you by IT Professionals NZ
« Back to Industry News

Brislen on Tech

Paul Brislen, Editor. 22 March 2019, 2:26 pm


This really isn't a free speech issue. Stopping terrorists from sharing obscene footage in accordance with our laws is not "stifling" nor is it "a slippery slope" and it certainly isn't "de-platforming a valuable contribution to the debate" or any of the other mad things I've been told over the past week.

A terrorist committed an atrocity and shared it with millions of viewers worldwide via social media channels.

This is a breach of the law here and in many other jurisdictions.

The New Zealand law (in this case, the Films, Videos, and Publications Classification Act 1993) is clear:

For the purposes of this Act, a publication is objectionable if it describes, depicts, expresses, or otherwise deals with matters such as sex, horror, crime, cruelty, or violence in such a manner that the availability of the publication is likely to be injurious to the public good.

It's up to the Chief Censor to decide whether this law applies - he made that call and we're seeing people arrested and charged for sharing the content.

(EDIT: There is some contention around whether the CC decides the content is objectionable or whether he simply clarifies that the content is objectionable. In practice you get the gist.)

But of course laws don't apply on the internet. Facebook, Twitter, Instagram, WhatsApp, YouTube and all the others, including Cloudflare which we'll come back to shortly, all get to shrug and say sorry, we did what we could but y'know ¯\_(ツ)_/¯

YouTube says it took down "tens of thousands of videos and terminated hundreds of accounts created to promote or glorify the shooter". The company, owned by Google's parent company, went so far as to automatically reject footage of the attacks and "temporarily suspend[ed] the ability to sort or filter searches by upload date".

This game of whack-a-mole is what passes for best efforts in this regard, and is a no-win situation if the video is untagged (or "hashed" as the cool kids call it) in any systematic way that would enable fast removal.

Because of course it's the initial uploading and live broadcasting of such material that is the problem here, and given the number of live streaming services out there, why do we keep banging on about Facebook?

Well, two reasons. Firstly, Facebook has a massive audience and so the potential damage is huge. I could start a live video session on LinkedIn and nobody would notice or care, but on Facebook you'll get traffic. This makes it vital that Facebook manage this carefully.

And secondly, we keep going on about Facebook because this is not the first time Facebook Live has been used to share atrocities. It has happened time and again. Violent assaults, sexual assaults, murders, suicides, terrorist attacks. Each time Facebook says it will do better and each time it fails.

This isn't about free speech. This is about putting shareholder money ahead of the law and Facebook and its ilk need to understand that we will draft laws that put the onus on them to fix this. One way or another this has to end.

This isn't about free speech. It's about decency.

CNN - The internet is radicalising white men. Big tech could be doing more

NZ Herald - Front up, Facebook CEO Mark Zuckerberg

The Guardian - Mark Zuckerberg, four days on, your silence on Christchurch is deafening

Google - Facebook Live murders (I'm not going to link to all the stories - see for yourself)

BBC - Facebook: New Zealand attack video viewed 4,000 times


Anti-Social Media

There is one way to expedite a take-down notice on social media and I can't believe it but it's through copyright threats.

Tell the social media giants that someone is threatening to kill you and they'll turn the other way. Tell them someone is inciting racial hatred and they'll point to their terms and conditions and say move along. Tell them someone is breaching your copyright and BAM! They will act within seconds, often before the post has gone up online.

Facebook is notorious for this. Try posting a video of your kids doing karaoke and it'll be blocked immediately. Post them shooting someone and well… it's difficult.

One man who discovered this and has been using it to his advantage is Lenny Pozner, the father of Noah Pozner who, along with 26 others was slaughtered at Sandy Hook in 2012.

Pozner discovered the conspiracy theorists who claimed the attack was fake, that he was some kind of paid actor, that his son wasn't real and wasn't really killed.

I can only imagine how distressing it would be lose a child like this, but to then have random strangers tell you (online and in person) that you're a liar and a fraud would be devastating.

He tried repeatedly to deal with the publishers of these lies, he tried to get the social media platforms to take down the content whenever it popped up but apparently it's OK to tell everyone you're part of a vast government conspiracy and that you're trying to take away their rights and it's even OK to tell these same incensed conspiracy theorists where you live and what your social security number is and to encourage them to visit you and explain why you're trash.

It was the publication of photographs of Noah and of his family that finally caught the social media platforms' attention. Pozner issued copyright infringement notices every time someone posted his son's photo. After three offences the accounts would be closed.

Can you imagine having to do this every day? You can't simply call up Google and say "hey, can you guys do this for me?" because no, they don't have the resources. You have to wade in to the filth every day looking for photographs of your dead child so you can report them and have some semblance of balance in your life.

I've spent a week researching free speech and far right issues and I feel ill. I cannot begin to understand how Lenny Pozner is feeling after all this time, but he's still doing it because nobody else will.

Something needs to change in the way social media platforms handle take-down requests, how they moderate their own content and how they manage live broadcasts.

I was very pleased to see the telcos move immediately to block access to the video content wherever they could but I do not want the telcos to become arbiters of what I can and cannot see on the internet. I'm no fan of filtering - I'd much rather we teach our children the tools they need to manage themselves online, and the same is true for adults.

But in time when the social media players aren't picking up the phone (presumably it's muffled under the pile of cash they made during the shooting spree) I was happy to have the telcos step up.

They went one step further, telling the world that they would not advertise on social media platforms and calling on Facebook, Twitter and Google to meet with government officials in New Zealand to discuss how they're going to manage things from here on out.

A number of other corporates have joined in, removing their advertising from social media. I've seen posts to that effect from ANZ, Ford New Zealand and many others and this is a good thing to see because these companies only respond to the sound of a wallet being opened or closed. Take away their money tree and they'll find a way to do what you want.

One company that has escaped any kind of acrimony is Cloudflare and that's rather odd.

Cloudflare is a content distribution platform that offers security services to websites. It doesn't host content itself but rather provides the services and support needed to keep sites online.

It claims to have more traffic than Apple, Facebook, Twitter, Amazon, Bing, Wikipedia and Instagram combined. It accounts for around 10% of all internet traffic and serves content right around the globe. It has points of presence in New Zealand and among its clients it includes many New Zealand companies. It also hosts several terrorist groups, a large number of malware hosting sites, plenty of torrents for the movie pirates and of course our terrorist's video footage.

Indeed, Cloudflare's own security protection suite will have helped keep the footage online while other agencies were working hard to take it down.

Cloudflare should be added to the list of social media platforms that operate a hands-off, "no care, no responsibility" model and which claim to be beyond the reach of the law. "Passively incompetent" is how one columnist described this approach but I suspect it's far more sinister than that.

TechBlog - Tech giants fail to stop spread of Christchurch video

TechBlog - Telcos write open letter to Social Media giants

This American Life - Beware the Jabberwock

New York Times - This Company Keeps Lies About Sandy Hook on the Web

Media Matters - Instagram is the new home for Alex Jones and Infowars

NZ Herald - Massive Kiwi funds with $90b consider dumping Facebook, Google, Twitter shares if firms broke law by hosting shooting video

New York Times - We're Asking the Wrong Questions of YouTube and Facebook After New Zealand

New York Times - The Attack That Broke the Net's Safety Net

Huffington Post - U.S. Tech Giant Cloudflare Provides Cybersecurity For At Least 7 Terror Groups

Techspot - Web giant Cloudflare reportedly providing service to seven terrorist organisations

The Register - Cloudflare speaks out amid allegations it safeguards banned terror gangs' websites


Where to from here?

The next steps are mostly in the hands of the government.

New gun laws are the most obvious and are well underway but we'll also need a full inquiry (preferably a Royal Commission of Inquiry) into the security services and police, and how they failed to see this coming. Yes, they're under-funded and yes, they have to prioritise, but given the rise of ultra-right-wing fascism around the world and the prevalence for alienated white men to go online and talk about how they're going to kill people before going out and actually killing people, surely some attention could have been paid to that end of the spectrum as well as to Keith Locke and Greenpeace?

What I don't want to see is this atrocity being used as an excuse to shave away yet more civil liberties. We have laws that already go too far in my opinion and which tread dangerously close to the line, if not already well over it. We need to use the tools we've got and be smarter about it. Anyone seeking a gun licence should have their social media profiles reviewed - it's what I would do before hiring someone, so why wouldn't you do it before giving them a gun?

We're going to need to review the laws relating to social media and how we can hold platforms accountable for the content they promote. Don't forget, they're making money off all of this - that means they're accountable in my book.

We've had court suppression orders flouted, we've seen privacy laws disregarded and now we've seen obscene material shared far and wide. If these companies want to continue to operate in New Zealand they need to be aware of, and abide by, New Zealand laws.

The tech giants will have to get better at enabling users to find and report content. It's not enough to say it doesn't breach the community terms and conditions - you need to do better. Facebook says nobody flagged the Facebook Live feed as it was going out - is that because it's too difficult to do? Is that a bug or is it a feature?

I'd start by revisiting the decision not to give the Privacy Commissioner some teeth and go from there. Perhaps the government could put together a group of people who will come up with solutions instead of hearing only from people saying "it's too difficult" and "we don't have the resources available". Social media companies are among the richest in the world - they can spend some time and effort and money if they have to getting their moderation and management services right.

And then we can have a conversation about making New Zealand into the place we thought it was for all New Zealanders.

Newsroom - How to make big tech do the right thing

NZ Herald - Analysis: Christchurch massacre - what did we miss and who missed it?

Newsroom - NZ's spies were watching, but maybe not close enough

RNZ - Far-right extremists were on the Government's radar before mosque attack

RNZ - Terror attack aftermath: 'Free speech' is great, but has its limits


Intolerant of Intolerance

Finally, one last note. Thanks to Dave and Eric from Twitter for pointing me in the direction of Karl Popper and his Paradox of Tolerance. Karl Popper is one of the last century's great philosophers and he wrote widely about the scientific method and how it could and should work.

He is also the father of Popper's Paradox of Tolerance which he described as this:

"Unlimited tolerance must lead to the disappearance of tolerance. If we extend unlimited tolerance even to those who are intolerant, if we are not prepared to defend a tolerant society against the onslaught of the intolerant, then the tolerant will be destroyed, and tolerance with them."

It is the great paradox of our age - as we preach tolerance, we cannot extend it to those who would destroy our way of life.

Indeed, Popper goes on to talk about the balance between free speech and regulation:

"I do not imply, for instance, that we should always suppress the utterance of intolerant philosophies; as long as we can counter them by rational argument and keep them in check by public opinion, suppression would certainly be unwise. But we should claim the right to suppress them if necessary even by force; for it may easily turn out that they are not prepared to meet us on the level of rational argument, but begin by denouncing all argument; they may forbid their followers to listen to rational argument, because it is deceptive, and teach them to answer arguments by the use of their fists or pistols."

I find this an excellent lens through which to view the events of the past week, not only because of the philosophy behind it and what it means for those of us trying to make sense of it all, but also because Karl Popper lived and worked in Christchurch during the 1930s

Wikipedia - Paradox of tolerance

CIO - A tribute to my friend Atta Elayyan


You must be logged in in order to post comments. Log In

Web Development by The Logic Studio