Harmful Digital Communications: anti-free speech or anti-troll?
On 2 July 2015 the New Zealand Parliament passed the Harmful Digital Communications Act with an overwhelming majority. The Act implements some of the recommendations of Law Commission's 2012 report "Harmful Digital Communications: The adequacy of the current sanctions and remedies", which described the growing problem of cyber bullying and outlined options for reform. It modernises some existing legislation and introduces a set of communication principles for material posted online, a civil enforcement regime and two new criminal offences.
The Act isn't relevant only to internet trolls and their victims - but also to "online content hosts". The definition of "online content host" (…the person who has control over the part of the electronic retrieval system, such as a website or an online application, on which the communication is posted and accessible by the user) is very broad and arguably covers any person or entity that has control over a forum where a communication that is allegedly harmful is posted (e.g. an organisation with a Facebook page where third parties can post comments).
One of the issues that the law has been grappling with for some time is the extent to which operators or websites or forums and other content hosts will be liable for user-generated content. The Act attempts to bring some clarity by providing a statutory safe harbour for online content hosts (an approach which has been taken in a number of different jurisdictions). However, the requirements of the safe harbour are prescriptive and there is considerable uncertainty about how a number of the provisions of the Act (including the safe harbour) are going to be interpreted and will work in practice.
For the safe harbour to apply, hosts must have an easily accessible reporting mechanism that allows users to complain about harmful material. After receiving notice of a complaint, hosts are required to contact the author of a harmful communication in a 48 hour window and ask for it to be removed. The author then has a further 48 hours to respond. If they provide a valid counter-notice, the host must contact the victim and inform them that the communication will not be taken down. If a counter-notice is not provided, the host must take down the allegedly harmful material.
In practice, many content hosts already have in place moderation and takedown policies and some will continue to prefer to take down objectionable content as soon as they become aware of it (regardless of whether a complaint has been made and without contacting the author). Online content hosts shouldn't necessarily face civil or criminal liability simply because they haven't taken advantage of the safe harbour.
It's also worth noting that the District Court has broad powers to subject online content hosts to orders, including: requiring public access to a communication being disabled, the identity of an anonymous communicator being revealed, corrections to information being posted and rights of reply for victims. Failure to comply is an offence that carries a maximum penalty of either a $5,000 fine or six months imprisonment for individuals, and a $20,000 fine for corporates.
Amy Ryburn is senior associate in Buddle Findlay's ICT practice. She really loves her job. Currently, she's particularly interested in how commercial contracts can best be crafted to support and promote ICT project success (and avoid failure) and in the role of these contracts in actually managing ICT projects after the contracts are signed. When Amy's not at work, she can generally be found hanging out with her husband and trying to keep up with her three small children.
You must be logged in in order to post comments. Log In