Home Good Reads › Fighting harmful comments online

Fighting harmful comments online

Provided by Netsafe 18.11.19

Are you worried about the impact of harmful comments online?

Ever wonder what a company’s responsibilities are when hosting comments that discriminate against someone’s lived experience or identity? We did! We sat down with Netsafe’s CEO Martin Cocker to find out.

Netsafe is an independent, not-for-profit online safety organisation that receives about 60 harmful digital communications reports a week. Many of these cases involve discriminatory comments or threads on social media, blogs or news sites.

Netsafe often see the negative effects of discriminatory threads when they are left unmoderated.

“What is published and commented on in the online world has very real impacts for people. Sometimes that can take the form of bullying, or it can be as simple as leaving people out, excluding them from things that they should be a part of," Mr Cocker says.

“We certainly deal with people who are at risk or are experiencing increased mental distress because of discriminatory behaviour from others online."

Comment moderation is key to keeping people safe, he says.

“If you’re hosting an online platform where you allow anybody to speak and people place harmful comments on your page, then you have a responsibility to act on that.”

Acting out against discrimination

The Harmful Digital Communications Act outlines what communications are considered harmful. It includes 10 principles of what constitutes harmful communication, which include comments that personally harass, cause harm or “denigrate” an individual by reason of their disability, colour, race, religion, gender or sexual orientation.

Although the Act doesn’t specifically say that mental distress discrimination constitutes harmful communication,
Mr Cocker believes it is covered by the Act.

“The way that the Act is written captures such a broad range of things that discriminatory or prejudiced comments towards people who live with mental distress could certainly be a part of the mix.

“If you could know that the comments from publishing your content would be harmful and failed to moderate them you could be liable under the Act.”

Who can be an online content host?

Under the Act, anyone who moderates or controls content is an online content host. This includes platforms such as Facebook, or individual companies.

Netsafe regularly works with Facebook and other platforms to report and remove harmful content, a job that has been made easier after the Christchurch attacks.

“There have been some announcements from Facebook and other companies to more clearly define certain content as objectionable, which has enabled us to report more harmful content and have it removed,” Mr Cocker says.

Netsafe also regularly work with media outlets to encourage comment moderation on their social media and online channels.

“As media companies have moved into allowing readers to comment online, they’ve struggled at times to create the experience that they’re looking for and to yield value out of those comments. We find that sometimes this can enable harmful comments to go unmoderated, and that’s something we’re working closely with them on.”

What can we do now?

Mr Cocker says the best way to tackle discriminatory comments online is to report them to their host.

“If reporting is not taken seriously by the host, you can report it to us at Netsafe. We have a statutory responsibility under the Act to attempt to assist you and get a resolution.

“You can also contribute to the conversation online with information, which might help people to better understand your position and perhaps change the tone of the conversation. However, in a lot of places this can be a high-risk behaviour and can lead to you becoming the target of the attack.”

If your company runs an online account where anyone can leave comments, Mr Cocker suggests establishing some clear rules around what types of comments are and aren’t acceptable.  

“It’s also really important to enforce these rules, and make sure you’ve got the required resource to do that.

“There’s nothing worse than setting rules and then not enforcing them, because that then makes you liable for the impact of not doing so.”

Interested in working with the media to change discriminatory language?

Join our Media Watch page to stand up and speak out against harmful language.