Instagram introduces new tools meant to curb abuse and expand privacy

A woman takes a photograph with a digital slr camera whilst standing against an illuminated wall bearing Instagram Inc.s logo in this arranged photograph in London, UK, on Jan. 5, 2016. PHOTO: BLOOMBERG

(NYTimes) - Instagram announced Tuesday (Dec 6) that it would be providing more tools for users to protect themselves and others from online abuse by allowing them to remove comments and filter out certain followers.

The changes will take effect within a few weeks, Kevin Systrom, a founder and the chief executive of Instagram, said in a blog post.

The ability to remove comments will be an add-on to previous efforts: About three months ago, Instagram announced that it would allow users to filter out abusive words from their comments, and the platform has been experimenting with comment removal on high-profile accounts since this summer.

The crackdown on unwelcome comments reflects the large and persistent problem of abuse found in all corners of the web.

Across social media, comments are often a breeding ground for bullying and trolling: Four in 10 internet users have experienced online bullying, with comment sections and social media sites being among the most common venues, according to a 2014 study by the Pew Research Center.

"Comments are where the majority of conversation happens on Instagram," Systrom wrote. "While comments are largely positive, they're not always kind or welcome."

In an effort to encourage positivity, he said, Instagram will allow users to tap a heart icon to "like" individual comments.

People using private accounts will also soon be able to remove certain followers to ensure that their content stays private.

"In the past, once you approved a follower, there was no simple way to undo that decision without blocking them," Systrom said, adding: "The person will not be notified that you removed them as a follower."

Systrom also reiterated that Instagram had recently enabled tools allowing users to anonymously report people who may be at risk for suicide or self-harm.

One of the new tools, giving users an avenue to seek help for someone displaying possible suicidal behavior, mirrors efforts made by Facebook earlier this year. Systrom said that Instagram has installed teams that work around the clock to monitor those reports.

The move by Instagram is the latest step in a slow but steady march by social publishers, including Facebook and Twitter, to curb online abuse and make life on the platforms safer - or at least free of trolls - for users.

In addition to anti-suicide efforts made by Facebook this summer, Twitter said in November that it would roll out new tools to protect users from hate speech and abuse.

And Monday (Dec 5), a coalition of companies including Facebook, Google, Twitter and Microsoft said that they had teamed up to stop the spread of terrorism propaganda around the web.

Join ST's Telegram channel and get the latest breaking news delivered to you.