NEW YORK (NYTIMES) - In Wikipedia's 18 years of existence, it has become a fixture in our lives: It ascends to the top of Google's search results and provides answers to the questions we ask Alexa and Siri.
For Wikipedia's editing community, the website is even more. It is a kind of social network where users debate the minutiae of history and modern life, climb the editorial hierarchy and even meet friends and romantic partners.
It is also a place where editors can experience relentless harassment. In 2016, Pax Ahimsa Gethen, a trans male Wikipedian, was persistently hit with personal attacks over several months. Gethen, 49, who uses the pronouns "they" and "them," said the anonymous harasser posted that they were "insufferable" and "unloved," that they belonged in an internment camp and that they should kill themself.
Gethen said the user also publicly posted their deadname, the name they used before transitioning.
"It was devastating to me because of the personal nature of it," Gethen said.
Unlike at social networks such as Facebook and Twitter, the people who handle reports of harassment on Wikipedia are largely unpaid volunteers.
In response to complaints about pervasive harassment, the Wikimedia Foundation, the San Francisco-based non-profit that operates Wikipedia and supports its community of volunteers, has promised new strategies to curb abuse.
In recent months, the foundation has rolled out a more sophisticated blocking tool that it hopes can better control the harassment plaguing some users.
Sydney Poore, a community health strategist with the foundation, said that when the free encyclopedia was established in 2001, it initially attracted lots of editors who were "tech-oriented" men. That led to a culture that was not always accepting of outside opinions, said Poore, who has edited Wikipedia for 13 years.
"We're making strong efforts to reverse that," she said, "but it doesn't happen overnight."
A BARRIER TO GENDER EQUITY
A few informed clicks on any Wikipedia article can reveal the lengthy discussions that shape a published narrative. According to interviews with Wikipedians around the world, those digital backrooms are where harassment often begins. A spirited debate over a detail in an article can spiral into one user spewing personal attacks against another.
"If you out yourself as a feminist or LGBT, you will tend to be more targeted," said Natacha Rault, a Wikipedia editor who lives in Geneva and founded a project that aims to reduce the gender gap on the website.
On French-language Wikipedia, where Rault does much of her editing, discussions about gender can often spark vitriol. Rault said there were six months of heated debate about whether to label the article on Britain's leader, Theresa May, with the feminine version of "prime minister" (première ministre), rather than the masculine one (premier ministre).
Another controversy has been simmering over the article "femme," the French word for woman, Rault said. At issue is whether the first paragraph should refer to gender in addition to biological sex and whether transgender women should be included in the definition of woman.
This debate devolved into an "edit war," a heated back-and-forth in which Wikipedians continuously edit an article to overwrite the other side's changes and reflect the language they want.
"Sometimes it can be so aggressive that you give up and run away from the article," she said.
The Wikimedia Foundation says it is seriously concerned by the idea that cisgender women and transgender editors could be repelled from Wikipedia by online abuse.
On its website, the foundation lists pervasive harassment as a barrier to gender equity. Sometimes, the harassment is explicitly sexual: According to anonymous interviews described by the foundation, users have had pornography posted on their personal Wikipedia userpage and emailed to them.
Camelia Boban, an editor on Italian-language Wikipedia, said another user once publicly used language to suggest she was a prostitute.
Studies on Wikipedia's contributor base from several years ago estimated that fewer than 20 per cent of editors were women. This research backed up an existing awareness in the Wikipedia community that female editors were seriously under represented, galvanising activists who set out to recruit more women to write and edit articles.
Groups like Art+Feminism were established to increase the representation of women and nonbinary individuals on Wikipedia.
Its organisers held sessions in which experienced editors taught aspiring ones the ways of Wikipedia, explaining how to navigate a website where editors sometimes appear to be communicating in code.
Wikipedians also began to discuss the "content gender gap," which includes an imbalance in the gender distribution of biographies on the site. The latest analysis, released this month, said about 18 per cent of 1.6 million biographies on the English-language Wikipedia were of women. That is up from about 15 per cent in 2014, partially because of activists trying to move the needle.
THE PERILS FOR LGBT EDITORS
Claudia Lo, a foundation researcher on a team that is building anti-harassment tools, said there was a pattern of harassment on Wikipedia stemming from debates over LGBT issues. When a celebrity comes out as transgender - notably, Chelsea Manning in 2013 and Caitlyn Jenner in 2015 - Wikipedians have extensively debated whether the individual's self-declared pronouns should be used.
Articles about transgender or nonbinary individuals are often subject to vandals who revert their pronouns back to their gender assigned at birth. But Wikipedia's guidelines make clear that editors should use the gender that the subject of the article most recently stated in a dependable source.
In countries where it is more dangerous for LGBT individuals to be open about their identities, harassment on Wikipedia can be particularly virulent.
Once, an administrator on a Wikipedia page blocked an editor simply because their username suggested that the editor could be gay, said Rachel Wexelbaum, a Wikipedian who works to improve LGBT content on the website. Eventually, she said, Wikimedia's Trust and Safety Team got involved, and the administrator was blocked for those actions.
IDENTIFYING AND PUNISHING HARASSERS
On English-language Wikipedia, one of roughly 300 languages with its own site, users are asked to report conflicts with one another on online notice boards, where they are expected to post links to abuse so an administrator can decide whether to take action.
That method problematically forces complaints into the public sphere, said Lo, the foundation researcher.
"If you're being harassed," Lo said, "the last thing you want to do is tell your harasser that you're about to tell on them." In some cases, situations are dealt with by Wikimedia Foundation staff members or via private emails with volunteer administrators, she said.
If a volunteer administrator finds the allegations of abuse credible, the user can be barred from editing anywhere on the site. Administrators for some Wikipedias, such as the English-language site, can also declare a "topic ban," a socially enforced tool in which other editors are responsible for making sure the guilty user is not involved in editing articles that mention prohibited subjects. Violating a topic ban can result in a sitewide ban.
The new tool, which the foundation calls "partial blocks," allows administrators to restrict users from editing particular pages on which they have proved to be a problem. Those developing the tool hope it will be used more liberally to block editors from specific topics without entirely barring users who are productive in other areas of the site.
"The idea is to provide volunteer administrators with a more targeted, more nuanced ability to respond to conflicts," Lo said.