When I was in the hospital a few months ago—I’m fine, thanks—I asked the attending resident about a new drug that I’d heard treated my condition well. He didn’t know much about it, so he suggested we look it up—on Wikipedia.
I have no doubt that the doctor would have done more research before actually prescribing this drug. But the episode was a reminder that Wikipedia, which turns 15 years old on Jan. 15, is not just a resource for lazy students and casual readers. Everyone uses it and generally trusts it, which is why it’s the sixth most-popular site in the U.S. and seventh in the world. And that’s also why we should all be worried that it is fraying at the edges.
All 15-year-olds experience growing pains. For Wikipedia, this means the number of articles continues to grow rapidly, just recently surpassing 5 million on the English-language site alone, while the number of dedicated editors has been in decline since 2007.
Read more: Meet the woman charged with saving Wikipedia.
This means a large proportion of articles contain some sort of warning that they are incomplete, poorly written or inadequately researched. Go to the article on NSYNC, for example (as if you hadn’t already today) and, as of this writing, you’ll be greeted at the door by a cautionary banner:
This warning has been sitting astride the article since November 2014, even though 288 revisions have been made since it was added.
A 2013 article in the MIT Technology Review laid out Wikipedia’s alleged crisis in stark terms:
The problem, most researchers and Wikipedia stewards seem to agree, is that the core community of Wikipedians are too hostile to newcomers, scaring them off with intractable guidelines and a general defensiveness. One detailed study from 2012 found that new editors often find that their first contributions to the site are quickly rejected by more experienced users, which directly correlates with a drop in the likelihood that they will continue to contribute to the site.
To get a handle on exactly how bad Wikipedia’s problems are, I did a little experiment: I downloaded the complete revision history for 25,000 randomly selected articles–a total of 2.3 million edits–and looked at how many had warnings about quality at a given time. At present, 12% of the articles in this sample had some documented problem, such as lack of references or a “non-neutral point of view.”
The good news is that, while high, this figure is on the decline since peaking around 2010, though the reason for this may be because these unsightly tags are getting less popular with editors, according to Aaron Halfaker, a senior research scientist at the Wikimedia Foundation, which oversees the site. “People don’t like to use those templates as much as they used to,” he said.
But the bigger issue is that it can be difficult to direct new editors to the pages that most need attention. If you’re looking for an answer to a common question, Halfaker says, “Wikipedia looks like it’s getting pretty close to done.” But as soon as you get sucked into a Wikipedia hole and start clicking through links, he says, you’ll quickly find articles that are in serious need of improvement.
While Wikipedia will always be a human-driven site, it may be bots and sophisticated artificial intelligence programs that save the day. Much of Halfaker’s work is currently directed at efforts to automatically score the quality of new edits to the remaining community of editors can more efficiently address them. At the same time, he emphasizes the need for new users to have positive interactions with other humans. To that end, Wikipedia has a page called the Teahouse where they can interact directly with experienced editors and get answers to questions about how to contribute appropriately.
The survival of the site depends on a new kind of editor. Sydney Poore, who’s been active on the site since 2005, says editor attrition is a natural product of the changing needs of the site. She is involved in several efforts to diversify the pool of active editors, which skew heavily male and Western.
“It does worry me that we’ll lose people, but it may just be part of the process” of an evolving site, she said. Instead of a model of super-dedicated amateurs spending hours a day making the meat and potatoes of the site, she says, she sees groups of people organizing regular “edit-a-thons” around a particular subject. One of her focuses in on a project called Women in Red, a reference to the fact that a link to a person with no biography appears in red on the site. That group aspires to turn those links blue by filling out biographies for notable women who deserve pages on the site.
Poore suggests that groups of experts concerned with the quality of a particular topic organize monthly brown bags to get together and improve articles in their area of study. Meanwhile, Halfaker is working on automated ways of directing new, well-intentioned editors to pages in need of improvement, where they’re most likely to be able to make lasting contributions.
In its adolescence, this is what Wikipedia may look like under the hood: A collaboration of occasional editors and smart software that make the site friendly to edit—even if it means some of the people who built the site through hundreds of hours of volunteer work don’t see a place for themselves in the new order.
“Wikipedia is built by a rich community of people who care deeply about free knowledge,”Juliet Barbara, senior communications manager at the Wikimedia Foundation, said in an e-mail Thursday. “They are creative and innovative, and constantly coming up with ways to improve Wikipedia and ensure its sustainability. Trends change, but Wikipedia is not a set of KPIs [key performance indicators]. Wikipedia is not in trouble.”
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Chris Wilson at chris.wilson@time.com