Understanding how online communities function is a particularly vexing challenge. For example, is a wiki best understood as a volunteer club and nonprofit organization? Or does it more closely resemble a business that organizes the wisdom of crowd -- or tames the unruliness of a mob? It becomes even more complicated when you consider that wikis have porous boundaries, enabling people to become either formal members or casual lurkers.
People trying to grasp how collaborative spaces work often turn to the literature of the commons. It has a lot to say about the dynamics of collective action and sharing wealth, after all. But does the stewardship of land or water really resemble the stewardship of online information by quasi-strangers scattered around the world?

These are the sorts of conceptual challenges that tug at Benjamin Mako Hill, a computer science professor at the University of Washington and scholar of digital collaboration. Mako, as he’s usually called, brings special talents to this inquiry. He’s not just a scientist and scholar, but a committed free software activist and hacker who has made significant contributors to the Debian and Ubuntu projects, two versions of the GNU/Linux computer operating system. He advises the Wikipedia project and has received a prestigious US National Science Foundation grant to study the governance and lifecycles of knowledge commons.
I was curious to learn more about the social life and governance of digital commons, so I invited Hill to join me on my podcast Frontiers of Commoning (Episode #73). Mako told me how digital communities go through different developmental stages, which affects their strategic choices as commoners. He also described how fledgling commons must often struggle to gain public credibility and new contributors – but later, they must develop ways to protect against hostile forces seeking to monetize or control their collective wealth (code, curated information, webs of relationships).
When he first began to study digital commons, Hill wondered why some of them become really big and successful, such as Wikipedia, while the vast majority of them don’t. He soon realized that most collaborative websites start out with a very basic challenge, How do we attract people?
“If you look at the median number of contributors to an open source software project,” said Hill, “it's a single person. If you look at the median number of contributors to wiki on a large wiki-hosting platform, it's very often no more than a small handful.” This situation naturally motivates collaborative sites to make themselves quite open to anyone who wants to contribute.
But as the number of participants on a site grows, perhaps acquiring a reputation in specialist circles or even mainstream culture, the needs of the community changes. Its members need to protect their stock of knowledge from increasing attacks by vandals and trolls. Such attacks tend to become more frequent as the community becomes larger and more robust.
But at a later stage of development, the goal of attracting newcomers often comes into conflict with the need to protect the shared wealth of the community from attacks. “Wikipedia is one of the top ten websites in the world,” said Hill, “so there are a lot of eyeballs on it and a lot people who want to vandalize it….Epistemic legitimacy [of a collaborative website] can absolutely be appropriated.” So steps need to be taken to protect that commons.
He cited the cautionary example of the Croatian language edition of Wikipedia and its startling takeover. “For a period of about ten years, Wikipedia Croatia was completed captured by a cabal of far-right Croatian nationalists,” he said, which resulted in the site becoming a haven for Holocaust denialism, for example. Croatia was the site of the third largest concentration camp in Europe during the Second World War, but you wouldn't have known that if you looked in Croatian Wikipedia between 2010 and 2020. I think that this represents a kind of threat that is actually common to a lot of peer production projects.”
But dealing with attacks or attempted takeovers can require serious tradeoffs in the terms of the governance of collaborative websites. A common response is to create new types of security and participation filters. This can secure the shared stock of information from vandalism and disruption, but it can also make it more difficult for people to contribute to the site. This, in turn, can lead to a decline in the quality of information collectively contributed.
One favored solution is to restrict participation to people who have registered accounts, thereby preventing anonymous contributions. Several Wikipedia editions have in fact done this, including the English language, Portuguese, and Farsi versions of the site. So have many alternative wikis such as Wookieepedia, https://starwars.fandom.com/wiki/Main_Page a fan culture wiki dedicated to Star Wars trivia.
While account registration does in fact deter mischief and misinformation -- reducing low-quality and damaging contributions by 70%, said Hill -- it also tends to result in 20% to 40% decreases in the amount of good material contributed. There are difficult tradeoffs is moving from a regime open to all, including anonymous strangers, to a world of commoning that can protect shared wealth.
Perhaps the more significant challenge to collaborative websites is Big Tech’s use of its market power and ingenious platforms to capture and monetize social and collaborative communities. This is the story of Facebook and Twitter, along with many other social media platforms. The general idea is to attract users with free social exchange, and ramp up data-surveillance, advertising, and data-driven algorithms to skew reader feeds as a way to boost online participation. This process of monetizing users’ attention and personal information leads to some dangerously skewed and often-politicized content. In effect, once-autonomous, self-organized social communities are enclosed and engineered to serve corporate profit-making.
A key event in the corporate enclosure of social media came in 2008, when Apple and other large tech firms began to “learn from the things that folks in the commons were doing,” as Hill put it.

Initially, in the early 2000s, commoners had figured out that the only way to scale collaboration to massive sizes of participants, was to make content really open, put it out under a free license, and manage it all as a commons. “That, in fact, was the only way that we knew how to do it,” said Hill. Before corporate platforms arrived, that’s how open source software, blogs, wikis, and other collaborative innovations sustained themselves as commons.
When the iPhone launched in 2007, Hill noted, there was no App Store. The iPhone was sold with only the apps that Apple had installed on it. But enough iPhone owners were unhappy with this arrangement. An estimated 25% of all iPhones sold were “jailbroken,” or hacked, to enable users to fix bugs and install homegrown, non-Apple apps.
After first fighting this development, Apple realized that this was a battle it couldn’t win. So it created the App Store, allowing anyone to make apps for the IPhone. But Apple insisted upon becoming the official gatekeeper of the apps, with its own terms of what could be sold. It also charged a 30% fee to developers to retail their apps via the App Store.
This became a rough model for other tech companies to exploit and refine: use one’s capital, market power, and tech to enclose a social community….monetize people's relationships and personal information through data surveillance and targeted marketing….and slowly privatize control over the community by converting it into a hybrid market of individuals. Cory Doctorow has incisively called this process (which has other dimensions) “enshittification.”
Despite such forces, Hill believes that digital commoners can secure control over their shared wealth and innovation ecosystems, even with the potent threats posed by new AI systems. You can listen to my full interview with Mako Hill here.









Recent comments