Next Wednesday, March 20, a fascinating new stage in transnational cooperation will arrive when scores of commoners in twenty countries take part in a Spanish P2P Wikisprint, a coordinated effort to document and map the myriad peer to peer initiatives that exist in Latin America and Spain.
The effort, hosted by the P2P Foundation, was originally going to be held in Spain only, but word got around in the Hispanic world, and presto, an inter-continental P2P collaboration was declared! (A Spanish-language version of the event can be found here.)
As described by Bernardo Gutiérrez on the P2P Foundation blog, the Wikisprint will bring together an indigenous collective in Chiapas with a co-working space of Quito; a crowdfunding platform in Barcelona with the open data movement of Montevideo; a hacktivist group in Madrid with permaculturists in Rio de Janeiro’s favelas; and a community of free software developers in Buenos Aires with Lima-based city planners; among many others.
The Wikisprint will map the various Spanish experiences around the commons, open innovation, co-creation, transparency, co-design, 3D printing, free license, p2politics, among other things. It will also feature debates, lectures, screenings, speeches, self-media coverage, workshops, network visualizations and videos.
Here is a list of the 20 participating cities. Anyone can add a new node from a new city. If you’d like to participate in the Wikisprint, check out this document on the P2P Foundation wiki to see the criteria for inclusion. There is a special website created for the occasion -- Wikisprint.p2pf.org – and a Twitter hashgtag, # P2PWikisprint.
The entire event will be peer-to-peer, meaning communication will take place through an open network topology in which each node is connected to the other without passing through any center. As Gutiérrez notes, “P2P – with its openness, decentralization and collective empowerment – is no longer something marginal. P2P is a philosophy, working trend and a solid reality. P2P is the nervous system of the new world.”
For years I have been the rapporteur for the Aspen Institute’s Information Technology Roundtable conference, which every year brings together about 25 technologists, venture capitalists, policy wonks, management gurus, and others to discuss topics of breaking concern. The most recent topic was the “power curve” distributions that tend to result on open network platforms.
This is extensively discussed in my just-released report on the conference, Power-Curve Society: The Future of Innovation, Opportunity and Social Equity in the Emerging Networked Economy. The report notes how a globally networked economy allows greater ease of transactions but also requires fewer workers at lower pay, which tends to aggravate wealth and income inequality. As I write in the introduction to the report:
Although the new technologies are clearly driving economic growth and higher productivity, the distribution of these benefits is skewed in worrisome ways. Wealth and income distribution no longer resemble a familiar “bell curve” in which the bulk of the wealth accrue to a large middle class. Instead, the networked economy seems to be producing a “power-curve” distribution, sometimes known as a “winner-take-all” economy. A relative few players tend to excel and reap disproportionate benefits while the great mass of the population scrambles for lower-paid, lower-skilled jobs, if they can be found at all. Economic and social insecurity is widespread.
The report also looks at Big Data and the coming personal data revolution beneath it that seeks to put individuals, and not companies or governments, at the forefront. Companies in the power-curve economy rely heavily on big databases of personal information to improve their marketing, product design, and corporate strategies. The unanswered question is whether the multiplying reservoirs of personal data will be used to benefit individuals as consumers and citizens, or whether large Internet companies will control and monetize Big Data for their private gain.
Aaron Swartz’s death is a sobering story about the collision of free culture activism with vindicative prosecutorial powers. It’s also about an amazing tech wizard and the personal costs of his idealism. Here’s hoping that Swartz’s tragic suicide at age 26 prompts some serious reflection about the grotesque penalties for a victimless computer crime and the unchecked power of federal prosecutors to intimidate defendants. Perhaps MIT, too, should reflect deeply on its core mission as an academic institution – to help share more knowledge, not fence it off.
Swartz was a hacker-wunderkind, a boy genius who played a significant role in many tech innovations affecting the Internet: RDF tags for Creative Commons licenses; a version of RSS software for syndicating web content; an early version of the platform that became Reddit, the user-driven news website. In 2006, when I interviewed Swartz for my book Viral Spiral, I was astonished to encounter a 19-year-old kid who had already done the path-breaking technical work that I just mentioned.
Swartz had been a junior high school student when he was doing mind-bending coding and design work for the Creative Commons licenses and their technical protocols. “I remember these moments when I was, like, sitting in the locker room, typing on my laptop, in these debates, and having to close it because the bell rang and I had to get back to class….”
When a windfall of cash came Swartz’s way following the sale of Reddit to Conde Nast, Swartz did not launch a new startup to make still more money. He intensified his activism and coding on behalf of free culture. He sought out new projects that would make information on the Internet more accessible to everyone.
In 2006, he worked with Brewster Kahle of the Internet Archive to post complete bibliographic data for every book held by the Library of Congress – information for which the Library charged fees. A few years later, working with guerilla public-information activist Carl Malamud, Swartz legally downloaded a large fraction of court decisions that were hosted by PACER, the Public Access to Court Electronic Records. PACER is the repository of US court decisions. Swartz’s idea was to reclaim documents that taxpayers had already paid for. Why should we have to pay 10 cents per page to access them? (Those documents can now be found at Malamud’s site, www.public.resource.org.)
As more and more computing moves off our PCs and into “the Cloud,” Internet users are gaining access to a wealth of new software-based services that can exploit vast computing capacity and memory storage. That’s wonderful. But what about our freedom to create and share things as we wish, free from corporate or government surveillance or over-reaching copyright enforcement? The real danger of the Cloud is its potential to limit how we may create and share what we want, on our terms.
There are already signs that large corporations like Google, Facebook, Twitter and all the rest will quietly warp the design architecture of the Internet to serve their business interests first. A terrific overview of the troubling issues raised by the Cloud can be found in the essay, “The Cloud: Boundless Digital Potential or Enclosure 3.0,” by David Lametti, a law professor at McGill University, and published by the Virginia Journal of Law & Technology. An earlier version is available at the SSRN website.
Lametti states his thesis simply: “I argue that the Cloud, unless monitored and possibly directed, has the potential to go beyond undermining copyright and the public domain – Enclosure 2.0 – and to go beyond weakening privacy. This round, which I call “Enclosure 3.0”, has the potential to disempower Internet users and conversely empower a very small group of gatekeepers. Put bluntly, it has the potential to relegate Internet users to the status of digital sheep.”
Josh Wallaert, writing at the Places Journal (at the Design Observer Group) – “the online journal of architecture, landscape and urbanism,” has a wonderful post about nominally public spaces on the Internet. The post, called “State of the Commons,” notes:
….Flickr has become a ghost town in recent years, conservatively managed by its corporate parent Yahoo, which has ceded ground to photo-sharing alternatives like Facebook (and its subsidiary Instagram), Google Plus (and Picasa and Panoramio), and Twitter services (TwitPic and Yfrog). An increasing share of the Internet’s visual resources are now locked away in private cabinets, untagged and unsearchable, shared with a public no wider than the photographer’s personal sphere. Google’s Picasa and Panoramio support creative commons licenses, but finding the settings is not easy. And Facebook, the most social place to share photos, is the least public. Hundreds of millions of people who have photographed culturally significant events, people, buildings and landscapes, and who would happily give their work to the commons if they were prompted, are locked into sites that don’t even provide the option. The Internet (and the mobile appverse) is becoming a chain of walled gardens that trap even the most civic-minded person behind the hedges, with no view of the outside world…..
For better and worse, public-making in the early 21st-century has been consigned to private actors: to activists, urban interventionists, community organizations and — here’s the really strange thing — online corporations. The body politic has retreated to nominally public spaces controlled by Google, Facebook, Twitter and Tumblr, which now constitute a vital but imperfect substitute for the town square. Jonathan Massey and Brett Snyder draw an analogy between these online spaces and the privately-owned public space of Zuccotti Park, the nerve center for Occupy Wall Street, and indeed online tools have been used effectively to support direct actions and participatory democracies around the world. Still, the closest most Americans get to the messy social activity of cooperative farm planning is the exchange of digital carrots in Farmville.
For anyone scratching their head about how to understand the deeper social and economic dynamics of online networks, a terrific new report has been released by Michel Bauwens called Synthetic Overview of the Collaborative Economy. Michel, who directs the Foundation for Peer to Peer Alternatives and works with me at the Commons Strategies Group, is a leading thinker and curator of developments in the emerging P2P economy.
The report was prepared for Orange Labs, a division of the French telecom company, as a comprehensive survey and analysis of new forms of collaborative production on the Internet. The report is a massive 346 pages (downloadable as a pdf file under a Creative Commons BY-NC-SA license) and contains 543 footnotes. But it is entirely clear and accessible to non-techies. Unlike so many popular books on this subject that are either larded with colorful hyperbole and overly long anecdotes, or arcane technical detail, the Bauwens report cuts to the chase, giving tightly focuses analyses of the key principles of online cooperation. The report is meaty, informative, comprehensive and well-documented.
Two paragraphs from the Introduction give a nice overview:
Two main agents of transformation guide this work. One is the emergence of community dynamics as an essential ingredient of doing business. It is no longer a matter of autonomous and separated corporations marketing to essentially isolated consumers, it is now a matter of deeply inter-networked economic actors involved in vocal and productive communities. The second is that the combined effect of digital reproduction and the increasingly 'socialized' production of value, makes the individual and corporate privatization of 'intellectual' property if not untenable, then certainly more difficult , and in all likelihood, ultimately unproductive. Hence the combined development of community-oriented and 'open' business models, which rely on more 'social' forms of intellectual property.
In this work, we therefore look at community dynamics that are mobilized by traditional actors (open innovation, crowdsourcing), and new models where the community's value creation is at its core (the free software, shared design and open hardware models). We then look at monetization in the absence of private IP. Linked to these developments are the emergence of distributed physical infrastructures, where the evolution of the networked computer is mirrored in the development of networked production and even financing. Indeed the mutualization of knowledge goes hand in hand with the mutualization of physical infrastructures, such as collaborative consumption and peer to peer marketplaces, used to mobilize idle resources and assets more effectively.
Shortly after I posted this, the State of Minnesota changed its mind, as reported here. Nice to know that officialdom can change its mind in the face of the blazingly obvious.
In a sign of just how deeply rooted cultural prejudices against free culture truly are, the State of Minnesota has banned Coursera, the free online course website, from offering its courses to Minnesota citizens. As reported in Slate magazine (itself drawn from the Chronicle of Higher Education), “Free Online Education Illegal in Minnesota.” Coursera is a website that partners with Stanford, Columbia, the University of Michigan and other top universities around the world to offer some of their courses online for free.
Why is this so objectionable to the state of Minnesota? Technically, the state wants to enforce its right to approve anyone that offers educational instruction within its borders. It is especially concerned with preventing fly-by-night schools from bilking people with worthless degrees.
But if the courses offered are for free, and if no degrees are being offered, what’s the problem? The state official in charge of enforcing the law told the Slate reporter that Minnesota residents could be wasting their time by taking the courses. So it's come to this: state regulators are worried about our frittering away our time on free courses like “Principles of Macroeconomics” and “Modern and Contemporary American Poetry.”
The tech world frequently talks about open source software as a collaborative endeavor, but it is less apt to use the word “commons,” let alone engage in rigorous empirical analysis for understanding how software commons actually work. The arrival of Internet Success: A Study of Open-Source Software Commons (MIT Press) is therefore a welcome event. This book is the first large-scale empirical study to look at the social, technical and institutional aspects of free, libre and open source software (often known as “FLOSS”). It uses extensive firsthand survey research, statistical analysis and commons frameworks for studying this under-theorized realm.
While most people may associate open source software with Linux, there are in fact tens of thousands of open source projects in existence. Many consist of no more than two or three participants, and may have only an irregular existence. However, many thousands of others attract a small but spirited team, and still others are huge, robust social ecosystems in their own right.
The authors of Internet Success, UMass Professor Charles M. Schweik and consultant Robert C. English, looked at the large universe of FLOSS projects hosted on SourceForge.net, a website that functions as a kind of clearinghouse for over 260,000 FLOSS projects (as of February 2011) and 2.7 registered software developers. The site provides most of the tools that developers need to find colleagues and build a new FLOSS program – a Web repository of code, bug-tracking utilities, online forums, email mailing lists, a wiki, file downloading services, etc.
While SourceForge is not the only such site for FLOSS projects, it is the largest and arguably representative of the universe of such projects. With support from the National Science Foundation, Schweik and English set out to study the pool of software development projects on SourceForge to try to determine why some succeed, why others fail and why others simply languish. They explain in excruciating technical, social science detail how they assembled and analyzed their datasets, which originate in a vast collection of SourceForge data on more than 130,000 projects as well as their own survey questionnaire of programmers.
The publishers of research journals don’t get much attention because their products are not very exciting. Mentions of Science or Nature do not exactly quicken the pulse. But that doesn’t mean that the publishers of academic journals aren’t as predatory and profiteering as any Fortune 500 bank or oil company.
It now appears that the major universities that generate so much of the world’s research (only to buy it back from publishers at huge mark-ups) could be getting ready to fight back. Harvard University is publicly urging its faculty members to avoid publishing in journals that require paid access, and to publish instead in open access journals. Open access literature can be defined as works that are digital, online, free of charge, and free of most copyright and licensing restrictions.
As the Guardian (UK) reports, the Harvard Faculty Advisory Council has sent a memo to 2,100 professors and researchers informing them that “major periodical subscriptions, especially to electronic journals published by historically key providers, cannot be sustained: continuing these subscriptions on their current footing is financially untenable. Doing so would seriously erode collection efforts in many other areas, already compromised.”