Unleashing Public Sector Information

Many public sector institutions are beginning to realize that they don’t really know how to make their information and cultural treasures available to the public online — but they should. Governments, schools, libraries, museums and cultural institutions face all sorts of barriers of technology, law and social habit. And there are few existing models to emulate. So what’s a conscientious public institution to do?

Some possible answers were explored recently at COMMUNIA, a two-day workshop in London on March 26 and 27. COMMUNIA is a project that brings together stakeholders interested in the public domain in the digital environment. It is coordinated by the NEXA Research Center for Internet and Society at the Politecnico di Torino, Italy. At this COMMUNIA gathering, held at the London School of Economics, there was a colorful array of thinkers, activists, entrepreneurs, government officials, museum curators, and many others.

It’s impossible to distill the twenty-plus presentations, so I will summarize some of the more captivating projects and ideas presented.

One of the biggest issues facing governments around the world is how to make their information more accessible and useable. Bureaucrats tend to prefer secrecy, the technical complexities can be forbidding, and the counterintuitive dynamics of the online world can be confusing. And yet there are compelling reasons to make a government’s huge reservoir of information available online. It can promote productive new collaborations; spur economic development and innovations; and encourage citizen participation and government accountability.

Public sector information is one of the key “utilities” of the knowledge economy, Rufus Pollock of the University of Cambridge (and a co-organizer of the event) pointed out. But it’s not entirely clear, in a given circumstance, how governments should manage public sector information — or how it should be paid for (via taxes? user fees? public/private partnerships?).

Often, government doesn’t even appreciate the potential value or uses of the information that it manages. That’s a good argument for providing it to the public as cheaply and easily as possible, using open-technical standards and formats. Then, entrepreneurs and others can “build on top of” the public-domain data, inventing new types of commercial and civic innovations.

For countries like Australia and Great Britain that have a so-called “Crown Copyright” — meaning, the government owns the works — governments must affirmatively allow works to be re-used by formally putting them into the public domain. The Creative Commons licenses are excellent tool for this.

But governments should not just put their information up on websites, noted Brian Fitzgerald, a law professor at Queensland University of Technology in Australia. They should give citizens access to the raw data and information so that they can remix the materials to suit their own purposes — and in ways that might not occur to government bureaucrats.

When New South Wales made train timetable data available, some enterprising soul built an iPhone application around it, making the data immensely more useful. In Australia, there was a huge public outcry when people couldn’t access real-time information about raging fires in the nation’s bush country. They had wanted to do a data mashup with Google Maps, so that people could quickly see where the fires were and where they were moving.

In an essay, Government Data and the Invisible Hand, David G. Robinson and three other authors have this advice for governments:

“Rather than struggling, as it currently does, to design sites that meet each end-user need, we argue that the executive branch should focus on creating a simple, reliable and publicly accessible infrastructure that exposes the underlying data. Private actors, either nonprofit or commercial, are better suited to deliver government information to citizens and can constantly create and reshape the tools individuals use to find and leverage public data.”

Great Britain has a useful one-stop portal for accessing the various services that government provides — Directgov, which billts itself as “public services all in one place.” Brian Hoadley, who oversees product design at Directgov, said that he is currently working to develop applications that will let people do mashups of different sorts of datasets, and deliver results via mobile devices like cell phones. One ingenious inspiration was a mashup of bicycle accident data overlaid on a map, which used various data sources and accessible in multiple formats.

Hoadley also envisions Directgov providing “open spaces for data collaborators,” particularly in the emerging “cloud computing” environment which will host huge amounts of data in a central source and be readily accessible via mobile devices. Government will not just be a “publisher” of information, but a hosting platform of an “open data space.” Under this framework, application developers will be able to share prototypes of new data applications, get comments rapidly, and then iterate and innovate quickly.

A new frontier for public sector information is “hyper-local” information. Under a new British law, some 400 local government authorities will soon have a legal obligation to publish more information about local decisionmaking. BeLocal.com — still in development — aims to provide “a simple portal to access a wide range of hyper-local information from planning alerts to traffic news, from health services to schools and from local newspapers to your local councillors and from county wide right down to parish level.” As newspapers face further economic woes, it will be interesting to see if the BeLocal model will rise to the challenge of providing better local news — or if it will be little more than a glorified community bulletin board.

Mathias Schindler of Wikimedia, Germany, described an ingenious collaboration between the state archive, Bundesarchiv, with Wikipedia, Germany. The archive contracted with Wikimedia (the parent project of Wikipedia) to put some 10 million of its images on Wikipedia under Creative Commons Attribution-ShareAlike licenses. The images were deliberately scaled down to 800 pixels, a lower-quality image, in order to let Bundesarchiv be able to control the original images.

Since this partnership went into effect, the Bundesarchiv has seen a huge boost in its web traffic, mostly from Wikipedia. At the same time, government offices now have links on their webpages, which is sending new traffic to Wikipedia.

The Imperial War Museum in London hosts a vast collection of photos depicting armed conflicts from WWI to the present. The museum recent put many of its images onto the Flickr Commons site as a way to make them more accessible to the public. However, because some companies are funding the digitization of the photos, they are retaining commercial rights as part of the bargain. This serves the public interest in making old, non-digitized images more generally available — yet it also limits what future entrepreneurs may do with the images (because the commercial rights are privately held). One possible solution is to require that any third-party commercial licenses be time-limited, so that the curator institution won’t lose all rights forever.

The COMMUNIA workshop had the useful effect of showcasing a wide range of projects using government or nonprofit information. It also helped validate “public sector information,” or PSI, as a challenge that will need more focused attention in coming years. To help spur that process, participants made dozens of recommendations about what needs to be done.

As part of its attempt to educate legislators and other policymakers, the COMMUNIA participants also hammered out a short consensus statement: “Public sector content and data must be made available to all without delay and presented (legally and technologically) to encourage use and re-use.” A simple concept with devilishly complicated paths to implementation.

Update: Click here for an array of post-conference materials about the COMMUNIA event.