As you can imagine we get a lot of website developers who make pretty, useful things that flash and jump and link. But the humble scraper can do heavy lifting for the building blocks that ultimately make these centres for civic engagement most engaging. So our user of the week is one such developer – Philip John.
He’s a WordPress developer (please join our campaign to get WordPress to allow embeds of ScraperWiki views! Send them emails) for Lichfield Live and a big engager with the local community (which comes with the local site territory). His experience working with public data has taught him a few things:
- Public data mostly comes with little context. It’s hard to understand because you don’t know how it’s been collected, for what purpose and by whom. These all contribute to how well you can trust the data.
- Once you start with one piece of data you want to mash it with others and that can be one heck of a headache because you get into all sorts of issues of matching IDs, especially on the local level with things like Super Output Areas, Postcodes, Constituencies and authority boundaries.
- There are lots and lots of tools and lots and lots of people to help you
So it’s no wonder he found ScraperWiki, and we’re glad he’s finding it useful. We hope he and other hyperlocal sites use us to make the tools their users need, and if one community wants local data liberated the others will too. So collaborate with your code (and pilfer)!
ScraperWiki makes gathering data that isn’t in a reusable format much easier. It makes learning how to code tons easier than anywhere else and not at all scary! It makes benefiting from the collective awesomeness of the open data community so much easier. – Philip John