ScraperWiki-ing Down Under

Streuth! You never know what’s been drilling around on ScraperWiki. If you’ve been too busy hacking away on your own projects you probably haven’t noticed a major undertaking right here on our wiki. Open Australia have made their planning alerts scrapers on the site and we’d like to take this moment to say: “G’day”. started life as a port of the codebase, which scrapes development application information from councils and sends email alerts of planning applications nearby to the user. It has since significantly evolved and is now a Ruby on Rails application that has sent almost one million development applications to people throughout Australia.

“We’ve started to use ScraperWiki because it makes collaboratively fixing scraper problems much simpler”
– Henare Degan (openaustralia)

Open Australia were finding Councils update their development application pages quite regularly. As a result, almost 30 percent of their scrapers broke. By having the visibility of what scrapers are broken, OpenAustralia volunteers can quickly dive in and make the changes required to fix broken scrapers on ScraperWiki. This is where our email alerts come in to play.

“Because of the email alerts, I’ve been able to quickly correct [broken scrapers] meaning more development applications are flowing to PlanningAlerts. The real power for me is that ScraperWiki handles the regular runs of scrapers and the email alerts are a killer feature”
– Henare Degan (openaustralia)

Open Australia are hoping to get more and more scrapers onto ScraperWiki so that other contributors will come on board to maintain the scrapers and write new ones. Whilst this hasn’t been realised yet, it would mean PlanningAlerts covers more and more of the country. And this collaboration is at the core of ScraperWiki so we wish them all the best.

Australia Planning Applications – you’ve been ScraperWikied!

This entry was posted in Scrapers, users and tagged , , , , , , . Bookmark the permalink.

3 Responses to ScraperWiki-ing Down Under

  1. Pingback: A Bonny Wee Hack Day at #hhhglas | Scraperwiki Data Blog

  2. Pingback: Build a business for your newsroom or make building a newsroom your business? « Data Miner UK

  3. Pingback: Announcing ScraperWiki Premium Accounts! | ScraperWiki Data Blog

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s