Open data and all that

PlanningAlerts is dead, long-live PlanningAlerts

with 29 comments

Planning Alerts screengrab

One of the first and best examples of how data could make a difference to ordinary people’s lives was the inspirational, built by Richard Pope, Mikel Maron, Sam Smith, Duncan Parkes, Tom Hughes and Andy Armstrong.

In doing one simple thing – allowing ordinary people to subscribe to an email alert when there was a planning application near them, regardless of council boundaries – it showed that data mattered, and more than that had the power to improve the interaction between government and the community.

It did so many revolutionary things and fought so many important battles that everyone in the open data world (and not just the UK) owes all those who built it a massive debt of gratitude. Richard Pope and Duncan Parkes in particular put masses of hours writing scrapers, fighting the battle to open postcodes and providing a simple but powerful user experience.

However, over the past year it had become increasingly difficult to keep the site going, with many of the scrapers falling into disrepair (aka scraper rot). Add to that the demands of a day job, and the cost of running a server, and it’s a tribute to both Richard and Duncan that they kept PlanningAlerts going for as long as they did.

So when Richard reached out to OpenlyLocal and asked if we were interested in taking over PlanningAlerts we were both flattered and delighted. Flattered and delighted, but also a little nervous. Could we take this on in a sustainable manner, and do as good a job as they had done?

Well after going through the figures, and looking at how we might architect it, we decided we could – there were parts of the problem that were similar to what we were already doing with OpenlyLocal – but we’d need to make sustainability a core goal right from the get-go. That would mean a business plan, and also a way for the community to help out.

Both of those had been given thought by both us and by Richard, and we’d come to pretty much identical ideas, using a freemium model to generate income, and ScraperWiki to allow the community help with writing scrapers, especially for those councils didn’t use one of the common systems. But we also knew that we’d need to accelerate this process using a bounty model, such as the one that’s been so successful for OpenCorporates.

Now all we needed was the finance to kick-start the whole thing, and we contacted Nesta to see if they were interested in providing seed funding by way of a grant. I’ve been quite critical of Nesta’s processes in the past, but to their credit they didn’t hold this against us, and more than that showed they were capable and eager to working in a fast, lightweight & agile way.

We didn’t quite manage to get the funding or do the transition before Richard’s server rental ran out, but we did save all the existing data, and are now hard at work building PlanningAlerts into OpenlyLocal, and gratifyingly making good progress. The domain is also in the middle of being transferred, and this should be completed in the next day or so.

We expect to start displaying the original scraped planning applications over the next few weeks, and have already started work on scrapers for the main systems used by councils. We’ll post here, and on the OpenlyLocal and PlanningAlert twitter accounts as we progress.

We’re also liaising with PlanningAlerts Australia, who were originally inspired by PlanningAlerts UK, but have since considerably raised the bar. In particular we’ll be aiming to share a common data structure with them, making it easy to build applications based on planning applications from either source.

And, finally, of course, all the data will be available as open data, using the same Open Database Licence as the rest of OpenlyLocal.

29 Responses

Subscribe to comments with RSS.

  1. This is brilliant news. Planning Alerts was the site that got me interested in the whole world of civic hacking and open data in the first place, and it’s great to see it finding a new home. Obviously I’m happy to help in any way I can, both as council officer on the inside and a civic hacker on the outside.

  2. Brilliant, brilliant news.

    This is *really* useful for the CycleStreets cycle campaign group toolkit:

    Martin, CycleStreets

    October 13, 2011 at 8:05 pm

  3. Well you’ve managed to push me into finally getting around to rewriting my scraper for Broxbourne on ScraperWiki and you can find it here:

    Let us know when you know what schema you want the scrapers to generate, and how they should be tagged to get picked up, and I’ll make sure it is updated.

    Tom Hughes

    October 17, 2011 at 7:53 am

    • Excellent! Should hopefully have schema done in the next couple of weeks, and will post details on this blog (and tweet too, of course)


      October 17, 2011 at 8:00 am

  4. I’m looking for an API to access local planning information for an app mobile (Windows Phone 7) for the UK.

    Are you thinking of exposing this information as an API or as an email service?

    Ollie Riches

    November 4, 2011 at 6:34 pm

    • Both email and API access. Expect to have basic API done in the next month or so. If you want preview access drop us a line.


      November 4, 2011 at 8:52 pm

  5. I’m a bit new to all of this, I have been searching for a way to get either my council’s planning alerts (email) or the new Planning Alert APIs that will become available to display on a WordPress-powered hyperlocal blog. Is there already a widget/scraper/plugin to do this – I’ve tried searching but I’m a bit bamboozled.

    Clare Hill

    November 8, 2011 at 10:03 pm

    • I don’t think there’s a way off the peg to do this, but once we’ve got the planning applications, we’ll definitely be exposing the data via an API, and then it will be easy to do a widget. Watch this space.


      November 8, 2011 at 10:25 pm

  6. […] you may have read from our previous post, OpenlyLocal is taking over to enable this valuable service not just to […]

  7. I’ve been undertaking my own personal revival and re-development of some parts of the planning alerts setup using a base python scraper and configuration scripts hosted on scraperwiki. At the moment I have set up an API ( and map based interface ( into 2 databases of all London ( and many Welsh ( authorites.

    It looks like all my effort are in vain – but I’m very happy to see the professionals move in.

    Having been through this process I can say that one of the weaknesses of many authority sites and the previous planning alerts setup concerns lack of definition and inconsistent terminology about dates – it can be unclear whether search and store dates represent the date of receipt of an application or when it was registered/validated on the local system. Sometimes these dates can be very far apart (presumably if an application has to be revised). To this end I ended up specifying 2 types of date (received_date and validated_date).

    Andrew Speakman

    November 22, 2011 at 2:13 pm

    • Andrew
      Discovered your impressive work a couple of weeks ago. We’re planning on using ScraperWiki for some of the less standard planning systems, and would love it if you were interested in helping out. I’ll drop you an email in the next couple of days to explain our plans.


      November 22, 2011 at 6:16 pm

  8. Andrew, your scraperwiki map looks amazing. I write for a London hyperlocal, and the area, King’s Cross, straddles two local authorities, Camden and Islington.

    For Camden, I can get email alerts for a specific ward, for Islington, no such facility. With your map, I put in a postcode in the geographic centre of KX, which is roughly KX station, a 1 month date range, and 0.75km and get a really nicely presently, manageable no. of results.

    Is it technically possible to embed this scraper into a hyperlocal site with a running date query? (ie maybe it could refresh every month). Or even show the results as a list, instead of a map, that kept on updating as things came in, and just showed the 30 latest results or something like that?

    At the moment, the only way we keep abreast of planning applications is through geeking around with ad hoc checking or being signed up to some email alerts, and we try not to miss stuff like this:

    Clare Hill

    December 4, 2011 at 2:10 am

    • Hi Clare

      The API will return a GeoRSS feed of the latest x planning applications within a search zone.

      Here is a link to a feed which returns the latest 30 centred 0.75 km around N1 9AP:

      You can embed this type of feed in a Google map and I believe there are wordpress plug-ins which might do this for your site (but I’m not an expert).

      Note I’m developing this site on and off in my spare time, so this is not a guaranteed service and the details of the API might change as time goes by.


      December 4, 2011 at 4:29 pm

      • Andrew, your scraper is brilliant. I’ve just adapted the Kings Cross Rss link provided to Clare above to generate a map centred in Trafford near the Man U ground (boo hiss) and embedded the map on our website

        I used to have a similar map based on planning alerts but had to abandon it when Trafford updated their database. If anything, this is a better resource as I haven’t required latitude and longlitude. I hope it will stay live. I’m really impressed and I don’t mean it about Man U – I’m their Councillor so don’t tell anyone I’m a blue in football.

        Councillor Mike Cordingley

        January 27, 2012 at 2:41 pm

      • Mike, your comments are much appreciated, apart from the bit about Man U – but then living in London I should think you would expect me to be a red supporter.


        January 27, 2012 at 4:54 pm

      • doh… It was ever thus hahaha

        Councillor Mike Cordingley

        January 27, 2012 at 5:15 pm

      • Hi Andrew,
        I feel very guilty about this as I suspect I’ve put too much demand upon the server at scraperwiki and they’ve disabled your planning applications api. I assume it’s me since it went down just 24 hours after I started hosting the resulting google map. Apologies to Clare also as the api disablement will affect King’s Cross too.
        I hope it can be rectified.


        Councillor Mike Cordingley

        January 30, 2012 at 2:35 pm

      • Hi Mike, yes scraperwiki said that a Google bot related to the map on your site was generating continuous requests to my interface which crashed the site. I’m not sure why this should be the case as I looked at your iframe implementation and it looked quite standard. I’m trying to get more information from scraperwiki – in the mean time no service I’m afraid. Andrew

        Andrew Speakman

        January 31, 2012 at 7:45 pm

    • Clare
      We’re planning on adding javascript widgets that can be included in any website — all you’ll need to do is submit a postcode or lat long (and optionally a distance) and it should populate a list of the most recent ones. Hopefully should be available very early in the new year (aiming for January).


      December 4, 2011 at 4:50 pm

      • Thanks to both of you. I’ll give Andrew’s suggestion a go but also stay tuned for the widgets. I hope I can at least be of help to test it out/put it into practice as a non-programming mere mortal. Yes I do this just for love too. I’ll be sandboxing away on the site in December, seems like the perfect month for it!

        Clare Hill

        December 4, 2011 at 4:55 pm

  9. Hi folks. It’s been quite exciting to read this discussion and see how things are progressing.

    I worked with a developer last year at a ScraperWiki hacks & hackers day in Glasgow to create a planning app map. I’m a hack, by the way, not a hacker. I wish I could code. However, it worked very well considering we only had a day. The two-person team of myself and Robert McWilliam were overall winners in the #hhhglas contest, much to our surprise. All we wanted to do was build something useful in which your average punter could bash in their postcode and see a map containing the active plans in their area. Amazingly, most council sites don’t have this option.

    Then, within a few weeks a local environmental charity used our map and the data to create an RSS feed which in turn triggered a Twitter account, tweeting the details of every new application as it went in. People seemed to love it. And I found it very useful as a journalist.

    Unfortunately the council (Edinburgh) moved over to a new site, so our scraper no longer works.

    I’m very keen to resurrect all of this but I’m not sure where to start. There’s a good version of the idea in Australia: and we have a pretty useful, if slightly anonymous, version here in the UK:

    But neither of those are ideal. I’d like users to be able to choose from:
    – postcode search
    Or a drop down menu with:
    – constituency name
    – colour status, ie green for approved, yellow for pending and red for declined apps

    I could go on…

    But what I’m seeking is some guidance, someone to point me in the right direction. I see ScraperWiki can now build custom apps at a cost, which is dependent on the amount of work involved. Has anybody done this and can you provide links to the final outcome?


    edinburghmacleod at gmail dot com

  10. Hi all,
    I too am not a hacker/can’t code but do lots of analysis for my local community action group on planning applications. I work in IT and am aware of scrapers and wished I’d had the facility you describe over the last four years.

    There are two other areas of data that also come into the planning application process that impact local communities – waste & mineral sites and HGV operating licences. Has anyone tried scraping the VOSA site for O Licence data? I’ve done it manually and put the data set into a Google Docs spreadsheet that has a standard map gadget that enables me to embed the map. Please look at the VOSA data set as I reckon 7-10% off the addresses are out of date or inaccurate.

    Once you get the local authority planning applications plus the VOSA operating centres plus the waste sites you get a very interesting map. We found our local authority had a hidden strategy of dumping all the nasty stuff in two or three wards away from where all the nice and rich people live, and certainly not where the planning officers and local authority executives live.

    This is becoming so much more important now that local plans at the parish and ward level are required under the NPPF.



    April 20, 2012 at 12:43 pm

  11. Hi all,

    A follow up thought is on the data set held by the Planning Inspectorate for planning applications that go to appeal or are subject to enforcement and the enforcement is appealed against. Again I’ve done manual scraping. Has anyone had a look at the PI data to see what’s possible? They are releasing so much stuff as PDF documents that its getting more difficult to screen scrape. You’d think they didn’t want the public to know what’s going on.



    April 20, 2012 at 12:51 pm

  12. […] very excited about. OpenlyLocal have been making good progress with developing the new PlanningAlerts. We've done some initial work to determine how this will work, interface-wise, to avoid lots of […]

  13. Hi all, I wonder if this idea is still alive please? I would like to get involved as I have been working on covering all London councils. Thanks


    February 19, 2015 at 12:04 pm

    • Hi Michael

      As far as I am aware the project under the auspices of OpenlyLocal and Scraperwiki is somewhat defunct.

      However I have my own personal project which is very slowly starting to revive this work.

      You can view it at where there is up to date coverage of around 25% of the 471 planning authorities in the UK.

      Andrew Speakman

      February 20, 2015 at 10:08 am

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: