The ABC’s Data Journalism Play

Now in its 70th year the Australian Broadcasting Corporation is Australia’s national public broadcaster. Annual funding is around AUS$1bn which delivers seven radio networks, 60 local radio stations, three digital television services, a new international television service and an online platform to deliver this ever expanding offering of digital and user generated content. At last count there were in excess of 4,500 full time equivalent staff and nearly 70% of them make content.

We are a national broadcaster fiercely proud of our independence — because although funded by government — we are separated at arm’s length through law. Our traditions are independent public service journalism. The ABC is regarded the most trusted news organzation in the country.

These are exciting times and under a managing director — the former newspaper executive Mark Scott — content makers at the ABC have been encouraged to as the corporate mantra puts it — be ‘agile’.

Of course, that’s easier said than done.

But one initiative in recent times designed to encourage this has been a competitive staff pitch for money to develop multi-platform projects.

This is how the ABC’s first ever data journalism project was conceived.

Sometime early in 2010 I wandered into the pitch session to face with three senior ‘ideas’ people with my proposal.

I’d been chewing it over for some time. Greedily lapping up the data journalism that the now legendary Guardian data journalism blog was offering, and that was just for starters.

It was my argument that no doubt within 5 years the ABC would have its own data journalism unit. It was inevitable, I opined. But the question was how are we going to get there, and who’s going to start.

For those readers unfamiliar with the ABC, think of a vast bureaucracy built up over 70 years. Its primary offering was always radio and television. With the advent of online in the last decade this content offering unfurled into text, stills and a degree of interactivity previously unimagined. The web space was forcing the ABC to rethink how it cut the cake (money) and rethink what kind of cake it was baking (content).

It is of course a work in progress.

But something else was happening with data journalism. Government 2.0 (which as we discovered is largely observed in the breach in Australia) was starting to offer new ways of telling stories that were hitherto buried in the zero’s and dots.

All this I said to the folk during my pitch. I also said we needed to identify new skills sets, train journalists in new tools. We needed a project to hit play.

And they gave me the money.

On the 24th of November 2011 the ABC’s multi-platform project and ABC News Online went live with ‘Coal Seam Gas by the Numbers’.

Figure 13. <em>Coal Seam Gas by the Numbers</em> (ABC News Online)
Figure 13. Coal Seam Gas by the Numbers (ABC News Online)

It was five pages of interactive maps, data visualizations and text.

It wasn’t exclusively data journalism — but a hybrid of journalisms that was born of the mix of people on the team and the story, which to put in context is raging as one of the hottest issues in Australia.

The jewel was an interactive map showing coal seam gas wells and leases in Australia. Users could search by location and switch between modes to show leases or wells. By zooming in users could see who the explorer was, the status of the well and its drill date. Another map showed the location of coal Seam gas activity compared to the location of groundwater systems in Australia.

Figure 14. Interactive map of gas wells and leases in Australia (ABC News Online)
Figure 14. Interactive map of gas wells and leases in Australia (ABC News Online)

We had data visualizations which specifically addressed this issue of waste salt and water production that would be produced depending on the scenario that emerged.

Another section of the project investigated the release of chemicals into a local river system

Our team

  • A web developer and designer

  • A lead journalist

  • A part time researcher with expertise in data extraction, excel spread sheets and data cleaning

  • A part time junior journalist

  • A consultant executive producer

  • A academic consultant with expertise in data mining, graphic visualization and advanced research skills

  • The services of a project manager and the administrative assistance of the ABC’s multi-platform unit

  • Importantly we also had a reference group of journalists and others whom we consulted on a needs basis

Where did we get the data from?

The data for the interactive maps were scraped from shapefiles (a common kind of file for geospatial data) downloaded from government websites.

Other data on salt and water were taken from a variety of reports.

The data on chemical releases was taken from environmental permits issued by the government.

What did we learn?

‘Coal Seam Gas by the Numbers’ was an ambitious in content and scale. Uppermost in my mind was what did we learn and how might we do it differently next time?

The data journalism project brought a lot of people into the room who do not normally meet at the ABC. In lay terms — the hacks and the hackers. Many of us did not speak the same language or even appreciate what the other does. Data journalism is disruptive!

The practical things:

  • Co-location of the team is vital. Our developer and designer were off-site and came in for meetings. This is definitely not optimal! Place in the same room as the journalists.

  • Our consultant EP was also on another level of the building. We needed to be much closer, just for the drop-by factor

  • Choose a story that is solely data driven.

The big picture: some ideas

Big media organzations need to engage in capacity building to meet the challenges of data journalism. My hunch is there are a lot of geeks and hackers hiding in media technical departments desperate to get out. So we need ‘hack and hacker meets’ workshops where the secret geeks, younger journalists, web developers and designers come out to play with more experienced journalists for skill sharing and mentoring. Task: download this data set and go for it!

Ipso facto Data journalism is interdisciplinary. Data journalism teams are made of people who would not in the past have worked together. The digital space has blurred the boundaries.

We live in a fractured, distrustful body politic. The business model that formerly delivered professional independent journalism – imperfect as it is — is on the verge of collapse. We ought to ask ourselves — as many now are — what might the world look like without a viable fourth estate? The American journalist and intellectual Walter Lippman remarked in the 1920’s that “it is admitted that a sound public opinion cannot exist without access to news”. That statement is no less true now. In the 21st century everyone’s hanging out in the blogosphere. It’s hard to tell the spinners, liars, dissemblers and vested interest groups from the professional journalists. Pretty much any site or source can be made to look credible, slick and honest. The trustworthy mastheads are dying in the ditch. And in this new space of junk journalism, hyperlinks can endlessly take the reader to other more useless but brilliant looking sources that keep hyperlinking back into the digital hall of mirrors. The technical term for this is: bullshit baffles brains. In the digital space everyone’s a storyteller now — right? Wrong. If professional journalism – and by that I mean those who embrace ethical, balanced, courageous truth seeking storytelling – is to survive then the craft must reassert itself in the digital space. Data journalism is just another tool by which we will navigate the digital space. It’s where we will map, flip, sort, filter, extract and see the story amidst all those 0’s and 1’s. In the future we’ll be working side by side with the hackers, the developers the designers and the coders. It’s a transition that requires serious capacity building. We need news managers who “get” the digital/journalism connection to start investing in the build.


subscribe figure