Scraped data is often the backbone of an investigation, but some websites are more difficult to scrape than others. This session covers best practices for dealing with tricky sites, including coping with captchas, using proxy and other scraping services, plus the tradeoffs and costs of these approaches.
As data journalism has become mainstream, more data editor positions have been created. But what makes a good data editor? In this panel we discussed what it takes to do the job effectively, the different things it can involve, and the different routes to getting there. With Marie-Louise Timcke, Jan Strozyk, Helena Bengtsson, Eva Belmonte, and Dominik Balmer, moderated by me.
Guest lecture covering the origins of investigative data journalism, the nature of data in investigations, where it comes from, plus what code is and how it is used in the newsroom to do this kind of work.
This session explained concepts as well as covered tips, tricks, and traps to avoid when working with data. Together they can help you get more organised, better understand your data, ease the friction of collaborating with others, see new opportunities, and develop working practices that make it harder to be wrong. Slides here.
Guest lecture on the data processing pipelines that powered the Financial Times’ coverage of the 2020 US election poll tracker and live results page.
Some of the most interesting datasets started life ‘unstructured’ – as documents, emails, web pages, images, videos, and other formats that look nothing like a spreadsheet. This session covered the challenges in extracting data from these formats, what tools are available, and approaches for verifying the results. Slides here.
For those taking their first steps with data and code, the command line is essential. There are also many useful command line based applications – understanding it opens the door to these power tools. This session covered how it works, the basic commands and concepts, and some of the tools which can be useful in data investigations, including story examples. Slides here.
How code is being used in newsrooms to find stories? If you’re just starting out, where should you start, and how should you approach learning such skills? Panel with Helena Bengtsson and Niamh McIntyre, moderated by Leila Haddou.
An introduction to how code is used in the newsroom, with recent story examples, explaining the fundamental concepts and demystifying the jargon. We also guided attendees through the most common programming languages, and gave a roadmap to deciding which to pursue. Slides here.
This talk explained the ways automation is already being used in newsrooms, why the coming wave of automation is not a threat, and how we can embrace this new technology to improve the quality of investigative reporting at a time of shrinking newsroom resources. Slides here.
As data becomes increasingly important to journalism reporters need to keep their skills up to date. However, newsrooms have less budget for training and conferences than ever before. This was a lightning talk on how Journocoders tries to solve these problems. Video here. Slides here.
Like our reality, our data is often messy. Finding meaningful connections between such datasets often means using fuzzy matching algorithms. This was a high-level look at some of the most commonly used algorithms, their pros and cons, and how they are used in practice. Slides here.
Though much data-led reporting is done in Excel, some can only be reported using other tools. This talk ran through a few stories which took different approaches. Write-up here. Slides here.
Communication difficulties are common between journalists and technologists. This was a talk with an investigative reporter on our experiences working together at the Guardian.
Panel discussing how news organisations have been challenged and transformed by the web, and how this has changed the way they interact with readers. Video here.
An introduction to the Guardian's Content API, which lets developers build their own applications using Guardian content. Also judge on a prize for the best use of the API. Write-up here.