A new paper by Rob Kitchin has been posted as open access on SSRN. Thinking critically about and researching algorithms is The Programmable City Working Paper 5.
Abstract
The era of ubiquitous computing and big data is now firmly established, with more and more aspects of our everyday lives being mediated, augmented, produced and regulated by digital devices and networked systems powered by software. Software is fundamentally composed of algorithms — sets of defined steps structured to process instructions/data to produce an output. And yet, to date, there has been little critical reflection on algorithms, nor empirical research into their nature and work. This paper synthesises and extends initial critical thinking about algorithms and considers how best to research them in practice. It makes a case for thinking about algorithms in ways that extend far beyond a technical understanding and approach. It then details four key challenges in conducting research on the specificities of algorithms — they are often: ‘black boxed’; heterogeneous, contingent on hundreds of other algorithms, and are embedded in complex socio-technical assemblages; ontogenetic and performative; and ‘out of control’ in their work. Finally, it considers six approaches to empirically research algorithms: examining source code (both deconstructing code and producing genealogies of production); reflexively producing code; reverse engineering; interviewing designers and conducting ethnographies of coding teams; unpacking the wider socio-technical assemblages framing algorithms; and examining how algorithms do work in the world.
Key words: algorithm, code, epistemology, research
As noted on the blog, the Dublin Dashboard was launched last Friday (19th Sept) in Wood Quay in Dublin. There was a full house at the launch and it was also covered in the media, including the RTE 1 TV news, radio coverage on RTE 1 Drivetime, FM104, and KFM, and newspaper reports in the Irish Times, Irish Independent, and The Irish Sun. The Irish Times website also included a short video.
We’re proud at the Programmable City project to be launching the Dublin Dashboard, one of the first major outputs from the project. Below is a short video overview, part of the press release, and our slides from the launch.
The Dublin Dashboard provides citizens, researchers, planners, policy makers and companies with real-time information, time-series data, and interactive maps about all aspects of the city. It allows users to gain a better understand how the city is performing, to undertake evidence-informed analysis, and to improve their everyday decision making. For example, you can learn about how the city economy is performing at a glance, visualise crime levels by garda station and district, and monitor traffic flows and car parking spaces in real time.
Owen Keegan, Chief Executive for Dublin City Council officially launched the Dublin Dashboard. “The Dublin Dashboard is a great example of a local authority and a university working together for the benefit of citizens. The real value for city leaders is how the dashboard will help Dublin to monitor performance.
“A massive wealth of data is being made available, including real-time data, about the economy, transport, planning, housing, health, population, the environment, and emergency services. We are committed to further developing and improving the range of city data and information available on the platform.”
The dashboard is made up of a number of modules that can be easily used to explore hundreds of graphs, maps and apps concerning how Dublin is performing over time and in relation to other locales, what is happening in the city right now, the location of all kinds of facilities, and how to report on particular issues.
Professor Rob Kitchin, the project principal investigator, welcomed Dublin City Council’s support in making data available for the Dashboard. “The aim of the site is to empower people living and working in Dublin by providing them with easy to access intelligence about the city,” he said. “For example, users can jump onto the system before they leave for work to see how the traffic is flowing or see what spaces are available in different car parks. Prospective house buyers can explore the characteristics of an area and how close different amenities are.”
Dr Gavin McArdle, the lead developer for the Dashboard, explained that the site is based on a principle of openness. “We wanted to create an open platform where anyone can take the data we use and build their own apps, or to connect their own apps back into the site to add new functionality,” he said. “Our approach has sought to avoid re-inventing the wheel, so if a good app already exists we just link to that rather than creating our own version.”
The data underpinning the website is drawn from a number of data providers — including Dublin City Council, Dublinked, Central Statistics Office, Eurostat, and government departments, and links to a variety of existing applications. The underlying data is freely available so others can undertake their own analysis and build their own applications and visualisations.
There are plans in place to add new real-time datasets, including maps of social media, and new interactive mapping modules.
Last Friday I acted as a discussant for three sessions (no. 1, no. 2, no. 3) on Digital Geography presented at the RGS/IBG conference in London. The papers were quite diverse and some of the discussion in the sessions centred on how to frame and make sense of digital geographies.
In their overview paper, Elisabeth Roberts and David Beel categorised the post-2000 geographical literature which engages with the digital into six classes: conceptualisation, unevenness, governance, economy, performativity, and the everyday. To my mind, this is quite a haphazard way of dividing up the literature. Instead, I think it might be more productive to divide the wide range of studies which consider the relationship between the digital and geography into three bodies of work:
Geography of the digital
These works seek to apply geographical ideas and methodologies to make sense of the digital. As such, it focuses on mapping out the geographies of digital technologies, their associated socio-technical assemblages and production. Such work includes the mapping of cyberspace, charting the spatialities of social media, plotting the material geographies of ubiquitous computing, detailing the economic geographies of component resources, technologies and infrastructures, tracing the generation and flows of big data, and so on.
Geography produced by the digital
This body of work focuses on how digital technologies and infrastructures are transforming the geographies of everyday life and the production of space. Such work includes examining how digital technologies and ICTs are increasingly being embedded into different spatial domains – the workplace, home, transport systems, the street, shops, etc.; how they mediate and augment socio-spatial practices and relations such as producing, consuming, communicating, playing, etc; how they shape and remediate geographical imaginaries and how spaces are visioned, planned and built; and so on.
Geography produced through the digital
An increasing amount of geographical scholarship, praxis and communication is now undertaken using digital technologies. For example, generating, recording and analyzing data using digital devices and associated software and databases; the collection and sharing of datasets and outputs through digital archives and repositories; discussing ideas and conducting debate via mailing lists and social media; writing papers and presentations, producing maps and other visualizations using computers; etc. A fairly substantial body of work thus reflects on the difference digital technologies make to the production of geographical scholarship.
Taken together these three bodies of work, I would argue, constitute digital geography.
At the same time, however, I wonder about the utility of bounding digital geography and corralling studies within its bounds. To what extent is it useful to delimit it as a defined field of research?
It might be more productive to reframe much of what is being claimed as digital geography with respect to its substantive focus. For example, examining the ways in which digital technologies are being pervasively embedded into the fabric of cities and how they modulate the production of urban socio-spatial relations is perhaps best framed within urban geography. Similarly, a study looking at the use of digital technology in the delivery of aid in parts of the Global South is perhaps best understood as being centrally concerned with development geography. In other words, it may well be more profitable to think about how the digital reshapes many geographies, rather than to cast all of those geographies as digital geography.
Nonetheless, it is clear that geographers still have much work to do with respect to thinking about the digital. That is a central task of my own research agenda as I work on the Programmable City project. I’d be interested in your own thoughts as to how you conceive and position digital geography, so if you’re inclined to share your views please leave a comment.
Rob Kitchin’s latest book The Data Revolution: Big Data, Open Data, Data Infrastructure and Their Consequences was published on August 23rd by Sage. There’s a dedicated website with a bunch of resources (including open access links to related papers and a hyperlinked bibliography), plus a promo video (below). The publisher has made the preface and chapters one (Conceptualising Data) and four (Big Data) open access. The website has a full table of contents and chapter outlines though the title gives a pretty good description as to what it’s about. The book has had a decent amount of advance praise. The site includes details about buying the book, including electronically through just about every format going.
Rob Kitchin and Tracey Lauriault have just published the second Programmable City Working Paper – Towards Critical Data Studies: Charting and Unpacking Data Assemblages and Their Work. It is a pre-print of a chapter written for the book, Geoweb and Big Data, edited by Joe Eckert, Andy Shears and Jim Thatcher, to be published by University of Nebraska Press.
Abstract
The growth of big data and the development of digital data infrastructures raises numerous questions about the nature of data, how they are being produced, organized, analyzed and employed, and how best to make sense of them and the work they do. Critical data studies endeavours to answer such questions. This paper sets out a vision for critical data studies, building on the initial provocations of Dalton and Thatcher (2014). It is divided into three sections. The first details the recent step change in the production and employment of data and how data and databases are being reconceptualised. The second forwards the notion of a data assemblage that encompasses all of the technological, political, social and economic apparatuses and elements that constitutes and frames the generation, circulation and deployment of data. Drawing on the ideas of Michel Foucault and Ian Hacking it is posited that one way to enact critical data studies is to chart and unpack data assemblages. The third starts to unpack some the ways that data assemblages do work in the world with respect to dataveillance and the erosion of privacy, profiling and social sorting, anticipatory governance, and secondary uses and control creep. The paper concludes by arguing for greater conceptual work and empirical research to underpin and flesh out critical data studies.
Key words
big data, critical data studies, data assemblages, data infrastructures, civil liberties