Abstract: This paper provides a short introductory overview of urban science. It defines urban science, details its practioners and their aims, sets out its relationship to urban informatics and urban studies, and explains its epistemology and the analysis of urban big data. It then summarizes criticism of urban science with respect to epistemology, instrumental rationality, data issues, and ethics. It is concluded that urban science research will continue to grow for the foreseeable future, providing a valuable means of making sense of cities, but that it is unlikely it will become a new paradigm, producing an integrative approach that replaces the diverse philosophical traditions within urban studies.
Rob Kitchin and Gavin McArdle have published a new Programmable City working paper (no. 21) – Urban data and city dashboards: Six key issues – on SocArXiv today. It is a pre-print of a chapter that will be published in Kitchin, R., Lauriault, T.P. and McArdle, G. (eds) (forthcoming) Data and the City. Routledge, London..
This chapter considers the relationship between data and the city by critically examining six key issues with respect city dashboards: epistemology, scope and access, veracity and validity, usability and literacy, use and utility, and ethics. While city dashboards provide useful tools for evaluating and managing urban services, understanding and formulating policy, and creating public knowledge and counter-narratives, our analysis reveals a number of conceptual and practical shortcomings. In order for city dashboards to reach their full potential we advocate a number of related shifts in thinking and praxes and forward an agenda for addressing the issues we highlight. Our analysis is informed by our endeavours in building the Dublin Dashboard.
More and more aspects of our everyday lives are being mediated, augmented, produced and regulated by software-enabled technologies. Software is fundamentally composed of algorithms: sets of defined steps structured to process instructions/data to produce an output. This paper synthesises and extends emerging critical thinking about algorithms and considers how best to research them in practice. Four main arguments are developed. First, there is a pressing need to focus critical and empirical attention on algorithms and the work that they do given their increasing importance in shaping social and economic life. Second, algorithms can be conceived in a number of ways – technically, computationally, mathematically, politically, culturally, economically, contextually, materially, philosophically, ethically – but are best understood as being contingent, ontogenetic and performative in nature, and embedded in wider socio-technical assemblages. Third, there are three main challenges that hinder research about algorithms (gaining access to their formulation; they are heterogeneous and embedded in wider systems; their work unfolds contextually and contingently), which require practical and epistemological attention. Fourth, the constitution and work of algorithms can be empirically studied in a number of ways, each of which has strengths and weaknesses that need to be systematically evaluated. Six methodological approaches designed to produce insights into the nature and work of algorithms are critically appraised. It is contended that these methods are best used in combination in order to help overcome epistemological and practical challenges.
The final version is available here and a pre-print version of the paper can be downloaded here.
A new paper by Rob Kitchin has been posted as open access on SSRN. Thinking critically about and researching algorithms is The Programmable City Working Paper 5.
The era of ubiquitous computing and big data is now firmly established, with more and more aspects of our everyday lives being mediated, augmented, produced and regulated by digital devices and networked systems powered by software. Software is fundamentally composed of algorithms — sets of defined steps structured to process instructions/data to produce an output. And yet, to date, there has been little critical reflection on algorithms, nor empirical research into their nature and work. This paper synthesises and extends initial critical thinking about algorithms and considers how best to research them in practice. It makes a case for thinking about algorithms in ways that extend far beyond a technical understanding and approach. It then details four key challenges in conducting research on the specificities of algorithms — they are often: ‘black boxed’; heterogeneous, contingent on hundreds of other algorithms, and are embedded in complex socio-technical assemblages; ontogenetic and performative; and ‘out of control’ in their work. Finally, it considers six approaches to empirically research algorithms: examining source code (both deconstructing code and producing genealogies of production); reflexively producing code; reverse engineering; interviewing designers and conducting ethnographies of coding teams; unpacking the wider socio-technical assemblages framing algorithms; and examining how algorithms do work in the world.
Key words: algorithm, code, epistemology, research
Rob Kitchin’s paper ‘Big data, new epistemologies and paradigm shifts’ has been published in the first edition of a new journal, Big Data and Society, published by Sage. The paper is open access and can be downloaded by clicking here. A video abstract is below. The paper is also accompanied by a blog post ‘Is big data going to radically transform how knowledge is produced across all disciplines of the academy?’ on the Big Data and Society blog.
This article examines how the availability of Big Data, coupled with new data analytics, challenges established epistemologies across the sciences, social sciences and humanities, and assesses the extent to which they are engendering paradigm shifts across multiple disciplines. In particular, it critically explores new forms of empiricism that declare ‘the end of theory’, the creation of data-driven rather than knowledge-driven science, and the development of digital humanities and computational social sciences that propose radically different ways to make sense of culture, history, economy and society. It is argued that: (1) Big Data and new data analytics are disruptive innovations which are reconfiguring in many instances how research is conducted; and (2) there is an urgent need for wider critical reflection within the academy on the epistemological implications of the unfolding data revolution, a task that has barely begun to be tackled despite the rapid changes in research practices presently taking place. After critically reviewing emerging epistemological positions, it is contended that a potentially fruitful approach would be the development of a situated, reflexive and contextually nuanced epistemology.
The Programmable City team delivered four papers at the Conference of the Association of American Geographers held in Tampa, April 8-12. Here are the slides for Kitchin, R., Lauriault, T. and McArdle, G. (2014). “Urban indicators, city benchmarking, and real-time dashboards: Knowing and governing cities through open and big data” delivered in the session “Thinking the ‘smart city’: power, politics and networked urbanism II” organized by Taylor Shelton and Alan Wiig. The paper is a work in progress and was the first attempt at presenting work that is presently being written up for submission to a journal. No doubt it’ll evolve over time, but the central argument should hold.