James Ash, Rob Kitchin and Agnieszka Leszczynski have published a new paper entitled ‘Digital Turn, Digital Geography?‘ available as Programmable City Working Paper 17 on SSRN.
Abstract
In this paper, we examine the relationship between the digital and geography. Our analysis provides an overview of the rich scholarship that has examined: (1) geographies of the digital, (2) geographies produced by the digital, and (3) geographies produced through the digital. Using this material we reflect on two questions: has there been a digital turn in geography? and, would it be productive to delimit ‘digital geography’ as a field of study within the discipline, as has recently occurred with the attempt to establish ‘digital anthropology’ and ‘digital sociology’? We argue that while there has been a digital turn across geographical sub-disciplines, the digital is now so pervasive in mediating the production of space and in producing geographic knowledge that it makes little sense to delimit digital geography as a distinct field. Instead, we believe it is more productive to think about how the digital reshapes many geographies
Keywords: digital, geography, computing, digital turn, digital geography
Over the past two decades urban social life has undergone a rapid and pervasive geocoding, becoming mediated, augmented and anticipated by location-sensitive technologies and services that generate and utilise big, personal, locative data. The production of these data has prompted the development of exploratory data-driven computing experiments that seek to find ways to extract value and insight from them. These projects often start from the data, rather than from a question or theory, and try to imagine and identify their potential utility. In this paper, we explore the desires and mechanics of data-driven computing experiments. We demonstrate how both locative media data and computing experiments are ‘staged’ to create new values and computing techniques, which in turn are used to try and derive possible futures that are ridden with unintended consequences. We argue that using computing experiments to imagine potential urban futures produces effects that often have little to do with creating new urban practices. Instead, these experiments promote big data science and the prospect that data produced for one purpose can be recast for another, and act as alternative mechanisms of envisioning urban futures.
Keywords: Data analytics, computing experiments, locative media, location-based social network (LBSN), staging, urban future, critical data studies
Rob Kitchin and Gavin McArdle have published a new paper entitled ‘The diverse nature of big data‘ available as Programmable City Working Paper 15 on SSRN.
Abstract: Big data has been variously defined in the literature. In the main, definitions suggest that big data are those that possess a suite of key traits: volume, velocity and variety (the 3Vs), but also exhaustivity, resolution, indexicality, relationality, extensionality and scalability. However, these definitions lack ontological clarity, with the term acting as an amorphous, catch-all label for a wide selection of data. In this paper, we consider the question ‘what makes big data, big data?’, applying Kitchin’s (2013, 2014) taxonomy of seven big data traits to 26 datasets drawn from seven domains, each of which is considered in the literature to constitute big data. The results demonstrate that only a handful of datasets possess all seven traits, and some do not possess either volume and/or variety. Instead, there are multiple forms of big data. Our analysis reveals that the key definitional boundary markers are the traits of velocity and exhaustivity. We contend that big data as an analytical category needs to be unpacked, with the genus of big data further delineated and its various species identified. It is only through such ontological work that we will gain conceptual clarity about what constitutes big data, formulate how best to make sense of it, and identify how it might be best used to make sense of the world.
Key words: big data, ontology, taxonomy, types, characteristics
The smart city encompasses a broad range of technological innovations which might be applied to any city for a broad range of reasons. In this article, I make a distinction between local efforts to effect the urban landscape, and a global smart city imaginary which those efforts draw upon and help sustain. While attention has been given to the malleability of the smart city concept at this global scale, there remains little effort to interrogate the way that the future is used to sanction specific solutions. Through a critical engagement with smart city marketing materials, industry documents and consultancy reports, I explore how the future is recruited, rearranged and represented as a rationalisation for technological intervention in the present. This is done across three recurring crises: massive demographic shifts and subsequent resource pressure; global climate change; and the conflicting demands of fiscal austerity and the desire of many cities to attract foreign direct investment and highly-skilled workers. In revealing how crises are pre-empted, precautioned and prepared for, I argue that the smart city imaginary normalises a style and scale of response deemed appropriate under liberal capitalism.
Keywords: smart cities, the urban age, anticipation, risk
Locative Social Media offers a critical analysis of the effect of using locative social media on the perceptions and phenomenal experience of lived in spaces and places. It includes a comprehensive overview of the historical development of traditional mapping and global positioning technology to smartphone-based application services that incorporate social networking features as a series of modes of understanding place. Drawing on users accounts of the location-based social network Foursquare, a digital post-phenomenology of place is developed to explain how place is mediated in the digital age. This draws upon both the phenomenology of Martin Heidegger and post-phenomenology to encompass the materiality and computationality of the smartphone. The functioning and surfacing of place by the device and application, along with the orientation of the user, allows for a particular experiencing of place when using locative social media termed attunement, in contrast to an instrumentalist conception of place.
Reviews
“Locative Social Media is a fine book that is theoretically sophisticated and empirically grounded. In it, Leighton Evans develops a rigorous post-phenomenology of location-based social media, and explores how mood or orientation, embodied practices involving mobile technology use, and the data-infused environment, are all ‘co-constitutive of place’.” – Rowan Wilken, Senior Lecturer, Faculty of Health, Arts and Design, Swinburne University of Technology, Australia
“In this book, Leighton Evans accomplishes something very ambitious: a deep theoretical reflection on the phenomenology of place experience as it occurs in the context of physical/digital interactions, interwoven with a thorough empirical account of situated use of location-based social networks. Evans’ study of Foursquare users details complex place-related agencies in the age of what he calls a ‘computationally infused world’, including gathering, mapping, bridging, broadcasting, reputation management and building social capital. His findings resonate with and holistically consolidate the state of the art of interdisciplinary investigations of locative social media. The most impressive achievement in this book, however, is how the empirical evidence builds the basis for an exciting conceptual revisitation of the phenomenology of place; Evans proposes an original ‘digital post-phenomenology of place’ that connects key aspects of situated socio-technical systems: from embodied practices, to new and emergent mappings, occurrences and representations enabled by code and by locative infrastructures.” – Luigina Ciolfi, Reader in Communication in the Cultural Communication and Computing Research Institute, Sheffield Hallam University, UK
“Transporting Heidegger from the Black Forest to the urban Foursquare-world, Leighton Evans discusses the persistently collective nature of space and place in digital culture. This important study opens different ways how location based social networks function to frame space for us but also how users participate in this process of defining belonging. Evans’ book addresses algorithmic situations as digital post-phenomenology of place; the book is a valuable research text for scholars and students in media, sociology and cultural studies of technology.” – Jussi Parikka, Professor of Technological Culture and Aesthetics, Winchester School of Art, University of Southampton, UK
A new paper, ‘Improving the Veracity of Open & Real-Time Urban Data’, has been published by Gavin McArdle and Rob Kitchin as Programmable City Working Paper 13. The paper has been prepared for the Data and the City workshop to be held at Maynooth University Aug 31th-Sept 1st.
Abstract
Within the context of the smart city, data are an integral part of the digital economy and are used as input for decision making, policy formation, and to inform citizens, city managers and commercial organisations. Reflecting on our experience of developing real-world software applications which rely heavily on urban data, this article critically examines the veracity of such data (their authenticity and the extent to which they accurately (precision) and faithfully (fidelity, reliability) represent what they are meant to) and how they can be assessed in the absence of quality reports from data providers. While data quality needs to be considered at all aspects of the data lifecycle and in the development and use of applications, open data are often provided ‘as-is’ with no guarantees about their veracity, continuity or lineage (documentation that establishes provenance and fit for use). This allows data providers to share data with undocumented errors, absences, and biases. If left unchecked these data quality issues can propagate through multiple systems and lead to poor smart city applications and unreliable ‘evidence-based’ decisions. This leads to a danger that open government data portals will come to be seen as untrusted, unverified and uncurated data-dumps by users and critics. Drawing on our own experiences we highlight the process we used to detect and handle errors. This work highlights the necessary janitorial role carried out by data scientists and developers to ensure that data are cleaned, parsed, validated and transformed for use. This important process requires effort, knowledge, skill and time and is often hidden in the resulting application and is not shared with other data users. In this paper, we propose that rather than lose this knowledge, in the absence of data providers documenting them in metadata and user guides, data portals should provide a crowdsourcing mechanism to generate and record user observations and fixes for improving the quality of urban data and open government portals.
Key words: open data, big data, realtime data, veracity, quality, fidelity, metadata, urban data, transport, smart cities