Author Archives: Rob Kitchin

Two invited talks in Boston on smart cities

Rob Kitchin gave two invited talks in Boston last week concerning Programmable City research.

The first was at UMass Boston, sponsored by APA-MA, BSA, MAPC, MAPD, MACDC, BRA, Mel King Institute, and was entitled ‘Planning in the era of smart urbanism.  The slides for the talk can be found at: http://www.slideshare.net/robkitchin/planning-in-an-era-of-smart-urbanism

The second was at the launch of MIT’s new Institute for Data, Systems and Society and was entitled ‘Ethics and risks of urban big data and smart cities’. The slides for the talk can be found at: http://www.slideshare.net/robkitchin/the-ethics-and-risks-of-urban-big-data-and-smart-cities  A video of the event can be found at: https://idss2016.mit.edu/

New paper: Urban data and city dashboards: Six key issues

Rob Kitchin and Gavin McArdle have published a new Programmable City working paper (no. 21) – Urban data and city dashboards: Six key issues – on SocArXiv today.  It is a pre-print of a chapter that will be published in Kitchin, R., Lauriault, T.P. and McArdle, G. (eds) (forthcoming) Data and the City. Routledge, London..

Abstract

This chapter considers the relationship between data and the city by critically examining six key issues with respect city dashboards: epistemology, scope and access, veracity and validity, usability and literacy, use and utility, and ethics.  While city dashboards provide useful tools for evaluating and managing urban services, understanding and formulating policy, and creating public knowledge and counter-narratives, our analysis reveals a number of conceptual and practical shortcomings.  In order for city dashboards to reach their full potential we advocate a number of related shifts in thinking and praxes and forward an agenda for addressing the issues we highlight.  Our analysis is informed by our endeavours in building the Dublin Dashboard.

Key words: dashboards, cities, access, epistemology, ethics, open data, scope, usability, utility, veracity, validity

New paper: Reframing, reimagining and remaking smart cities

Rob Kitchin has published a new Programmable City working paper (no. 20) – Reframing, reimagining and remaking smart cities – on SocArXiv today.  It is an introductory framing/provocation essay for the ‘Creating smart cities’ workshop to be hosted at Maynooth University, 5-6 September 2016.

Abstract

Over the past decade the concept and development of smart cities has unfolded rapidly, with many city administrations implementing smart city initiatives and strategies and a diverse ecology of companies and researchers producing and deploying smart city technologies. In contrast to those that seek to realise the benefits of a smart city vision, a number of critics have highlighted a number of shortcomings, challenges and risks with such endeavours.  This short paper outlines a third path, one that aims to realise the benefits of smart city initiatives while recasting the thinking and ethos underpinning them and addressing their deficiencies and limitations.  It argues that smart city thinking and initiatives need to be reframed, reimagined and remade in six ways.  Three of these concern normative and conceptual thinking with regards to goals, cities and epistemology, and three concern more practical and political thinking and praxes with regards to management/governance, ethics and security, and stakeholders and working relationships.  The paper does not seek to be definitive or comprehensive, but rather to provide conceptual and practical suggestions and stimulate debate about how to productively recast smart urbanism and the creation of smart cities.

Key words: smart cities, framing, vision, ethos, politics, urbanism

Two new postdoctoral posts on ProgCity project

The Programmable City project is seeking two postdoctoral researchers (14 month contracts). Preferably the posts will critically examine either:

• the production of software underpinning smart city technologies and how software developers translate rules, procedures and policies into a complex architecture of interlinked algorithms that manage and govern how people traverse or interact with urban systems; or,

• the political economy of smart city technologies and initiatives; the creation of smart city markets; the inter-relation of urban (re)development and smart city initiatives; the relationship between vendors, business lobby groups, economic development agencies, and city administrations; financialization and new business models; or,

• the relationship between the political geography of city administration, governance arrangements, and smart city initiatives; political and legal geographies of testbed urbanism and smart city initiatives; smart city technologies and governmentality.

We are prepared to consider any other proposal that critical interrogates the relationship between software, data and the production of smart cities and there will be some latitude to negotiate with the principal investigator the exact focus of the research undertaken.

While some of the research will require primary fieldwork, it is anticipated it will also involve the secondary analysis of data already generated by the project.

The project will be based in the National Institute for Regional and Spatial Analysis (NIRSA) at Maynooth University.

More details on how to apply can be found on the University human resources site.  Closing date is 5th August.

New paper in Information, Communication and Society: Thinking critically about and researching algorithms

A new paper by Rob Kitchin – Thinking critically about and researching algorithms – has just been published (online first) in Information, Communication and Society.

Abstract

More and more aspects of our everyday lives are being mediated, augmented, produced and regulated by software-enabled technologies. Software is fundamentally composed of algorithms: sets of defined steps structured to process instructions/data to produce an output. This paper synthesises and extends emerging critical thinking about algorithms and considers how best to research them in practice. Four main arguments are developed. First, there is a pressing need to focus critical and empirical attention on algorithms and the work that they do given their increasing importance in shaping social and economic life. Second, algorithms can be conceived in a number of ways – technically, computationally, mathematically, politically, culturally, economically, contextually, materially, philosophically, ethically – but are best understood as being contingent, ontogenetic and performative in nature, and embedded in wider socio-technical assemblages. Third, there are three main challenges that hinder research about algorithms (gaining access to their formulation; they are heterogeneous and embedded in wider systems; their work unfolds contextually and contingently), which require practical and epistemological attention. Fourth, the constitution and work of algorithms can be empirically studied in a number of ways, each of which has strengths and weaknesses that need to be systematically evaluated. Six methodological approaches designed to produce insights into the nature and work of algorithms are critically appraised. It is contended that these methods are best used in combination in order to help overcome epistemological and practical challenges.

The final version is available here and a pre-print version of the paper can be downloaded here.

New paper in Big Data and Society: What makes big data, big data?

Rob Kitchin and Gavin McArdle have a new paper – What makes big data, big data? Exploring the ontological characteristics of 26 datasets – published in Big Data and Society.

Abstract

Big Data has been variously defined in the literature. In the main, definitions suggest that Big Data possess a suite of key traits: volume, velocity and variety (the 3Vs), but also exhaustivity, resolution, indexicality, relationality, extensionality and scalability. However, these definitions lack ontological clarity, with the term acting as an amorphous, catch-all label for a wide selection of data. In this paper, we consider the question ‘what makes Big Data, Big Data?’, applying Kitchin’s taxonomy of seven Big Data traits to 26 datasets drawn from seven domains, each of which is considered in the literature to constitute Big Data. The results demonstrate that only a handful of datasets possess all seven traits, and some do not possess either volume and/or variety. Instead, there are multiple forms of Big Data. Our analysis reveals that the key definitional boundary markers are the traits of velocity and exhaustivity. We contend that Big Data as an analytical category needs to be unpacked, with the genus of Big Data further delineated and its various species identified. It is only through such ontological work that we will gain conceptual clarity about what constitutes Big Data, formulate how best to make sense of it, and identify how it might be best used to make sense of the world.

The paper is available for download as a PDF here.