Rob Kitchin and Gavin McArdle have published a new Programmable City working paper (no. 21) – Urban data and city dashboards: Six key issues – on SocArXiv today. It is a pre-print of a chapter that will be published in Kitchin, R., Lauriault, T.P. and McArdle, G. (eds) (forthcoming) Data and the City. Routledge, London..
This chapter considers the relationship between data and the city by critically examining six key issues with respect city dashboards: epistemology, scope and access, veracity and validity, usability and literacy, use and utility, and ethics. While city dashboards provide useful tools for evaluating and managing urban services, understanding and formulating policy, and creating public knowledge and counter-narratives, our analysis reveals a number of conceptual and practical shortcomings. In order for city dashboards to reach their full potential we advocate a number of related shifts in thinking and praxes and forward an agenda for addressing the issues we highlight. Our analysis is informed by our endeavours in building the Dublin Dashboard.
A new paper, ‘Improving the Veracity of Open & Real-Time Urban Data’, has been published by Gavin McArdle and Rob Kitchin as Programmable City Working Paper 13. The paper has been prepared for the Data and the City workshop to be held at Maynooth University Aug 31th-Sept 1st.
Within the context of the smart city, data are an integral part of the digital economy and are used as input for decision making, policy formation, and to inform citizens, city managers and commercial organisations. Reflecting on our experience of developing real-world software applications which rely heavily on urban data, this article critically examines the veracity of such data (their authenticity and the extent to which they accurately (precision) and faithfully (fidelity, reliability) represent what they are meant to) and how they can be assessed in the absence of quality reports from data providers. While data quality needs to be considered at all aspects of the data lifecycle and in the development and use of applications, open data are often provided ‘as-is’ with no guarantees about their veracity, continuity or lineage (documentation that establishes provenance and fit for use). This allows data providers to share data with undocumented errors, absences, and biases. If left unchecked these data quality issues can propagate through multiple systems and lead to poor smart city applications and unreliable ‘evidence-based’ decisions. This leads to a danger that open government data portals will come to be seen as untrusted, unverified and uncurated data-dumps by users and critics. Drawing on our own experiences we highlight the process we used to detect and handle errors. This work highlights the necessary janitorial role carried out by data scientists and developers to ensure that data are cleaned, parsed, validated and transformed for use. This important process requires effort, knowledge, skill and time and is often hidden in the resulting application and is not shared with other data users. In this paper, we propose that rather than lose this knowledge, in the absence of data providers documenting them in metadata and user guides, data portals should provide a crowdsourcing mechanism to generate and record user observations and fixes for improving the quality of urban data and open government portals.
Key words: open data, big data, realtime data, veracity, quality, fidelity, metadata, urban data, transport, smart cities