We had a wonderful Code and the City workshop in September and we will be making the video recording of the presentations available from today, and on the following Fridays!
Today, we will be sharing videos from the Opening talk and First session: Code, coding and interfaces
Opening talk
Code and the city: Reframing the conceptual terrain Rob Kitchin, NIRSA, National University of Ireland Maynooth
Abstract>
Software has become essential to the functioning of cities. It is deeply and pervasively embedded into the systems and infrastructure of the built environment and in the management and governance of urban societies. Software-enabled technologies and services augment and facilitate how we understand and plan cities, how we manage urban services and utilities, and how we live urban lives. This paper will provide an overarching overview of the ways in which software has become an indispensible mediator of urban systems and the consequent implications, and makes the case for the study of computational algorithms and how cities are captured in and processed through code.
Session 1: Code, coding and interfaces
Code-crowd: How software repositories express urban life Adrian Mackenzie, Sociology, Lancaster University
Abstract
Is code an expression of urban life? This paper analyses around 10 million software repositories on Github.com from the perspective of how they include cities. The methodology here relies on data-intensive work with bodies of code at a number of different levels. It maps the geographies of Github organisations and users to see how location anchors coding work. More experimentally, it tracks how urban spaces, movements and architectures figure in and configure code. The paper’s focus is less on how code shapes cities and more on apprehending code and coding as a way of experientially inhabiting cities. This approach might better highlight how code expresses urban experiences of proximity, mixing, movement, nearness, distance, and location. It might also shed light on the plural forms of spatiality arising from code, particularly as algorithmic processes become more entangled with each other.
Encountering the city at hackathons
Sophia Maalsen and Sung-Yueh Perng, National University of Ireland, Maynooth
Abstract
The growing significance of hackathons is currently developing in a mutually informing way. On the one hand, there is an increasing use of hackathons to address issues of city governance – Chris Vein, US CTO for government innovation has described them as ‘sensemaking’ tools for government, encouraging agencies to make use of hackathons and “let the collective energy of the people in the room come together and really take that data and solve things in creative and imaginative ways” (Llewellyn 2012). On the other, regular hack nights appear as creative urban space for citizens to discuss problems they encounter and which are not necessarily considered by government, and produce solutions to tackle these issues.
In this paper, we explore potential opportunities and tensions, as well as excitement and inattentiveness, emerging as solutions are proposed and pursued. Through this, we reflect upon how such processes translate the city and transform ways of living in places where the solutions are applied. We further ask whether the positive discourse surrounding hackathons is justified or whether there are limits to their ability to deal with the complexity of urban issues.
Interfacing urban intelligence Shannon Mattern, Media Studies, New School NY
Abstract
Technology companies, city governments, and design firms – the entities teaming up to construct our highly-networked cities of the future – have prototyped interfaces through which citizens can engage with the smart city. But those prototypes, almost always envisioned as screens of some sort, embody institutional values that aren’t always aligned with those of citizens who rightfully claim a “right to the city.” Based on promotional materials from Cisco, Siemens, IBM, Microsoft, and their smart-city-making counterparts, it seems that one of the chief preoccupations of our future-cities is to reflect their data consumption and hyper-efficient (often “widgetized”) activity back to themselves. We thus see city “control centers” lined with screens that serve in part to visualize, and celebrate, the city’s own supposedly hyper-rational operation. Public-facing interfaces, meanwhile, are typically rendered via schematic mock-ups, with little consideration given to interface design. They’re portrayed as conduits for transit information, commercial and service locations and reviews, and information about cultural resources and tourist attractions; and as portals for gathering user-generated data. Across the board, these interfacing platforms tend to frame their users as sources of data that feed the urban algorithmic machines, and as consumers of data concerned primarily with their own efficient navigation and consumption of the city.
In this talk, I’ll consider how we might we design urban interfaces for urban citizens, who have a right to know what’s going on inside “’black boxed’ [urban] control systems” – and even engage with the operating system as more than mere data-generators or reporters-of-potholes-and-power-outages. In considering what constitutes an ideal urban interface, we need to examine those platforms that are already in existence, and those that are proposed for future cities. Even the purely hypothetical, the speculative – the “design fiction” – can illuminate what’s possible, technologically, aesthetically, and ideologically; and can allow us to ask ourselves what kind of a “public face” we want to front our cities, and, even more important, what kinds of intelligence and agency – technological and human – we want our cities to embody.
Do come back next Friday! The next session awaits!
We have the video and slides ready for you now! And we will start our seminar series for the coming year too. So do come back for latest updates on our events!
You can write down equations that predict what people will do. That’s the huge change. So I have been running the big data conversation … It’s about the fact that you can now understand customers, employees, how we organise, in a quantitative, predictive way for the first time.
Predictive analytics is fervently discussed in the business world, if not fully taken up, and increasingly by public services, governments or medical practices to exploit the value hidden in the public archive or even in social media. In New York for example, there is a geek squad to Mayor’s office, seeking to uncover deep and detailed relationships between the people living there and the government, and at the same time realising “how insanely complicated this city is”. In there, an intriguing question remains as to the effectiveness of predictive analytics, the extent to which it can support and facilitate urban life and the consequences to the cities that are immersed in a deep sea of data, predictions and humans.
Let’s start with an Australian example. The Commonwealth Scientific and Industrial Research Organisation (CSIRO) has partnered with Queensland Health, Griffith University and Queensland University of Technology and developed the Patient Admission Prediction Tool (PAPT) to estimate the presentations of schoolies, leavers of Australian high schools going on week-long holidays after final examines, to nearby hospitals. The PAPT derives their estimates from Queensland Health data on schoolies presentations in previous years, including statistics about the numbers of presentations, parts of the body injured and types of these injuries. Using the data, the PAPT benefits hospitals, their employees and patients by improved scheduling of hospital beds, procedures and staff, with the potential of saving $23 million per annum if implemented in hospitals across Australia. As characterised by Dr James Lind, one of the benefits of adapting predictive analytics is the proactive rather than reactive approaches towards planning and management:
People like working in a system that is proactive rather than reactive. When we are expecting a patient load everyone knows what their jobs [are], and you are more efficient with your time.
The patients are happy too, because they receive and finish treatment quickly:
Can we find such success when predictive analytics is practised in various forms of urban governance? Moving the discussion back to US cities again and using policing as an example. Policing work is shifting from reactive to proactive in many cities, in experimental or implementation stages. PredPol is predictive policing software produced by a startup company and has caught considerable amount of attention from various police departments in the US and other parts of the world. Their success as a business, however, is partly to do with by their “energetic” marketing strategies, contractual obligations of referring the startup company to other law enforcement agencies, and so on.
Above all, claims of success shown by the company are difficult to sustain in closer examination. The subjects of the analytics that the software focuses are very specific: burglaries, robberies, vehicle thefts, thefts from vehicles and gun crimes. In other words, the crimes that have “plenty of data to chew on” for making predictions, and are of the opportunistic crimes which are easier to prevent by the presence of the patrolling police (more details here).
This further brings us to the issue of the “proven” and “continued” aspects of success. These are even more difficult and problematic aspects of policing work for the purpose of evaluating and “effectiveness” and “success” of predictive policing. To prove that an algorithm performs well, expectations for which an algorithm is built and tweaked have to be specified, not only for those who build the algorithm, but also for people who will be entangled in the experiments in intended and unintended ways. In this sense, transparency and objectivity related to predictive policing are important. Without acknowledging, considering and evaluating how both the crimes and everyday life, or both normality and abnormality, are transformed into algorithms and disclosing them for validation and consultation, a system of computational criminal justice can turn into, if not witchhunting, alchemy – let’s put two or more elements into a magical pot, stir them and see what happens! This is further complicated by knowing that there are already existing and inherent inequalities in crime data, such as reporting or sentencing, and the perceived neutrality of algorithms can well justify cognitive biases that are well documented in justice system, biases that could justify the rational that someone should be treated more harshly because the person is already on the black list, without reconsidering how the person gets onto such list in the first place. There is an even darker side of predictive policing when mundane social activities are constantly treated as crime data when using social network analysis to profile and incriminate groups and grouping of individuals. This is also a more dynamic and contested field of play considering that while crime prediction practitioners (coders, private companies, government agencies and so on) appropriate personal data and private social media messages for purposes they are not intended for, criminals (or activists for that matter) play with social media, if not yet prediction results obtained by the reverse engineering of algorithms, to plan coups, protests, attacks, etc.
For those who want to look further into how predictive policing is set up, proven, run and evaluated, there are ways of opening up the black box, at least partially, for critically reflecting upon what exactly it could achieve and how the “success” is managed both in computer simulation and in police practices. The Chief scientist of PredPol gave a lecturer where, as pointed out:
He discusses the mathematics/statistics behind the algorithm and, at one point, invites the audience not to take his word for it’s accuracy because he is employed by PredPol, but to take the equations discussed and plug in crime data (e.g. Chicago’s open source crime data) to see if the model has any accuracy.
The video of the lecturer is here
Furthermore, RAND provides a review of predictive policing methods and practices across many US cities. The report can be found here and analyses the advantages gained by various crime prediction methods as well as their limitations. Predictive policing as shown in the report is far from a crystal ball, and has various levels of complexity to run and implement, mathematically, computationally and organisationally. Predictions can range from crime mapping to predicting crime hotspots when given certain spatiotemporal characteristics of crimes (see a taxonomy in p. 19). As far as prediction are concerned, they are good as long as crimes in the future look similar to the ones in the past – their types, temporality and geographic prevalence, if the data is good, which is a big if!. Also, predictions are good when they are further contextualised. Compared with predicting crimes without any help (not even from the intelligence that agents in the field can gather), applying mathematics to help in a guessing game creates a significant advantage, but the differences among these methods are not as dramatic. Therefore, one of the crucial messages intended by reviewing and contextualising predictive methods is that:
It is important to bear in mind that the predictive methods discussed here do not predict where and when the next crime will be committed. Rather, they predict the relative level of risk that a crime will be associated with a particular time and place. The assumption is always that the past is prologue; predictions are made based on the analysis of past data. If the criminal adapts quickly to police interventions, then only data from the recent past will be useful to police departments. (p. 55)
Therefore, however automated, human and organisational efforts are still required in many areas in practice. Activities such as finding relevant data, preparing them for analysis, tweaking factors, variables and parameters, all require human efforts, collaboration as a team and transforming decisions into actions for reducing crimes at organisational levels. Similarly, human and organisational efforts are again needed when types and patterns of crimes are changing, targeted crimes shift, results are to be interpreted and integrated in relation to changing availabilities of resources.
Furthermore, the report reviews the issues of privacy, transparency, trust and civil liberties within existing legal and policy frameworks. However, it becomes apparent that predictions and predictive analytics need careful and mindful designs, responding to emerging ethical, legal and social issues (ELSI) when the impacts of predictive policing occur at individual and organisational levels, affecting the day-to-day life of residents, communities and frontline officers. While it is important to maintain and revisit existing legal requirements and frameworks, it is also important to respond to emerging information and data practices, and notions of “obscurity by design” and “prodecural data due processes” are ways of rethinking and designing relationships between privacy, data, algorithms and predictions. Even the term transparency needs further reflections to make progress on issues concerning what it means under the context of predictive analytics and how it can be achieved by taking into account renewed theoretical, ethical, practical and legal considerations. Under this context, “transparent predictions” is proposed wherein the importance and potential unintended consequences are outlined with regards to rendering prediction processes interpretable to humans and driven by causations rather than correlations. Critical reflections on such a proposal are useful, for example this two part series – (1)(2), further contextualising transparency both in prediction precesses and case-specific situations.
Additionally, IBM has partnered with New York Fire Department and Lisbon Fire Brigade. The main goal is to use predictive analytics to make smart cities safer by using the predictions to better and more effectively allocate emergency response resources. Similarly, crowd behaviours have already been simulated for understanding and predicting how people would react in various crowd events in places such as major traffic and travel terminals, sports and concert venues, shopping malls and streets, busy traffic areas, etc. Simulation tools take into account data generated by sensors, as well as quantified embodied actions, such as walking speeds, body sizes or disabilities, and it is not difficult to imagine that more data and analysis could take advantage of social media data where sentiments and psychological aspects are expected to refine simulation results (a review of simulation tools).
To bring the discussion to a pause, data, algorithms and predictions are quickly turning not only cities but also many other places into testbeds, as long as there are sensors (human and nonhuman) and the Internet. Data will become available and different kinds of tests can then be run to verify ideas and hypotheses. As many articles have pointed out, data and algorithms are flawed, revealling and reinforcing unequal parts and aspects of cities and city lives. Tests and experiments, such as manipulating user emtions by Facebook in their experiments, can make cities vulnerable too, when they are run without regards to embodied and emotionally chared humans. Therefore, there is a great deal more to say about data, algorithms and experiments, because the production of data and experiments of making use of them are always an intervention rather than evaluation. We will be coming back to these topics in subsequent posts.
The fifth Programmable City seminar will take place on May 7th. Based on some detailed ethnographic work, the paper will focus on the workings of control rooms in governing events.
Events and Urban Control Ben Anderson and Rachel Gordon
Time: 16:00 – 18:00, Wednesday, 7 May, 2014 Venue: Room 2.31, 2nd Floor Iontas Building, North Campus NUI Maynooth (Map)
Abstract
How do control rooms enable today’s networked urban life? And how are events grasped and handled from within control rooms as cities become known in new ways? The paper will hone in on how the events that interrupt urban life in the global north – the traffic accident, the delayed train, the power outage – are governed through control rooms; control rooms that are increasingly integrating an array of ‘smart’ technologies. Control rooms are sites for detecting and diagnosing events, where action to manage events is initiated in the midst multiple forms of ambiguity and uncertainty. By focusing on the work of control rooms, the paper will ask what counts as an event of interruption or disruption and trace how forms of control are enacted.
About the speakers
Dr Ben Anderson is a Reader in Human Geography at Durham University. Recently, he has become fascinated by how emergencies are governed and how emergencies govern. He currently leads a Leverhulme Trust International Network on the theme of ‘Governing Emergencies’, and is conducting a geneaology of the government of and by emergency supported by a 2012 Philip Leverhulme Prize. Previous research has explored the implications of theories of affect and emotion for contemporary human geography. This work will be published in a monograph in 2014: Encountering Affect: Capacities, Apparatuses, Conditions (Ashgate, Aldershot). He is also co-editor (with Dr Paul Harrison) of Taking-Place: Non-Representational Theories and Geography (2010, Ashgate, Aldershot).
Dr Rachel Gordon completed a ESRC-Funded PhD on the situated work of control rooms, with particular reference to transport systems and to how control rooms deal with complex urban systems. She currently coordinates an Leverhulme Trust international network on Governing Emergencies, after completing an EPSRC funded project on the relation between control rooms and smart technologies.
The Programmable City team delivered four papers at the Conference of the Association of American Geographers held in Tampa, April 8-12. Here are the slides for Kitchin, R., Lauriault, T. and McArdle, G. (2014). “Urban indicators, city benchmarking, and real-time dashboards: Knowing and governing cities through open and big data” delivered in the session “Thinking the ‘smart city’: power, politics and networked urbanism II” organized by Taylor Shelton and Alan Wiig. The paper is a work in progress and was the first attempt at presenting work that is presently being written up for submission to a journal. No doubt it’ll evolve over time, but the central argument should hold.