Moving applications: A multilayered approach to mobile computing Jim Merricks White, National University of Ireland, Maynooth
Abstract
Mobile computing plays an increasingly important role in the way that space is experienced in the city. This has political consequences, both at the micro level of everyday production and consumption, and at the macro level of institutional and political economy. While geographers have explored the ontological role which might be played by hardware, software, data and mapping within this spatial paradigm, there remains little concerted effort to explore mobile computing as a technological system which incorporates all of these socio-technical assemblages. By drawing on adjacent disciplines of science and technology studies (STS) and media and communication studies, this essay proposes a multilayered model for such a holistic inquiry: hardware—software—data(base)—GUI (graphical user interface).
By applying this model to a self-reflexive exploration of the taxi service Hailo and the mobility tracking application Moves, I attempt to demonstrate how it might be put to work as a heuristic tool. Following on from my desire to expose and explore the politics of mobile computing, the model is used to draw attention to the networks of power which make up these mobile computing services.
Digital urbanism in crises: A hopeful monster? Monika Büscher, With Michael Liegl, Katrina Petersen, Mobilities.Lab, Lancaster University, UK
Abstract
Intersecting mobilities of data, people and resources are an integral part of a new digital urbanism. Thrift speaks of Lifeworld.Inc, a new entertainment-security sector driven contexture where people’s everyday activities, movements, physiological data, thoughts, desires and fears are so richly documented in real time that commercial enterprise as well as urban services (transport, energy, security) can dynamically anticipate and shape them ‘just-in-time’ (2011). While this opens up novel opportunities for more efficiency, comfort, and sustainability in networked urban mobilities, it also provides new leverage for mobilizing disaster response. In a ‘century of disasters’ (eScience 2012), where urbanization has increased vulnerability and climate change contributes to increased frequency and severity of disasters, this opens up a perspicuous site for investigations of post-human practices, phenomenologies and ethics. Big data analytics and information sharing for risk prevention and disaster response can exacerbate the unprecedented surveillance contemporary societies practice (Harding 2014), Kafka-eske transformations of privacy and civil liberties (Solove 2004) and a splintering urbanism (Graham & Marvin 2001). At the heart of these transformations is a digital phenomenology of invisibility, immateriality and ‘intelligence’ that does not lend itself to human control. ‘Smart cities’ may depend on smart citizens (Greenfield 2013), but the technologies contemporary societies produce do not support human intelligence. We report from ‘inside the belly of the beast’ of innovation in mobilizing Lifeworld.Inc data for disaster response (Balka 2006). Drawing on experience from collaborative research and design projects (e.g. http://www.bridgeproject.eu/en), we discuss the relationship between lived cyborg practice, phenomenology and ethics in networked urban mobilities. Using a disaster perspective for a disclosive ethical investigation (Introna 2007) does disclose some potentially disastrous transformations, but it also highlights avenues for alternative, radically careful as well as carefully radical design (Latour 2009).
Abstract
The urban riots of the USA in the late 1960s were some of the most powerful political events of that era. As well as drawing numerous responses from media, the civil rights movement, black nationalists, and groups such as the Situationist International, the uprising also triggered a range of research responses including some of the first computational models of cities. T.C. Schelling’s “Models of Segregation” attempted to provide a logical model for racial segregation and laid much of the groundwork for what later became agent-based modeling. Such work is expressed contemporarily for instance in the riot and insurgency modeling of J.M. Epstein and others. For the state, such events mark a schizophrenic relationship to the contingency of riot and how the algorithms play out in such a scenario. How can it govern events that both demonstrate and excite its power and also undermine it? This paper will propose a tracing of the genealogy of such models alongside a reading of other ways of using urban modeling in relation to the urban riots of that era and now. A parrallel reference point here will be the work of W. Bunge a quantitative geographer and spatial theorist. Bunge consistently argued that geometrical patterns and morphological laws express disadvantage and injustice under contemporary capitalism, and that identified patterns could be remedied by rational methods.
The history of computing, from G.W. Leibniz onwards, tangles with the problematic of developing rational approaches to complex, multi-dimensional problems with a high-degree of what J. Law describes as “messiness”. This paper will examine the ways in which rationality, or ratio, is positioned in relation to urban conflict as a means of discussing the relations between the city and software. The paper will develop a discussion of ratio in relation to questions of abstraction, reduction and empiricism. We are especially concerned to find a relationship between abstraction and the empirical that, by working with the materiality of computational systems recognises, and perhaps works with, the tendency to reduction(ism) but through which modes of abstraction may also work with the highly and complexly empirical.
We had a wonderful Code and the City workshop in September and we will be making the video recording of the presentations available from today, and on the following Fridays!
Today, we will be sharing videos from the Opening talk and First session: Code, coding and interfaces
Opening talk
Code and the city: Reframing the conceptual terrain Rob Kitchin, NIRSA, National University of Ireland Maynooth
Abstract>
Software has become essential to the functioning of cities. It is deeply and pervasively embedded into the systems and infrastructure of the built environment and in the management and governance of urban societies. Software-enabled technologies and services augment and facilitate how we understand and plan cities, how we manage urban services and utilities, and how we live urban lives. This paper will provide an overarching overview of the ways in which software has become an indispensible mediator of urban systems and the consequent implications, and makes the case for the study of computational algorithms and how cities are captured in and processed through code.
Session 1: Code, coding and interfaces
Code-crowd: How software repositories express urban life Adrian Mackenzie, Sociology, Lancaster University
Abstract
Is code an expression of urban life? This paper analyses around 10 million software repositories on Github.com from the perspective of how they include cities. The methodology here relies on data-intensive work with bodies of code at a number of different levels. It maps the geographies of Github organisations and users to see how location anchors coding work. More experimentally, it tracks how urban spaces, movements and architectures figure in and configure code. The paper’s focus is less on how code shapes cities and more on apprehending code and coding as a way of experientially inhabiting cities. This approach might better highlight how code expresses urban experiences of proximity, mixing, movement, nearness, distance, and location. It might also shed light on the plural forms of spatiality arising from code, particularly as algorithmic processes become more entangled with each other.
Encountering the city at hackathons
Sophia Maalsen and Sung-Yueh Perng, National University of Ireland, Maynooth
Abstract
The growing significance of hackathons is currently developing in a mutually informing way. On the one hand, there is an increasing use of hackathons to address issues of city governance – Chris Vein, US CTO for government innovation has described them as ‘sensemaking’ tools for government, encouraging agencies to make use of hackathons and “let the collective energy of the people in the room come together and really take that data and solve things in creative and imaginative ways” (Llewellyn 2012). On the other, regular hack nights appear as creative urban space for citizens to discuss problems they encounter and which are not necessarily considered by government, and produce solutions to tackle these issues.
In this paper, we explore potential opportunities and tensions, as well as excitement and inattentiveness, emerging as solutions are proposed and pursued. Through this, we reflect upon how such processes translate the city and transform ways of living in places where the solutions are applied. We further ask whether the positive discourse surrounding hackathons is justified or whether there are limits to their ability to deal with the complexity of urban issues.
Interfacing urban intelligence Shannon Mattern, Media Studies, New School NY
Abstract
Technology companies, city governments, and design firms – the entities teaming up to construct our highly-networked cities of the future – have prototyped interfaces through which citizens can engage with the smart city. But those prototypes, almost always envisioned as screens of some sort, embody institutional values that aren’t always aligned with those of citizens who rightfully claim a “right to the city.” Based on promotional materials from Cisco, Siemens, IBM, Microsoft, and their smart-city-making counterparts, it seems that one of the chief preoccupations of our future-cities is to reflect their data consumption and hyper-efficient (often “widgetized”) activity back to themselves. We thus see city “control centers” lined with screens that serve in part to visualize, and celebrate, the city’s own supposedly hyper-rational operation. Public-facing interfaces, meanwhile, are typically rendered via schematic mock-ups, with little consideration given to interface design. They’re portrayed as conduits for transit information, commercial and service locations and reviews, and information about cultural resources and tourist attractions; and as portals for gathering user-generated data. Across the board, these interfacing platforms tend to frame their users as sources of data that feed the urban algorithmic machines, and as consumers of data concerned primarily with their own efficient navigation and consumption of the city.
In this talk, I’ll consider how we might we design urban interfaces for urban citizens, who have a right to know what’s going on inside “’black boxed’ [urban] control systems” – and even engage with the operating system as more than mere data-generators or reporters-of-potholes-and-power-outages. In considering what constitutes an ideal urban interface, we need to examine those platforms that are already in existence, and those that are proposed for future cities. Even the purely hypothetical, the speculative – the “design fiction” – can illuminate what’s possible, technologically, aesthetically, and ideologically; and can allow us to ask ourselves what kind of a “public face” we want to front our cities, and, even more important, what kinds of intelligence and agency – technological and human – we want our cities to embody.
Do come back next Friday! The next session awaits!
We have the video and slides ready for you now! And we will start our seminar series for the coming year too. So do come back for latest updates on our events!
You can write down equations that predict what people will do. That’s the huge change. So I have been running the big data conversation … It’s about the fact that you can now understand customers, employees, how we organise, in a quantitative, predictive way for the first time.
Predictive analytics is fervently discussed in the business world, if not fully taken up, and increasingly by public services, governments or medical practices to exploit the value hidden in the public archive or even in social media. In New York for example, there is a geek squad to Mayor’s office, seeking to uncover deep and detailed relationships between the people living there and the government, and at the same time realising “how insanely complicated this city is”. In there, an intriguing question remains as to the effectiveness of predictive analytics, the extent to which it can support and facilitate urban life and the consequences to the cities that are immersed in a deep sea of data, predictions and humans.
Let’s start with an Australian example. The Commonwealth Scientific and Industrial Research Organisation (CSIRO) has partnered with Queensland Health, Griffith University and Queensland University of Technology and developed the Patient Admission Prediction Tool (PAPT) to estimate the presentations of schoolies, leavers of Australian high schools going on week-long holidays after final examines, to nearby hospitals. The PAPT derives their estimates from Queensland Health data on schoolies presentations in previous years, including statistics about the numbers of presentations, parts of the body injured and types of these injuries. Using the data, the PAPT benefits hospitals, their employees and patients by improved scheduling of hospital beds, procedures and staff, with the potential of saving $23 million per annum if implemented in hospitals across Australia. As characterised by Dr James Lind, one of the benefits of adapting predictive analytics is the proactive rather than reactive approaches towards planning and management:
People like working in a system that is proactive rather than reactive. When we are expecting a patient load everyone knows what their jobs [are], and you are more efficient with your time.
The patients are happy too, because they receive and finish treatment quickly:
Can we find such success when predictive analytics is practised in various forms of urban governance? Moving the discussion back to US cities again and using policing as an example. Policing work is shifting from reactive to proactive in many cities, in experimental or implementation stages. PredPol is predictive policing software produced by a startup company and has caught considerable amount of attention from various police departments in the US and other parts of the world. Their success as a business, however, is partly to do with by their “energetic” marketing strategies, contractual obligations of referring the startup company to other law enforcement agencies, and so on.
Above all, claims of success shown by the company are difficult to sustain in closer examination. The subjects of the analytics that the software focuses are very specific: burglaries, robberies, vehicle thefts, thefts from vehicles and gun crimes. In other words, the crimes that have “plenty of data to chew on” for making predictions, and are of the opportunistic crimes which are easier to prevent by the presence of the patrolling police (more details here).
This further brings us to the issue of the “proven” and “continued” aspects of success. These are even more difficult and problematic aspects of policing work for the purpose of evaluating and “effectiveness” and “success” of predictive policing. To prove that an algorithm performs well, expectations for which an algorithm is built and tweaked have to be specified, not only for those who build the algorithm, but also for people who will be entangled in the experiments in intended and unintended ways. In this sense, transparency and objectivity related to predictive policing are important. Without acknowledging, considering and evaluating how both the crimes and everyday life, or both normality and abnormality, are transformed into algorithms and disclosing them for validation and consultation, a system of computational criminal justice can turn into, if not witchhunting, alchemy – let’s put two or more elements into a magical pot, stir them and see what happens! This is further complicated by knowing that there are already existing and inherent inequalities in crime data, such as reporting or sentencing, and the perceived neutrality of algorithms can well justify cognitive biases that are well documented in justice system, biases that could justify the rational that someone should be treated more harshly because the person is already on the black list, without reconsidering how the person gets onto such list in the first place. There is an even darker side of predictive policing when mundane social activities are constantly treated as crime data when using social network analysis to profile and incriminate groups and grouping of individuals. This is also a more dynamic and contested field of play considering that while crime prediction practitioners (coders, private companies, government agencies and so on) appropriate personal data and private social media messages for purposes they are not intended for, criminals (or activists for that matter) play with social media, if not yet prediction results obtained by the reverse engineering of algorithms, to plan coups, protests, attacks, etc.
For those who want to look further into how predictive policing is set up, proven, run and evaluated, there are ways of opening up the black box, at least partially, for critically reflecting upon what exactly it could achieve and how the “success” is managed both in computer simulation and in police practices. The Chief scientist of PredPol gave a lecturer where, as pointed out:
He discusses the mathematics/statistics behind the algorithm and, at one point, invites the audience not to take his word for it’s accuracy because he is employed by PredPol, but to take the equations discussed and plug in crime data (e.g. Chicago’s open source crime data) to see if the model has any accuracy.
The video of the lecturer is here
Furthermore, RAND provides a review of predictive policing methods and practices across many US cities. The report can be found here and analyses the advantages gained by various crime prediction methods as well as their limitations. Predictive policing as shown in the report is far from a crystal ball, and has various levels of complexity to run and implement, mathematically, computationally and organisationally. Predictions can range from crime mapping to predicting crime hotspots when given certain spatiotemporal characteristics of crimes (see a taxonomy in p. 19). As far as prediction are concerned, they are good as long as crimes in the future look similar to the ones in the past – their types, temporality and geographic prevalence, if the data is good, which is a big if!. Also, predictions are good when they are further contextualised. Compared with predicting crimes without any help (not even from the intelligence that agents in the field can gather), applying mathematics to help in a guessing game creates a significant advantage, but the differences among these methods are not as dramatic. Therefore, one of the crucial messages intended by reviewing and contextualising predictive methods is that:
It is important to bear in mind that the predictive methods discussed here do not predict where and when the next crime will be committed. Rather, they predict the relative level of risk that a crime will be associated with a particular time and place. The assumption is always that the past is prologue; predictions are made based on the analysis of past data. If the criminal adapts quickly to police interventions, then only data from the recent past will be useful to police departments. (p. 55)
Therefore, however automated, human and organisational efforts are still required in many areas in practice. Activities such as finding relevant data, preparing them for analysis, tweaking factors, variables and parameters, all require human efforts, collaboration as a team and transforming decisions into actions for reducing crimes at organisational levels. Similarly, human and organisational efforts are again needed when types and patterns of crimes are changing, targeted crimes shift, results are to be interpreted and integrated in relation to changing availabilities of resources.
Furthermore, the report reviews the issues of privacy, transparency, trust and civil liberties within existing legal and policy frameworks. However, it becomes apparent that predictions and predictive analytics need careful and mindful designs, responding to emerging ethical, legal and social issues (ELSI) when the impacts of predictive policing occur at individual and organisational levels, affecting the day-to-day life of residents, communities and frontline officers. While it is important to maintain and revisit existing legal requirements and frameworks, it is also important to respond to emerging information and data practices, and notions of “obscurity by design” and “prodecural data due processes” are ways of rethinking and designing relationships between privacy, data, algorithms and predictions. Even the term transparency needs further reflections to make progress on issues concerning what it means under the context of predictive analytics and how it can be achieved by taking into account renewed theoretical, ethical, practical and legal considerations. Under this context, “transparent predictions” is proposed wherein the importance and potential unintended consequences are outlined with regards to rendering prediction processes interpretable to humans and driven by causations rather than correlations. Critical reflections on such a proposal are useful, for example this two part series – (1)(2), further contextualising transparency both in prediction precesses and case-specific situations.
Additionally, IBM has partnered with New York Fire Department and Lisbon Fire Brigade. The main goal is to use predictive analytics to make smart cities safer by using the predictions to better and more effectively allocate emergency response resources. Similarly, crowd behaviours have already been simulated for understanding and predicting how people would react in various crowd events in places such as major traffic and travel terminals, sports and concert venues, shopping malls and streets, busy traffic areas, etc. Simulation tools take into account data generated by sensors, as well as quantified embodied actions, such as walking speeds, body sizes or disabilities, and it is not difficult to imagine that more data and analysis could take advantage of social media data where sentiments and psychological aspects are expected to refine simulation results (a review of simulation tools).
To bring the discussion to a pause, data, algorithms and predictions are quickly turning not only cities but also many other places into testbeds, as long as there are sensors (human and nonhuman) and the Internet. Data will become available and different kinds of tests can then be run to verify ideas and hypotheses. As many articles have pointed out, data and algorithms are flawed, revealling and reinforcing unequal parts and aspects of cities and city lives. Tests and experiments, such as manipulating user emtions by Facebook in their experiments, can make cities vulnerable too, when they are run without regards to embodied and emotionally chared humans. Therefore, there is a great deal more to say about data, algorithms and experiments, because the production of data and experiments of making use of them are always an intervention rather than evaluation. We will be coming back to these topics in subsequent posts.
We are delighted to report that, as part of the Silicon Republic’s Women Invent Tomorrow campaign, Dr Tracey P. Lauriault is one of the 100 top women in the areas of science, technology, engineering and maths (STEM).
In Ireland she has been actively engaged in critical data research on topics related to Open Data, Big Data, Open Government and Infrastructures. Like other members of the Programmable City Team she is doing field work in Dublin and Boston. Because her home country is Canada, that has been added. She is also a big supporter of evidence based decision making and deliberative democracy, and considers numbers, data and open and interoperable infrastructures as being part of that process.
Congratulations to Tracey! And you can read more about other inspiring women over at Silicon Republic.
ERC Advanced Investigator on the Programmable City project, Professor Rob Kitchin, was awarded the Royal Irish Academy Gold Medal last month, and now we have video footage of the event from the Royal Irish Academy in which the contributions of the gold medalists to not only research communities but also Irish society were highlighted.