Category Archives: news

Boston fieldwork

Boston and Brookline from the tower in Mount Auburn Cemetery.

Boston from the tower in Mount Auburn Cemetery

From April 2nd to 30th five of the Programmable City team travelled to Boston (or rather as we quickly learned the Metro-Boston area, which is a conglomerate of 101 municipalities) to undertake fieldwork, staying in Cambridge.  Over the course of a busy month the team:

  • conducted 75 interviews/focus groups;
  • had 25 informal meetings;
  • undertook participant observation at 3 civic hacks;
  • were given 4 tours of facilities and 2 of the city;
  • presented 7 invited talks (at MIT (3), Harvard, Northeastern, UMass Boston and Analog Devices);
  • attended 8 other workshops/conferences (Bits and Bricks at MIT; Using Technology to Engage Constituents and Improve Governance at Northeastern; Civic Media meetup at MIT; Urban Mobility in Green Cities at Boston Univ; Microsoft Civic Innovation; Climate Change Policy after Paris at Boston Univ; Digital GeoHumanities at Harvard; City Mart at NY Civic Hall).

The interviews were conducted with a range of different stakeholders including municipal, regional and state-level government officials, various agencies, university researchers, and companies.  The research focused on mapping out the smart city landscape in general terms, with a particular in-depth focus on various data-driven initiatives in the metro area, transportation solutions, civic hacking, the development of civic tech, procurement of smart city technologies, and emergency management response.

Along with the 29 interviews conducted on previous visits, we now have a rich dataset of over 100 interviews to analyse in order to make sense of the Boston Metro area’s use of smart city technologies and to compare with Dublin (for which we have a couple of hundred interviews).  That said, we’ve not quite finished with the fieldwork and a couple of team members will be back at some point to extend their work.  We’ll also be returning for the Association of American Geographers conference which is being held in Boston in 2017 to present some of our findings.

We would like to thank everyone who agreed to take part in our research and for generously sharing their knowledge, insights and time, and also for helping to introduce us to other potential interviewees and generally steer us in the right direction.  We very much appreciate the excellent hospitality we received during our visit.  The next task is to get all the interviews transcribed and to start the coding work.  No small task!

Rob Kitchin

Seminar: “Smartcontracts and smartcities, displacing power through authentication?”

We are delighted to have Dr. Gianluca Miscione as a guest speaker on Wednesday 18th May at 3pm, Iontas Building, room 2.31 for the fourth of our Programmable City seminars this year.

Gianluca Miscione joined the group of Management Information Systems at the School of Business of University College Dublin in June 2012. Previously, he worked as Assistant Professor in Geo-Information and Organization at the Department of Urban and Regional Planning and Geo-Information Management, Faculty of Geo-Information Science and Earth Observation, University of Twente, Netherlands. He received his Ph.D. in Information Systems and Organization from the Sociology Department of the University of Trento, in collaboration with the Sociology Department of Binghamton University New York and the School of International Service of American University in Washington DC. While at the Department of Informatics of the University of Oslo, he broadened his research on information infrastructures on the global scale. Gianluca conducted and contributed to research in Europe, Latin America, India, East Africa, and on the Internet. The focus remained on the interplay between technologies and organizing processes with a specific interest on innovation, development, organizational change and trust.

Gianluca will be talking about organizing processes related to automation of authentication in “smart contracts” exploring what novel forms of ‘sociation’ smart contracts entangle with.

ProgCity_Seminar_2016_1_GM

Unpacking Dublin as an emergent smart city

The Smart Dublin (SD) initiative has been promoted by Dublin City Council in collaboration with the other three local authorities of the Dublin city region to identify “open challenges” and to “drive innovation and collaboration in the development of new urban solutions, using open data and with the city region as a test bed”. (1)

Since its commencement in June 2015, the Smart Dublin initiative has conducted four one-day workshops with the employees from each local authority (Dublin City Council, Dún Laoghaire Rathdown County Council, Fingal County Council, South Dublin County Council) to draw on their practical knowledge of the challenges facing the Dublin region, as well as to note all the existing cases of smart city technologies and practices in each area for a new website, SmartDublin.ie, explaining Dublin’s merits as a Smart City and the challenges ahead.

From June to December of 2015, a number of case studies (2) and challenges have been collected and identified, and then further studied by Prog City project researchers to create case study texts for SmartDublin.ie. The ‘soft launch’ of SmartDublin.ie was on the 5th of October 2015 with a number of these case studies, and the final, more complete, website will be launched on the 8th of March, 2016, at Dublin City Hall.

Wow! Tickets for @smartdublin launch event snapped up in a few hours.We want you there!Last few tickets click here https://t.co/C7UhD0UGgX

— SmartDublin (@smartdublin) February 19, 2016

SD intends to act as a driver and connector for a step-change, coordinative transformation in Dublin’s smart city policies, moving from an approach based on the ‘creative city’ and entrepreneurism towards a larger emphasis on service delivery and efficiency, although keeping the link with start-ups and open innovation processes as well as developing different forms of procurement and the deployment of smart technology in an urban setting.

In particular, a specific form of procurement, called “procurement by challenge”, has been adopted by SD from Citymart, a consultancy agency located in Barcelona. Traditionally, procurement is based on identifying both problem and its solution, and then tendering for the chosen solution. In contrast, “Procurement by Challenge” is based upon, firstly, identifying problems as “open challenges to entrepreneurs and citizens”, and secondly, seeking the solutions themselves using this process, awarding the actual development contract to the team which came up with the best solution. (3)

Thus conceived, SD is at the centre of various events and projects occurring in Dublin since autumn 2015 (Web Summit, SD soft launch, Open Agile Smart Cities seminar, Future of Cities seminar, Smart City tour, Smart District etc.). Its mandate is to provide a platform for smart city governance and innovation in order to make Dublin a global player in smart cities and the Internet of Things, while coping at the same time with the limited role of the public sector in urban transformation due to the recent recession and related austerity drive and the commensurate need to reduce the costs of public services.

The new ‘smart city atmosphere’ created and promoted through SD shows the following interrelated features, marking a significant change in the how Dublin tackles governance and innovation:

  • a challenge-driven form of urban innovation: it reframes the procurement relations between public and private sector to mobilise resources focused on “problems instead of solutions” and to establish shared governance practices and standards;
  • a test-bedding approach: urban space becomes a distributed laboratory in which to test smart city technologies based on big data and the Internet of Things, creating test sites that might help solve challenges faced by Dublin; “allowing to explore smart city solutions in a space small enough to trial and wide enough to prove”;
  • mutable scales: a shift from the Dublin city core to the Dublin city region scale as a joint endeavour of the four local authorities. This changes to the scale of “networked cities” when confronting with the global settings,  such as in the case of Open Agile Smart Cities.

A number of recently initiated ProgCity case study projects aim to explore how these changes affect Dublin urban space and management, starting from the settings where the new forms of procurement and test-bedding are generated and adopted

The objective is to understand how smart city management ideas circulate and interact with the adoption of smart technologies, thus shaping Dublin organizational, technological and everyday settings. Research will focus on different processes occurring in test-bed and procurement:

  • accidental smart urbanism through multiple co-existing, co-evolving and conflicting forms of algorithmic governance applied to traffic control, environmental monitoring and crowd management;
  • anticipation and demonstration as coordination devices and performative devices: how procurement and testbedding embody and enact anticipation and demonstration dynamics, how they interact with the spatial change of scale of Dublin and perform its specific material, social, cultural urban arrangements and finally how they make sense of accidental and fragmented smart city landscape.

Two other projects are looking at existing and emerging Smart City case studies in Dublin:

  • Real-Time Passenger Information (RTPI): this looks at the interaction between code and space resulting from the implementation of this technology into Dublin’s transport systems. This case study will seek to examine a real-world data assemblage in relation to how data flows interact with spatial flows;
  • Smart Districts: this work follows an emerging project that seeks to harness the large-scale urban developments in the Dublin Docklands as an exemplar for trialling smart technologies. This will look at how smart technologies become part of urban masterplanning in the context of a large urban development with many actors involved in planning and decision-making.

These two projects will examine real-world examples of transduction and translation; how the city interacts with code, each continually reshaping the other. In the case of RTPI, this is concerned with how code and physical movement interact, and in the case of Smart Districts, how urban space is co-configured with smart technologies.

Together, these projects will seek to unpack Dublin as an emerging ‘Smart City’, following how the concept itself takes form through the interplay of new technologies and new ways of procurement. Also, they will look at how urban big data are tested and used to regulate and shape the temporal and spatial dimension of urban space, as well as social relations.

Claudio Coletta, Liam Heaphy

References

(1) SD report “Local authority challenge identification workshops” (2015, unpublished).

(2) http://progcity.maynoothuniversity.ie/2015/12/dublin-as-a-smart-city/

(3) http://www.citymart.com

New project: Citizen-related data privacy/protection concerns arising from the development of smart cities

Prof. Rob Kitchin has recently been awarded a research contract by the Department of the Taoiseach to examine citizen-related data privacy/protection concerns arising from the development of smart cities.  More specifically, the research is to help inform the work of the new Government Data Forum, an initiative of Dara Murphy TD, the Minister for EU Affairs and Data Protection.  The research is to:

  • Identify, document and summarise key developments in the area of smart cities in Ireland, at EU level and internationally;
  • Identify and report on concerns and challenges for citizens regarding key data protection, data privacy and associated issues that arise in the context of smart cities
  • Identify and report on best practice initiatives that have been undertaken to address these issues.

The project will run for two months from October 1st to November 30th and result in a detailed report setting out the main data privacy/protection issues associated with smart city technologies and initiatives.

 

Data and the City workshop (day 2)

Reblogged from Po Ve Sham – Muki Haklay’s personal blog:

The second day of the Data and City Workshop (here are the notes from day 1) started with the session Data Models and the City.

Pouria Amirian started with Service Oriented Design and Polyglot Binding for Efficient Sharing and Analysing of Data in Cities. The starting point is that management of the city need data, and therefore technologies to handle data are necessary. In traditional pipeline, we start from sources, then using tools to move them to data warehouse, and then doing the analytics. The problems in the traditional approach is the size of data – the management of the data warehouse is very difficult, and need to deal with real-time data that need to answer very fast and finally new data types – from sensors, social media and cloud-born data that is happening outside the organisation. Therefore, it is imperative to stop moving data around but analyse them where they are. Big Data technologies aim to resolve these issues – e.g. from the development of Google distributed file system that led to Hadoop to similar technologies. Big Data relate to the technologies that are being used to manage and analyse it. The stack for managing big data include now over 40 projects to support different aspects of the governance, data management, analysis etc. Data Science is including many areas: statistics, machine learning, visualisation and so on – and no one expert can know all these areas (such expert exist as much as unicorns exist). There is interaction between data science researchers and domain experts and that is necessary for ensuring reasonable analysis. In the city context, these technologies can be used for different purposes – for example deciding on the allocation of bikes in the city using real-time information that include social media (Barcelona). We can think of data scientists as active actors, but there are also opportunities for citizen data scientists using tools and technologies to perform the analysis. Citizen data scientists need data and tools – such as visual analysis language (AzureML) that allow them to create models graphically and set a process in motion. Access to data is required to facilitate finding the data and accessing it – interoperability is important. Service oriented architecture (which use web services) is an enabling technology for this, and the current Open Geospatial Consortium (OGC) standards require some further development and changes to make them relevant to this environment. Different services can provided to different users with different needs [comment: but that increase in maintenance and complexity]. No single stack provides all the needs.

Next Mike Batty talked about Data about Cities: Redefining Big, Recasting Small (his paper is available here) – exploring how Big Data was always there: locations can be seen are bundles of interactions – flows in systems. However, visualisation of flows is very difficult, and make it challenging to understand the results, and check them. The core issue is that in N locations there are N^2 interactions, and the exponential growth with the growth of N is a continuing challenge in understanding and managing cities. In 1964, Brian Berry suggested a system on location, attributes and time – but temporal dimension was suppressed for a long time. With Big Data, the temporal dimension is becoming very important. An example of how understanding data is difficult is demonstrated with understanding travel flows – the more regions are included, the bigger the interaction matrix, but it is then difficult to show and make sense of all these interactions. Even trying to create scatter plots is complex and not helping to reveal much.

The final talk was from Jo Walsh titled Putting Out Data Fires; life with the OpenStreetMap Data Working Group (DWG) Jo noted that she’s talking from a position of volunteer in OSM, and recall that 10 years ago she gave a talk about technological determinism but not completely a utopian picture about cities , in which OpenStreetMap (OSM) was considered as part of the picture. Now, in order to review the current state of OSM activities relevant for her talk, she asked in the OSM mailing list for examples. She also highlighted that OSM is big, but it’s not Big Data- it can still fit to one PostGres installation. There is no anonymity in the system – you can find quite a lot about people from their activity and that is built into the system. There are all sort of projects that demonstrate how OSM data is relevant to cities – such as OSM building to create 3D building from the database, or use OSM in 3D modelling data such as DTM. OSM provide support for editing in the browser or with offline editor (JOSM). Importantly it’s not only a map, but OSM is also a database (like the new OSi database) – as can be shawn by running searches on the database from web interface. There are unexpected projects, such as custom clothing from maps, or Dressmap. More serious surprises are projects like the humanitarian OSM team and the Missing Maps projects – there are issues with the quality of the data, but also in the fact that mapping is imposed on an area that is not mapped from the outside, and some elements of colonial thinking in it (see Gwilym Eddes critique) . The InaSAFE project is an example of disaster modeling with OSM. In Poland, they extend the model to mark details of road areas and other details. All these are demonstrating that OSM is getting close to the next level of using geographic information, and there are current experimentations with it. Projects such as UTC of Mappa Marcia is linking OSM to transport simulations. Another activity is the use of historical maps – townland.ie .
One of the roles that Jo play in OSM is part of the data working group, and she joined it following a discussion about diversity in OSM within the community. The DWG need some help, and their role is geodata thought police/Janitorial judicial service/social work arm of the volunteer fire force. DWG clean up messy imports, deal with vandalisms, but also deal with dispute resolutions. They are similar to volunteer fire service when something happens and you can see how the sys admins sparking into action to deal with an emerging issue. Example, someone from Ozbekistan saying that they found corruption with some new information, so you need to find out the changeset, asking people to annotate more, say what they are changing and why. OSM is self policing and self regulating – but different people have different ideas about what they are doing. For example, different groups see the view of what they want to do. There are also clashes between armchair mapping and surveying mappers – a discussion between someone who is doing things remotely, and the local person say that know the road and asking to change the editing of classification. DWG doesn’t have a legal basis, and some issues come up because of the global cases – so for example translated names that does not reflect local practices. There are tensions between commercial actors that do work on OSM compared to a normal volunteer mappers. OSM doesn’t have privileges over other users – so the DWG is recognised by the community and gathering authority through consensus.

The discussion that follows this session explored examples of OSM, there are conflicted areas such as Crimea nad other contested territories. Pouria explained that distributed computing in the current models, there are data nodes, and keeping the data static, but transferring the code instead of data. There is a growing bottleneck in network latency due to the amount of data. There are hierarchy of packaging system that you need to use in order to work with distributed web system, so tightening up code is an issue.
Rob – there are limited of Big Data such as hardware and software, as well as the analytics of information. The limits in which you can foster community when the size is very large and the organisation is managed by volunteers. Mike – the quality of big data is rather different in terms of its problem from traditional data, so while things are automated, making sense of it is difficult – e.g. tap in but without tap out in the Oyster data. The bigger the dataset, there might be bigger issues with it. The level of knowledge that we get is heterogeneity in time and transfer the focus to the routine. But evidence is important to policy making and making cases. Martijn – how to move the technical systems to allow the move to focal community practice? Mike – the transport modelling is based on promoting digital technology use by the funders, and it can be done for a specific place, and the question is who are the users? There is no clear view of who they are and there is wide variety, different users playing different roles – first, ‘policy analysts’ are the first users of models – they are domain experts who advise policy people. less thinking of informed citizens. How people react to big infrastructure projects – the articulations of the policy is different from what is coming out of the models. there are projects who got open and closed mandate. Jo – OSM got a tradition of mapping parties are bringing people together, and it need a critical mass already there – and how to bootstrap this process, such as how to support a single mapper in Houston, Texas. For cases of companies using the data while local people used historical information and created conflict in the way that people use them. There are cases that the tension is going very high but it does need negotiation. Rob – issues about data citizens and digital citizenship concepts. Jo – in terms of community governance, the OSM foundation is very hands off, and there isn’t detailed process for dealing with corporate employees who are mapping in their job. Evelyn – the conventions are matters of dispute and negotiation between participants. The conventions are being challenged all the time. One of the challenges of dealing with citizenship is to challenge the boundaries and protocols that go beyond the state. Retain the term to separate it from the subject.

The last session in the workshop focused on Data Issues: surveillance and crime

David Wood talked about Smart City, Surveillance City: human flourishing in a data-driven urban world. The consideration is of the smart cities as an archetype of the surveillance society. Especially trying to think because it’s part of Surveillance Society, so one way to deal with it is to consider resistance and abolishing it to allow human flourishing. His interest is in rights – beyond privacy. What is that we really want for human being in this data driven environment? We want all to flourish, and that mean starting from the most marginalised, at the bottom of the social order. The idea of flourishing is coming from Spinoza and also Luciano Floridi – his anti-enthropic information principle. Starting with the smart cities – business and government are dependent on large quant of data, and increase surveillance. Social Science ignore that these technology provide the ground for social life. The smart city concept include multiple visions, for example, a European vision that is about government first – how to make good government in cities, with technology as part of a wider whole. The US approach is about how can we use information management for complex urban systems? this rely on other technologies – pervasive computing, IoT and things that are weaved into the fabric of life. The third vision is Smart Security vision – technology used in order to control urban terrain, with use of military techniques to be used in cities (also used in war zones), for example biometrics systems for refugees in Afghanistan which is also for control and provision of services. The history going back to cybernetics and policing initiatives from the colonial era. The visions overlap – security is not overtly about it (apart from military actors). Smart Cities are inevitably surveillance cities – a collection of data for purposeful control of population. Specific concerns of researchers – is the targeting of people that fit a profile of a certain kind of people, aggregation of private data for profit on the expense of those that are involved. The critique of surveillance is the issue of sorting, unfair treatment of people etc. Beyond that – as discussed in the special issue on surveillance and empowerment– there are positive potentials. Many of these systems have a role for the common good. Need to think about the city within neoliberal capitalism, separate people in space along specific lines and areas, from borders to building. Trying to make the city into a tamed zone – but the danger parts of city life are also source for opportunities and creativity. The smart city fit well to this aspect – stopping the city from being disorderly. There is a paper from 1995 critique pervasive computing as surveillance and reduce the distance between us and things, the more the world become a surveillance device and stop us from acting on it politically. In many of the visions of the human in pervasive computing is actually marginalised. This is still the case. There are opportunities for social empowerment, say to allow elderly to move to areas that they stop exploring, or use it to overcome disability. Participation, however, is flawed – who can participate in what, where and how? additional questions are that participation in highly technical people is limited to a very small group, participation can also become instrumental – ‘sensors on legs’. The smart city could enable to discover the beach under the pavement (a concept from the situationists) – and some are being hardened. The problem is corporate ‘wall garden’ systems and we need to remember that we might need to bring them down.

Next Francisco Klauser talked about Michel Foucault and the smart city: power dynamics inherent in contemporary governing through code. Interested in power dynamics of governing through data. Taking from Foucault the concept of understanding how we can explain power put into actions. Also thinking about different modes of power: Referentiality – how security relate to governing? Normativity – looking at what is the norm and where it is came from? Spatiality – how discipline and security is spread across space. Discipline is how to impose model of behaviour on others (panopticon). Security work in another way – it is free things up within the limits. So the two modes work together. Power start from the study of given reality. Data is about the management of flows. The specific relevance to data in cities is done by looking at refrigerated warehouses that are used within the framework of smart grid to balance energy consumption – storing and releasing energy that is preserved in them. The whole warehouse has been objectified and quantified – down to specific product and opening and closing doors. He see the core of the control through connections, processes and flows. Think of liquid surveillance – beyond the human.

Finally, Teresa Scassa explored Crime Data and Analytics: Accounting for Crime in the City. Crime data is used in planning, allocation of resources, public policy making – broad range of uses. Part of oppositional social justice narratives, and it is an artefact of the interaction of citizen and state, as understood and recorded by the agents of the state operating within particular institutional cultures. Looking at crime statistics that are provided to the public as open data – derived from police files under some guidelines, and also emergency call data which made from calls to the policy to provide crime maps. The data that use in visualisation about the city is not the same data that is used for official crime statistics. There are limits to the data – institutional factors: it measure the performance of the police, not crime. It’s how police are doing their job – and there are lots of acts of ‘massaging’ the data by those that are observed. The stats are manipulated to produce the results that are requested. The police are the sensors, and there is unreporting of crime according to the opinion of police person – e.g. sexual assault, and also the privatisation of policing who don’t report. Crime maps are offered by private sector companies that sell analytics, and then provide public facing option – the narrative is controlled – what will be shared and how. Crime maps are declared as ‘public awareness or civic engagement’ but not transparency or accountability. Focus on property offence and not white collar one. There are ‘alternalytics’ – using other sources, such as victimisation survey, legislation, data from hospital, sexual assault crisis centres, and crowdsourcing. Example of the reporting bottom up is harrassmap to report cases that started in Egypt. Legal questions are how relationship between private and public sector data affect ownership, access and control. Another one is how the state structure affect data comparability and interoperability. Also there is a question about how does law prescribe and limit what data points can be collected or reported.

The session closed with a discussion that explored some examples of solutionism like crowdsourcing that ask the most vulnerable people in society to contribute data about assault against them which is highly problematic. The crime data is popular in portals such as the London one, but it is mixed into multiple concerns such as property price. David – The utopian concept of platform independence, and assuming that platforms are without values is inherently wrong.

The workshop closed with a discussion of the main ideas that emerged from it and lessons. How are all these things playing out. Some questions that started emerging are questions on how crowdsourcing can be bottom up (OSM) and sometime top-down, with issues about data cultures in Citizen Science, for example. There are questions on to what degree the political aspects of citizenship and subjectivity are playing out in citizen science. Re-engineering information in new ways, and rural/urban divide are issues that bodies such as Ordnance Survey need to face, there are conflicts within data that is an interesting piece, and to ensure that the data is useful. The sensors on legs is a concept that can be relevant to bodies such as Ordnance Survey. The concept of stack – it also relevant to where we position our research and what different researchers do: starting from the technical aspects to how people engage, and the workshop gave a slicing through these layers. An issue that is left outside is the business aspect – who will use it, how it is paid. We need the public libraries with the information, but also the skills to do things with these data. The data economy is important and some data will only produced by the state, but there are issues with the data practices within the data agencies within the state – and it is not ready to get out. If data is garbage, you can’t do much with it – there is no economy that can be based on it. An open questions is when data produce software? when does it fail? Can we produce data with and without connection to software? There is also the physical presence and the environmental impacts. Citizen engagement about infrastructure is lacking and how we tease out how things open to people to get involved. There was also need to be nuanced about the city the same way that we focused on data. Try to think about the way the city is framed: as a site to activities, subjectivity, practices; city as a source for data – mined; city as political jurisdiction; city as aspiration – the city of tomorrow; city as concentration of flows; city as a social-cultural system; city as a scale for analysis/ laboratory. The title and data and the city – is it for a city? Back to environmental issues – data is not ephemeral and does have tangible impacts (e.g. energy use in blockchain, inefficient algorithms, electronic WEEE that is left in the city). There are also issues of access and control – huge volumes of data. Issues are covered in papers such as device democracy. Wider issues that are making link between technology and wider systems of thought and considerations.

Data and the City workshop (day 1)

Reblogged from Po Ve Sham – Muki Haklay’s personal blog:

The workshop, which is part of the Programmable City project (which is funded by the European Research Council), is held in Maynooth on today and tomorrow. The papers and discussions touched multiple current aspects of technology and the city: Big Data, Open Data, crowdsourcing, and critical studies of data and software. The notes below are focusing on aspects that are relevant to Volunteered Geographic Information (VGI), Citizen Science and participatory sensing – aspects of Big Data/Open data are noted more briefly.

Rob Kitchin opened with a talk to frame the workshop, highlighting the history of city data (see his paper on which the talk is based). We are witnessing a transformation from data-informed cities to data-driven cities. Within these data streams we can include Big Data, official data, sensors, drones and other sources. The sources also include volunteered information such as social media, mapping, and citizen science. Cities are becoming instrumented and networked and the data is assembled through urban informatics (focusing on interaction and visualisation) and urban science (which focus on modelling and analysis( . There is a lot of critique – with relations to data, there are questions about the politics of urban data, corporatisation of governance, the use of buggy, brittle and hackable urban systems, and social and ethical aspects. Examples to these issues include politics: accepting that data is not value free or objective and influenced by organisations with specific interest and goals. Another issue is the corporatisation of data, with questions about data ownership and data control. Further issues of data security and data integrity when systems are buggy and brittle – there have been cases of hacking into a city systems already. Social, Political, and ethical aspects include data protection and privacy, dataveillance/surveillance, social sorting through algorithms, control creep, dynamic pricing and anticipatory governance (expecting someone to be a criminal). There are also technical questions: coverage, integration between systems, data quality and governance (and the communication of information about quality), and skills and organisational capabilities to deal with the data.
The workshop is to think critically about the data, and asking questions on how this data is constructed and run.

The talk by Jim Thatcher & Craig Dalton – explored provenance models of data. A core question is how to demonstrate that data is what is saying it is and where it came from. In particular, they consider how provenance applies to urban data. There is an epistemological leap from an individual (person) to a data point(s) – per person there can be up to 1500 data attribute per person in corporate database. City governance require more provenance in information than commercial imperatives. They suggest that data user and producers need to be aware of the data and how it is used.

Evelyn Ruppert asked where are the data citizens? Discuss the politics in data, and thinking about the people as subjects in data – seeing people as actors who are intentional and political in their acts of creating data. Being digital mediates between people and technology and what they do. There are myriad forms of subjectivation – there are issues of rights and how people exercise these rights. Being a digital citizens – there is not just recipient of rights but also the ability to take and assert rights. She used the concept of cyberspace as it is useful for understanding rights of the people who use it, while being careful about what it means. There is conflation of cyberspace and the Internet and failures to see it as completely separate space. She sees Cyberspace is the set of relations and engagements that are happening over the Internet. She referred to her recent book ‘Being Digital Citizens‘. Cyberspace has relationships to real space – in relations to Lefebvre concepts of space. She use speech-act theory that explore the ability to act through saying things, and there is a theoretical possibility of performativity in speech. We are not in command of what will happen with speech and what will be the act. We can assert acts through the things we do, and not only in the thing we say and that’s what is happening with how people use the Internet and construct cyberspace.

Jo Bates talked about data cultures and power in the city. Starting from hierarchy in dat and information. Data can be thought as ‘alleged evidence’ (Buckland) – data can be thought as material, they are specific things – data have dimensionality, weight and texture and it is existing something. Cox, in 1981, view the relationship between ideas, institutions and material capabilities – and the tensions between them – institutions are being seen as stabilising force compare to ideas and material capabilities, although the institutions may be outdated. She noted that sites of data cultures are historically constituted but also dynamic and porous – but need to look at who participate and how data move.

The session followed by a discussion, some of the issues: I’ve raised the point of the impact of methodological individualism on Evelyn and Jim analysis – for Evelyn, the digital citizenship is for collectives, and for Jim, the provenance and use of devices is done as part of collectives and data cultures. Jo explored the idea of “progressive data culture” and suggested that we don’t understand what are the conditions for it yet – the inclusive, participatory culture is not there. For Evelyn, data is only possible through the action of people who are involved in its making, and the private ownership of this data does not necessarily make sense in the long run. Regarding hybrid space view of cyberspace/urban spaces – they are overlapping and it is not helpful to try and separate them. Progressive data cultures require organisational change at government and other organisations. Tracey asked about work on indigenous data, and the way it is owned by the collective – and noted that there are examples in the arctic with a whole setup for changing practices towards traditional and local knowledge. The provenance goes all the way to the community, the Arctic Spatial Data Infrastructure there are lots of issues with integrating indigenous knowledge into the general data culture of the system. The discussion ended with exploration of the special case of urban/rural – noting to the code/space nature of agricultural spaces, such as the remote control of John Deere tractors, use of precision agriculture, control over space (so people can’t get into it), tagged livestock as well as variable access to the Internet, speed of broadband etc.

The second session looked at Data Infrastructure and platforms, starting with Till Straube who looked at Situating Data Infrastructure. He highlighted that Git (GitHub) blurs the lines between code and data, which is also in functional programming – code is data and data is code. He also looked at software or conceptual technology stacks, and hardware is at the bottom. He therefore use the concept of topology from Science and Technology Studies and Actor-Network Theory to understand the interactions.

Tracey Lauriaultontologizing the city – her research looked at the transition of Ordnance Survey Ireland (OSi) with their core GIS – the move towards object-oriented and rules based database. How is the city translated into data and how the code influence the city? She looked at OSi, and the way it produce the data for the island, and providing infrastructure for other bodies (infrastructure). OSi started as colonial projects, and moved from cartographical maps and digital data model to a full object-oriented structure. The change is about understanding and conceptualising the mapping process. The ontology is what are the things that are important for OSi to record and encode – and the way in which the new model allows to reconceptualise space – she had access to a lot of information about the engineering, tendering and implementation process, and also follow some specific places in Dublin. She explore her analysis methods and the problems of trying to understand how the process work even when you have access to information.

The discussion that follows explored the concept of ‘stack’ but also ideas of considering the stack at planetary scale. The stack is pervading other ways of thinking – stack is more than a metaphor: it’s a way of thinking about IT development, but it can be flatten. It gets people to think how things are inter-relations between different parts. Tracey: it is difficult to separate the different parts of the system because there is so much interconnection. Evelyn suggested that we can think about the way maps were assembled and for what purpose, and understanding how the new system is aiming to give certain outcomes. To which Tracey responded that the system moved from a map to a database, Ian Hacking approach to classification system need to be tweaked to make it relevant and effective for understanding systems like the one that she’s exploring. The discussion expanded to questions about how large systems are developed and what methodologies can be used to create systems that can deal with urban data, including discussion of software engineering approaches, organisational and people change over time, ‘war stories’ of building and implementing different systems, etc.

The third and last session was about data analytics and the city – although the content wasn’t exactly that!

Gavin McArdle covered his and Rob Kitchin paper on the veracity of open and real-time urban data. He highlighted the value of open data – from claims of transparency and enlighten citizens to very large estimation of the business value. Yet, while data portals are opening in many cities, there are issues with the veracity of the data – metadata is not provided along the data. He covered spatial data quality indicators from ISO, ICA and transport systems, but questioned if the typical standard for data are relevant in the context of urban data, and maybe need to reconsider how to record it. By looking at 2 case studies, he demonstrated that data is problematic (e.g. indicating travel in the city of 6km in 30 sec). Communicating the changes in the data to other users is an issue, as well as getting information from the data providers – maybe possible to have meta-data catalogue that add information about a dataset and explanation on how to report veracity. There are facilities in Paris and Washington DC, but they are not used extensively

Next, Chris Speed talked about blockchain city – spatial, social and cognitive ledgers, exploring the potential of distributed recording of information as a way to create all forms of markets in information that can be controlled by different actors.

I have closed the session with a talk that is based on my paper for the workshop, and the slides are available below.

The discussion that followed explored aspects of representation and noise (produced by people who are monitored, instruments or ‘dirty’ open data), and some clarification of the link between the citizen science part and the philosophy of technology part of my talk – highlighting that Borgmann use of ‘natural’,’cultural’ and ‘technological’ information should not be confused with the everyday use of these words.