Tag Archives: algorithms

Code and the City workshop videos: Session 4

If you missed any of the videos from the first three sessions, they are here: Session 1, Session 2 and Session 3.

Session 4: Cities, knowledge classification and ontology

Cities and context: The codification of small areas through geodemographic classification
Alex Singleton, Geography, University of Liverpool

Abstract
Geodemographic classifications group small area geography into categories based on shared population and built environment characteristics. This process of “codification” aims to create a common language for the description of salient internal structure of places, and by extension, enable their comparison across geographic contexts. The typological study of areas is not a new phenomenon, and contemporary geodemographics emerged from research conducted in the 1970s that aimed at providing a new method of targeting deprivation relief funding within the city of Liverpool. This city level model was later extended for the national context, and became the antecedent of contemporary geodemographic classification. This paper explores the origins of geodemographics, to first illustrate that the coding of areas is not just a contemporary practice; and then extends this discussion to consider how methodological choices influence classification structure. Being open with such methods is argued as being essential for classifications to engender greater social responsibility.

The city and the Feudal Internet: Examining institutional materialities
Paul Dourish, Informatics, UC Irvine

Abstract
In “Seeing like a City,” Marianne Valverde turns to urban regulation to counter some of James Scott’s arguments about the homogenizing gaze of high modern statehood. Cities, she notes, are highly regulated, but without the panoptic order that Scott suggests. They operate instead as a splintered patchwork of regulatory boundaries – postal codes, tax assessment districts, business improvement zones, school catchment areas, zoning blocks, sanitation districts, and similar divisions that don’t quite line up. Arguments about online experience and the consequences of the Internet have a similar air to Scott’s analysis of statehood – they posit a world of consistent, compliant, and compatible information systems, in which the free flow of information and the homogenizing gaze of the digital erases boundaries (both for good and ill).

In fact, the organization of the Internet — that is, of our technologically- and historically-specific internet –is one of boundaries, barriers, and fiefdoms. We have erected all sorts of internal barriers to the free flow of information for a range of reasons, including the desire for autonomy and the extraction of tolls and rents. In this talk I want to explore some aspects of the historical specificity of our Internet and consider what this has to tell us about the ways that we talk about code and the city.

Semantic cities: Coded geopolitics and rise of the semantic web
Heather Ford and Mark Graham, Oxford Internet Institute, University of Oxford

Abstract
In 2012, Google rolled out a service called Knowledge Graph which would enable users to have their search query resolved without having to navigate to other websites. So, instead of just presenting users with a diverse list of possible answers to any query, Google selects and frames data about cities, countries and millions of other objects sourced from sites including Wikipedia, the CIA World Factbook and Freebase under its own banner.

For many, this heralded Google’s eventual recognition of the benefits of the Semantic Web: an idea and ideal that the Web could be made more efficient and interconnected when websites share a common framework that would allow data to be shared and reused across application, enterprise, community, and geographic boundaries. This move towards the Semantic Web can be starkly seen in the ways that Wikipedia, as one of the foundations for Google’s Knowledge Graph, has begun to make significant epistemic changes. With a Google funded project called WikiData, Wikipedia has begun to use Semantic Web principles to centralise ‘factual’ data across all language versions of the encyclopaedia. For instance, this would mean that the population of a city need only be altered once in WikiData rather in all places where it occurs in Wikipedia’s 285 language versions.

For Google, these efficiencies provide a faster experience for users who will stay on their website rather than navigating away. For Wikipedia, such efficiencies promise to centralise the updating process so that data are consistent and so that smaller language Wikipedias can obtain automated assistance in translating essential data for articles more rapidly.

This paper seeks to critically interrogate these changes in the digital architectures and infrastructures of our increasingly augmented cities. What shifts in power result from these changes in digital infrastructures? How are semantic standardisations increasingly encoded into our urban environments and experiences? And what space remains for digital counter-narratives, conflict, and contention?

To tackle those questions, we trace data about two cities as they travel through Google’s algorithms and the Semantic Web platforms of Wikidata and Wikipedia. In each case, we seek to understand how particular reflections of the city are made visible or invisible and how particular publics are given voice or silenced. Doing so leads us to ultimately reflect on how these new alignments of code and content shape how cities are presented, experienced, and brought into being.

Job: Three year postdoc on the Programmable City project

We’re pleased to announce the advertisement of a three year postdoc position on the Programmable City project.   Full details of the project can be found on the Maynooth University HR page, but essentially the post will study algorithms and code used in smart city initiatives (broadly conceived) from a software studies perspective.  As such, the project will critically examine how software developers translate rules, procedures and policies into a complex architecture of interlinked algorithms that manage and govern how people traverse or interact with urban systems.  It will thus provide an in-depth analysis of how software and data are being produced to aid the regulation of city life in an age of software and ‘big data’. The primary methods will be a selection from those set out in the paper ‘Thinking critically about and researching algorithms’.

We are seeking applications from researchers with an interest in software studies, critical data studies, urban studies, and smart cities to work in an interdisciplinary team. Applicants will:

  • have a keen interest in understanding software from a social science perspective;
  • be a proficient programmer and able to comprehend other developer’s code;
  • have a good, broad range of qualitative data creation and analysis skills;
  • be interested in theory building;
  • have an aptitude to work well in an interdisciplinary team;
  • be prepared to undertake overseas fieldwork;
  • have a commitment to publishing and presenting their work;
  • have a willingness to communicate through new social media;
  • be prepared to archive their data for future re-use by others;
  • be prepared to help organise and attend workshops and conferences.

The closing data is 5th December.  See the full job description here for more details.

We would encourage any interested candidates to apply for the post and for readers of the blog to bring the post to the attention of those who you think might be interested, or circulate in your networks/social media.

Session 4: Programmable City Project Team

Session 4: Programmable City Project Team, included project introductions from Postdoctoral Researchers and PhD students. Here are links to the slides the complete program.

  • Robert Bradshaw, Smart Bikeshare
  • Dr Sophia Maalsen, How are discourses and practices of city governance translated into code?
  • Jim Merricks White, Towards a Digital Urban Commons:Developing a situated computing praxis for a more direct democracy
  • Alan Moore, The Role of Dublin in the Global Innovation Network of Cloud Computing
  • Dr Leighton Evans, How does software alter the forms and nature of work?
  • Darach Mac Donncha, ‘How software is discursively produced and legitimised by vested interests’
  • Dr Sung-Yueh Perng, Programming Urban Lives
  • Dr Gavin McArdle, NCG, NIRSA, NUIM, Dublin Dashboard Performance Indicators & Metrics