A new paper by Rob Kitchin has been posted as open access on SSRN. Thinking critically about and researching algorithms is The Programmable City Working Paper 5.
The era of ubiquitous computing and big data is now firmly established, with more and more aspects of our everyday lives being mediated, augmented, produced and regulated by digital devices and networked systems powered by software. Software is fundamentally composed of algorithms — sets of defined steps structured to process instructions/data to produce an output. And yet, to date, there has been little critical reflection on algorithms, nor empirical research into their nature and work. This paper synthesises and extends initial critical thinking about algorithms and considers how best to research them in practice. It makes a case for thinking about algorithms in ways that extend far beyond a technical understanding and approach. It then details four key challenges in conducting research on the specificities of algorithms — they are often: ‘black boxed’; heterogeneous, contingent on hundreds of other algorithms, and are embedded in complex socio-technical assemblages; ontogenetic and performative; and ‘out of control’ in their work. Finally, it considers six approaches to empirically research algorithms: examining source code (both deconstructing code and producing genealogies of production); reflexively producing code; reverse engineering; interviewing designers and conducting ethnographies of coding teams; unpacking the wider socio-technical assemblages framing algorithms; and examining how algorithms do work in the world.
Key words: algorithm, code, epistemology, research