Category Archives: information

Predictive analytics in the city

Predictive analytics is a way of responding to and taking advantage of historical and emerging, large datasets by using “a variety of statistical, modeling, data mining, and machine learning techniques … to to make predictions about the future”. One of the advantages of drawing predictions from big data, according to the vision of social physics, is that

You can write down equations that predict what people will do. That’s the huge change. So I have been running the big data conversation … It’s about the fact that you can now understand customers, employees, how we organise, in a quantitative, predictive way for the first time.

Predictive analytics is fervently discussed in the business world, if not fully taken up, and increasingly by public services, governments or medical practices to exploit the value hidden in the public archive or even in social media. In New York for example, there is a geek squad to Mayor’s office, seeking to uncover deep and detailed relationships between the people living there and the government, and at the same time realising “how insanely complicated this city is”. In there, an intriguing question remains as to the effectiveness of predictive analytics, the extent to which it can support and facilitate urban life and the consequences to the cities that are immersed in a deep sea of data, predictions and humans.

Let’s start with an Australian example. The Commonwealth Scientific and Industrial Research Organisation (CSIRO) has partnered with Queensland Health, Griffith University and Queensland University of Technology and developed the Patient Admission Prediction Tool (PAPT) to estimate the presentations of schoolies, leavers of Australian high schools going on week-long holidays after final examines, to nearby hospitals. The PAPT derives their estimates from Queensland Health data on schoolies presentations in previous years, including statistics about the numbers of presentations, parts of the body injured and types of these injuries. Using the data, the PAPT benefits hospitals, their employees and patients by improved scheduling of hospital beds, procedures and staff, with the potential of saving $23 million per annum if implemented in hospitals across Australia. As characterised by Dr James Lind, one of the benefits of adapting predictive analytics is the proactive rather than reactive approaches towards planning and management:

People like working in a system that is proactive rather than reactive. When we are expecting a patient load everyone knows what their jobs [are], and you are more efficient with your time.

The patients are happy too, because they receive and finish treatment quickly:

Can we find such success when predictive analytics is practised in various forms of urban governance? Moving the discussion back to US cities again and using policing as an example. Policing work is shifting from reactive to proactive in many cities, in experimental or implementation stages. PredPol is predictive policing software produced by a startup company and has caught considerable amount of attention from various police departments in the US and other parts of the world. Their success as a business, however, is partly to do with by their “energetic” marketing strategies, contractual obligations of referring the startup company to other law enforcement agencies, and so on.

Above all, claims of success shown by the company are difficult to sustain in closer examination. The subjects of the analytics that the software focuses are very specific: burglaries, robberies, vehicle thefts, thefts from vehicles and gun crimes. In other words, the crimes that have “plenty of data to chew on” for making predictions, and are of the opportunistic crimes which are easier to prevent by the presence of the patrolling police (more details here).

This further brings us to the issue of the “proven” and “continued” aspects of success. These are even more difficult and problematic aspects of policing work for the purpose of evaluating and “effectiveness” and “success” of predictive policing. To prove that an algorithm performs well, expectations for which an algorithm is built and tweaked have to be specified, not only for those who build the algorithm, but also for people who will be entangled in the experiments in intended and unintended ways. In this sense, transparency and objectivity related to predictive policing are important. Without acknowledging, considering and evaluating how both the crimes and everyday life, or both normality and abnormality, are transformed into algorithms and disclosing them for validation and consultation, a system of computational criminal justice can turn into, if not witchhunting, alchemy – let’s put two or more elements into a magical pot, stir them and see what happens! This is further complicated by knowing that there are already existing and inherent inequalities in crime data, such as reporting or sentencing, and the perceived neutrality of algorithms can well justify cognitive biases that are well documented in justice system, biases that could justify the rational that someone should be treated more harshly because the person is already on the black list, without reconsidering how the person gets onto such list in the first place. There is an even darker side of predictive policing when mundane social activities are constantly treated as crime data when using social network analysis to profile and incriminate groups and grouping of individuals. This is also a more dynamic and contested field of play considering that while crime prediction practitioners (coders, private companies, government agencies and so on) appropriate personal data and private social media messages for purposes they are not intended for, criminals (or activists for that matter) play with social media, if not yet prediction results obtained by the reverse engineering of algorithms, to plan coups, protests, attacks, etc.

For those who want to look further into how predictive policing is set up, proven, run and evaluated, there are ways of opening up the black box, at least partially, for critically reflecting upon what exactly it could achieve and how the “success” is managed both in computer simulation and in police practices. The Chief scientist of PredPol gave a lecturer where, as pointed out:

He discusses the mathematics/statistics behind the algorithm and, at one point, invites the audience not to take his word for it’s accuracy because he is employed by PredPol, but to take the equations discussed and plug in crime data (e.g. Chicago’s open source crime data) to see if the model has any accuracy.

The video of the lecturer is here

Furthermore, RAND provides a review of predictive policing methods and practices across many US cities. The report can be found here and analyses the advantages gained by various crime prediction methods as well as their limitations. Predictive policing as shown in the report is far from a crystal ball, and has various levels of complexity to run and implement, mathematically, computationally and organisationally. Predictions can range from crime mapping to predicting crime hotspots when given certain spatiotemporal characteristics of crimes (see a taxonomy in p. 19). As far as prediction are concerned, they are good as long as crimes in the future look similar to the ones in the past – their types, temporality and geographic prevalence, if the data is good, which is a big if!. Also, predictions are good when they are further contextualised. Compared with predicting crimes without any help (not even from the intelligence that agents in the field can gather), applying mathematics to help in a guessing game creates a significant advantage, but the differences among these methods are not as dramatic. Therefore, one of the crucial messages intended by reviewing and contextualising predictive methods is that:

It is important to bear in mind that the predictive methods discussed here do not predict where and when the next crime will be committed. Rather, they predict the relative level of risk that a crime will be associated with a particular time and place. The assumption is always that the past is prologue; predictions are made based on the analysis of past data. If the criminal adapts quickly to police interventions, then only data from the recent past will be useful to police departments. (p. 55)

Therefore, however automated, human and organisational efforts are still required in many areas in practice. Activities such as finding relevant data, preparing them for analysis, tweaking factors, variables and parameters, all require human efforts, collaboration as a team and transforming decisions into actions for reducing crimes at organisational levels. Similarly, human and organisational efforts are again needed when types and patterns of crimes are changing, targeted crimes shift, results are to be interpreted and integrated in relation to changing availabilities of resources.

Furthermore, the report reviews the issues of privacy, transparency, trust and civil liberties within existing legal and policy frameworks. However, it becomes apparent that predictions and predictive analytics need careful and mindful designs, responding to emerging ethical, legal and social issues (ELSI) when the impacts of predictive policing occur at individual and organisational levels, affecting the day-to-day life of residents, communities and frontline officers. While it is important to maintain and revisit existing legal requirements and frameworks, it is also important to respond to emerging information and data practices, and notions of “obscurity by design” and “prodecural data due processes” are ways of rethinking and designing relationships between privacy, data, algorithms and predictions. Even the term transparency needs further reflections to make progress on issues concerning what it means under the context of predictive analytics and how it can be achieved by taking into account renewed theoretical, ethical, practical and legal considerations. Under this context, “transparent predictions” is proposed wherein the importance and potential unintended consequences are outlined with regards to rendering prediction processes interpretable to humans and driven by causations rather than correlations. Critical reflections on such a proposal are useful, for example this two part series – (1)(2), further contextualising transparency both in prediction precesses and case-specific situations.

Additionally, IBM has partnered with New York Fire Department and Lisbon Fire Brigade. The main goal is to use predictive analytics to make smart cities safer by using the predictions to better and more effectively allocate emergency response resources. Similarly, crowd behaviours have already been simulated for understanding and predicting how people would react in various crowd events in places such as major traffic and travel terminals, sports and concert venues, shopping malls and streets, busy traffic areas, etc. Simulation tools take into account data generated by sensors, as well as quantified embodied actions, such as walking speeds, body sizes or disabilities, and it is not difficult to imagine that more data and analysis could take advantage of social media data where sentiments and psychological aspects are expected to refine simulation results (a review of simulation tools).

To bring the discussion to a pause, data, algorithms and predictions are quickly turning not only cities but also many other places into testbeds, as long as there are sensors (human and nonhuman) and the Internet. Data will become available and different kinds of tests can then be run to verify ideas and hypotheses. As many articles have pointed out, data and algorithms are flawed, revealling and reinforcing unequal parts and aspects of cities and city lives. Tests and experiments, such as manipulating user emtions by Facebook in their experiments, can make cities vulnerable too, when they are run without regards to embodied and emotionally chared humans. Therefore, there is a great deal more to say about data, algorithms and experiments, because the production of data and experiments of making use of them are always an intervention rather than evaluation. We will be coming back to these topics in subsequent posts.

Mapping Openness and Transparency

by: Tracey P. Lauriault

I attended the European Regional Meeting of the Open Government Partnership at the Dublin Castle Conference Centre in May of this year.  The meeting was a place for performance and evaluation wonks to show their wares, especially at the following sessions: Open Government Standards and Indicators for Measuring Progress, The EU’s Role in Promoting Transparency and Accountability and Engagement with the OGP, and Open Contracting: Towards a New Global Norm.  I did not attend the Independent Reporting Mechanism (IRM) sessions, but having read the IRM report for Canada, I know that it too is an emerging performance evaluation indicator space, which is affirmed by a cursory examination of the IRMs two major databases.  The most promising, yet the most disappointing session was the Economic Impact of Open Data session.  This is unfortunate as there are now a number of models by which the values of sharing, disseminating and curating data have been measured.  It would have been great to have heard either a critical analysis or a review of the newly released Ordinance Survey of Ireland report, Assessment of the Economic Value of the Geospatial Information Industry in Ireland, the many economic impact models listed here in the World Bank Toolkit, or the often cited McKinsey Global Institute Open data: Unlocking innovation and performance with liquid information report.  Oh Well!

While there I was struck by the number of times maps were displayed.  The mapping of public policy issues related to openness seems to have become a normalized communication method to show how countries fare according to a number of indicators that aim to measure how transparent, prone to corruption, engagemed civil society is, or how open in terms of data, open in terms of information, and open in terms of government nation states are.

What the maps show is how jurisdictionally bound up policy, law and regulatory matters concerning data are.  The maps reveal how techno-political processes are sociospatial practices and how these sociospatial matters are delineated by territorial boundaries.  What is less obvious, are the narratives about how the particularities of the spatial relations within these territories shape how the same policies, laws and regulation are differentially enacted.

Below are 10 world maps which depict a wide range of indicators and sub-indicators, indices, scorecards, and standards.  Some simply show if a country is a member of an institution or is a signatory to an international agreement.  Most are interactive except for one, they all provide links to reports and methodologies, some more extensive than others.  Some of the maps are a call to action; others are created to solicit input from the crowd, while most are created to demonstrate how countries fare against each other according to their schemes.  One map is a discovery map to a large number of indicators found in an indicator portal while another shows the breadth of civil society participation.  These maps are created in a variety of customized systems while three rely on third party platforms such as Google Maps or Open Street Maps.  They are published by a variety of organizations such as transnational institutions, well resourced think tanks or civil society organizations.

We do not know the impact these maps have on the minds of the decision makers for whom they are aimed, but I do know that these are often shown as backdrops to discussions at international meetings such as the OGP to make a point about who is and is not in an open and transparent club.  They are therefore political tools, used to do discursive work.  They do not simply represent the open data landscape, but actively help (re)produce it.  As such, they demand further scrutiny as to the data assemblage surrounding them (amalgams of systems of thought, forms of knowledge, finance, political economies, governmentalities and legalities, materialities and infrastructures, practices, organisations and institutions, subjectivities and communities, places, and marketplaces), the instrumental rationality underpinning them, and the power/knowledge exercised through them.

This is work that we are presently conducting on the Programmable City project, which will  complement a critical study concerning city data, indicators, benchmarking and dashboards, and we’ll return to them in future blog posts.

1.       The Transparency International Corruption by Country / Territory Map

Users land on a blank blue world map of countries delineated by a thick white line, from which they select a country of interest.  Once selected a series of indicators and indices such as the ‘Corruption measurement tools’, ‘Measuring transparency’ and ‘Other governance and development indicators’ appear.  These are measured according rankings to a given n, scored as a percentage and whether or not the country is a signatory to a convention and if it is enforced.  The numbers are derived from national statistics and surveys.  The indicators are:

  • Corruption Perceptions Index (2013), Transparency International
  • Control of Corruption (2010), World Bank dimension of Worldwide Governance Indicators
  • The Bribe Payer’s Index (2011), Transparency International
  • Global Corruption Barometer (2013), Transparency International
  • OECD Anti-Bribery Convention (2011)
  • Financial Secrecy Index (2011), Tax Justice Network
  • Open Budget Index (2010), International Budget Partnership
  • Global Competitiveness Index (2012-2013), World Economic Forum Global Competitiveness Index
  • Judicial Independence (2011-2012), World Economic Forum Global Competitiveness Index
  • Human Development Index (2011), United Nations
  • Rule of Law (2010), World Bank dimension of Worldwide Governance Indicators
  • Press Freedom Index (2011-2012) Reporters Without Borders
  • Voice & Accountability (2010), World Bank dimension of Worldwide Governance Indicators

By clicking on the question mark beside the indicators, a pop up window with some basic metadata appears. The window describes what is being measured and points to its source.

The page includes links to related reports, and a comments section where numerous and colourful opinions are provided!

2.      Open Government Standards

Users land on a Google Map API mashup of Government, Citizen and Private Open Government initiatives.  They are given the option to zoom in to see local initiatives.  In this case, users are led to a typology of initiatives which define what Open Government means from civil society’s point of view.

Initiatives are classified with respect to the following categories 1) Transparency, 2) Participation and 3) Accountability.  The development of the Open Government Standards are being coordinated by “Access Info Europe, a human rights organisation dedicated to the promotion and protection of the right of access to information in Europe and the defence of civil liberties and human rights with the aim of facilitating public participation in the decision-making process and demanding responsibility from governments”.

Definitions, parameters and criteria for a number of sub-indicators are being crowsourced in the online forms like the following for Openness:

The following is a list of standards that are a currently under development.

  • Recognition of the Right to Know
  • Openness
  • Codes of Conduct: Clear standards of behaviour
  • All information available from all public bodies
  • Clear and reasonable Timelines
  • Conflict of Interest Prevention Mechanisms
  • Access is the Rule – Secrecy is the Exception
  • Clear and comprehensive information
  • Assets Disclosure
  • Proactive Publication of Information
  • Active collaboration
  • Transparency and Regulation of Lobbying
  • Free of charge and free for reuse
  • Appropriate and Clear Procedures
  • Whistleblower mechanisms and protections
  • Open Formats
  • Empowerment
  • Procurement Transparency
  • Compilation of information
  • Transparency and Accountability
  • Independent Enforcement Bodies
  • Independent review mechanism

3.      The Global Integrity Report Map

This is an interactive Open Street Map (OSM) Mapbox map depicting the locations where there is Global Integrity fieldwork national reports arranged by the year these were published.  Reports are called Country Assessments and each includes: a qualitative Reporter’s Notebook and a quantitative Integrity Indicators scorecard.

The Integrity Indicators scorecard assesses “the existence, effectiveness, and citizen access to key governance and anti-corruption mechanisms through more than 300 actionable indicators. They are scored by a lead in-country researcher and blindly reviewed by a panel of peer reviewers, a mix of other in-country experts as well as outside experts. Reporter’s Notebooks are reported and written by in-country journalists and blindly reviewed by the same peer review panel”.

Users select a country, and below the map a number of scorecard indicators appear.  Scorecard indicators are arranged into 6 major categories:

  1. Non-Governmental Organizations, Public Information and Media
  2. Elections
  3. Government Conflicts of Interest Safeguards & Checks and Balances
  4. Public Administration and Professionalism
  5. Government Oversight and Controls
  6. Anti-Corruption Legal Framework, Judicial Impartiality, and Law Enforcement Professionalism

Users can then access how each score was derived by following sub-category links.  Below is an example of legislation and the score associated with the Political Financing Transparency indicator which is a sub-class of the Elections category.

PDF copies of the reports are also available, as are spreadsheets of the data used to derive them.

4.      The World Bank Global Integrity Index Map

This is an interactive map depicting the World Bank’s Global Integrity Index, which is one of its Actionable Governance Indicators (AGIs).  AGIs “focus on specific and narrowly-defined aspects of governance, rather than broad dimensions. These indicators are clearly defined, providing information on the discrete elements of governance reforms, often capturing data on the “missing middle” in the outcome chain”.  The map allows users to select from a drop down menu which includes a subset of AGI indicator – the portal contains thousands.  The interactive and downloadable map aims to graphically demonstrate the progress of governance reform worldwide.  The map is but a small picture of what the Portal contains and below is a Governance At A Glance country report for Ireland.

And here is a data availability table, also for Ireland.

5.      The Open Government Partnership Participating Countries map

The interactive map depicts the countries that have signed onto the Open Government Partnership and in which cohort they belong.  Users can select their country of choice which hyperlinks to that country’s membership status and its progress to date in meeting the criteria for membership, where it ranks in terms of commitment and links to related documents such as action plans and progress reports.  It is interesting to note, that the Independent Review Mechanism reports are not included in this list.  Canada’s IRM report was submitted in 2014.

6.      The Open Data Barometer Data Map

The Open Data Barometer map depicts the 77 countries the Open Data Institute has evaluated.  This map assesses how open data policies are implemented in these countries according to three main indicators:

  • Readiness of:
    • government,
    • citizens and civil society and
    • entrepreneurs and business,
    • Implementation based on the availability a variety of datasets within the following sub-categories:
      • accountability
      • social policy
      • Innovation
    • The following Emerging Impacts :
      • Political
      • Social
      • Economic

These are also graphically depicted in a radar chart, as well as a bubble chart where the size of the bubble represents the availability of the datasets per category and if these sets meet the Open Definition Criteria.  The data and associated methodologies are explained in the website about the report.

7.      Open Contracting Implementation and Supporting Tools Map

This is a static map depicting where Open Contracting Support and Tools are implemented.  Sadly, the 5 indicators depicted on the map were not explained or described.  I would have to contact them at a later time to find out!

8.      Reporters Without Borders World Press Freedom Index Map

This map is depicting the findings of the World Freedom Index Report for 2014, with a particular focus on how countries rose and fell from the previous year.  180 Countries are scored against the following criteria which are based partly on a questionnaire, violence committed against journalists the algorithm of which is clearly defined in the PDF copy of the report.  Data and the report are fully downloadable, and more detailed maps with a legend are provided in the report itself and a methodology report.

  • Pluralism
  • Media independance
  • Environment and self-censorship
  • Legislative framework
  • Transparency
  • Infrastructure

9.      Politics for People Not Profit Map

This is an interactive map that depicts the pledges made by nationally elected political officials to the European Parliament and asking them to commit to “stand-up for citizens and democracy against the excessive lobbying influence of banks and big business in the EU?” Mousing over a country triggers a pop-up menu which lists which party has made a commitment, while clicking on the map directs users to the page below which is a tool whereby citizens can fill in a form letter and have it sent to their elected officials to solicit them to pledge.

10.      The OGP Civil Society Hub Map

This is a partially curated and partially crowsourced map on the OGP Civil Society Hub Website.  The starred drops represent countries that are official OGP members, while the red drops represent any number of civil society organizations that have in some way engaged with the OGP, some of which are transnational while others are national or sub-national entities.  Once a location is selected, a pop-up menu appears that includes a national flag, a link to that nation state’s official OGP member site, and provides users with the option to pick from who is involved, a selection of topic areas or a list of information resources.  The map is a means to find people and activities and also a means by which to have people self identify and be recognized as civil society actors but also to connect people.

Unfortunately, the very popular and often discussed Open Knowledge Foundation Open Data Index, the Open Corporate Data indices and the Open Data Study by the Open Society Foundation have not been mapped, even though the latter includes quite a lovely world map on the cover of its report.

Workshop: Code and the City, 3-4 September

In early September the Programmable City project at NUI Maynooth will be hosting a number of the foremost thinkers on the intersection of software, ubiquitous computing and the city for a two day workshop entitled ‘Code and the City’.

We’re really excited to be gathering together these scholars to discuss their ideas and research.  We’ve structured the programme so that each session lasts for two hours, with c. an hour for presentations, followed by an hour of discussion and debate.  Full draft written papers will be circulated in advance to attendees.

To try and make sure the event operates as a workshop we are limiting the numbers attending to the speakers, plus our team, plus a handful of open slots.  If you are interested in attending then please email Sung-Yueh.Perng@nuim.ie with your request by June 6th, setting out why you would like to attend.  We will then allocate the additional places by June 13th.

Introduction

Code and the City
Rob Kitchin, NIRSA, National University of Ireland Maynooth

Session 1: Automation/algorithms

Cities in code: how software repositories express urban life
Adrian Mackenzie, Sociology, Lancaster University

Autonomy and automation in the coded city
Sam Kinsley, Geography, University of Exeter

Interfacing Urban Intelligence
Shannon Mattern, Media Studies, New School NY

Session 2: Abstraction and urbanisation

Encountering the city at hackathons
Sophia Maalsen and Sung-Yueh Perng, National University of Ireland, Maynooth

Disclosing Disaster? A Study of Ethics, Praxeology and Phenomenology in a Mobile World
Monika Büscher, With Michael Liegl, Katrina Petersen, Mobilities.Lab, Lancaster University, UK

Riot’s Ratio, on the genealogy of agent-based modeling and the cities of civil war
Matthew Fuller and Graham Harwood, Cultural Studies, Goldsmiths

Session 3: Social/locative media

Digital social interactions in the city: Reflecting on location-based social media
Luigina Ciolfi, Human-Centred Computing, Sheffield Hallam University

A Window, a Message, or a Medium? Learning about cities from Instagram
Lev Manovich, Computer Science, The Graduate Center, City University of New York

Feeling place in the city: strange ontologies, Foursquare and location-based social media
Leighton Evans, National University of Ireland Maynooth

Mobility in the actually existing smart city: Developing a multilayered model for the mobile computing dispositif
Jim Merricks White, National University of Ireland, Maynooth

Session 4: Knowledge classification and ontology

Cities and Context: The Codification of Small Areas through Geodemographic Classification
Alex Singleton, Geography, University of Liverpool

The city and the Feudal Internet: Examining Institutional Materialities
Paul Dourish, Informatics, UC Irvine

From Jerusalem to Kansas City: New geopolitics and the Semantic Web
Heather Ford and Mark Graham, Oxford Internet Institute, University of Oxford

Session 5: Governance

From community access to community calculation: exploring alternative urban governance through code
Alison Powell, Media & Communications, LSE

Code and the socio-spatial stratification of the city
Agnieszka Leszczynski, Geography, University of Birmingham

The Cryptographic City
David M. Berry, Media & Communication, University of Sussex

 

Interactive city benchmarking sites

Over the past few years there have been a proliferation of city benchmarking indexes and data tools that enable the comparison of different phenomenon across cities.  A recent Jones Lang LaSalle report details 150 of them.  Such indexes are composed of composites of key indicators and are proported to give an indication of city performance vis-a-vis other locales and to judge how city administrations and policies are fairing.  Below are some links to some interactive city benchmarking sites that allow the comparison of selected cities.  Our interests in such benchmarking is in the politics of indicator selection and the formulation of indices, and how the data are employed, a topic that we’ve just started to examine on the ProgCity project.

NYC Global Innovation Exchange: http://www.nyc.gov/html/ia/gprb/html/global/global.shtml

innoexch

OPENCities Monitor: http://www.opencities.eu/web/index.php?monitor_en

opencities
Siemens Green City Index: http://www.siemens.com/entry/cc/en/greencityindex.htm
siemensgreencity
Intercultural City Index: http://www.culturalpolicies.net/web/intercultural-cities-charts.php

ICC
Smart cities index (mid-size cites , 100-500K population): http://www.smart-cities.eu/benchmarking.html

EUsmartcities
Brookings GlobalMonitor: http://www.brookings.edu/research/interactives/global-metro-monitor-3

Brookingsglobalmm
McKinsey Urban World: http://www.mckinsey.com/insights/urbanization/urban_world

mckinseycities

LSE European Metromonitor: http://labs.lsecities.net/eumm/m/metromonitor
LSEmonitor

Code/Space book reviews

Code spaceRob Kitchin and Martin Dodge’s book ‘Code/Space’, published by MIT Press, is the focus of a book review round-table in the journal Dialogues in Human Geography.  The reviews are by Paul Adams, Aharon Kellerman, Sam Kinsley and Mark Wilson.  The authors then respond and reflect on the book in a short piece, ‘Code/space and the nature, production and enrolment of software‘.

Other reviews of the book include:

  • Taylor Shelton’s review in the Annals of the Association of American Geographers, 102(1): 247-49.
  • Mike Batty’s review in Computational Culture, 1 December 2011.
  • Gwilym Eades’s review in Cartographica, 47(2): 140-1.
  • Francis Harvey’s review in IJGIS, March 2012.
  • Matthew Wilson’s review in Cultural Geographies, 19(3): 418-19.
  • Matthew Zook’s review in Regional Studies, 46(8): 1105-06.
  • Peter Adey’s review in the Journal of Transport Geography, 26: 177-76.

Big data and human geography forum

A forum on big data and human geography has just been published in Dialogues in Human Geography 3(3), November 2013.  It includes a paper by Rob Kitchin on the opportunities, challenges and risks of big data to geographic scholarship.  Here’s a full list of contributions:

Mark Graham and Taylor Shelton: Geography and the future of big data, big data and the future of geography, pp. 255-261,

Rob Kitchin: Big data and human geography: Opportunities, challenges and risks, pp. 262-267

Evelyn Ruppert: Rethinking empirical social sciences, pp. 268-273

Michael Batty: Big data, smart cities and city planning, pp. 274-279

Michael F Goodchild: The quality of big (geo)data, pp. 280-284

Sean P Gorman: The danger of a big data episteme and the need to evolve geographic information systems, pp. 285-291

Sandra González-Bailón: Big data and the fabric of human geography, pp. 292-296

Trevor J Barnes: Big data, little history, pp. 297-302