Mapping of Suspected Burial Sites as an Aid for the Search of the Missing

Mass grave discovered July–August 2014 at Estépar (Burgos), dating from the start of the Spanish Civil War. The grave contains twenty-six republicans who were killed by nationalists in August-September 1936. By Mario Modesto Mata – Own work, CC BY-SA 4.0,

While techniques relying on Geographic Information Science (GIScience) have been applied to a number of fields over recent decades, there are still many fields wherein experimental work on applied spatial modeling is just now opening up opportunities for advancing scientific knowledge. For the last six years, I have had the honor of working with some ground-breaking, forensic anthropologists to advance scientific knowledge surrounding humanitarian aid issues. Specifically, we examine how to use GIScience and spatial statistics to model wartime killer behaviors. Accurate, precise models of these behaviors may help identify lost burial sites and eventually allow families to recover the remains of loved ones – the missing civilians and soldiers. This work has involved research on acts of violence committed in political emergencies in many complex scenarios such as the former Yugoslavia and in enduring conflicts like Nagorno-Karabakh. I have had the chance to personally work with some  amazing collaborators in this field, including Dr. Derek Congram, Hugh Tuller, Matt Vennemeyer, Michael Kenyhercz, and several current and former staff at the ICRC (financially supported this work). In fact, many of authors in this edited collection have inspired and informed our research. 

The below article represents a small sample of lessons learned from these collaborations and our experimental work. This article is an accepted manuscript for the Forensic Science International special issue on “Humanitarian Forensic Action”. Accepted manuscripts are under a 12-month embargo for many academic sharing sites, but can be immediately shared on author’s personal website. Links will be updated (DOI, journal publication link, etc.) as available. Additional licensing details for accepted manuscripts to FSI can be found here. © 2017, this manuscript version is made available under the CC-BY-NC-ND 4.0 license. 

Recommended Citation (to be updated as additional details become available):
Congram, D., Kenyhercz, M., and Green, A.G. In press. Mapping of suspected burial sites as an aid for the search of the missing. Forensic Science International, Special Issue Humanitarian Forensic Action for comments and annotations is enabled on this post, you can setup an account and learn more about how to use here.


We review the current and potential uses of Geographic Information Software (GIS) and “spatial thinking” for understanding body disposal behaviour in times of mass fatalities, particularly armed conflict contexts. The review includes observations made by the authors during the course of their academic research and professional consulting on the use of spatial analysis and GIS to support Humanitarian Forensic Action (HFA) to search for the dead, theoretical and statistical considerations in modelling grave site locations, and suggestions on how this work may be advanced further.


Forensic Anthropology; Forensic Archaeology; Spatial Analysis; Geographic Information Science; Forensic Humanitarian Action


Rodrigo Guerrero Velasco, the mayor of Cali, Colombia from 1992 to 1994 (re-elected in 2011), is an epidemiologist. As mayor of a city that was at the time plagued with a homicide rate of 124 per 100,000 residents, he adopted an approach to fighting crime that the local press labeled “urban acupuncture” — sticking pins in a map to mark crimes, particularly homicides. This “hot spot” mapping (which is now routinely digital and GIS-based) allowed the municipal authorities to focus resources on the “sick” areas of the city [1]. Guerrero Velasco understood the value of visualisation and of data-driven inquiry. During his tenure as mayor and for two years following, the homicide rate in Cali dropped significantly [2].

One of several maps that were exhibits in the genocide trial of General Ratko Mladic at the United Nations International Tribunal for the former Yugoslavia highlighted the spatial relationship of schools with mass execution sites (Figure 1). The relationship might seem contrived unless you know that up to 8,000 men and boys from in and around Srebrenica were detained for several days before being executed. Because most of the buildings in the area had limited capacity, the detentions were mostly in schools, an agricultural warehouse, and a cultural centre. Some of those who were held prisoner and survived the killings were able to testify about the mass executions at and around these detention centres. In most cases, victim bodies were transported in trucks from execution sites to nearby burial sites. Understanding the spatial dynamics and logistics of detentions, which includes knowing the boundaries of the area under control of those responsible for the subsequent killings (the area marked “RS” in Figure 1), was important in the eventual discovery of victim burial sites [3].

Figure 1. Map submitted as evidence in the trial of Ratko Mladic, Bosnian-Serb General, showing territorial boundaries, schools used as detention sites and mass execution sites. Evidence Reference Number 0706-7941 in Prosecutor v. Mladić, IT-09-82.

When mass fatalities occur due to natural disasters or armed conflict, official resources for interring the dead and investigating the missing often become overwhelmed, forcing improvisational treatment of both statutory and customary treatment of the dead and missing. In cases including illegal killings, burial customs may be deliberately violated either as a means of concealing evidence (i.e., victim bodies) of crimes or as a means of disrespecting the victims and their communities. In these scenarios, the bodies of the dead are often buried anonymously, transforming them into “missing persons”. For those seeking the missing, understanding situational variability in burials and deviations from customary and statutory burial practices is paramount. Knowing the circumstances of disappearance and death as well as those responsible can help us deduce where we ought to be looking for the bodies of those who are missing.

In this article, we emphasize the utility of spatial thinking and analysis, things typically eschewed in favour of oral testimony and written documentation. Spatial analysis, in this context, involves visualizing an area of investigation and assessing spatial relationships among variables that influence how and where bodies are buried (or otherwise managed). We introduce some Geographic Information Science (GIScience) tools that enable more effective investigation of the missing, presumed dead. We illustrate these concepts and methods with several cases from our applied research. The aims of our research are to: (1) supplement traditional investigative efforts and (2) explore new means of investigation using GIScience. More than simply introducing concepts and tools, however, we advocate spatial analysis as a more informed way of preparing for disaster to mitigate the social, psychological, and material cost of not knowing the whereabouts of those who have disappeared and are believed to have died.


Typically, those who investigate missing persons cases seek out witnesses. Witnesses describe what they saw, turning memories, in the form of mental images, into words, which are documented and used to guide investigations. In some cases, witness statements lead investigators to a specific place (e.g., a detention, execution, or burial site) and, depending on the perceived reliability of the information, might lead to an excavation in search of a grave (a prospection). Ideally, the information is accurate and precise so that bodies can be found, exhumed, identified and returned to family for culturally appropriate treatment. Yet, sometimes the information provided by witnesses or informants is not reliable and no burial sites are located. New witnesses will then be sought in the hope that they, in turn, will be able to identify a burial location. This methodological loop tends to have diminishing returns until there are no more witnesses with new information and investigators simply stop searching for the missing.

When the search for unmarked burial sites fails, we seldom know what went wrong. Did the witness of a possibly traumatic event simply not remember accurately? Was their information generally correct but imprecise? For example, a grave prospection might have stopped only 30 meters away from the actual burial site. Or perhaps the witness was, out of a misguided desire to help, subconsciously and inaccurately embellishing the facts. Worse, but plausibly, a witness might have been deliberately deceiving investigators by providing false or inaccurate information. Those of us engaged in these failed searches know how disheartening they can be. Most of us can only imagine what impact failed prospections have on the families who are seeking the missing. Driven by these failures, we seek to develop an alternative search method. One that does not exclusively rely on witnesses who can draw an “x” on a map to indicate a burial site or describe its location. Instead, we examine the common patterns of characteristics of the known locations of body disposal during times of mass fatalities when the capacity to register and mark the burial place of the dead is overwhelmed and in the instance of criminal disappearances and deaths.

Since the early 2000s, geospatial technologies have become embedded in a wide-range of academic disciplines, innovative business models, international humanitarian organizations and civil society groups conducting applied research. Geographic Information Science (GIScience) refers to research that both develops geospatial technology and use geospatial technology to create analytical models for scientific research. Geospatial technology refers to a wide array of data gathering instruments that produce data for Geographic Information Systems (GIS).  For example, Convergne and Snyder discuss how geospatial technology has become a strategic and tactical tool for United Nations peacekeeping operations [4]. Other international organizations have explored GIS and other geospatial technology for mapping in humanitarian contexts, for human rights monitoring [5] and in some rare instances for detecting suspected mass grave sites [6, 7]. The non-profit Ushahidi began in 2008 to crowd-source volunteered geographic information (VGI) to map electoral violence in Kenya. In 2010 they managed volunteer efforts to use GIS and crowd-sourced data to map the streets in Port-au-Prince to facilitate the delivery of humanitarian aid in post-earthquake Haiti [8]. The wide application of GIS in such scenarios belies the seemingly short next step to mapping victims of such disasters (places last seen alive or seen dead, morgues, hospitals, and body disposal sites). Identifying the dead is more easily done when multiple locations can be linked to triangulate data and there has been a profusion of open government data and public data generation by way of almost ubiquitous mobile phones and GPS-enabled cameras. Yet, there are considerable procedural issues that need to be resolved to link such spatial data sets to the expertise and spatial analysis required to interpret and turn the confluence of data into actionable information. We are now light years beyond pins on a map, and linking these big data, open data, VGI, and other data sets to the expertise required to handle and interpret them could lead to dramatic changes in the way the authorities (or others) handle the dead in times of disaster and war.

GIS is an effective tool for visually displaying multiple sources of data into one coherent image, often depicting latent trends not directly observable from individual data sources. Moreover, witness testimony does not need to be abandoned as it can instead be coded and incorporated into a geospatial database that maintains the integrity of the accounts as attributes of spatial locations. Mapping burial sites can serve multiple purposes. Sometimes exhumations are not feasible (e.g., ongoing conflict, lack of resources or expertise), or are not desired (e.g., for religious reasons or political sensitivity, [9]. Simply plotting the grave locations on a map can be critical to the return to that location if conditions change in the future (e.g., an end to the war and the desire for humanitarian exhumation, identification, and repatriation). In addition to burial sites, mapping places of disappearance and death allows the recording of related geographic features and proxy variables. Proxy variables can be generated through spatial analysis of information either from available datasets or carefully coded witness testimony (distance from road, location last seen alive, polygons of known battle sites, to name a few). In this context, there are two primary purposes of mapping: visual representation and spatial analysis. Sometimes analysis is simply intuitive, as in the classic crime map of Guerrero Velasco in Cali to identify a cluster of crimes in a particular neighbourhood and focus their investigative efforts on that place. However, GIS and spatial statistics enable much more powerful analysis.

Materials and Methods

Geospatial Tools and Data

At its core, GIS is a sophisticated database that allows for the acquisition, management, analysis, and display of spatial data. Data within GIS is categorized as either vector (points, lines, and polygons) or raster (any image based on pixels, such as a .tiff or .jpeg). Vector data represents discrete features, such as the coordinate point of a grave, a cemetery, a road, or a territorial boundary. A raster dataset is comprised of pixels; in this instance, pixels represent grids of varying sizes (resolution), that have a spatial component and an attribute component for each grid cell. A commonly employed raster dataset is a digital elevation model (DEM) in which the spatial attribute values for each cell include elevation for that location. A more familiar raster format is .jpeg, where the frame is divided into millions of cells (measured as pixels per square inch) and the attribute value for each cell is a colour, thus composing a picture. Unlike traditional databases, the organizing principle for all data within a geospatial database is spatial location.

For the purposes of mapping the missing and deceased, points can be plotted as vector objects into a geographic reference system (often latitude and longitude) that is linked to other layers such as political boundaries. Ideally, each point corresponds to an individual or case. For each individual’s point, several non-spatial feature attributes can be recorded to construct a database of the missing. Attributes can include information such as field identification numbers for each missing person, their location last seen alive, political affiliation, civilian status, age, sex, nationality, stature, and identifying markings or characteristics (such as tattoos, dental augmentation, etc.). Attribute tables can be entered into GIS without having associated coordinates. This is particularly important in the event of a grave excavation when a body is identified because the coordinates can then be added to the attribute table.

Geoprocessing steps such as buffering, clipping, and measuring can be conducted on selected or all spatial layers (e.g., topographic, road and railway networks, land use, hospitals, cemeteries, military facilities, morgues, military field maps) and entered into a GIS for analysis. Such steps allow us to measure the relationships between places and combine data sets. Defining all possible locations of victim remains requires pooling different types of information from various sources. The relevance of relating locations from different types of spatial data layers can be seen in the investigation of Malaysia Airlines MH17. The commercial aircraft carrying 282 passengers is believed to have been shot down over the Ukraine in 2014. Bodies recovered at the crash site were moved via train by rebel authorities to a nearby town [10]. The spatial relationships of the crash site, rivers, the political boundaries and military-controlled territory within which the crash occurred, a missile launcher position, and the road and train networks all influenced how victim bodies were handled following impact (at multiple locations because of the mid-air explosion and fragmentation).

The introduction of spatial data into geodatabases can also enhance our understanding of errors in data collection and processing that might escape attention in conventional non-spatial databases. In the former Yugoslavia, particularly in Bosnia-Herzegovina from 1996-2001 and Kosovo in 1999 and 2000, multiple investigative entities operated concurrently [11]. These organizations used different standards of recording information, which caused problems coordinating knowledge and action on the ground. Gathering data on the deaths at Srebrenica in Bosnia was a particular focus of the Office of the Prosecutor (OTP) of the International Criminal Tribunal for the former Yugoslavia, which dedicated resources and expertise to document killings in a very thorough manner. The primary interest of the OTP was to investigate grave violations of international criminal law, including crimes against humanity and genocide. Both of these crimes are demonstrated by systematic killings, which can require viewing victim graves at a smaller (i.e., “zoomed out”) scale, rather than on an individual basis. Other, smaller (in personnel and budget) organizations recovering bodies in the Balkans generally operated on a site-by-site basis and their goals were primarily humanitarian, so less concerned with reconstructing a larger temporal and geographic narrative as would be important to a criminal investigation of genocide. As such, there is far less detail available from their work that can now be used to analyse patterns and scale of deaths and burial during and following the wars. If we rely on this available geographic data, the more thorough and numerous cases documented in Srebrenica can constitute a sampling error for model design and in the actual analysis of data (i.e., the sample will be skewed).

There are several other data entry inconsistencies that can cause errors in analysis. For example, places might have names that are spelled differently according to the language being spoken or written during the documentation process (e.g., Kosovo or Kosova, Table 1). By giving spatial coordinates priority, such differential names are resolved in a spatial-enabled geodatabase.

Albanian Serbo-Croat
Drenas Glogovac
Ferizaj Uroševac
Fushë Kosovë Kosovo Polje
Kamenicë Kosovska Kamenica
Rahovec Orahovac
Skënderaj Srbica

Table 1. Distinct place names for the same districts according to language in Kosovo/Kosova.

A second data problem relates to quality. Although there are many free or inexpensive digital maps available from online repositories, governments or other organizations, the precision and accuracy of them can differ greatly. Figure 2 shows a map with road networks from three different sources. One source (with roads drawn using a burgundy line) is much more detailed than the others, but the others do mark some roads that the first does not. These maps can be merged, but it can take time and the completeness of the final map might still be lacking.

Figure 2. Three road network maps for the same location, each differing in precision and completeness. One public source maps (“OSM”, which stands for OpenStreetMap, above) is more complete and precise than others from private sources, but each of the three has roads marked that are absent in the others. Credit: © OpenStreetMap contributors, data available under Open Database License.

Another benefit of using geospatial tools such as GIS is that they can be deployed at several scales. GIS can support spatial analysis including bone microstructure [12], mapping body positions within a mass grave to assist with commingling resolution [13], mapping deposits of bodies within a grave [14], mapping a site within a geographical context and mapping sites relative to each other [15], as well as graves relative to other points of interest such as schools, as illustrated above.

Ultimately, exploring spatial relationships within particular armed conflict contexts gives a more quantifiable understanding of the dynamics that caused people to go missing. While this can help find the remains of those who have died, this may also help analyze and understand comparisons of spatial variation and relationships of similar events among distinct countries. A greater understanding of the variation in human behaviour regarding body disposal will help to develop theories that guide further analysis and improve the effectiveness of searches for the missing, identification of remains, and return of the missing to their families for dignified burial.

Models for Locating the Missing

A fundamental concept in modelling human spatial behaviour is that humans interact with their environments in patterned, non-random ways [16-22]. The ways in which humans have predictably exploited their environments has been the basis of archaeological site prediction that has been employed by Cultural Resource Management firms for decades [23]. Congram and Kenyhercz have argued that human behavioural ecology, specifically optimal foraging theory (OFT), can be used as a theoretical framework for applying site prediction modeling to aid in locating the missing [24]. Briefly, OFT hypothesizes that natural selection is preferential to animals whose behavioural strategy maximizes their energy intake per unit of time spent foraging for resources. Congram and Kenyhercz extended OFT to understand the nature of clandestine body disposal by positing that clandestine burial location is a function of time spent with remains (analogous to foraging time) and avoiding detection (selection), which is bounded by either culture, local environments, or both [24].

As mentioned above, there are many spatial variables available to model the location of potential sites, however, few, if any, are going to be explicitly related to the death event or subsequent burial activity. Instead, spatial variables are often used as proxies for human behaviour, or cognitive decision-making. Whitley identified three different classes of proxy variables that inform cognitive decision-making: 1) direct causal reference; 2) indirect causal reference; and 3) non-causal reference [25]. Direct causal references tie an environmental, or spatial, variable to some sort of cognitive behaviour. For example, a viewshed is a map created with GIS that shows the area that is visible from a fixed location. The visible area can directly impact decision-making, which, in the current context might relate to avoiding detection while disposing of remains. Indirect causal references do not explicitly influence cognition, but will affect the way that latent variables may influence decision-making. To illustrate, distance maps, or cost-distance maps (distance maps that are bounded by other variables such as slope, natural or cultural barriers, etc.) could be used as a proxy for familiarity with a region, which assumes that people are more familiar with their immediate vicinity than they are of regions further away, thus unconsciously bounding their potential disposal sites. Lastly, non-causal references use proxies for some sort of behaviour even though there is no direct relationship. Take the viewshed example again. The line of sight from a particular vantage has a direct relationship to the individual from that vantage. However, the vantage point might also allow an individual to hear better or worse from a location, which cannot be directly measured in GIS.

Predictive modelling of site locations is both an inductive and deductive process. The inductive aspect starts with compiling a database of known site locations and plotting them in a common coordinate system. Simply displaying spatial data layers in an interactive GIS facilitates the identification of spatial patterns using the human eye and expert knowledge of events and geographic areas, just as Velasco used pins on a paper map to elucidate homicide clusters in Cali. Spatial data analysis recognizes some unique barriers to analyses and accounts for problems such as the Modifiable Areal Unit Problem (MAUP). The MAUP occurs when point-based data are aggregated into new levels of analysis (often polygons) as is often done when district reports of criminal events are aggregated into table formats. The aggregation of points (such as burglaries) and subsequent averaging of attribute data (such as cost of items stolen) obfuscates and leads to a loss of understanding of spatial patterns of behaviour, especially when clusters of events may cross over and be divided into neighboring areas (such as police districts). In other words, burglaries that straddle a boundary might not be seen as being related if they are only looked at within individual police district boundaries. The spatial display of data in GIS, even without statistical modeling, allows a more effective visualization and analysis of both inductive and expert knowledge. Identified patterns might then be investigated through fieldwork or through statistical modelling.

Statistically significant clusters can be identified through a variety of analyses including Getis-Ord Gi (Hot Spot) analysis, Ripley’s K-function cluster analysis, and Average Nearest Neighbour tests. These analyses examine spatio-temporal relationships of event observations as well as the spatio-temporal clustering of observation attributes (for example, crime rates or event magnitude) for statistically significant clusters. After significant clusters have been identified, the spatial relationship between environmental and cultural variables can be tested: distance-to-road (or water, or railways, battlefields, site-last-seen, etc.), slope, viewsheds, elevation, surface geology, land-use, population density, and demographic and economic distribution maps to name a few. Using GIS, variables can be assigned to each of the significantly clustered points. Further, to test for significance, one can compare the variable values at each known site location to those at random site locations. This will show if the environmental and cultural variables show any significant pattern particular to the known grave sites when compared to a random distribution of places where there are no sites. Using GIS, it is possible to set a study area boundary and populate it with a random distribution of points. Environmental and cultural variable values can then be measured for each of the random “non-sites”. To test for significance of continuously distributed (parametric) values (e.g., distance to road, elevation), a simple two-tailed t-test can be used when the assumptions of normal distributions are met. When normal distributions cannot be assumed or are violated, Monte Carlo Simulations (or random permutations) are often used to establish a p value for hypothesis testing involving continuous data. On the other hand, for non-continuous data (land use, surface geology, demography, etc.) it is necessary to use nonparametric significance tests such as the Chi-Square Goodness of Fit Test and the Mann-Whitney U test. In both the parametric and nonparametric cases, a p value will be calculated. Typically, if p values are greater than 0.05, it is assumed that the distribution of variables between sites and non-sites is not significant. Put another way, if p is greater than 0.05, the distribution of a particular variable is not significantly different among actual grave sites and randomly distributed non-sites, which means that that particular variable has little analytical value alone in predicting grave site locations.

Equipped with the environmental and cultural variables that significantly contribute to site location, building models relies on deductive reasoning. Consider the following hypothetical scenario:

A battle takes place between warring factions in a town that is bordered by a river on the north side. Three kilometers north of the town, the main road rises to cross a high elevation ridge. During the battle, one party suffers dozens of fatal casualties and with few intact vehicles, makes a harried retreat across the river, destroying the only bridge as they go. You are part of a post-conflict commission to help facilitate the location and repatriation of the dead and decide to create site prediction maps. You have compiled your datasets of known burial locations and all available spatial proxy variables. You have noticed that elevation, particularly high elevations, are significantly associated with grave locations, 50% of casualties have been buried in cemeteries, and that graves are almost always located within 10 kilometers of the place where a person was killed. Would you include the ridge in your analysis?

Inductively, the ridge is very attractive for burial locations: it is restricted geographically (narrow), has limited visibility as shown by a viewshed analysis, is high elevation, a town at the southern edge of the ridge has a cemetery, and it is within the ten kilometer range of most burials relative to fatality locations. However, it is here that deductive reasoning is paramount in creating a location model. The validity of a model rests very much on the context of the battle: the only way to transport the dozens of dead to the ridge had been blown up during the retreat by the survivors. It is extremely unlikely that the dead could have been taken north across the river for burial. As Kvamme pointed out, archaeological sites (particularly Native American) can be accurately predicted based on distance to water (rivers and streams) and surface geography (fertile loess), but so can the distribution of elm trees [23]. Thus, the difference between spurious correlation and causation rests entirely in context-driven deductive reasoning.

To produce a site prediction map, each of the variables must be considered together. The end product of site prediction modelling is not to create a map that explicitly shows the location of individual graves, rather it identifies high and low probability areas where graves are likely to be given the commonalities amongst known site locations (inductive) and the context of the conflict (deductive). There are several methods available to produce site location prediction maps: weighted map-layer, binary logistic regression, maximum entropy modelling, and agent based modelling to name a few. The practical applicability of each of the modeling methods will largely be site, scale, and context dependent. Site prediction modelling is based on the concept of raster math. As mentioned before, raster data is one of the two data types available in GIS wherein a grid is created that describes the location and some other sort of attributes (elevation, colour, distance, slope, demography, etc). To create the model, all explanatory, or predictor, variables must be converted into raster datasets.  

The weighted map-layer approach entails reweighting each variable within each raster dataset, or changing the values of each gridcell so that the values that are significantly related to grave location have a high positive value and those that do not have a low, or even negative value. For example, say 80% of all known sites are within 20m of a major road, a 20m buffer can be created around all major roads and assigned a positive value to reflect a greater likelihood of having a burial, whereas further distances will have a lesser value, or even a negative value (say in the middle of the desert that is impractical to access). Each raster dataset is reweighted with this sort of criteria and then all of the variable layers are “added” together (a process included in raster math). This process can be visualized by thinking of each of these raster datasets as simple arithmetic tables with the rows to be calculated being organized by explicit spatial location (coordinates). The numbers are then assigned a colour scheme to visualize the continuous distribution of low and high numbers. The end result is a new raster that shows high and low probability locations in a continuous form. Remember, each of the original variables were reweighted in such a way that higher positive numbers show sites that have a lot of variables in common. To go back to our hypothetical example with the destroyed bridge, it is during this reweighting step that the area of the high-elevation ridge is excluded from analysis, or assigned negative values because, given the context of the conflict, it was unreachable.

A less subjective way to create a site prediction map can be done through binary logistic regression. Logistic regression will look for optimal splits between two response variables (those being predicted), which in the current case is between known grave site and randomly generated non-site. The values for each of the variables included in the analysis are subjected to the logistic regression wherein coefficients for the explanatory (predictor) variables are automatically generated to best separate the two responses (grave site vs. non-site).  Theoretically, logistic regression scores can range from negative to positive infinity, with a cut-off typically designated at 0. Put simply, the value for each variable is multiplied by the coefficient and then summed together to produce a logistic regression score – those below a certain threshold are classified as non-site, and those above it as an actual site. As a product, the logistic regression can tell you how accurate the model is at classifying grave site from non-site as a total correct classification, and also which variables are significant in the analysis and which are not. The logistic regression can then be fitted to the known data to show each individual site’s logistic regression score. These scores can be reloaded into GIS and displayed with a colour gradient to show high and low probability areas as well as demarcate the statistical cut-off between known sites and non-sites.

Maximum entropy modeling (Maxent) was developed to make inferences on presence-only data, which in the current case would mean the known distribution of actual grave sites. A target probability distribution is estimated by locating the maximum entropy probability distribution that is bounded by a set of constraints [26]. The sample point in this case will be the actual grave site, and the environmental, spatial, and cultural proxy variables are constraining features. Using Maxent, it is unnecessary to generate a distribution of non-sites because the probability distribution is generated based on the set of features that all of the known sites have in common. The end result will be the same – a map set to some colour scheme depicting areas of high and low probability.

Agent based modeling (ABM) is another approach to modeling probable grave site locations. Also sometimes called individual based modeling, ABM models contain agents, decision-making heuristics, adaptive learning rules, a topology, and environmental objects (often serving as barriers). The possible spatial behaviours of agents (individuals) is constrained or enhanced by the above factors in the models which are run on GIS-based computer simulations. So, for example, if a researcher knows that movement is largely reliant on mechanized vehicles, roads can be created as an important topology vector. If most burials are conducted in clandestine locations, the researcher can attempt to operationalize those clandestine areas as spatial goals by creating limits to movement in non-clandestine (e.g., highly visible and populated) areas. The spatially aware probability models are run on agent actions, which produce probability maps (introducing randomness through Monte Carlo Simulations) for possible grave locations.


In our preliminary work and research using GIS to study geographic relationships related to conflict graves in several countries, we have made several useful observations. For this paper, we will list some of the most important and common considerations that are necessary for effective mapping and analysis, with anonymized data to illustrate these.

  1. Some cases of missing persons “last-seen” locations coincide with conflict event locations (e.g., shellings or battles) and/or grave locations. This is often the result of soldiers who died in battles, where the dead were buried at that same location. In our studies across different countries, this is a fairly common practice. On other occasions, depending on the resources available to move victim bodies, the dead will be buried at the nearest local cemetery. Other instances of this location overlap results from killings of individuals, usually non-combatants, in homes. It is not uncommon for victims to be killed at a house and then those responsible for the killing leave the site without burying the victim. In these instances, friends, family or neighbours of the victims often bury the victim on the same property. In both of these cases we see a coincidence of event locations, but the circumstances of death are quite distinct. Nevertheless, from the perspective of victim body recovery and identification, this co-location is very important, and can be illustrated very clearly using GIS. Both of these circumstances have been observed by us in our work in various countries in Central Africa, East Asia, and Europe.
  2. Some missing persons’ “last seen” locations are reported as being distant from any known conflict event, killing, or burial locations. The two most plausible explanations for this spatial discrepancy is incomplete data. The first explanation is that people were reported last seen at specific locations, but there have been no related reports of their nearby death and burial. In this instance the “last seen” location and “death/burial” locations are near one another, but there is information only on the first criteria. The second explanation is that the “last-seen” information about a missing persons comes from someone who last saw them some time (and distance) from their death. This can happen when, for example, someone is mobilized for battle and their family members report them leaving their home on a particular date. It might be that several days pass before the missing person is actually engaged in battle and the distance travelled between their home and the place of battle/death/burial is great. Linking these locations is not easily made with maps or spatial analysis. Instead, more information is needed along with logical (deductive) inferences to better plot a more accurate place of disappearance. In this example, useful information about a person’s actual place of disappearance – to distinguish it from a “last seen” location – would most likely come from their military unit and places of combat engagement that post-date their last seen location as reported by family members. Although the most useful information related to the eventual identification of the missing person will often come from family members (e.g., ante-mortem data and comparative DNA samples), information on the missing person’s burial location is more likely to come from another source (e.g., co-combatants, military records). Coordinating these complementary types of data is something easily done with GIS.
  3. Local expertise, specifically language and cultural expertise, is very important at the stages of data collection and interpretation of analytical results. The biggest element of this with respect to mapping the missing is probably language. Territories with multiple languages, data collected from sources in different languages, and maps with distinct nomenclature are all common problems. An analysis of closest roads to known burial sites in two neighbouring countries showed the greatest number to be tertiary roads in one country and residential roads in the other (Figure 3). Looking over maps of the respective countries, it appears that the discrepancy is attributable, at least in part, to labelling conventions, rather than different burial patterns. In other words, roads labelled as “residential” in one country are labelled as “tertiary” in the other. The number of road categories is also greater in the one country than in the other, which will impact one’s perception of existing burial site location patterns. Having local personnel to interpret why certain patterns exist can be critical in the development of site location models. In one country, we noted that disappearance/death locations were hundreds of kilometers from the burial locations. It did not take an expert to detect that this was highly unusual, but a person who had local knowledge was able to tell us that during a ceasefire, those bodies had been repatriated to families far from the front lines. Identifying these anomalous cases is important because they skew models and interpretations.

Figure 3. A table showing road classifications and proportions from different countries of nearest road types to graves.

Discussion and Recommendations

At times of mass fatalities, government resources are often overwhelmed, particularly during armed conflict. Traditional roles and customs change, often affecting the disposal of dead. Record keeping might be ad hoc, unsystematic, or even deliberately avoided. Non-governmental and intergovernmental organizations might be active in these places and work to support government efforts to record the dead. However, a lack of standard operating procedures, disparate resources and mandates, and the involvement of multiple organizations often results in a chaotic, incomplete corpus of information that complicates the recovery, identification, and return of the dead to their families. In response to these challenges, the ICRC has acquired forensic capacity and developed HFA to assist in proper recovery, documentation and identification of the dead in armed conflict and catastrophes.

The identification and return of the dead to their families is universally important for many reasons. Symbolic memorialization of the dead is part of this process and prevalent across time, regions, and culture. As discussed by Barceló and Pallarés, “production, distribution and consumption take place in a physical space, and as a consequence, this physical space becomes transformed, socialized” [27]. In other words, these places take on societal significance.  Different studies debate how far back deliberate burial goes in human evolutionary history, with recent discoveries suggesting the practice is as old two million years [28]. Some of the debate considers whether or not burial was in fact ritual or simply a way of discouraging dangerous scavengers from discovering a dead body, thus jeopardizing the living in the area [29, 30]. Modern humans have made the ritual treatment of the dead, most commonly burial, almost a universal practice.

The advancement of technology has made GPS-enabled mobile phones and cameras easily accessible to laypeople. The widespread use of these in conjunction with satellite imagery allow for the recording of death scenes and burial sites in real time, even (or especially) in times of war. The ubiquity of GPS-enabled technology can potentially alleviate many of the issues incurred in our experience – such as inconsistent, vague, or unreliable witness testimony about times and places. Now, instead of post hoc interviews, images can be taken with date, time, and location all stored in the metadata. This by no means will replace witness accounts. While photographs and videos are useful tools, they are only part of the overall picture.

The other side to the increasingly available technology and ease of distribution is the problem of data protection. This has been a principal concern and significant obstacle in our recent consulting on grave site location analysis. The prospect of donating GPS-enabled phones or tablets to investigators of missing persons (or even to soldiers during conflict or post-battle surveys) to document disposal locations of the dead is easy and relatively inexpensive. Information related to the dead, however, can be extremely sensitive. Data recorded could include the faces and names of witnesses, (including those responsible for deaths), videos of senior officers ordering executions, vehicle license plates – possibly belonging to civilians but sequestered by the military for the transportation of victims, victim bodies, families of victims, etc. The ease of electronic data storage and transmission make it more difficult to protect as massive data leaks in recent years clearly demonstrate.

Recognizing the ability to better record and understand how the dead are treated and how this often contravenes the legal obligation of states to treat the dead, we make two simple recommendations, which we believe will improve the resolution of cases of missing persons, presumed dead: 1) think spatially; and 2) map the dead. The first suggestion is conceptual and the second is practical. Organizations tasked with the humanitarian or judicial investigation of the dead should think spatially. They should consider how people understand and use space with respect to the treatment of dead bodies. These organizations should also equip themselves with very basic tools such as GPS-enabled mobile phones or tablets in order to improve the recording of actions related to the dead. This can extend from the individual death scene during routine investigation to mass fatality incidents. Further, keeping a repository of spatial data in conflict zones is useful – particularly road network maps, aerial images, places of interest (e.g., cemeteries), and DEMs. Visually mapping sites related to death is not a new concept –  we do it all the time when we mark cemeteries. Standard expectations, norms and resources change at times of mass fatalities and the cartographic documentation of the dead is often neglected.

We have discussed methods for conducting spatial analysis and generating site prediction models of the dead in disasters and armed conflicts. However, in conclusion, the following points must be stressed:

  1. Context is key. There will never be one model that can adequately encapsulate the intricacies of clandestine body disposal. A thorough understanding of the conflict and culture are necessary to draw the most meaningful conclusions from a model. 
  2. Data quality. Accurate geographic coordinates are important, but meaningful interview questions that can aid in the search for the missing are just as important, if not more important than extremely accurate geospatial locations.
  3. The products of site prediction maps are not maps that depict the absolute location of where graves will be. Rather, the site prediction maps show where graves are likely to be given the commonalities that those locations have with known grave sites. These maps are not to be taken as gospel, but instead as guides to help focus search and recovery efforts.

Using GIS and spatial scientific principles for the analysis of spatial patterns in the post-hoc search for missing persons shows great potential. Despite being in its early stages, we are already able to identify some lessons learned that can inform best practices. We hope that sharing such early work will contribute to the further development of these methods as a novel and useful tool for HFA.


[1] R. Guerrero Velasco, Big data are reducing homicides in cities across the Americas. Sci Am (2015) (accessed 14.10.16).

[2] R. Guerrero, A. Concha-Eastman, An epidemiological approach for the prevention of violence. The DESAPAZ program in Cali, Colombia, J Health Pop in Dev Countries  4(1) (2001) .

[3] Srebrenica; Genocide in eight acts. (accessed 14.10.16).

[4] E. Convergne, M.R. Snyder, Making maps to make peace: geospatial technology as a tool for UN peacekeeping, Int Peacekeeping 22(5) (2015) 1-22.

[5] Human rights applications of remote sensing, The Geospatial Technologies and Human Rights Project, American Association for the Advancement of Science, Scientific Responsibility, Human Rights and Law Program (2013) 31-33.

[6] Provincial Killing Fields Maps: Genocide Sites (1975-1979). Cambodian Genocide Program, Yale University. (accessed 14.10.16).

[7] M. Madden, A. Ross, Genocide and GIScience: Integrating personal narratives and Geographic Information Science to study human rights, The Pro Geog 61(4) (2009) 508-526.

[8] M. Zook, M. Graham, T. Shelton, S. Gorman, Volunteered geographic information and crowdsourcing disaster relief: A case study of the haitian earthquake. World Med Health Pol 2(2) (2010) 7-33.

[9] A. Rosenblatt, Digging for the Disappeared. Stanford University Press, Palo Alto, CA, 2015.

[10] BBC, MH17 plane crash: Train with bodies leaves Ukraine station,, 2014 (accessed 14.10.2016).

[11] J.P. Baraybar, V. Brasey, A. Zadel, The need for a centralised and humanitarian-based approach to missing persons in Iraq: An example from Kosovo, Int J Hum Rights 11(3) (2007) 265-274.

[12] D.C. Rose, A.M. Agnew, T.P. Gocha, S.D. Stout, J.S. Field, Technical note: The use of geographical information systems software for the spatial analysis of bone microstructure, Am J Phys Anth 148 (2012) 648-654.

[13] H. Tuller, U. Hofmeister, Spatial analysis of mass grave mapping data to assist in the reassociation of disarticulated and commingled human remains, in: B.Adams, J.Byrd (Eds.), Commingled Human Remains: Methods in Recovery, Analysis, and Identification, Academic Press, San Diego, CA, 2014, pp. 7-31.

[14] H.H. Tuller, Mass graves and human rights: Latest developments, methods, and lessons learned, in: D.C. Dirkmaat (Ed.), A Companion to Forensic Anthropology, First Ed., Wiley-Blackwell, Hoboken, NJ, 2012, pp. 157-74.

[15] D.Congram, A.G. Green, H. Tuller, Mapping the missing: A new approach to locating missing persons burial locations in armed conflict contexts, in: D.Congram (Ed.), Missing Persons; Multidisciplinary Perspectives on the Disappeared, Canadian Scholars’ Press, Inc., Toronto, Canada, 2016, pp. 207-223.

[16] R. Brandt, B.J. Groenewoudt, K.L. Kvamme, An experiment in archaeological site location: Modeling in the Netherlands using GIS techniques, World Arch 24(2) (1992) 268-282.

[17] J. Connolly, M. Lake, Geographical Information Systems in Archaeology. Cambridge University Press, Cambridge, UK, 2006.

[18] D.C. Kellogg, Statistical relevance and site locational data, Am Antiquity 52(1) (1987) 143-150.

[19] M.W. Mehrer, K.L. Wescott, GIS and Archaeological Site Location Modeling, CRC Press, Boca Raton, FL., 2006.

[20] S.J. Shermer, J.A. Tiffany, Environmental variables as factors in site location: an example from the upper Midwest, Midcontinental J Arch 10(2) (1985) 215-240.

[21] K.L. Wescott, R.J. Brandon 2000. Practical applications of GIS for archaeologists. Taylor & Francis, Philadelphia, PA, 2000.

[22] P.E. Woodman, M. Woodward, The use and abuse of statistical methods in archaeological site modeling, in: D.G.J. Wheatley, S. Poppy (Eds.), Contemporary Themes in Archaeological Computing, Oxbow Books, Oxford, UK, 2002.

[23] K.L. Kvamme, The fundamental principles and practice of predictive archaeological modeling, in: A. Voorips (Ed.) Mathematics and Information Science in Archaeology: A Flexible Framework, Holos-Verlag, Bonn, Germany, 1990, pp.  257-295.

[24] D.R. Congram and M.K. Kenyhercz, Thinking spatially: human behavioral ecology and forensic anthropology, poster presentation, the 85th Annual Meeting of the American Association of Physical Anthropologists, Atlanta, GA, 2016.

[25] T.G. Whitley, Spatial variables as proxies for modeling cognition and decision-making in archaeological settings: a theoretical perspective. Internet Arch 16(2) (2004).

[26] S.J. Phillips, R.P. Anderson, R.E. Schapire, Maximum entropy modeling of species geographic distributions, Ecol Modeling 190 (2006) 231-259.

[27] J.A. Barceló, M. Pallares, Beyond GIS: The archaeology of social spaces, Archeologia e Calcolatori 9 (1998) 47-80.

[28] P.S. Randolph-Quinney, A new star rising: Biology and mortuary behaviour of Homo naledi. S Afr J Sci. 111 (2015) 1-4.

[29] R.H. Gargett, Middle Palaeolithic burial is not a dead issue: the view from Qafzeh, Saint-Césaire, Kebara, Amud, and Dederiyeh, J Hum Evol 37 (1999) 27-90.

[30] H.L. Dibble, V. Aldeias, P. Goldberg, S.P. McPherron, D. Sandgathe, T.E. Steele, A critical look at evidence from La Chapelle-aux-Saints supporting an intentional Neandertal burial, J Arch Sci 53 (2015) 649-657.

What is Open Pedagogy?

I was invited to contribute to the Year of Open’s April Open Perspective: What is Open Pedagogy? published by the Open Education Consortium under a CC-BY license (10 April 2017). Below, you can read my contribution and you can also find it in context with the other diverse, fantastic pieces from David Wiley, Mali Baha, and Robert Schuwer on the Year of Open website.

What is Open Pedagogy?

At its core the term “open pedagogy” expresses the aspiration to improve learning processes through more open teaching practices. So, I believe open pedagogy encapsulates the theories and the innovative, applied strategies that support that aspiration.

That being said, I am not sure that open pedagogy can be neatly defined. There are, for example, at least two contemporary understandings of open pedagogy. One contemporary definition focuses on the use of openly-licensed content in tandem with open, effective teaching strategies, while another focuses on a more general philosophy of openness in all elements of the teaching process including open planning, open products, and open post hoc reflection. As well, in the 1960s and 1970s the term open pedagogy was also used to refer (interchangeably with “open education” and open classrooms) to learner-centered teaching approaches that were inspired by theorists such as John Dewey and Jean Piaget .

While each of the above definitions of open pedagogy has radical value in that they each advance the core aspiration of open pedagogy, I find the greatest fidelity with and utility in David Wiley’s definition of open pedagogy as the use of open education resources (OER as defined by the 5Rs) in tandem with open, effective teaching strategies.

Why is it important?

Open pedagogy is the present and future of teaching and learning. Open pedagogy is the natural progression of integrating socially just principles of human relations and the potential of current technology into the educational system. If we believe education leads to human flourishing and that education is a right, then the use and creation of OER in tandem with effective teaching and learning strategies (that is, open pedagogy) is required to establish and protect that right. Open pedagogy fulfills one of the core commitments to a democratic system by cultivating an informed, educated, and engaged electorate.

On a more personal level, open pedagogy has become not just important, but fundamental to my own approach to teaching. My engagement with open pedagogy focuses on revolutionizing the pedagogical relations between learners, learning facilitators, the production of knowledge, and the societal contexts in which we learn, teach, and live. In fact, my theoretical approach to teaching draws directly from critical pedagogy which emphasizes the awakening of a critical consciousness. Critical pedagogy questions the institutions and practices of education by supporting an approach that emphasizes teaching as a political act, learner-centered practices, praxis, the co-production of knowledge, and the educator as a facilitator. I believe practicing contemporary critical pedagogy requires engaging with OER and therefore leads naturally to experiments in open pedagogy.

What changes do you hope it will bring (for your country/region)?

I think open pedagogy has ripple effects. So, what I outline below are what I would describe as direct impacts of open pedagogy that I can see in the first couple of “ripples”, though I am sure other positive, possibly more indirect impacts (gender opportunities, environment stewardship, etc.) might occur. When open pedagogy is more widely adopted, I believe we will see the following changes within our regional education systems:

  1. Students and faculty working on creating and updating openly-licensed educational materials that are locally adapted.
  2. Learners contributing novel ideas and original research to pressing contemporary problems.
  3. Better retention and completion rates in post-secondary education due to lower costs and more engaging and efficient teaching strategies.
  4. A more critical, informed, and engaged electorate.
  5. Higher human capital and performance in creative applications of principles from natural and social sciences, technology, engineering, and mathematics.

What is the future of Open Pedagogy?

The future of open pedagogy is experimentation and adaptive management.

I see parallels between open pedagogy and the evolution of thought among environmental scientists regarding uncertainty and adaptive management. When my colleagues and I teach about environmental management and complex environmental issues, we believe one of the first things that students must grasp is that there is not a solution to every problem. Many complex environmental problems require adaptive management – that is, constantly using scientific principles and experimentation to find and adjust optimum conditions in the face of complex uncertainty. These problems require not only expertise in environmental science, but an understanding of the constraints and opportunities of human institutions that mediate our relationship to our environment. As well, engaging in adaptive management sometimes means a change of paradigm to re-conceptualize stubborn problems as possible opportunities.

Open pedagogy describes that same approach for the future of education. Education poses innumerable complex problems. If education is a right, then providing access to human knowledge becomes a societal obligation. There is not one teaching style that can magically address that obligation or all the complex issues involved in education. However, open pedagogy provides the tools, resources, and framework to adaptively manage and find the optimum conditions for education in many different contexts. If accessing the educational materials that everyone needs to succeed is a problem due to costs or other constraints, then it is also an opportunity to teach people about the systems that produce knowledge and how to create and share educational materials. If our institutional and professional practices in education encourage locking knowledge behind paywalls, then it is an opportunity to create communities of co-production of knowledge wherein data is shared and the learning process happens in the open accessible to many people at many levels. Our growing understanding of the science of how people learn, creative legal innovations such as Creative Commons, and the vast affordances of the internet all provide new, exciting opportunities to manage and turn enduring challenges into opportunities within our educational systems. We live, perhaps unaware, in the playground of open pedagogy.

I believe that disciplines, institutions, and departments are beginning to awaken to the potential of open pedagogy. I see exploration of how to encourage and support open pedagogy practices through new policies at all these different levels. So, I sincerely hope that effective learning strategies that involve using and making OER become the default rather than an afterthought or secondary option.

What cannot be denied is that the future of open pedagogy entails the creation of relationships, tools, and processes that allow us to improve learning processes through more open teaching practices.

Reading the Cape Town Open Education Declaration

I have mentioned the Cape Town Open Education Declaration in many of my public presentations, including a recent keynote for the British Columbia Open Education Librarians Beyond Reading Our Rights: The Changing Paradigms of Open Education. Many of the people I interact with have never been exposed to this text, so I want to just summarize the importance of it to my own work in this brief post.

This declaration’s text, much like Paolo Freire’s Pedagogy of the Oppressed, is one of the foundational, reflective pieces I use to ground my work as an educator and open education advocate. The text was drafted in 2007 at a meeting in Cape Town by the Open Society Institute (OSI) and the Shuttleworth Foundation. Since then it has been signed by several thousand people across the world. I have copied the original text of the declaration below. In keeping with the ethos of open, the original text is licensed under a Creative Commons Attribution 3.0 License and the drafters encourage adaptation of this text to fit each of our unique settings – be that drafting institutional policies, regional declarations, or simply personal teaching statements. I encourage you to read it and sign it if you find some resonance with your work.

One of the most useful parts of this text is that it clearly communicates an important idea to people who are new to open education. Specifically, that open education is not just online textbooks. While online collaboration enables the open education movement, the text outlines core principles and strategies that reveal that the open education movement is not simply about re-purposing online, openly-licensed resources.

This text does provide several inspirational quotes that I use in my presentations as they get at the heart of what I try to do in open education. For example, that we are “planting the seeds of a new pedagogy where educators and learners create, shape and evolve knowledge together, deepening their skills and understanding as they go.” Yet, as often said, slogans are not solutions – quotes can inspire but we need a framework for action. The three open education strategies described in the declaration provide part of that framework and help me organize my thoughts about how far we have come, current strategic actions, and future challenges for open education. The three open education strategies in the declaration are (1) supporting educators and learners (open educational practices and open pedagogy), (2) creating and distributing OERs, and (3) implementing open education policies.

Reflecting on these different strategies makes me thankful for many of the people with whom I get to work on daily basis. For example, theory and research on OERs have experienced dramatic advances since 2007 thanks to groups like the OER Research Hub, the Hewlett Foundation-funded Open Education Group, and the BCcampus OpenEd group – check out the leading research of opencontent and johnhiltoniii who established the COUP research framework. Meanwhile, research and theory on open pedagogy and transformative open educational practices are currently experiencing an incredibly fecund moment through the diverse contributions of many people. For example, check out the work of MillerJamison, @mctoonish, and some of the exciting projects my colleague Loch Brown, myself, and other geographers are advancing at UBC Geography. Open policy advances from SPARC and the collaborative projects spearheaded by leaders like acoolidge and @dendroglyph show some amazing and enabling tools that we are just starting to leverage in our institutions. For example, check out this Open Policy Toolkit released in 2016.

These three strategic themes are useful, yet they do not include or even suggest all the important questions about open education that we must discuss. Nor do they necessarily provide the answers we need for the constant improvement of open education. In fact, part of the utility of organized reflection through a specific framework such as this declaration is that we must always ask and be aware of what we are missing by engaging with the limited framework. There are important ethical considerations and necessary discussions that do not fall within these three strategies. For example, what does it mean to even have an open education movement or community – what does inclusivity look like, feel like, act like? How can we systemically integrate and further support student leadership in open education (like the amazing work I have seen first hand from UBC AMS)?  How to recognize contributions of staff and faculty through new institutional commitments to openness in tenure and hiring practices? What are the 12 leverage points for open education as a complex system? How do open education advocates define the universal right to education and what does that mean for our strategic actions? These are some of the enriching questions and discussions that can facilitate thoughts about the future of the movement. In fact, working with a dynamic group of colleagues, I recently contributed to an OpenEd 2017 proposal that would create a session to explore these types of questions by turning them around (posing questions in counter-intuitive ways that are meant to stimulate innovative discussions) and working at these questions from the ground up (everyone involved, no expert panelists raining ideas down on us). So, the declaration is a great starting point for those of us just discovering the open education movement as well as those of us that continually seek to reflect upon, expand, and improve the movement.

Cape Town Open Education Declaration:
Unlocking the promise of open educational resources

We are on the cusp of a global revolution in teaching and learning. Educators worldwide are developing a vast pool of educational resources on the Internet, open and free for all to use. These educators are creating a world where each and every person on earth can access and contribute to the sum of all human knowledge. They are also planting the seeds of a new pedagogy where educators and learners create, shape and evolve knowledge together, deepening their skills and understanding as they go.

This emerging open education movement combines the established tradition of sharing good ideas with fellow educators and the collaborative, interactive culture of the Internet. It is built on the belief that everyone should have the freedom to use, customize, improve and redistribute educational resources without constraint. Educators, learners and others who share this belief are gathering together as part of a worldwide effort to make education both more accessible and more effective.

The expanding global collection of open educational resources has created fertile ground for this effort. These resources include openly licensed course materials, lesson plans, textbooks, games, software and other materials that support teaching and learning. They contribute to making education more accessible, especially where money for learning materials is scarce. They also nourish the kind of participatory culture of learning, creating, sharing and cooperation that rapidly changing knowledge societies need.

However, open education is not limited to just open educational resources. It also draws upon open technologies that facilitate collaborative, flexible learning and the open sharing of teaching practices that empower educators to benefit from the best ideas of their colleagues. It may also grow to include new approaches to assessment, accreditation and collaborative learning. Understanding and embracing innovations like these is critical to the long term vision of this movement.

There are many barriers to realizing this vision. Most educators remain unaware of the growing pool of open educational resources. Many governments and educational institutions are either unaware or unconvinced of the benefits of open education. Differences among licensing schemes for open resources create confusion and incompatibility. And, of course, the majority of the world does not yet have access to the computers and networks that are integral to most current open education efforts.

These barriers can be overcome, but only by working together. We invite learners, educators, trainers, authors, schools, colleges, universities, publishers, unions, professional societies, policymakers, governments, foundations and others who share our vision to commit to the pursuit and promotion of open education and, in particular, to these three strategies to increase the reach and impact of open educational resources:

1. Educators and learners: First, we encourage educators and learners to actively participate in the emerging open education movement. Participating includes: creating, using, adapting and improving open educational resources; embracing educational practices built around collaboration, discovery and the creation of knowledge; and inviting peers and colleagues to get involved. Creating and using open resources should be considered integral to education and should be supported and rewarded accordingly.

2. Open educational resources: Second, we call on educators, authors, publishers and institutions to release their resources openly. These open educational resources should be freely shared through open licences which facilitate use, revision, translation, improvement and sharing by anyone. Resources should be published in formats that facilitate both use and editing, and that accommodate a diversity of technical platforms. Whenever possible, they should also be available in formats that are accessible to people with disabilities and people who do not yet have access to the Internet.

3. Open education policy: Third, governments, school boards, colleges and universities should make open education a high priority. Ideally, taxpayer-funded educational resources should be open educational resources. Accreditation and adoption processes should give preference to open educational resources. Educational resource repositories should actively include and highlight open educational resources within their collections.

These strategies represent more than just the right thing to do. They constitute a wise investment in teaching and learning for the 21st century. They will make it possible to redirect funds from expensive textbooks towards better learning. They will help teachers excel in their work and provide new opportunities for visibility and global impact. They will accelerate innovation in teaching. They will give more control over learning to the learners themselves. These are strategies that make sense for everyone.

Thousands of educators, learners, authors, administrators and policymakers are already involved in open education initiatives. We now have the opportunity to grow this movement to include millions of educators and institutions from all corners of the earth, richer and poorer. We have the chance to reach out to policymakers, working together to seize the opportunities ahead. We have the opportunity to engage entrepreneurs and publishers who are developing innovative open business models. We have a chance to nurture a new generation of learners who engage with open educational materials, are empowered by their learning and share their new knowledge and insights with others. Most importantly, we have an opportunity to dramatically improve the lives of hundreds of millions of people around the world through freely available, high-quality, locally relevant educational and learning opportunities.

The above work is licensed  Creative Commons License to the original drafters. The original text is here.

Open Pedagogy Workshop for Open Access Week 2016

These workshop slides on open pedagogy and open science are openly-licensed as CC BY 4.0. Download the slides here.

For the mind does not require filling like a bottle, but rather, like wood, it only requires kindling to create in it an impulse to think independently and an ardent desire for the truth. ~Plutarch? 

One of the most exciting evolution in pedagogy over the last few years is the integration of open education resources (OERs) and open practices into teaching and learning. During Open Access Week 2016, I had the pleasure and opportunity to lead a workshop on open pedagogy with my BCcampus Faculty Fellow colleagues at Kwantlen Polytechnic University (KPU). I think I can speak for all of us when I say it was truly inspiring to see the administrative support of, faculty enthusiasm for, and student participation in the open education movement at KPU. We planned the workshop as a hands-on create your own open pedagogy project using the liberating structures activity Troika consulting. That rapidly turned into an illuminating group discussion about experiences integrating and developing OER. KPU is truly in the open.

Before we started the hands-on workshop, we presented an overview of open education, open science, and several lessons learned from our work integrating OER into new pedagogical approaches. Many of the examples came from work with my colleagues on Open Geography at UBC and on the authentic learning projects presented by our students’ open scholarship website.

We share these slides above in the hopes that they can be a resource for those of you interested in taking next steps in open pedagogy and stimulating discussion on open education. Many thanks to KPU Open Education for the invitation and special thanks to Caroline Daniels (KPU Library) and Rajiv Jhangiani for being such gracious hosts.


Call for Papers “Researching Conflict”

Please forward this call for papers for “Researching Conflict” a special section of ACME or issue of another major geographical journal to interested scholars.

***Call For Papers Researching Conflict***
Research on geographies of conflict is inherently messy and difficult, muddled with power relations, riddled with foggy recollections, and often enabled with the help of “fixers” with a stake in the conflict. Recognizing this, we invite papers that explore what it means to ‘do research in violent settings’ in a variety of geographic contexts. We especially welcome papers which recognize the inability of the researcher to be objectively separate from the conflict, and consider instead how research as well as the researcher are implicated and embedded in the violent conditions being studied. While this special issue builds on existing methodological texts of fieldwork in conflict zones, it contributes to wider geographic debates from post-colonial, feminist, and political ecology perspectives. Perspectives that emphasize the explicitly normative and unequal positions of researchers in the field in relation to their subjects and settings of investigation. Specifically, we hope that the special issue will provide guidance on research in violent spaces beyond what traditional methodological texts provide and critically interrogate violence as done to and done through the research process itself.

This special issue seeks to present the experiences of scholars working from a range of different fields and spanning experiences across the globe. We are interested in balancing contributions from established as well as emerging scholars. In addition to traditionally structured research manuscripts, this special issue will include interventions and creative works. We encourage interested scholars to consider a wide playing field for these creative submissions. We especially welcome pieces which extend beyond the boundaries of traditional printed page such as interactive works of art, technology, video, and sound.

If you are interested in contributing to this special issue, please submit an abstract of no more than 450 words by 31 August 2016. In addition, we encourage authors to include a few keywords that capture the central themes of the intended paper. An example (not
intended to be used as parameters or limits to possible entries) of keywords is provided in the attached word cloud. Abstract submissions should be made via e-mail to Please include in the subject line “Researching Conflict”. Accepted authors will be notified by mid-September 2016 and provided a detailed timeline for submission and publication as well as other potential opportunities connected to the special issue.

Editors: Ann Laudati (UC Berkeley), Stephen Aldrich (Indiana University), and Arthur Gill Green (University of British Columbia).

More details can be found here:

Photo credit: By MONUSCO Photos – Aerial view Lusenda Burundi refugee camp., CC BY-SA 2.0,

Tossing Out the Disposable Teaching Philosophy Statement

If you are reading this post, you may have been recently asked by a potential employer to provide a teaching philosophy statement. So, heads up! I am not  a job application guru and this post is not about helping you craft the perfect job application adapted to a specific employer. This post is about getting away from the teaching philosophy statement ‘for hire’ model and rethinking why we write teaching philosophy statements at all. Can we agree that it is unfortunate that many of us do not revisit our teaching statements often enough or even share them with anyone but hiring committees?

While I originally wrote my own teaching philosophy statement under job search circumstances, I have come to believe that we have been writing our statements for the wrong audience. While working on open education and authentic learning approaches over the last several years, I have tried my best to dispose of ‘the disposable assignment’. This has made me question many of the disposable products that we ‘must’ create as academics – including the disposable teaching philosophy statement. After all, who is the actual audience for a teaching philosophy statement? Who benefits from reading it? Who benefits from writing it?

I no longer think that the potential employer is the correct audience. Ideally, the audience of these statements is the writer and the writer’s community. It is us. Reflecting on our approaches to teaching and learning should inspire us to refocus on and evolve our core principles. Such reflection could immensely benefit if shared within a community (the ethos of open).

So, I am putting mine here on this website to open up my process. These are the values by which I teach and learn. They may not be perfect and they will certainly evolve, but they are a constant reminder of my core principles and sources of inspiration.

It would be great to hear other’s thoughts on this…  Is it time to get rid of the disposable teaching philosophy statement? Should educators share their teaching statements with students and have students create learning philosophy statements? Is there a different, better way to approach our teaching philosophy statements?

Photogrammetry and OER Field Trips

We’ve embarked on a project to create virtual reality destinations for education (shout out to BCcampus for funding our team’s work on this project via their OER grants program). As part of our work to create these OER virtual destinations around Vancouver, we have had to dive (read ‘belly flop’) into photogrammetry and some other amazing modeling techniques in which I never imagined I would be involved. We are learning to create 3D models of landscapes that can then be imported as virtual locations. In fact, I just started to experiment with building out virtual destinations using photogrammetry and the Destinations Workshop Tookit (beta) for VR headsets. The toolkit contains an amazing suite of 3D modeling tools that we are just beginning to grasp.

I’ll document my photogrammetry experiments and catalogue the materials  (the good, the bad, and the ugly… hopefully some fantastic too) that I create for these education-focused virtual field trip locations. Everything we produce is OER and openly licensed. I am using to embed the interactive content below (you can even view the models in Google Cardboard). Sketchfab is a pretty amazing repository of 3D models that allows authors to apply a CC license to their work.

Experiment 01. This was our first experiment at making a 3D model using photogrammetry, created using 12 pictures taken by Nexus 5 and processed by Autodesk ReMake.

Keys (raw 3D model from AutoDesk)

FieldPress Field Trip Plugin

I am excited to announce the beta release of a WordPress plugin that we have been developing at UBC Geography for the last six months. FieldPress is a WordPress plugin that allows instructors to create and manage field trips online. This plugin provides instructors with a user-friendly environment to build field trips, add multimedia content, create assessments and manage student activity.

FieldPress is open source and is an open educational resource (OER). We are offering access to the beta version of the plugin via GitHub. If you would like to learn more about managing field trips and how to install the plugin on your own installation of WordPress you can use the below links. This plugin will work only on your own installation of WordPress not on blogs maintained on Once we move out of beta, our plugin will be available in the WordPress plugin repository.

Prepackaged beta plugin:

Latest version of the plugin code:

User manual:

Demo Website: is a demonstration website for displaying and testing the capabilities of the FieldPress plugin for WordPress.

If you would like to experience FieldPress as a student user, please following these easy three steps:

  1. Sign up for an account here:
  2. Confirm your account. After signing up you should get an email confirmation. Make sure to check your spam folder.
  3. Sign up for any of our demo field trips here:

If you want to see the backend (as an instructor) you will need to have WordPress administrator role on a website. At this time we are not providing public instructor/administrator access to our demo site, but you can install the plugin on your own WordPress installation.

Just one last note, FieldPress is an exciting example of students as creators. It is primarily the result of the work of a talented, recently-graduated student named Kimi Shen adapting code from CoursePress. The plugin is open source. So, we are looking for feedback from and collaboration with early adopters. We were excited to see Professor David Wright implement the plugin so quickly! Hopefully, he is the first of many!

See more at:

Teaching in the Open: Open Pedagogy and Responsible Pedagogy

It was an honor to share my work on open pedagogy as a teaching strategy with the University of British Columbia Centre for Teaching, Learning, and Technology (@UBC_CTLT). They put together a little video vignette as part of their Open Dialogues that allowed me to (1) explain  the experiences that led me to become an advocate for Open Educational Resources (OER), (2) talk about the role of BCcampus in promoting OER in British Columbia, (3) define why OER are critical approaches to responsible pedagogy, and (4) reflect on what open pedagogy means to learners (and I include both “students” and myself as a “professor” in the category of learners).

Here is the CTLT post and video:

I have a long term itch to write about why open pedagogy matters for geography, environmental studies, and environmental sciences… but that will have to wait until after the the Association of American Geographers meeting in San Francisco and our presentation on using open science approaches for teaching Geographic Information Science.

By the way, I should also point out the presentations by my colleagues on our collaborative research on understanding the neoliberalization of pedagogy and the geography of teaching and learning (Turner) and on developing virtual reality and augmented reality field trip (Brown)! Here’s a taste of the virtual reality Sea-to-Sky field trip that we developed as an experiment using 360 cameras, Holobuilder, and Google Cardboard to increase field trip accessibility!


Watershed Delineation GIS – Open Education Resource

This is histogram equalized image of the Shuttle Radar Topographic Mission data for the border of Nigeria and Cameroon (Faro River Basin).

This is an histogram equalized image of the Shuttle Radar Topographic Mission (SRTM) 1 Arc-Second (30 meter) data for the border of Nigeria and Cameroon (Faro River Basin). The same data is used in the below tutorial on Water Delineation. Arthur G. Green (CC BY SA).

The above slides are a tutorial for people who want to learn how to do watershed delineation using Shuttle Radar Topographic Mission (SRTM). The tutorial is free and licensed as CC BY SA. It can be downloaded here.

The tutorial uses SRTM 1 Arc-Second (30 meter resolution) data to map the Faro River basin near the Cameroon and Nigeria border. The methods can be applied to any region as the data for the tutorial is free (open data) from USGS that can be downloaded here

I created this tutorial for my Advanced Geographic Information Science students at UBC in March 2016. The tutorial uses ArcMap 10.3, so you will need access to that software and the software’s Spatial Analyst license.

Let me know if you find it useful or see something that could be improved!

Download Options

  1. You can download the slides from SlideShare.
  2. The data is free from USGS Earth Explorer.

Where are Tchabal Mbabo and the Faro River?

Tchabal Mbabo cliffs looking out on the Faro River Basin. Arthur G. Green (CC BY SA).

Tchabal Mbabo cliffs looking out on the Faro River Basin. Arthur G. Green (CC BY SA).


Creative Commons License
Watershed Delineation by Arthur Gill Green is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.