Workshop Proceedings
2nd Workshop on Value of Geoinformation
Sep 30 - Oct 2, 2010 | Hamburg, Germany
Editors
Alenka Poplin
Max Craglia
Stéphane Roche
HCU
HafenCity University
Hamburg
Proceedings of the 2nd GeoValue Workshop
Sept 30 – Oct 2, 2010, Hamburg, Germany, Europe
© 2010, First volume
HafenCity University Press
Design by blueBox / HCU
Germany - 22085 Hamburg
Proceedings of the 2nd GeoValue Workshop
Sept 30 – Oct 2, 2010, Hamburg, Germany, Europe
FOREWORD
Dear GeoValue proceedings reader:
On behalf of the GeoValue program and organizing committee, the hosting organization HafenCity
University Hamburg, and the sponsors AGILE and HafenCity University Hamburg, it is a pleasure to
welcome the participants to the 2nd GeoValue workshop on Value of Geoinformation.
The workshop is addressed to experts in the ields of geoinformation science, economics, social
and environmental sciences, urban planning and spatial data quality. It provides an opportunity to
exchange experiences and methodologies across continents, and represents an expansion of the
discussions held last year at the GeoValue `09, organised in Hannover, Germany, as one of the
AGILE conference workshops.
For the 2010 program, 10 extended abstracts were accepted for presentation. Our goal was to stay
strictly focused on the topics, selected for the workshop:
• Value, cost, and pricing of geoinformation;
• Assessment of spatial data infrastructures;
• Geoinformation value chain;
• Socio-economic impacts of geoinformation.
Only the best articles related to these topics were selected for these proceedings and for the oral
presentations at the GeoValue 2010 workshop. The best extended abstracts and their authors will be
invited to extend their submission and contribute an article for the special issue of The International
Journal of Spatial Data Infrastructures Research http://ijsdir.jrc.ec.europa.eu/.
We wish you a very inspiring and productive conference! Enjoy the discussions, the camaraderie and
the city of Hamburg.
Editors of the proceedings:
Alenka Poplin
Max Craglia
Stéphane Roche
Proceedings of the 2nd GeoValue Workshop
Sept 30 – Oct 2, 2010, Hamburg, Germany, Europe
COMMITTEES
Program Committee
The reviews were contributed by:
*
*
*
*
*
*
Michele Campagna, Università di Cagliari, Cagliari, Italy
Max Craglia, European Commission Joint Research Centre, Ispra, Italy
Andrew Frank, Technical University of Vienna, Vienna, Austria
Bastiaan van Loenen, Delft University of Technology, Delft, The Netherlands
Alenka Poplin, HafenCity University, Hamburg, Germany
Stéphane Roche,Université Laval, Québec, Canada
Thank you for your good work!
Organisation Committee
* Max Craglia, Institute for Environment and Sustainability, European Commission
* Alenka Poplin, HafenCity University Hamburg
* Stéphane Roche, Département des Sciences géomatiques, Université Laval
Proceedings of the 2nd GeoValue Workshop
Sept 30 – Oct 2, 2010, Hamburg, Germany, Europe
TABLE OF CONTENTS
Papers
From the Assessment of Spatial Data Infrastructure to the Assessment of Community of Practice: Advocating an Approach by Uses, Matthieu Noucher, François
Golay.
11
An approach for the estimation of SDI services demand: lessons from the transportation sector, Maria Teresa Borzacchiello, Max Craglia.
21
Assessing geoportals from a user perspective, Bastiaan van Loenen, Joep Crompvoets, Alenka Poplin.
29
The narrative anchor as the decisive element of geoinformation infrastructures,
Henk Koerten.
39
It’s not just a matter of numbers – the value of spatial data relected in the regulatory
framework, Katleen Janssen, Joep Crompvoets.
53
Experimenting GeoInformation Prices, Gianluca Miscione, Walter DeVries, Jaap
Zevenbergen, Bastiaan van Loenen.
63
How Much does Free Geoinformation Cost? Alenka Poplin.
67
Valuing volunteered geographic information (VGI): Opportunities and challenges
arising from a new mode of GI use and production, Rob Feick, Stéphane Roche.
75
Re-use of Public Sector Hydrographic Information: Is the maritime sector a guiding
beacon or a slow turning ship? Frederika Welle Donker.
81
Statistical analysis of routing processes using OpenStreet Map road data of the
inner city of Hamburg with different completeness of information about one-way
streets, Matthias Fessele, Alenka Poplin.
87
Proceedings of the 2nd GeoValue Workshop
Sept 30 – Oct 2, 2010, Hamburg, Germany, Europe
PAPERS
GeoValue - Hamburg 2010
10
From the Assessment of Spatial Data Infrastructur
to the Assessment of Community of Practice:
Advocating an Approach by Uses.
Matthieu Noucher1, François Golay2
1
2
IETI Consultants, France,
[email protected]
EPFL / IIE / LASIG, Switzerland,
[email protected]
ABSTRACT
Spatial data sharing mechanisms are an important asset to territorial communities. They help them
understand and control their long term development. In this perspective, this paper suggests a novel approach of geodata appropriation processes based on diverse socio-cognitive theories. This
approach suggests that the evolution of spatial data infrastructures from rough data exchange platforms towards geospatial learning networks, also termed “communities of practice”, and towards
geo-collaboration platforms supporting co-decision may be a signiicant driver of added value. Thus,
it is important to consider these new perspectives in the evaluation criteria and processes of spatial
data infrastructures.
Keywords : assessment, spatial data infrastructure, community of practice, learning network, appropriation.
1. CONTEXT
Spatial Data Infrastructures (SDI) are transcending national levels, and also developing on departmental, local and regional scales across a varied backdrop of legal statutes (simple partnership
agreement, association, public interest group, etc.). These statutes make no guarantee that their
inancial backing will last. Thus the implementation of regional SDIs is often funded by grants that are
unstable by nature. To facilitate their renewal, it is then necessary to plan to assess SDIs in order to
take stock of actions already performed as well as plan for future needs. Thus, assessment-related
issues of SDI have already attracted much attention from the scientiic community over the last decade (see for example Georgiadou & al., 2006; Crompvoets & al., 2008; or Craglia and Campagna,
2010).
Most of the early studies have focused on eficiency and effectiveness criteria, often applying costbeneit analysis principles. Beyond the economic aspects however, the SDI assessment measures
must allow a better understanding of the motivations of those participating and the ways in which the
operators‘ individual expectations match the overall endeavour of the considered SDI. Evaluations
are also a vector of recognition and motivation for the actors involved in the process, and may foster
the development of the infrastructure at hand.
Indeed, as tangible beneits of geographical information systems are dificult to identify and to measure, the assessment of spatial data infrastructures accentuates this dificulty because, from an
initially data-focused point of view, the SDIs are becoming more and more oriented toward the implementation of services. These mostly recent, composite-style platforms that share information do not
have standardized assessment tools yet (Crompvoets & al., 2008).
In addition, beyond assessment efforts that often concentrate on spatial data or even spatial services, classical assessments (especially based on technical and economical performance) quickly
reveal gaps in terms of the assessments of the organizational aspect of emerging collaborative systems around spatial data infrastructures (Georgiadou & al., 2006).
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
11
GeoValue - Hamburg 2010
2. FROM SPATIAL DATA INFRASTRUCTURES TO COMMUNITIES OF PRACTICE
Moreover, recent developments of SDIs show a progressive shift beyond their original function of
base maps diffusion toward the actual coproduction of thematic data within thematic communities.
A certain number of local, departmental and regional SDIs are trying to encourage the sense of belonging to these user communities and actor networks to develop the geographic information culture
(Noucher, 2006). These communities of practice that are forming around the sharing of knowledge
and know-how especially via harmonization, generalization or coproduction of spatial data must also
be taken into account in the evaluation measures of the SDIs that support them.
Recall that a community of practice is deined as a group of individuals linked informally, working in
a network united by common interests and similar projects, cooperating and exchanging their knowledge (Wenger, 1998):
- to create collective value useful to each individual,
- to share common resources (knowledge, experience, etc.),
- to work together in a collective learning process,
- to combine a common culture and a cohesive system of individual interests at the same
time.
In France we can cite CRIGE PACA (Centre régional pour l’information géographique de ProvenceAlpes-Côte-d’Azur) as an example, which, in parallel with its geoportal, leads ten „poles metier” as
thematic groups (urban planning, seashores, forests, public safety, agriculture, etc.) whose objective
is to encourage the harmonization and coproduction of thematic spatial data. In Canada, the CGDI
(Canadian Geospatial Data Infrastructure) has been seeking since 2005 to develop “communities
of practice“ related to diverse issues as public safety, or even to indigenous people. In Switzerland
the INDG (Infrastructure Nationale de Données Géospatiales) has been calling for participation in
„thematic interest communities“ since the beginning of 2010.
Thus, although the names may vary, the issues remain the same: to go beyond the simple goal of
sharing base maps to encourage the development and use of spatial data matching the regional
players‘ business practices or e-governance practices, as addressed by Georgiadou & al. (2006).
3. TOWARD AN ASSESSMENT FRAMEWORK OF COMMUNITIES OF PRACTICE
From a sustainability perspective, the assessment of Spatial Data Infrastructures must therefore include the observation and the evaluation of the communities of practice that emerge from the shared
processes. An ambitious evaluation cannot merely be limited to counting the number of spatial data
downloads or even an ROI study (Return on Investment) and must consider the dynamics of the
users network who, at the boundaries of traditional organizations, are investing in the development
of new uses of geographical information.
Millen, Fontaine and Muller (2002) group the impact of communities of practice into three categories:
- Individual beneits: better understanding of the work of other operators, enhanced professional reputation, higher conidence level, learning of new techniques and new tools, etc.
- Community beneits: increased problem-solving capacity, idea generation, vigorousness
of the exchanges, etc.
- Beneits for the organization: openness toward new activity areas and new techniques,
reduction in time needed to ind information, reuse of existing items, use of previously
pro-ven solutions, etc.
To this we add one other dimension inherent in the context of spatial data infrastructures:
- Territorial beneits: pooling of resources, knowledge, know-how, institutional connections,
harmonization of regional management policies, increased capacity to “decide together”,
etc.
12
From the Assessment of Spatial Data Infrastructur to the Assessment of Community of Practice: Advocating an Approach by Uses.
In order to assess the added value of those impacts for the communities of practice, various types
of evaluation can be conducted by independent experts that relate to existing or emerging communities. We will cite, for example:
- Cross-sectional assessments of the value added by a community of practice in fostering
networking, awareness, and competences of its members and staff (for example: going
from professional isolation, to experience sharing, an to collective learning approaches) –
do we have something to share?
- Assessments of community work results. The objective is to take stock of the group‘s
effectiveness in terms of participant satisfaction, achievement of goals, learning and usefulness for the organization after several months of existence – do we get substantial
results?
- Assessments of the community work process. We analyze the exchange dynamics that
have been developed within the work group, relying especially on the use of available
communication channels by the group members – do we support sound processes?.
- Assessment of the of the community members’ commitment to effectively contribute to the
common endeavour (involvement of every member in the common tasks, provision of sup
porting and skilled human resources, etc.) – does the community get the necessary backup from its members?
- Prospective assessments identifying actual consolidation processes to be developed to
strengthen the community – what to do next?
Such criteria do not anymore assess the productivity or the usefulness of a SDI, but its ability to
support sense-making processes (“zone de la compréhension” in Rodriguez, 2005) and to eventually build upon those processes to develop and strengthen. Research on communities of practice is
therefore more about revealing the whole complexity of the emergence and stabilisation of the production processes considered (the practices) than about assessing the usefulness of SDIs. Moreover, according to Breton and Proulx (2002), the appropriation of such processes by the community
members seems to rely more on their sense-making value for the users than on the quality of the
underlying technologies.
4. AN APPROACH BY OBSERVATION OF COMMON USE PRACTICES
Since the communities we want to assess are focused on practices, it seems necessary to link the
assessment to the observation and analysis of the related appropriation mechanisms. Proulx (1988)
emphasizes the creative dimension of the appropriation. It would be effective only to the extent that
the individual integrates the object of knowledge acquired in a meaningful and creative way into his
daily tasks. It is there where all of the difference between the ideas of consumption and appropriation
lies: one can buy data off the shelf (highway map background, socioeconomic statistics, land use, for
example) but not use them or make a counter-use of them and therefore never appropriate them.
Appropriation for a given operator means a change in his capacity to complete a task. Evolution of
use practice is the evidence of this. The study of the use of spatial data generated by the related
communities of practice can thus allow us to evaluate the effects of SDIs. Now, as Millerand (2002)
points out: „it is by focusing precisely on the mechanisms of appropriation of technical objects that
the research has shown itself to be most fruitful for getting to grips with the question of the development of use practices“.
We have therefore endeavoured to study the actual practices applied by SDI users (Noucher, 2009),
i.e. what they do with spatial data, what uses they make of it and how they appropriate it. To this end,
we have observed interactions between users within communities of practice then we proceeded to
study individual uses of spatial data from SDIs to attempt to understand the complexity and better
determine the consequences of their dissemination. These exchanges between collective (group
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
13
GeoValue - Hamburg 2010
meeting) and individual (routine use of SDI) allowed us to better understand the mechanics of practice formation. Mainly relying on theories of Paul Ricoeur (individual projection), of Etienne Wenger
(collective negotiation), and of Edwin Hutchins (social cognition), we have shown that the individual
and collective involvement in spatial data appropriation processes relies on two different dialectic
processes: the individual projection based on expectation and experience process and the collective
negotiation based on participation and reiication process, as illustrated in igure 1 (Noucher, 2009).
This overall appropriation process is applied to facts and artefacts of our geographical space that can
be seen as boundary objects according to Harvey and Chrisman (1998).
Figure 1. Individual and collective appropriation processes of geographic information (Noucher 2009)
To better delineate the range of uses framing the appropriation process, Michel de Certeau‘s (de
Certeau, 1980) thoughts appear essential. His work has shown, notably, the „own share“ that comes
back to each individual in the construction of use practices. With his subtle descriptions of users‘
„art of doing“ and „ways of doing“, Michel de Certeau demonstrates how user practices diverge from
the program that the technocrats and cultural industries seek to impose. Regular people, he claims,
show creative abilities unsuspected by manufacturers: through tricks, tinkering or diversions - that
Michel de Certeau combines under the term „poaching“ - they are capable of inventing for themselves a unique way of travelling in prefabricated universes. This work allowed the expansion of the
viewpoint beyond the individual interaction of people with technological interfaces. They introduced
time and learning dynamics as key factors in the appropriation and stabilization of practice in habits
or routines.
If Michel de Certeau‘s contribution to the debate on uses is signiicant, it is, according to Florence
Millerand, because he allowed us to investigate the independence of users facing technical systems:
„On one hand, he refused to consider their use as being self-evident; on the contrary, he questioned
it. On the other hand, he focused his study on “actual practices” rather than on structuring dimensions of social, political, or economic nature“ (Millerand, 2003). In doing so, the creativity of user
practices that had been ignored until that point has eventually been understood as the very process
of appropriation (Perriault, 1989).
5. EARLY EXPERIMENTS: OBSERVATION AND EVALUATION OF APPROPRIATION TRAJECTORIES
After describing and analyzing actual uses of spatial data in different communities of practice (8 case
studies, 20 observations, and 80 interviews – 3 examples are synthesized in table 1), we became
interested in the related appropriation processes. We applied the analysis framework presented in
the previous chapter to identify and characterize the appropriation trajectories of spatial datasets
(Noucher, 2009).
We saw that the interaction between both projection and negotiation processes of igure 1 is controlled by an overarching cognitive adaptation process as deined in Piaget’s (1975) theory. Cognitive
14
From the Assessment of Spatial Data Infrastructur to the Assessment of Community of Practice: Advocating an Approach by Uses.
adaptation emerges from the attempts of individuals to match their cognitive structures to their environment: individuals change their environment, and conversely the environment frames the individuals’ development. Cognitive adaptation relies on 2 main mechanisms:
Name
of the case
study
"
5+6."
%4+)'"2#%#"
#2'/"
Système d’Information
du Territoire Lémanique
Centre Régional de
l’Information Géographique
de PACA
Assemblée
Pyrénéenne d’Economie
Montagnarde
Territory
Boundary between
Switzerland & France
Provence Alpes Côte
d’Azur (France)
Pyrenees
(Spain, France, Andorra)
Geoportal
http://www.sitl.org
http://www.crige-paca.org
http://www.apem.asso.fr
- Geospatial data
catalogue
- Based map acquisition
(orthophoto for example)
- Partners convention
Spatial Data
- Technical comity
Infrastructure
and Decisional comity
Components
(technical &
organizational)
Communities
of Practice
Productions
- Geospatial data
- Geospatial data
catalogue by GeoSource catalogue by EasySDI (ISO
19115)
(ISO 19115)
- Geospatial data access
- Geospatial data access
by online mapping
by online mapping (under
(EasySDI)
construction)
- Geospatial data
- Geospatial data
visualization by Web
visualization by Web
Services (WMS / WFS)
Services (WMS / WFS)
- Partners convention
- Base map acquisition
- Technical secretary and
(SCAN IGN for example)
annual general meeting
- Partners convention
- Technical team, executive
board, supervisory board
and
annual general meeting
Spatial data
harmonization :
Spatial data
co-production :
Spatial data
analysis :
Public equipment, road
network, administrative
boundaries…
Public equipment,
road, land uses,
urban sprawl, forestry,
scholar equipment…
Observatory of climate,
Observatory of forest,
Observatory of craft
activities,
Observatory of
pastoralism…
Table 1.Table
Overview
of 3 among
studies
selected
for our
researchfor our research
1: Overview
of83case
among
8 case
studies
selected
- the assimilation, by which an individual incorporates external information into his cognitive
structures, without necessary modiications of those structures, but possibly with a restruc
turation of the external information;
- the accomodation, by which an individual adapts his cognitive structures in order to incor
porate external information.
If a irst assimilation attempt fails because of irreducible differences between the considered information and the individual’s cognitive structures, the individual is facing a cognitive conlict that will eventually lead to a socio-cognitive conlict by entering a negotiation process with other actors to reduce
the differences. Cognitive decentration is then required from the participating actors to accept the
legitimacy of other actors’ points of view and to accommodate their own cognitive structures to a
consensual, new deinition of the information at hand.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
15
GeoValue - Hamburg 2010
Many events all along the individual and collective appropriation process may foster or hinder the
resolution of emerging cognitive conlicts. By observing the information appropriation processes in
our case studies, we could synthesize 4 typical appropriation trajectories of new spatial datasets
(igure 2):
1. a direct consumption of the considered dataset, especially taking place when the considered geodata are already known to the users. Thus, there is no occurring cognitive
conlict and no need for accommodation. The added value of the new data is immediately available to the user.
2. a rejection of the considered dataset, that implies no use at all. This often occurs when
users are not ready to any cognitive decentration. Thus, the occurring socio-cognitive
conlict cannot be solved, and the dataset is rejected, or it is used with downgraded
semantics (i.e. as a cartographic background). We encountered many examples of such
rejection in our case studies, for example by 2 utility management companies that did
not agree on a common model of pipe sections, apparently worried by a possible loss of
control on their infrastructure.
3. a consumption of the highest common divisor, ending up in an individual and partial appropriation of the considered dataset. In one of our case studies, the users were city
planners working in a thematic group aiming at a common deinition of urban land cover
categories. They succeeded in adopting common deinitions, but they accommodated
their own semantics according to their individual needs.
4. a collective appropriation, leading to a modiied, value-added use of the considered information. The actors are looking for common meanings of the identiied boundary
objects, making new uses of the data possible, especially around common goals. For
example, the trans-boundary partners of the Lake Geneva Land Information System
(SITL) ended up with integrated deinitions of land planning zones, allowing for improved
land planning capabilities.
We might further understand the latter trajectory type as a clearly more identity-building case for
communities of practice, bringing all partners toward a consolidated perception of common ends.
Thus, communities of practice could emerge as a step forward from the paradigm of SDI, relying on
the individual re-use of geodata produced by others, toward a true case of geo-collaboration supporting collaborative decision-making activities (Noucher, 2009).
16
From the Assessment of Spatial Data Infrastructur to the Assessment of Community of Practice: Advocating an Approach by Uses.
Figure 2. Typical information appropriation trajectories
6. CONCLUSION
This research offers a different vantage point on spatial data sharing issues. Systemic and socio-cognitive approaches suggest a new integration of knowledge and information in the context of rapidly
spreading geographical information technologies. It advocates a progressive evolution of spatial data
infrastructures toward geomatics-oriented learning networks, also termed “communities of practice”
(Wenger, 1998), and toward geo-collaboration platforms.
Let us inally suggest that the observation and analysis of the presented appropriation mechanisms
and trajectories could constitute a novel step toward the implementation of indicators of the use and
value of spatial data infrastructures. It seems to us that it is in the interest of national and regional
SDIs to foster assessment methods and tools able to express formalized measures of their added
value for government agencies, private companies, and citizens. That should provide them with a
solid asset to claim the necessary resources for their sustainable development.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
17
GeoValue - Hamburg 2010
REFERENCES
Breton P. and S. Proulx (2002). L’explosion de la communication à l’aube du XXIème siècle, La Découverte/Boréal, Paris/Montréal.
Buogo A. (2004). Analyse des politiques cantonales romandes en matière d’information géographique et conséquences pour l’infrastructure nationale de données géographiques, Mémoire pour
l’obtention du Master of Public Administration MPA, Institut de hautes études en administration public
(IDHEAP), Lausanne.
Certeau de M. (1980). L’invention du quotidien, tome 1 : Arts de faire. Gallimard, Paris.
Craglia M., and M. Campagna (2010). Advanced Regional SDI in Europe: Comparative cost-beneit
evaluation and impact assessment perspectives, International Journal of Spatial Data Infrastructures
Research, Vol.5, 145-167.
Crompvoets J., de Bree F., van Oort P., Bregt A., Wachowicz M., Rajabifard A., and I. Williamson
(2007). Worldwide Impact Assessment of Spatial Data Clearinghouses, URISA Journal, Vol. 19, No.
1.
Crompvoets J., Rajabifard A., van Loenen B. and T. Delgado Fernández (Ed). (2008). A Multi-View
Framework to Assess Spatial Data Infrastructures, The University of Melbourne.
Georgiadou, Y., Rodriguez-Pabón, O. and K.T. Lance. (2006). Spatial Data Infrastructure (SDI) and
E-governance: A Quest For Appropriate Evaluation Approaches, Journal of the Urban and Regional
Information Systems Association, 18(2).
Georgiadou Y. and J. Stoter (2010). Studying the use of geo-information in government – A conceptual framework, Computers, Environment and Urban Systems, Volume 34, Issue 1, pp. 70-78.
Harvey F. and N. Chrisman (1998). Boundary objects and the social construction of GIS technology,
Environment and Planning A, 30.
Lelong B. and F. Thomas (2001). L’apprentissage de l’internaute: socialisation et autonomisation,
Actes du 3e colloque international ICUST, Paris.
Miller D.R., Fontaine M.A. and M.J. Muller (2002) Understanding the Beneits and Costs of Communities of Practice, Communications of the ACM, vol. 45, no 4.
Millerand F. (2002). La dimension cognitive de l‘appropriation des artefacts communicationnels. In F.
Jauréguiberry, S. Proulx (Ed), Internet : nouvel espace citoyen, Ed. l’Harmattan.
Millerand F. (2003). L’appropriation du courrier électronique en tant que technologie cognitive chez
les enseignants chercheurs universitaires. Vers l’émergence d’une culture numérique ?, Thèse de
doctorat de l’Université de Montréal.
Noucher M. (2009). La donnée géographique aux frontières des organisations : approche sociocognitive et systémique de son appropriation, Thèse de Doctorat de l’Ecole Polytechnique Fédérale
de Lausanne.
Noucher M. (2006). Mutualisation de l’information géographique : infrastructure de données spatiales ou communauté de pratique ?, GéoEvénement, Paris.
Périault J. (1989). La logique de l’usage. Essai sur les machines à communiquer, Flammarion.
Piaget J. (1975). L’équilibration des structures cognitives, PUF.
Proulx S., Giroux L. and F. Millerand (2001). La « culture technique » dans l’appropriation cognitive des TIC. Une étude des usages du courrier électronique. In Actes du 3e colloque international
ICUST, Paris.
18
From the Assessment of Spatial Data Infrastructur to the Assessment of Community of Practice: Advocating an Approach by Uses.
Proulx S. (1988). Vivre avec l’ordinateur: les usagers de la micro-informatique, Editions G. Vermette,
Boucherville, Québec.
Rodriguez-Pabón O. (2006). Cadre théorique pour l’évaluation des infrastructures d’information géospatiale. Centre de recherche en géomatique, Département des sciences géomatiques, Université
Laval. Thèse de doctorat.
Wenger E. (1998). Communities of Practices: learning, meaning and identity, Cambridge University
Press.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
19
GeoValue - Hamburg 2010
20
An approach for the estimation of SDI services demand: lessons from the
transportation sector
Maria Teresa Borzacchiello, Max Craglia
European Commission - Joint Research Centre,
Institute for Environment and Sustainability
Spatial Data Infrastructure Unit
[email protected]
[email protected]
ABSTRACT
Quantitative and qualitative estimation of spatial data infrastructure (SDI) demand contribute to a
better understanding of its impact on policy and society as a whole. In this article we concentrate
on the transportation sector and compare it with SDI in terms of users´ classiication and demand
estimation. A formal model for the estimation of travel demand and an example of demand estimation
for SDI is presented.
Keywords: spatial data infrastructures, users´classiication, estimation of demand, transportation
1. INTRODUCTION
The adoption of the INSPIRE Directive (European Commission, 2007) establishing a decentralised
spatial data infrastructure (SDI) in Europe, has spawned a large number of implementations of SDIs
at both national and sub-national levels. This phenomenon in turn has heightened the importance
of researching and measuring the impacts of these infrastructures on policy, and society as a whole. Although the research ield has grown considerably in the last few years (see for example: van
Loenen, 2008; Grus et al., 2007; Crompvoets et al,. 2008; Castelein et al., 2010; Nedovic-Budic et
al., 2004). One area that still has not received suficient attention is the estimation of the demand for
SDIs. This research question has been already posed by Centre for International Economics (2000),
but their research does not seem to have had any following up in the subsequent years.
The lack of demand estimation has both a quantitative and qualitative component. In terms of quantity of the demand, one needs to estimate the demand to size the technological infrastructure in order
to avoid malfunctions or crashes such as that in 2006, when the French “Geoportail” crashed due to
huge, unexpected amount of requests1. From a qualitative point of view, estimating the different segments of the demand (i.e the characteristics of different user groups) is a prerequisite for designing
services differently for each group and maximise user satisfaction.
Drawing on these considerations, this paper looks at the lessons that can be drawn from the welldeveloped research ield of trafic demand estimation in the transportation sector, and propose a
methodology to estimate the demand for SDIs.
Firstly, in section 2, it will be shown that the two ields have several common points, in terms of deinition. Then a more detailed analysis of how the ield of SDI research could beneit by the comparison
in terms of users’ classiication and demand estimation is presented. In the inal section, the methods
used to estimate travel demand will be described, and an example of demand estimation for SDI will
be presented as a case study.
2. A COMPARISON BETWEEN THE TWO FIELDS
There are many deinitions that try to synthesize the essence of the transportation systems (TS) as
well as of the Spatial Data Infrastructures (SDI), although the latter is part of a more recent discipline.
“Transportation systems consist not only of the physical and organizational elements that interact
1 http://www.lemonde.fr/cgi-bin/ACHATS/acheter.cgi?offre=ARCHIVES&type_item=ART_ARCH_30J&objet_id=951125
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
21
GeoValue - Hamburg 2010
with each other to produce transportation opportunities, but also of the demand that takes advantage
of such opportunities to travel from one place to another” (Cascetta, 2008).
“Infrastructure for spatial information means metadata, spatial data sets and spatial data services;
network services and technologies; agreements on sharing, access and use; and coordination and
monitoring mechanisms, processes and procedures, established, operated or made available” (European Commission, 2007).
Similarities can be raised from these two deinitions. Indeed, both refer to
- a physical component
- in the case of TS, it refers to the infrastructure itself, namely the road, the railway, the
terminals, the vehicles
- in the case of SDI it is composed by Internet (railway, terminals) and data (vehicles) along
with “network services and technologies”, “metadata, spatial datasets and spatial data ser
vices”;
- an organizational component
- in the case of TS the organizational component refers to that layer of rules, agreements,
that make the transportation service available and usable by users.
- in case of SDI these are the agreements, the standards and the rules that deine data sha
ring policy and coordination and monitoring mechanisms;
- the interaction between these two components
- in the TS case the interaction between physical and organisational component makes it
possible to deliver the transportation service
- in the case of SDI, the interaction between the two previous components allows the
delivery of data and the services embedded in the SDI.
- opportunities
- In the TS case, they are strictly connected with the possibility to “travel from one place to
another”
- In the SDI case the gained opportunities are represented by the possibility to share,
access and use spatial information
The inal portion of the irst deinition of TS refers to the users’ need for transportation services: (TS)
transportation system is not only composed by the “hardware” or infrastructural part, but it is complete only when the demand side is taken into account as well. Moreover, given the multiplicity of its
internal components as well as their interactions with elements of external systems, which have also
a high degree of change in time, transportation system is viewed as a complex and evolving system.
This is not dissimilar to what has been argued by Giff & Crompvoets (2008) and Vandenbroucke et
al. (2009) in relation to SDIs. Therere are therefore many similarities between a TS and an SDI, but
a key difference seems to be the lack of studies attempting to estimate the demand of the services
that an SDI has to deliver.
3. THE PROBLEM OF USERS´ CLASSIFICATION
One of the ist steps in the process of demand estimation is the classiication of the users of the Infrastructure.Some efforts are already present in the literature on SDIs, most of them dealing with the deinition of users of SDI and not of users of SDI services. For instance, Geudens et al. (2009) suggest
that some of the “stakeholders” (e.g. people interested in, and inluenced by, the SDI implementation,
or directly involved in its creation and maintenance) may be considered as SDI users. Along the lines
of the above cited study, another interesting input comes from Giff & Crompvoets (2008) in which SDI
are considered “as a whole”, that is, including stakeholders, while users are not mentioned.
22
An approach for the estimation of SDI services demand: lessons from the transportation sector
Chan et al. (2005) distinguish between novice users and specialist users, while, from a survey on
the Dutch Geo-Information sector Castelein et al. (2010) three main interested actors are identiied:
private sector, governmental sector and research sector.
In general, users of SDI might cover several typologies, depending on the purpose of the use itself.
The expected user community by Inspire (DPLI Working Group, 2002) is subdivided as follows (categories from (Geudens et al., 2009) are very similar):
1.Governments and Administrations
2.Utility and Public Services
3.Research and Development
4.Commercial and Professional End Users
5.Non Governmental Organisations and not-for-proit organisations
6.Citizens
Depending on the typology of the service that the SDI is able to deliver, there can be very different rates of usage. For example, in Navarra (one of the Spanish Regions) the SITNA web portal, belonging
to the regional SDI, has reached in 2009 3 million requests in the whole year (Valentín et al., 2009).
According to Eurostat estimates, in 2009, for the Region of Navarra, on average 58% of individuals
had access to the Internet, at last once a week, meaning more than 270,00021 people that at least
once a week had access to the Internet, therefore more than 14 million every year: this gives a rate,
between SDI users and Internet users, of about 0.21.
From the National point of view, the Spanish Cadastre has different statistics, available, with a monthly update, on the Cadastre website (www.catastro.meh.es). The number of visits to the Cadastre
website in 2009 was 20,802,745. It is unfortunately not clear from the published information if this
igure represents the number of unique visits or the number of overall visits to the website.
Eurostat estimates that in 2009 on average 63% of the all individuals in Spain had used the Internet.
This means that the number of users, on average, reaches almost 29 million23 . At least in Spain then,
the annual amount of Internet users and of users of the Cadastral SDI are comparable, with a relative
rate of 0,71 for 2009.
We have focused here on cadastral applications in an SDI because they are one or two orders of
magnitude bigger than “normal” SDI use. This was clear in the analysis of the key case-studies of
advanced regional SDIs (Craglia and Campagna, 2010), in which normal SDIs had users in their tens
of thousands, while SDIs that delivered also cadastral applications reached millions of users.
These considerations conirm the need to shift the problem to the estimation of the demand of a Spatial Data Infrastructure, to the estimation of the demand of the services that a SDI has to deliver.
The knowledge of these numbers is extremely important in the design phase of a Spatial Data Infrastructure, because it allows the SDI developers to know the volume of users of the machines and
of the services that will host the SDI. Next to this, the typology of users is also a key feature to allow
SDI’s content and performances design.
This is one of the reasons why it is important to monitor and record the experiences of the SDIs
currently in place.
2 Navarra population in 2009 aged 16 to 74 (to which refer the shown percentage) was 465,906 individuals.
3 Spain population in 2009 was 45,828,172 individuals.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
23
GeoValue - Hamburg 2010
4. THE PROBLEM OF DEMAND ESTIMATION
Once chosen the user categories, there are the following possibilities to estimate and/or model the
demand:
- Models analysing usage behaviour: they could be useful to understand market penetration;
- Demand estimation, by taking direct surveys or using models. In the latter case, surveys
are anyway needed to calibrate and validate models.
In the transportation practice, the demand modelling phase, under certain assumptions, follows a
formal analytical formulation, both for the supply model, which is a representation of transportation
network physical and functional features (e.g. length, capacity, link costs, modelled using graph theory and GIS software), and the demand model, either speciied with a behavioural or a descriptive
approach.
The behavioural approach has its rationale in the random utility theory, which was irst used in economics and then applied in a wide range of other ields. It makes explicit assumptions on choice
behaviour of users (Domencich & McFadden, 1975). Behavioural approaches refer to the discrete
choice theory, according to which the individuals, in choosing among different economic alternatives,
have a rational behaviour, oriented at maximising the individual utility function.
Opposed to the behavioural approach, the descriptive approach requires the speciication of a demand function, representing the number of users undertaking trips as depending from the generalised trip cost that is a linear combination of costs and times, weighted by parameters depending on
users’ category and trip purpose. As such, the descriptive approach is exclusively based on experimental premises, which are the basis for the choice of the demand function; moreover, it is more
aggregated and does not allow understanding the detailed behaviour of the users: this is the reason
why in the following the irst approach will be preferred.
These two established approaches in the transportation ield contrast with the complete absence
even of the consideration of the demand side in case of SDI domain.
Therefore, in this paper, the aim is to present a model to estimate the demand volume of the services
of a newly formed SDI or to forecast it for an existing one, using the behavioural approach, choosing
as an applicative example the case of an SDI delivering a Cadastral service. This practical example
will allow specifying each choice dimensions, and to understand the phenomenon and the detailed
motivations of the user’s behaviour, starting from a theoretical framework.
5. THEORETICAL BACKGROUND AND CASE STUDY
The random utility theory is based on the so-called “rational user” assumptions, which are set as
the following:
• the set of possible choices is inite and known;
• the decision maker assigns a perceived utility to each alternative and chooses the alter
native that maximises her perceived utility;
• utility depends on a particular set of quantiiable attributes, regarding the decision maker
and the alternative;
• the analyst does not know with certainty the utility assigned to the particular alternative by
the decision maker;
The random utility function Ui may be expressed as the sum of a deterministic component Vi (the
2
so-called Systematic Utility), and of a random variable gi, with null average and ui as variance. With
respect to the generic choice alternative, it holds:
U i ? Vi - g i ? Â Vik *d ik , X ik + - g i
k
24
An approach for the estimation of SDI services demand: lessons from the transportation sector
In the previous equation,d are parameters to be calibrated, X are the attributes of each alternative.
According to the random distribution of the g function, it is possible to calculate the probability of
choosing an alternative in closed form or numerically.
The simplest case is that random residuals gi are identically and independently distributed (i. i. d.)
as a Gumbel random variable with zero mean and s as parameter, that is, ui,i not equal to zero, and
ui,i equal to zero. In this case, the probability of choosing the alternative Oi equals to:
p(Oi ) ?
exp(Vi s )
 exp*V s +
n
i ?1
j
This equation leads to the well-known Multinomial Logit model (Cascetta, 2008). In order to explain the potential of this model, a classical transportation application in the transportation domain is
shown in the following.
Attributes’ speciication depends on the set of alternative and on the users’ category. For example, to
specify systematic utility for shopping trips this simple equation could be used (Cascetta, 2009):
Vxodmk ( X ) ? d1 NOTRIP - d 2 SHPd / d 3todmk - d 4 mcodmk
where
• NOTRIP=speciic variable for the choice of not making a trip
• SHP=number of shops in the destination zone d concerned (a proxy of the attractiveness
of the zone)
• todmk=travel time to go from origin o to destination d with mode m and the path k
• mcodmk=monetary cost to go from o to d with mode m and the path k
• ds are parameters to be calibrated (sample surveys, trafic counts)
Users in this kind of case studies are considered to be, for example, families, while it is important to
note that every consideration is made considering a certain reference period h.
The steps to be undertaken to set up the model include:
i)
ii)
iii)
Model speciication
Model calibration
Model validation
The model speciication includes the following phases:
i)
ii)
iii)
iv)
Identify choice alternatives
Deine the functional form of the model
Deine the hierarchy of the subsequent choices (in case of a hierarchical model)
Deine choice attributes explaining systematic utility
The model calibration consists in inding the value of the model parameters, while the model validation aims at verifying that the model is effectively able to reproduce the reality.
The individuation of the choice alternatives is not a banal process. They have to satisfy three hypotheses (Train, 2009): the alternatives should be (i) mutually independent (aka mutually exclusive); (ii)
the choice set must be exhaustive, including all the possible choices that the user could make; (iii)
the number of alternatives must be inite.
In order to fulil these requirements, irst the purpose for the user’s choice of the different alternatives
should be identiied. Once deined this purpose, the possible alternatives to reach that objective may
be identiied.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
25
GeoValue - Hamburg 2010
In our case, the users do not make use of the SDI per se, but they use it to address a particular aim,
that is to obtain a particular service.
It could be useful to start making an applicative example, using for instance the SDI that delivers
Cadastral services, which, as shown in the previous section for the Spanish case, has a great usage
rate.
Let’s suppose that a group of users has the need to use data from a regional Cadastre. The same
service could be delivered by different institutions:
i)
ii)
iii)
the public ofice that has always had the duty to make cadastral maps available to
citizens
an independent digital information system delivering a cadastral service
a Spatial Data Infrastructure that, besides giving access to discovery and view
services by means of its catalogues, allows the integration with information from
other sectors
The choice of one alternative or another to reach the common objective of obtaining information
from the regional Cadastre depends at least on two things: the characteristics of each option (is it
accessible, easy to use? Is the service delivered on time?) and the category of users (are they able
to use Internet? Are they familiar with maps? Are they aware of the existence of digital services or of
SDI? Are they experienced users or GIS professionals? In other words, which is the penetration rate
of the technology?).
Once deined the alternatives’ choice set, for each of them the average systematic utility function Vik,
introduced above, must be speciied, regardless of the particular functional form of the random utility
model (Logit, Nested Logit, Probit and so on).
Vik has a form of a linear combination of attributes related to each alternative and coeficients of the
attributes (respectively, X and ds). The attributes of the alternatives X are typically costs and beneits,
expressed not only in monetary terms, that could lead users to choose one alternative or the other.
The attributes of the alternatives are always related to the perceptions of users: sometimes it is
possible to ind proper quantitative indicators, some other times the analyst has to make recourse to
”proxy variables”, that give an approximate idea of the real phenomenon which would otherwise be
not measurable.
The attributes of the utility functions would be therefore related to users’ category, and to the speciic
alternative being analysed.
More in detail, the variables that could be useful as attributes are, in the case of the selected alternatives:
1. Time spent by the user in accessing the service (hours)
2. Time spent in requesting the service (hours or days)
3. Time spent to obtain the service (days)
4. Price of the service (€)
5. Period of availability of the service (continuous(24/7)/discontinuous)
6. The level of popularity (or user’s awareness rate) of the alternative
7. Privacy and security perceptions (trust in the digital tools)
8. Accessibility to data and information from different administrative sectors
9. Interoperability of data and information from different administrative sectors
Once the attributes are deined, the corresponding coeficients in the systematic utility are unknown
quantities. These are the parameters of the model to be calibrated. Physically, they represent the value that each user’s category attaches to that attributes, when considering the different alternatives.
These values can be estimated making use of surveys and interviews to various users’ categories,
26
An approach for the estimation of SDI services demand: lessons from the transportation sector
or of already available data, to be included in the calibration process, which consists in inding the
value of the coeficients that minimize the distance from the hypothesized ones.
If it is not possible to organise surveys or interviews, it has a big importance the data collection, for
calibration purposes. Therefore, the following data could be useful:
• Socioeconomic data of the Region chosen to be a pilot area for the methodology
(segmentation for age, job, technology penetration and so on)
• Data regarding the effective use of the SDI and of different alternatives to reach the same
scope (in this case, obtaining an Urban Cadastre service). If possible, data regarding not
only the typology of services delivered and the overall quantity of periodical users,
but also information about users’ proiles, the most used services by whom and so on.
The study is currently in the phase of data collection and it will be useful to get feedback at the GeoValue Workshop on the the approach chosen, and the variables selected for the modelling.
REFERENCES
Cascetta E. (2008). Transportation systems analysis: models and applications 2nd ed., New York,
London: Springer.
Castelein W.T., Bregt A.K. and Y. Pluijmers (2010). The economic value of the Dutch geo-information
sector. International Journal of Spatial Data Infrastructures Research, 5, 58-76.
Centre for International Economics (2000). Scoping the business case for SDI development. Available at: http://www.gsdidocs.org/docs2000/capetown/businesscase/scoping.pdf [Accessed March 10,
2010].
Chan T.O., Thomas E. and B. Thompson (2005). Beyond SDI: the case of Victoria. In ISPRS Workshop on Service and Application of Spatial Data Infrastructure, XXXVI (4/W6), Hangzhou, China.
Craglia M. and M. Campagna (2010). Advanced Regional SDIs in Europe: comparative cost-beneit
evaluation and impact assessment perspectives. International Journal of Spatial Data Infrastructures
Research, 5, 145-167.
Crompvoets J., Rajabifard A., van Loenen B., and T. Delgado Fernández (Eds.) (2008), A Multi-View
Framework to Assess Spatial Data Infrastructures, The University of Melbourne. http://www.csdila.
unimelb.edu.au/publication/books/mvfasdi.html [accessed 2010-04-22].
Domencich T.A. & D. McFadden (1975). Urban travel demand, North-Holland.
DPLI Working Group (2002). Data Policy & Legal Issues Position Paper, Environment Agency for
England and Wales.
European Commission (2007). Directive 2007/2/EC of the European Parliament and of the Council
of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community (INSPIRE). Available at: http://eur-lex.europa.eu/JOHtml.do?uri=OJ:L:2007:108:SOM:EN:HTML
[Accessed March 10, 2010].
Geudens T., Macharis C., Crompvoets J. and F. Plastria (2009). Assessing Spatial Data Infrastructure Policy Strategies Using the Multi-Actor Multi-Criteria Analysis. International Journal of Spatial
Data Infrastructures Research, 4, 265-297.
Giff G.A. & J. Crompvoets (2008). Performance Indicators a tool to Support Spatial Data Infrastructure assessment. Computers, Environment and Urban Systems, 32, 365 - 376.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
27
GeoValue - Hamburg 2010
Grus L., Crompvoets J. and A.K. Bregt (2007). Multi-view SDI assessment framework, International
Journal of Spatial Data Infrastructures Research, Vol. 2, 33-53, at http://ijsdir.jrc.ec.europa.eu/index.
php/ijsdir/article/viewFile/27/21 , [accessed 31-10-2009].
Nedovic-Budic Z., Feeney M. Rajabifard A. and I. Williamson (2004). Are SDIs serving the needs
of local planning? Case study of Victoria, Australia and Illinois, USA. Computers, Environment and
Urban Systems, 28(4), 329-351.
Train K. (2009). Discrete choice methods with simulation 2nd ed., Cambridge; New York: Cambridge
University Press.
Valentín A., Cabello M. and P. Echamendi (2009). Sitna Geoportal: towards the Integration of territorial information. In 24th International Cartographic Conference. Santiago del Chile. Available at: http://
www.icaci.org/documents/ICC_proceedings/ICC2009/.
Vandenbroucke D., Crompvoets J., Vancauwenberghe G., Dessers E. and J. Van Orshoven (2009).
A Network Perspective on Spatial Data Infrastructures: Application to the Sub-national SDI of Flanders (Belgium). Transactions in GIS, 13(s1), 105-122.
Van Loenen B. (2008). Assessment and socio-economic aspects of geographic information infrastructures, Proceedings of the Workshop on Assessment and Socio-economic Aspects of Spatial
Data Infrastructures, Delft: Nederlandse Commissie voor Geodesie 46. http://www.ncg.knaw.nl/Publicaties/Groen/46VanLoenen.html [accessed 2010-04-22].
28
Assessing geoportals from a user perspective
Bastiaan van Loenen, Delft University of Technology, The Netherlands
Joep Crompvoets, Katholieke Universiteit Leuven, Belgium
Alenka Poplin1 , HafenCity University Hamburg, Germany
[email protected]
[email protected]
[email protected]
ABSTRACT
Spatial data infrastructures (SDIs) are network-based solutions which enable easy, consistent and
effective access to geoinformation and services. They aim to enable users to save resources, time
and effort when trying to acquire, store and user the new data sets. Billions of euros are and have
been invested in SDI initiatives. Although the success of an SDI is closely related to the extent to
which geoinformation is being used, currently developed and emerging assessment methods mostly
ignore the user of geoinformation. Decision makers remain uncertain about the success of their
SDI, the barriers experienced by users and the strategy to follow to accommodate the users’ needs.
Transaction cost theory is a promising new method to assess the performance of an SDI from a user
perspective. Geoinformation transaction costs can be measured by the resources a user needs to
ind, assess, access and use geoinformation. This article focuses on the assessment of one important component of an SDI: geoportals. It presents the preliminary results of research performed by
Msc. students in the Netherlands. They assessed from a transaction cost perspective how much a
user might beneit from using a geoportal compared to a general web search.
Keywords: geoportals, assessment, transaction cost theory, spatial data infrastructure
1. INTRODUCTION
Information economy is a powerful engine for growth, improving competitiveness and enabling jobs
(The Lisbon Special European Council, 2000). It aims at improving citizens‘ quality of life and the environment. New digital goods and services are vital to developing information economies (The Lisbon
Special European Council, 2000; see also The European Parliament, 2005). Information infrastructures are considered to be the backbone of information economies (Castells and Himanen, 2002).
Within information infrastructures, geoinformation may be considered a special type of information.
Geoinformation refers to all information that somehow is linked to the surface of the Earth.
The specialty of geoinformation has resulted in the emerging of Spatial Data Infrastructures (SDIs)
or Geographic Information Infrastructures (GIIs) (see Masser, 1999; 2007). SDIs are network-based
solutions which enable easy, consistent and effective access to geo-information and services offered
by public agencies and others (see Van Loenen, 2006). As a result, they enable users to save resources, time and effort when trying to acquire new data sets (Rajabifard and Williamson, 2002). SDIs
therefore play a crucial role in the management of geo-information and that pertaining to the administration of our societies. In the European Union, two Directives address access to and resue of public
sector information: the PSI Directive (2003/98/EC) promoting re-use of Public Sector Information
and the INSPIRE Directive (2007/2/EC) aiming to establish a European SDI, promoting exchange,
sharing, access and use of (environmental) geoinformation and services across the various levels of
public authority and across different sectors.
Large sums of money are and have been invested in SDI initiatives. Rhind (2000) estimated an expenditure of approximately $10 billion for the US SDI and $2 billion for the SDI of the UK. Worldwide
around €120 million each year is spent just on the management of online portals at national level providing access to geoinformation (Crompvoets, 2006). Given this expenditure and society’s interest in
the effective and eficient use of public funds, it is imperative that these SDI services and initiatives
1 Alenka Poplin published earlier under her maiden name Alenka Krek.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
29
GeoValue - Hamburg 2010
should be assessed on their effectiveness and eficiency. Although the value of geoinformation comes from its use (Onsrud and Rushton, 1995), SDI assessment from a user perspective has been
scant (see Crompvoets et al., 2008; Grus, 2010).
This article focuses on the assessment of one critical component of an SDI: geoportals. How much
does a user beneit from the existence of a geoportal, and is there any emprirical evidence of the added value of a geoportal for the user? Students of Delft University of Technology (The Netherlands)
following Master course ‘Geomatics’ were tasked to assess geoportals from a user perspective. Both
geoportal theory and transaction cost theory were applied in this assessment research. This article
presents the role of geoportals in SDIs (Section 2), the relation between the Web and SDI (Section
3), and introduces transactions cost theory (Section 4), in order to understand the context behind the
assessment research. Finally, the case study research is presented (Section 5).
2. THE ROLE OF GEOPORTALS IN SDI
Spatial data infrastructures aim at enabling easy search for, access to and use of geoinformation.
According to Williamson et al. (2003) an SDI consists of three key components that link the user to
the data: the access network, the policies, and standards (see Figure 1).
Figure 1. Nature and relations between SDI components (Williamson et al. 2003)
A geoportal for geoinformation focuses on the facilitation of geoinformation discovery, access of
data and related services (Crompvoets, 2006). It can be seen as a one-stop shop for geoinformation
(Crompvoets et al., 2004). Through the provision of a one-stop-shop, signiicant cost reductions related to searching, assessing and accessing geo-information can be achieved (see Groot and Sharii,
1994; Askew et al., 2005; Maguire and Longley, 2005; Beaumont et al., 2005).
The importance of geoportals in SDIs can be assessed if we consider the geoportals as the medium
through which the users access the available information. We can imagine them as shopping malls
(Crompvoets, 2006) in which spatial data from government agencies and private bodies are offered
in a complete way so as the user does not have to visit different “shops”. As a result, geoportals have
to be complete systems that will not only offer information but will also the ways the user could useuse them in an eficient way. Their aim is to signiicantly reduce the time necessary to ind, access
and assess data. A geoportal may also facilitate the exchange of data between public authorities,
companies, commercial and professional users by clarifying the transaction conditions.
However, little is known about the user, his/her experiences related to geoportals and time saved
when using these services. So far, the assessment of geoportals has mainly focused on the supplier
or geoportal coordinator sides (see, for example Crompvoets, 2006; Crompvoets, 2007). In our research presented in this article we focus on the user´s experience using the methodology for measuring this experience on the basis of transaction costs as suggested by Poplin (2010).
3. THE WEB AND SDI
SDI 2.0 advocates argue that the SDI concept and objectives should not only consider SDI 1.0
activities, as a geoportal development may be categorised, but also the Web 2.0 may play in geoinformation exchange. In an extreme position one may argue that SDI can be implemented without
including geoportals and can fully rely on existing search facilities on the Internet. However, according to Ellen Bates (2002), 107 billion dollars a year is spent by American companies on employees that spend their time trying to ind the required information through the internet. This is
30
Assessing geoportals from a user perspective
mainly because the impulsive growth of the internet did not allow for standardization of the information put on the web, resulting into a lack of legislations and almost no consistency between the
different websites. The internet has become like a whirlpool of all kinds of information, both relevant
and irrelevant, with an enormous network of links between (sometimes total irrelevant) websites,
from which the user has to ilter the needed information. As Ellen Bates states, the internet gives the
“illusion of an easy access”, it might seem easy to ind information and access the website while it is
in fact quite opposite. Such complexity can discourage potential users or buyers of a particular type
of product. This directly inluences providers of products; i.e. the more dificult it is for users to ind
their products, the less likely the provider) are willing to invest in the product assortment (for instance
product quality) (Van Oort et al.,2009).
However, in the context of geoinformation, no much empirical evidence is available supporting the
claims of Ellen Bates or SDI 2.0 advocates. The transaction cost theory and its application for geoinformation might contribute to the assessment from a user perspective that may provide empirical data on the use of geoportals and the internet supporting to achieve the objectives of an SDI.
4. GEOINFORMATION TRANSACTION COST
Transaction cost theory deals with the cost of transacting called transaction cost. Every trade, every
exchange of a product is a transaction and entails costs that result from both parties attempting to
determine the valued characteristics of the product or service that is a subject of exchange (North,
1990). It takes resources to measure these characteristics and to deine and to measure the rights
that are transferred to the user with the exchange of the products. The cost associated with these efforts is considered to be the transaction cost (Williamson, 1985; North, 1990; Williamson and
Masten, 1995; Sholtz, 2001). Coase (1937) is one of the authors that realised the importance of the
transaction costs. North (1990) got a Nobel Prize for his work on transaction cost theory. Today the
research is part of so called institutional economy.
Transaction cost of geoinformation and the attempts to measure and quantify it is a novel idea. The
irst experiments were done in 2009 (Krek, 2009a, Krek, 2009b, Poplin, 2010). The main idea is
based on transaction cost theory and applied to geoinformation. Geoinformation trade is a transaction which involves data and service providers on one hand, and geoinformation and data users on
other hand. In the process of exchange of geoinformation product the potential users and providers
have to agree on the characteristics of the geoinformation product which is the subject of trade, and
on the conditions of exchange. In this process of communication the geoinformation transaction
costs incur on both sides; on the geoinformation provider´s and the potential user´s side (Krek 2003,
2004, 2009a, 2009b, Poplin 2010). In this research we aim at concentrating on the potential user and
the cost of transacting which we call Demand Geoinformation Transaction Cost (DGTC). The DGTC
is the cost covered by the potential user related to the exchange of geoinformation. It is primarily
the cost of the time spent on searching for the geoinformaiton provider, contacting the organisation,
inquiring characteristics of a speciic dataset, acquiring it, and testing the itness of use in a speciic
application. We summarised the potential user´s activities as follows:
• Activity 1: Searching for the geoinformation provider includes a. searching for the provi
ding organisation and b. searching for the responsible contact person
• Activity 2: Inquiring about the general conditions of the exchange
• Activity 3: Inquiring about the speciic conditions of the exchange; phone or E-Mail inclu
des a. inquiring about the pricing policy and b. inquiring about the availability of the dataset
• Activity 4: Deining the exact characteristics of the geoinformation product includes
deining
the features of the dataset, understanding the offer and explaining the need
• Activity 5: Acquiring and testing the geoinformation product: free sample data acquisition
and storage, testing the “itness of use”
• Activity 6: Reading the documentation about the trade conditions and pricing: reading and
understanding the conditions of use and pricing policies
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
31
GeoValue - Hamburg 2010
All these activities undertaken by the potential user require investment into the process of search
and acquisition of the information about the geoinformation and trade conditions. After the phase of
testing, the potential user can decide whether she wants to acquire the geoinformation or not. After a
dataset has been acquired, it needs to be integrated with the user’s software and possibly linked to
other datasets. Only after this process has been inalised successfully, the user can start using the
dataset for the intended task.
Backx (2003) has captured the concepts related to geoinformation search and acquisition in Figure
2. He explains that before a data set can be used, the user must pass through the two outer rings
as shown in igure 2. A user should irst be aware of the existence (the outer Known ring) of a data
set in order to be able to obtain it (the middle Attainable ring). Transaction cost theory as applied to
geoinformation by Krek (2009a; 2009b; Poplin, 2010) quantiies all three rings including the itness
of use test. A geoinformation that is dificult to ind will result in a thick Known ring which in terms of
transaction cost theory means that the measurement geoinformation transaction cost will rise (an
user has to spend more time in searching for the data). Measurement geoinformation transaction
cost is the cost related to search for an appropriate geoinformation and geoinformation provider, to
veriication of the geoinformation quality and possible transformation to the needed format. A data
set which is easy to ind (thin Known ring) but dificult to obtain (thick Attainable ring) will result in low
measurement transaction cost but a high enforcement transaction cost as deined in North (1990).
The enforcement cost of geoinformation is the cost related to negotiating for the conditions of trade
such as price of the geoinformation, enforcing agreements, protecting copyright, and deining the
right to use and distribute the acquired geoinformation product or service (Poplin, 2010).
Figure 2. The concentric skin model of Backx (2003)
The geoinformation transaction cost appears also on the supplier’s side. The supply geoinformation
transaction cost (SGTC) is the cost imposed on the geo-information provider. Supply geoinformation
transaction cost (SGTC) is related to explaining the complex rules about the acquisition of geoinformation, the use of data and its copyright issues. The communication happens either via email
or phone and can be very costly for the providing institution. However, this article concentrates on
measuring the potential user’s transaction cost being aware that the cost exists on the supplier’s side
as well.
5. GEOPORTAL ASSESSMENT FROM A USERS PERSPECTIVE: CASE STUDY RESEARCH
In order to be able to asses the value of geoportals from the user’s perspective we designed a serious of experiments. In Spring 2010, 13 MSc students of the Geomatics curriculum of Delft University
of Technology were assigned to assess geoportals. This was performed through four tasks:
32
Assessing geoportals from a user perspective
(1) Each student conducted a literature study on both (geo)portal theory and transaction
cost theory. Based on the literature study, the requirements for geoportals were developed.
These criteria were discussed, and a inal list of assessment criteria was agreed.
(2) Each student applied this list of criteria to assess 2 portals of their choice, a list of exis
ting portals was provided for some guidance. The geoportals were assessed on a scale 1
(very poor) – 5 (excellent).
(3) After the (theoretical) assessment, the students assessed a geoportal from a transac
tion cost perspective.
(4) Finally, the overall results were discussed and experiences were relected. The selected
criteria and results were discussed with the focus on the outcomes and whether they justify
investments in geoportals worldwide.
The students could be considered as an international group of Master students of Delft University
of Technology, well acquainted with geo-information and GIS. All of them master English, and their
native language was Dutch, Chinese, Persian, Greek or Bulgarian.
Assessing the transaction cost
After the theoretical assessment, the students assessed the geoportals from a transaction cost perspective. They formed groups of two. One of those two would assess a geoportal by inding, assessing and accessing a dataset at their choice (Scenario 1). Since framework datasets are considered
to be key in a SDI, the only requirement was that this dataset should be on the list of framework
datasets as developed by Onsrud (1998). The other student was tasked to do the same experiment,
but without using the geoportal (Scenario 2), which meant that he or she was searching for datasets
through web search. The only condition was that it should be one of the key datasets underpinning a
SDI: a framework dataset. Students were required to keep track of the time spent on each transaction cost stage: i.e. inding, assessing and accessing the dataset. Further they were asked to perform
as if they were students that need the data for a special assignment; thus non-commercial use. Table
1 summarizes the scenarios applied and the number of students involved in the experiments.
Scenario
Number of students
Scenario 1
The students received information about the geoportal
through which they should start searching for
geoinformation
6 students
Scenario 2
The students not received any instructions where to search
for geoinformation
7 students
Table 1. The number of experiments planned and the number of students involved in the experiments
The irst preliminary results of the assignment are presented in Table 2. All together, 13 datasets
were acquired by the students. This is a 50% score. 9 of the 13 successfully acquired datasets were
found through a geoportal. Only 4 of the 13 successfully acquired datasets were acquired through a
web search. In this regard a geoportal can be considered successful.
However, the average time necessary to acquire the datasets through a geoportal was a little longer
than the average time through the web.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
33
GeoValue - Hamburg 2010
Succeeded total
Succeeded through a portal
Average time of those succeeded?
13 datasets (out of 26)
9 (out of 12 datasets totally sought for through a
portal)
235 minutes
Succeeded without a portal
Average time of those succeeded?
4 (out of 14 datasets sought for through the web)
253 minutes
Table 2. First results of the geoportal and web search task
Table 3 shows some of the problems faced by the students while searching for the framework datasets.
Problem faced
Contact portal manager failed
Who is responsible
Language
Access policy
No direct access
Limited data sets
No direct access
User friendliness
Technical
Limited data sets
Due to
broken links, non existing email
addresses, contact button
inactive, redirected to someone
else
Organisation of SDI
Response in language of portal
manager
No GoogleTranslate possible
(images/ pdf etc)
Pricing policy in local language
only
Licences in local language only
Only homepage in English
English keyword search failed
payment required
terms of use unclear/ nonexistent
Registration required
only small scale data
no to few datasets
no metadata
Response time provider
scrolling per institute
endless clicking
big file size
firefox not supported
projections
data sets most relevant not in
portal
Where
Belgium, Denmark, Netherlands
(RWS), France, Germany (BKG)
Belgium, Germany
Netherlands, Denmark, Norway,
Germany, France
Belgium, Arizona, Germany
Aizona (fax), Denmark, China,
Belgium, Corine in Gemany
INSPIRE, China, Germany/
Hessen, Netherlands (RWS)
Netherlands (RWS): 7 days
Germany: no response
Germany (Rheinland): no
response
Netherlands (RWS): no
response
Belgium: no response
Germany
CORINE, G-DEM, OSM, NOAA,
NASA, NHD
Table 3. Problems experienced in acquiring geoinformation
6. CONCLUSIONS AND FURTHER WORK
At this stage of the research it is dificult to suggest or draw conclusions based on the empirical research since the results have to be analysed in more depth. In the inal paper, the experiences of the
students will be cross-references and analyzed based on several key factors that might be relevant
for the assessment of a geoportal: language, access policy, and organisation of the SDI, among
other issues. The inal paper will relect to the role of geoportals and web search engines in SDI and
provide recommendations.
34
Assessing geoportals from a user perspective
REFERENCES
Askew D., Evans S., Matthews R. and P. Swanton (2005). MAGIC: a geoportal for the English countryside. Computers, Environment and Urban Systems, 29: 71-85.
Backx M. (2003). Gebouwgegevens redden levens [Building information saves lives], MSc thesis.
TU Delft.
Beaumont P., Longley P.A. and D.J. Maguire (2005). Geographic information portals – a UK perspective. Computers, Environment and Urban Systems, 29: 49-69.
Castells M. and P. Himanen (2002). The Information Society and the Welfare State; The Finnish
Model, Oxford: Oxford University Press.
Coase R. H. (1937). The Nature of the Firm. Economica 386: 386-405.
Craglia M. and M. Campagna, (2009). Advanced Regional SDIs in Europe: comparative cost-beneit
evaluation and impact assesment perspectives. to be published in the IJSDIR journal.
Crompvoets J., Rajabifard A., Loenen van B. and T. Delgado (eds.) (2008), Multi-view framework to
assess Spatial Data Infrastructures (SDI), Melbourne: The University of Melbourne, Australia (403
pages).
Crompvoets J., Bree de F., Oort van P.A.J., Bregt A.K., Wachowicz M., Rajabifard A. and I. Williamson (2007). Worldwide impact assessment of spatial data clearinghouses, URISA Journal, 19(1):
23-32.
Crompvoets J. (2006). National spatial data clearinghouses worldwide development and impact.
Dissertation. Wageningen University and Research Centre, Wageningen.
Crompvoets J., Bregt A., Rajabifard A. and I. Williamson (2004). Assessing the worldwide status
of national spatial data clearinghouses. International Journal of Geographical Information Science,
18(7), 665-689.
Ecorys & Grontmij (2009). Kosten-batenanalyse INSPIRE [Cost beneit analysis INSPIRE], 77 pages, http://www.geonovum.nl/nieuws/inspire/kosten-baten-analyse-online.
European Commission (2009). Communication from the Commission: Re-use of Public Sector Information – Review of Directive 2003/98/EC - Brussels, 7.5.2009 COM(2009) 212 inal.
European Commission (2008). Review of the PSI Directive: Results of the online consultation of
stakeholders, http://ec.europa.eu/information_society/policy/psi/docs/pdfs/online_consultation/report_psi_online_consultaion_stakeholders.pdf.
FGDC (2007). 2006 FGDC Annual Report, http://www.fgdc.gov/library/whitepapers-reports/annual%20reports/2006-report/html/index_html.
Geonovum (2009).Voortgangsrapportage uitvoering – monitoring GIDEON. Nummer 02, december
2009 [Progress report execution GIDEON].
Giff G. and J. Crompvoets (2008). Performance indicators a tool to support spatial data infrastructure
assessment. Computers, Environment and Urban Systems 32 (5), pp. 365- 376.
Groot R. and M.A. Shariff (1994). Spatial data infrastructure, essential element in the successfulexploitation of GIS technology. EGIS/MARI ‚94. Fifth European Conference and Exhibition on GIS.
Utrecht, the Netherlands.
Grus L. (2010). Assessing Spatial Data Infrastructures, Dissertation Wageningen University, the
Netherlands.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
35
GeoValue - Hamburg 2010
Guala F. (2005). The Methodology of Experimental Economics. Cambridge University Press, New
York.
INSPIRE (Craglia M.) (2003). Contribution to the extended impact assessment of INSPIRE. http://
inspire.jrc.ec.europa.eu/reports/fds_report_sept2003.pdf.
INSPIRE (2007a). Directive 2007/2/EC of The European Parliament and of The Council of 14 March
2007 establishing an Infrastructure for Spatial Information in the European Community (INSPIRE).
INSPIRE (2007b) Spatial Data Infrastructures in Europe: State of play 2007. http://inspire.jrc.
ec.europa.eu/index.cfm/pageid/6/list/4.
KPMG Consulting (Sears G.) (2001). Canadian Geospatial Data Policy Study, report prepared for
GeoConnections. Available online at: http://www.geoconnections.org/programsCommittees/proCom_policy/keyDocs/KPMG/KPMG_E.pdf.
Krek A. (2009a). In Krek, Rumor, Zlatanova & Fendel (Ed.), Quantifying transaction costs of geoinformation: Experiments in national information structures in Sweden and Germany. Paper presented
at the 27th Urban and Regional Data Management (UDMS), June 24-26, Ljubljana, Slovenia. Taylor
&Francis Group.
Krek A. (2009b). Measuring Transaction Costs in Spatial Data Infrastructures: Examples of Sweden
and Germany, GEOWS ´09, The International Conference on Advanced Geographic Information
Systems & Web Services, February 1-7, 2009, Cancun, Mexico.
Krek A. (2006). Geoinformation as an Economic Good. In: GIS for Sustainable Development, Editor:
Michele Campagna, CRC Press Taylor & Francis Group.
Krek A. (2004). Cost in GI Product Transaction. In: GIM International, The Worldwide Magazine for
Geomatics, January 2004, Volume 187, Number 1, GITC bv, The Netherlands.
Krek A. (2003). What are transaction costs and why do they matter?. In: M. Gould, R. Laurini and
S. Coulondre (Eds.), 6th AGILE 2003 - Conference on Geographic Information Science, April 24th –
27th, Lyon, France.
Lopez X.R. (1998). The dissemination of spatial data: A North American - European comparative
study on the impact of government information policy (London: Ablex Publishing Corporation).
Masser I. (1999). All shapes and sizes: the irst generation of national spatial data infrastructures. Int.
J. Geographical Science, 3(1), 67-84.
Masser I. (2007). Building European Spatial Data Infrastructures. ESRI Press, Redlands, California.
Maguire D.J. and P.A. Longley (2005). The emergence of geoportals and their role in spatial data
infrastructures. Computers, Environment and Urban Systems, 29: 3-14.
McLaughlin J. and S. Nichols (1994). Developing a National Spatial Data Infrastructure. Journal of
Surveying Engineering, 120(2): 62-76.
MEPSIR (Measuring European Public Sector Information Resources) (2006). Final Report of Study
on Exploitation of public sector information– benchmarking of EU framework conditions.
Micus Gmbh (2008). Assessment of the Re-use of Public Sector Information (PSI) in the Geographical information, Meteorological Information and Legal Information Sectors, http://ec.europa.eu/
information_society/policy/psi/docs/pdfs/micus_report_december2008.pdf.
Niehans J. (1987). Transaction costs. The New Palgrave: A Dictionary of Economics. 4: pp. 677-80.
36
Assessing geoportals from a user perspective
North D. C. (1990). Institutions, Institutional Change and Economic Performance, Cambridge University Press.
Onsrud H.J. and G. Rushton (eds.) (1995). Sharing Geographic Information. New Brunswick, NJ
(Centre for Urban Policy Research).
Onsrud H.J. (1992a). In support of open access for publicly held geographic information, GIS Law, 1
(1), pp. 3-6. Available online at: http://www.spatial.maine.edu/~onsrud/pubs/In_Support_OA.htm.
Onsrud H.J. (1992b). In support of cost recovery for publicly held geographic information. GIS Law,
1 (2), pp. 1-7. Available online at: http://www.spatial.maine.edu/~onsrud/pubs/Cost_Recovery_for_
GIS.html.
PIRA International Ltd, University of East Anglia, and Knowledge View Ltd. (2000). Commercial exploitation of Europe‘s public sector information. Final Report For the European Commission Directorate General for the Information Society. Available online at: ftp://ftp.cordis.lu/pub/econtent/docs/
commercial_inal_report.pdf.
Poplin A. (2010). Methodology for Measuring the Demand Geoinformation Transaction Costs: Based
on Experiments in Berlin, Vienna and Zurich, Journal of Spatial Data Infrastructures Research, Vol
(5), 168-193.
PSI Directive (2003). Directive 2003/98/EC of The European Parliament and of The Council of 17
November 2003 on the re-use of public sector information,
Rhind D. (2000). Funding an NGDI. In: Geospatial Data Infrastructure Concepts, Cases and Good
Practices. Groot. R. and J. McLaughlin (eds.). Oxford University Press, pp. 39-55.
Rajabifard A. and I.P. Williamson (2002). Spatial Data Infrastructures: an initiative to facilitate spatial
data sharing. in: Global Environmental Databases- Present Situation and Future Directions. Volume
2 (International Society for Photogrammetry and Remote Sensing (ISPRS-WG IV/8), GeoCarto International Centre, Hong Kong.
Sholtz (2001). Transaction costs and social costs of online privacy. First Monday 6(5).
The European Parliament (2005). Mid-term review of the Lisbon strategy European Parliament resolution on the mid-term review of the Lisbon Strategy, P6_TA(2005)0069.
The Lisbon Special European Council (2000). Towards a Europe of Innovation and Knowledge,
http://www.europarl.europa.eu/summits/lis1_en.htm.
Van Loenen B. (2009). Developing geographic information infrastructures: the role of access policies, International Journal of Geographical Information Science, 23(2): 195-212.
Van Loenen B. and J. de Jong (2007). Institutions Matter; The impact of institutional choices relative
to access policy and data quality on the development of geographic information infrastructures. in: H.
Onsrud (Ed.), Research and Theory in Advancing Spatial Data Infrastructure Concepts (ESRI Press,
Redlands, CA, USA, pp. 215-229.
Van Loenen, B. (2006). Developing geographic information infrastructures; The role of information
policies (dissertation ed.). DUP Science, Delft.
Van Oort, P.A.J., Kuyper, M.C. Bregt, A.K., and J. Crompvoets, (2009). Geoportals: An Internet
Marketing Perspective. Data Science Journal, 8(2009): 162-181.
Wallis, J. and D. C. North (1986). Measuring the Transaction Sector in the American Economy, 18701970. Chicago, University of Chicago Press.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
37
GeoValue - Hamburg 2010
Weiss, P. and Pluijmers, Y. (2002). Borders in Cyberspace: Conlicting Public Sector Information
Policies and their Economic Impacts. Available online at: http://www.spatial.maine.edu/GovtRecords/
cache/Final%20Papers%20and%20Presentations/bordersII.htm.
Williamson, O. E. (1985). The Economic Institutions of Capitalism, Free Press.
Williamson, O. E. and S. E. Masten (1995). Transaction Cost Economics, Theory and Concepts,
Edward Elgar Publishing Limited.
Yin, R.K. (1994). Case Study Research; Design and Methods, (Sage Publications, Second Edition).
38
The narrative anchor as the decisive element of geoinformation infrastructures
Henk Koerten
Delft University of Technology
[email protected]
ABSTRACT
Efforts to build a geoinformation infrastructure using technological innovations have not always been
succesful, at times they even can be problematic. In this paper three Spatial Data Infrastructure(SDI-) initiatives are being compared, trying to explain succes and failure by giving an ethnography
of everyday practice regarding creation, development and fate. Using the method of narrative analysis, these cases are analysed. It turns out that the way technology is linked to the goals of an SDI
is an indication for success. This research reveals that where technology is directly linked up to an
infrastructure it is eventually bound to fail. However, where technology and infrastructure share a
narrative achor as a mediating non-tangible element, an SDI can be successful and sustainable.
Keywords: SDI, organization, ethnography, narrative approach
1. INTRODUCTION
Geoinformation professionals always try to reconcile images of static infrastructures with dynamic
cutting-edge technology to serve the domain of public administration (Hanseth, Monteiro et al., 1996;
Bowker and Star, 2000). Efforts to build a geoinformation infrastructure using technological innovations have not always been successful; at times even problematic (Koerten and Veenswijk, 2009). We
can not only witness how endeavours like this get redeined because they do not live up to their initial
expectations, it has also been widely acknowledged internationally among geoinformation insiders
that knowledge on establishing infrastructures is lacking, that is, on combining and disseminating
map-related information among and between organisations (Budhathoki and Nedovic-Budic, 2007;
Georgiadou, Harvey et al., 2009; Homburg and Georgiadou, 2009; Lance & Georgiadou et al., 2009;
Grus, 2010). The common denominator in related literature is that these Spatial Data Infrastructures
(SDIs) should be more effectively guided by management models (Koerten, 2008).
Therefore, implementers are inclined to value organisational aspects of NGII development using
design rules, borrowed from political science, economics and management science (Koerten, 2008).
Accordingly, NGII-researchers focus on best practices, organization models and planning (Rajabifard and Williamson, 2001; Warnest, McDougall et al., 2003; Masser, 2005; Warnest, Rajabifard et
al., 2005; Obermeyer and Pinto, 2008; Box and Rajabifard, 2009). However, some researchers want
to focus on these non-technological aspects in a non-prescriptive way to get a better understanding
of implementation processes, for which they think alternative ontology’s and epistemologies are
more appropriate (Harvey, 2001; Georgiadou, Puri et al., 2005; Crompvoets, Rajabifard et al., 2008;
Georgiadou, Harvey et al., 2009). Although some research has been conducted in this vein (Martin,
2000; Harvey, 2001; Georgiadou and Homburg, 2008; Lance, Georgiadou et al., 2009), mainstream
NGII research remains to be design oriented (Budhathoki and Nedovic-Budic, 2007; Crompvoets,
Rajabifard et al., 2008).
A big gap exists between on the one hand the wish to implement SDIs using fashionable management models, and on the other hand the inability to accomplish that. Establishing information infrastructures based on cutting-edge technologies seems to have more intricacies than ordinary management practices can handle. Additionally, evaluation literature on SDI implementation, that is the
review of completed SDI implementation projects, is scarce (Koerten Forthcoming).
In this paper, speciic SDI implementation projects are going to be evaluated with a ‘bigger picture’
view {Georgiadou, 2009 p.1638}, using an ethnographic, narrative approach. It looks into speciic
cases with a focus on how SDIs are narrated in meetings, interviews and policy documents in order
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
39
GeoValue - Hamburg 2010
to develop our understanding about conceptualization and usage. The research question is: How
can we understand NGII implementation using narrative analysis? Secondary questions are: How
do technological and organisational aspects interact with each other? How are goals and results
perceived over time? In order to accomplish that, I am going to present the Dutch situation in which
two close related cases, NCGI and Geoportals, have been declared as unsuccessful and terminated, while another case, GBKN, is going strong for over 35 years now. These cases are going to be
evaluated using a narrative framework guiding ethnographic research.
The remainder of this paper is as follows. First I am going to develop a narrative theory which allows
me to focus on technological aspects, discerning narrative conceptualisations about scene, actors
and actions, termed as narrative setting, narrative space and narrative storyboard respectively. A
narrative setting concerns notions about the narrated environment in time, territory and technology.
Narrative spaces refer to conigurations of actors and how they interact with each other and narrate
their world, individually and collectively. Narrative storyboards arise from relection on practices and
are transposed into relatively ixed patterns, which can be regarded as the outcome of the propensity
of human beings to consider sense-making itself in terms of ixed concepts. These concepts are
going to be used to analyse cases, followed by some concluding remarks.
2. A NARRATIVE APPROACH FOR INFORMATION-INFRASTRUCTURE RESEARCH
The theory to be used originates from notion on social interaction introduced by Goffman (1959),
arguing that human beings are able to look at themselves from another point of view, using theatrical
terms of ‘front-stage’ and a ‘back-stage’ (Goffman, 1959; Blumer, 1969). It is linked to Bourdieu’s
implicit rejection of the assumption of an objective truth, because organisational structures are in fact
socially constructed. Blending these two approaches together into one theoretical concept, provides
useful notions about the life world affecting individual, group and intergroup behaviour, however, these theoretical notions do not address the process of sense-making. Therefore, I am going to use a
narrative approach using linguistic, anthropological and social psychological insights (Gergen, 1994;
Boje, 1995; Berendse, Duijnhoven et al., 2006), coming from the realm of less positivistic methods
(Polkinghorne, 1988; Hatch and Yanow, 2003), in which a ‘linguistic turn’, and a ‘narrative turn’ may
be distinguished (Verduijn, 2007).
Stories and narratives are linked (Czarniawska-Joerges, 1998; Yanow, 2000; Boje, 2001; Veenswijk, 2006), to give meaning to experience (Gabriel, 2000). Stories allow us to create interpretations,
invoking frames of reference for future stories and actions (Tesselaar, Sabelis et al., 2008). They
may start to live a life of their own, becoming narratives to be loosely connected to the originals or
to become universal, culminating in identity-creation (Boje, 2001; Beech and Huxham, 2003). From
a manager to a company car, human and non-human identities are created by storytelling, leading
to narratives that are continuously reconstructed and therefore subject to change. Narratives are not
always visible and recognisable; they can be prominent or unconsciously present, being interpretations of assembled, either real or imagined stories, which Boje, after Clair, called ‘narratives dressed
as theories’ (Boje, 2001).
A narrative framework for research
In the framework guiding this research, we discern narrative conceptualisations about scene; actors
and actions, in terms of narrative setting, narrative space and narrative storyboard respectively (see
Figure 1). A narrative setting concerns notions about the narrated environment in time, territory and
technology. Narrative spaces refer to conigurations of actors interacting with each other narrating
the world. Narrative storyboards arise from practice and are transposed into relatively ixed patterns,
to be regarded as the outcome of human beings to let action to be guided by ixed concepts.
40
The narrative anchor as the decisive element of geoinformation infrastructures
Figure 1. Theoretical Focus
The narrative setting conceptualises narratives about the environment; time and territory and technology. These include notions of local and global, of presence and absence, of home and abroad,
of change, stability and institutionalisation come together in an enacted location of time, place and
technology (Scott, 1995).
One or more narrative spaces may be discerned within a narrative setting. They represent groups of
people and are therefore the link to human existence, enacting a department, organisation, profession, religion or subgroup or even a single individual.
Narrative storyboards are the bedrock of human actions, providing predeined scripts for action. In
a constant low of events we enact our world as stable and predictable, requiring ixed recipes for
action which are based on past and possible future events. People adhere to unwritten rules in daily
life, allowing them to present themselves as good citizens, and thus feel uncomfortable when rules
are not properly applied.
3. THE STORYBOARDS OF UTOPIA AND MYOPIA IN THE DUTCH GEOINFORMATION SECTOR
In this section ethnographies of the cases of GBKN and NCGI/Geoportals are concisely presented
and narratively analysed, using the framework of narrative setting, spaces and storyboards. For that
purpose two basic storyboards have been developed based on a historical analysis of surveying and
geodesy in the Netherlands. The narratives storyboards are based on two distinct, old professions in
geoinformation, forming a dichotomy of approaches, being around for ages, still to be traced today
(Koerten Forthcoming). These narrative storyboards will be discussed below, revealing narratives
on action, of how things are done in practice. To enact the things we do in daily life, narrative storyboards tell us what to do in more or less prescribed ways (Garinkel, 1984; Weick, 1995). The two
developed storyboards will be the basis for comparative analysis.
Utopia and myopia: storyboards for analysis
The geoinformation sector used to be a closed community which was able to develop and maintain
these storyboards in relative isolation. The rather coarse storyboards of myopia and utopia have
been formed in practice, guiding thoughts and behaviour, being inluential to what has happened
and what is still happening in the geoinformation sector, as they emerged from a historical analysis.
They certainly have been inluential to one another, for the sake of analysis they are regarded here
as dichotomous, being mutual exclusive (Douglas, 1986; Bowker and Star, 2000). Their primary analytical qualities are recapitulated in table 1, a crosstab which relates to respective narrative settings
and spaces.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
41
GeoValue - Hamburg 2010
Narrative setting
Narrative space
Myopia
Utilizing technology
Authoritative orientation
Utopia
Determining technology
Scientific orientation
Table 1. A framework for analysis with storyboards
4. TWO DUTCH GEOINFORMATION INFRASTRUCTURES COMPARED USING NARRATIVE
STORYBOARDS
This paper is based on two cases within the Dutch geoinformation sector, aimed at sharing, disseminating information that has to be put on a map which is called geoinformation.
4.1 GBKN
One case is Grootschalige Basis Kaart Nederland (GBKN), started in 1975 and still active today.
Aimed at building and maintaining a national system of large scale base maps, it was endorsed and
still primarily used by utility companies, municipalities and the Dutch Kadaster for their registering
obligations. The other case is a combination of the National Clearinghouse Geoinformation (NCGI)
(1995-2006) and its sequel Geoportals (2005-2008), intended to exchange geoinformation held by
#different nationally operating (Semi-) public organisations through a website. Spanning a considerable amount of time with signiicant technological changes, these cases had profound impact on the
geoinformation sector and society as a whole.
Ethnographic research has been carried out to compare the GBKN case, widely appraised as successful on the one hand, with the NCGI/Geoportals-case on the other, considered as unsuccessful
attempts. For a detailed description, see (Koerten Forthcoming). In the analysis both cases will be
compared and in order to do that, two contrasting storyboards were developed based on an analysis
of the history of the Dutch geoinformation ield.
Case analysis
The case reveals a shifting pattern of initiative-taking. Figure 2 shows that the initiative comes from
utopia-driven scientists and geodesists, who have to grant the production to the Kadaster in the realisation phase. Stagnation is caused by utopia-driven municipalities who want to deine a more dynamic, local form of utopia. After the turnaround both utopia and myopia combine different interests
towards completion, however in the recognition phase the initiative shifts again to utopian spheres
as national government tries to gain inluence.
Figure 2. Pattern of shifting initiatives in GBKN
42
The narrative anchor as the decisive element of geoinformation infrastructures
Since it was irst mentioned as a possibility to unify large scale map making in The Netherlands a
lot has changed. Technology was both an enabler to the production of base maps and decisive in
the process of keeping track of all the changes in the built environment. It did, however, not dominate the process of GBKN becoming an infrastructure. If a new emerging technology seemed to be
tempting to one of the participants, it still would not be applied since it could harm the interests of
others.
It is remarkable that after the Kadaster was released from its assignment to produce a GBKN, the apparent false start resulted in Public Private Parnership (PPP) cooperation’s, tailored to local circumstances. They opened the way for local and regional actors to make GBKN to a success; however
rationalization processes forced towards both up scaling of organizational arrangements and further
standardization, making GBKN it for use on the national level.
4.2 NCGI / Geoportals: Technology rules
Case analysis
It looks like every time the geo-professional space takes the initiative, technological shifts are announced and every time the management space is in control, there are arguments about organization, budgets and institutionalization. Acting like a weather box, issues are addressed sequentially,
impeding integral examination. Figure 3 visualises how the initiative switches alternatively from one
narrative space to the other.
Figure 3. Pattern of shifting initiatives in NCGI/Geoportals
Technology is the driving force for change, experienced by geo-professionals as a constant pressure
to be committed to the latest developments. Consequently, every novel technology has repercussions for the approach of both NCGI and Geoportals. Every new technology knocking at the door is felt
as an obligation to apply it, even when the preceding technological innovation has not been properly
implemented. This preoccupation with technology is most felt in the Geoportals project. Here a group
of geo-professionals launches the idea of setting up a system of Geoportals, gets funding for, but do
not feel they are able to realise that. Getting second thoughts, they increasingly started to see Geoportals as a project to boost innovation, which made it not any more related to concrete solutions.
The inal phase shows a project team full of conidence, believing that creating innovative software
applications is now the new project aim.
5. THE NARRATIVE CONSTRUCTION OF GEOSPATIAL INFRASTRUCTURE
With narratives of surveying and geodesy narrowed down to storyboards of myopia and utopia, I am
going to analyze the respective cases. Having these two storyboards in mind, I will try to come to the
essence of base maps and GBKN on the one hand, and the National Clearinghouse Geo Information
(NCGI) and the Geoportals project on the other.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
43
GeoValue - Hamburg 2010
5.1 Large scale mapping becoming an infrastructure through GBKN
It is a storyboard of utopia which guides these urban developments with a strong relationship between territory and maps, relying on scientiic methods. However, in the national arena the lack of
large scale maps is tackled with a new system, independent to speciic organizations, including these
urban municipalities. They act as two versions of a storyboard of utopia, exclusively linked to their
own territory and do not become connected to each other.
The Kadaster, commissioned to take up the production of GBKN in 1975, connects the national concern with a myopian storyboard, based on the cadastral means-to-an-end form of infrastructure. It
makes the national, uniied concern vulnerable to the cadastral mode of conduct, allowing local and
regional opportunities to determine where a GBKN mapping initiative will start, which organizations
are invited to cooperate, and how standards are applied. The storyboard of myopia guides how
GBKN is handled within the cadastral organization, allowing the Kadaster the opportunity to seek the
best suitable way to combine the assignment of GBKN with the eternal quest to improve cadastral
mapping, harming the principle of a uniied GBKN.
It appears that these storyboards do not entirely add up to each other in this phase of GBKN. The
Central Mapping Board has certain utopian convictions about how GBKN should look like and how
it needs to be implemented; however it is taken up by the Kadaster in a myopian way. Metropolitan
municipalities do not play a role at this stage as they are neither inclined to associate with the myopian preference of the Kadaster nor the utopian storyboard on the national level.
The municipal version of the utopian storyboard is the driving force for medium-sized municipalities
to have their own large scale base maps, with production and up-keeping mostly organized on a
regional scale, forced by Utilities and Kadaster, which are essential for funding. While GBKN is still
treated as a national uniication tool, its character changes towards a national umbrella for initiatives
on regional scale, leaving as much room to individual municipalities as possible to promote their
constituting role in large scale map making.
Meanwhile, the Kadaster loses its leading role through inancial troubles. Following the myopic means-to an-end storyboard, the Kadaster sees an opportunity in changing its role from GBKN-mapmaker towards the one of being a service-provider to regional collaboration.
Public-Private-Partnerships are established to balance the interests between municipalities, utilities
and the Kadaster. The myopian means-to-an-end storyboard of the Kadaster, shared also by utilities forces to balance interests. Reversely, the municipalities need the other two partners to realize
their utopia-inlicted large scale base maps. It is this situation of mutual complementing interests,
sweeping GBKN towards national coverage. Standards emerge, relecting the beneits for all three
participants: large scale maps to serve the needs of individual municipalities as well as effective and
cheap mapping on an optimal business scale for utilities and the Kadaster.
The utopian storyboard of municipal interests makes enforced standardization on a national scale less important. Utility organizations consolidating towards semi-national conglomerates and the
Kadaster are in need of standards to exchange data at all levels, making standardization a myopiadriven aim.
5.2 The infrastructural qualities of clearinghouse and geoportals
The start of Nationaal Clearinghouse Geo-Informatie (NCGI) is a result of the failed attempt to establish a formal relationship between four participating national geoinformation-processing organizations to exchange geoinformation. They form a likeminded cooperation-seeking constellation of
organizations, trying to work out a deal in a myopian way to be beneicial to all. After this setback, GIprofessionals belonging to these organizations get together on an informal basis and start to develop
Idéix, a database with metadata (data that describes the data to be exchanged) for geoinformation
exchange, using state-of-the-art technology. They are hardly concerned about the interests of their
respective organizations. GI-sharing is seen as the ultimate goal, rather than to serve the interests
of their own organizations.
44
The narrative anchor as the decisive element of geoinformation infrastructures
Driven by a utopian storyboard to standardize data-exchange, GI-professionals see Idéix as a
role model for a national, universal, standardized infrastructure for geoinformation exchange, to be
enforced with an almost philanthropical attitude. In their view, Idéix is seen as the perfect engine for
the clearinghouse concept, which is to have a kind of central catalogue that describes all geoinformation through the disclosure of its metadata. Such a National Clearinghouse Geo Information (NCGI)
should be preferably implemented at a matching institutional level, approved and managed by an
umbrella-like organization on the national level.
Having become NCGI, independently organised at the national level, a myopian storyboard gets
hold of the NCGI. Board members, representing their own respective organisational interests are
more inclined to protect their own organisation than to promote a common goal. By putting their own
interests irst, shared attempts to make NCGI to a national infrastructure are bound to fail. Faced with
decline, the failing initiative is granted again to utopian professionals who are given a new opportunity: a software-engineering company founded and operated by former Idéix-professionals is invited
to take over all operational NCGI activities. With a clean slate, they start to focus again on GI-sharing
as a virtue for all, to be developed with cutting-edge technology. The storyboard of utopia sets the
scene here and NCGI is presented as universal, being beneicial to everybody. However, the utopian
attitude fades again along the way, as the engineering company tries to make a proit by putting its
own interests irst, increasingly treating NCGI as a commercial billboard.
The myopian storyboard steers the management of individual organizations to join forces on the
idea of a national infrastructure launching the Ruimte voor Geoinformatie (Space for Geo Information, RGI)-program, which incorporates NCGI. Grants are given to innovative sharing ideas, like the
idea of establishing a geoportals project, again an attempt taken up by individual professionals from
diverse geoinformation organizations. The focus shifts again to utopia, as professionals focus on
the geoinformation sector as a whole, separated from individual organizations. Because RGI preaches innovation, Geoportals-professionals feel they have to incorporate cutting-edge technology
which needs to be adapted and developed. As the project carries on, the focus shifts from a tangible
system of geoportals towards the development of tools to apply new technologies to enable future
infrastructures. An overarching infrastructure to disclose geoinformation is still wanted, but seems
further away than ever.
6. BUILDING GEOINFORMATION INFRASTRUCTURES: TWO CONTRASTING APPROACHES
The narrative storyboards of utopia and myopia, applied to describe GBKN on the one hand and
NCGI and Geoportals on the other call for a comparison. Table 2 offers a summary, to be explicated
in this section. Geoportals came to life as a result of a failing NCGI, both not delivering the infrastructure initially promised. Because it was seen as successfully boosting innovation, Geoportals were
in the end celebrated as a success. GBKN had to face redeinitions, adjustments of organisational
arrangements and serious dificulties but is nevertheless still going strong, celebrating its 35 year
anniversary in 2010, while NCGI oficially existed less than 10 years and geoportals only three.
Duration of completion
attempt
Time perspective
Territory
Technology
Organization
Tangible results
GBKN
25 years (successful)
NCGI
9 years (failed)
Geoportals
3 years (failed)
Solving lasting problems
Municipal-National
Balanced
Top down -PPS-Top down
System of base maps
Future oriented
National
Cutting-edge
Networked project
Website
Future oriented
National
Cutting-edge
Networked project
Software-prototypes
Table 2. Comparing the cases
In the next section I will further connect the cases with theory, using the utopia and utopia storyboards
to shed light upon time, space and technology in respect to the narrative setting and organizational
considerations relating to narrative spaces.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
45
GeoValue - Hamburg 2010
7. CONCLUSION: A NARRATIVE ANCHOR AS DISTINCTIVE FOR INFORMATION INFRASTRUCTURES
Making an analysis, I have discerned two storyboards: utopia and myopia. Utopia refers to the scientiic strive for accuracy, universality and standardization in assessing the earth, while myopia stands
for a here-and-now, means-to-an-end, honouring exceptions. In this section I want to go through
some conclusions based on this analysis.
The versatile features of geoinformation: the link with original data
In discussions on geoinformation, a lot is expected from reuse of geoinformation, presuming that if
some kind of information is collected at one place, it is quite easy to use it elsewhere. Of course,
technical, legal and economic oriented considerations have been acknowledged in that respect (Welle Donker, Van Loenen et al., 2010); however other aspects regarding geoinformation sharing are
still ignored.
Such an aspect is topicality: in the GBKN case, it takes until 1992 to realize that this is the essence
of base maps. For every user, topicality of base maps is a concern, however irst and foremost for
municipalities. Additionally, standardization in mapping is not equally important to all participants.
Topicality and standardization determine for the greater part if data collected at one place can be
successfully used at another. Within GBKN, these interests seem to be suficiently balanced with
inancial motives to give every participant the right share.
Considering the shifts in initiative taking and granting, both the NCGI and the Geoportals case, demonstrate an absence of dialogue between stakeholders. There is simply no possibility to discuss
the nature of information, let alone balancing the interests of their producers. Within the framework of
information exchange, being either clearinghouse or a geoportals, there is no framework to establish
if data collected on one place can replace data from another source being held elsewhere. Metadata
serves that purpose only in a very limited way. Accordingly, a link between data production and data
use among different types of organizations cannot be established.
Standardisation is envisioned as top-down law-enforcement
Within the sector of Geoinformation, standardization is a hot topic. First of all it relates to technical
standardization: in order to connect electronic devices like databases and software applications of
different nature they need some sort of standardization. There are also more abstract conventions
to deal with, like in the case of mapping; standards have to be made on how to frame reality into an
image.
Case descriptions reveal that standardisation has been conceptualised and taken up as a top-down
process, in which some central body gives rules to follow that should be used within the ield to which
they apply. Either envisioned as strict and detailed regulations to be followed by all members of a
community, or as an understanding of some general preferences, the idea is that they are issued by
a central coordinating agency to be recognizable by all.
Other forms of standardization are not recognized as such, let alone the idea that standardization is
a dynamic process, rather than going abruptly from a non-standardized to a standardized reality just
by issuing rules. In the Geoportals case a ‘light’ version of standards is advocated, however thought
of as a rather static concept to seduce organisations to participate, not as a start to seduce them to
transform them towards sophisticated standards. In the GBKN case, sales of random map excerpts
stimulate standardisation efforts, to avoid selling these excerpts to consist of different parts with differing standards. That differences occur is acknowledged, but not recognized as a tool to improve
national uniied standardisation.
Mind the time perspective: going back to the future
When a reference is made to an infrastructure as an institution, the image of it has enduring and
lasting qualities (Douglas, 1986; Scott, 1995). If an infrastructure has to be purposeful to society, it
needs to solve existing and pressing problems with solutions that are lasting.
46
The narrative anchor as the decisive element of geoinformation infrastructures
In the case of GBKN, the absence of base maps is regarded as a pressing societal problem being
around for decades, in need to be solved now to prevent society from further losses. Consequently,
GBKN is connected to problems of the past that are ought to be solved once and for all to gain a
better future. Therefore, it looks backwards for its problem deinition and neither has the intention nor
the desire to look ahead for new problems.
Within NCGI, environmental problems are envisioned as lying ahead to be solved, acting as a trigger
to integrate data from different sources that has not been done before. With a keen eye on the future,
new technology is hailed and in the course of the project, the initial problem deinition gets pushed
to the background and traded for the viewpoint that the application of new technologies is crucial for
future problems.
The success factor of geoinformation infrastructures: the art of enabling and inhibiting technology
The thing that strikes the most is that the Dutch geoinformation sector is technology-based, an observation also made by a lot of interviewees. However, as informed insiders see the role of technology
within their sector as obvious, straightforward and simple, as a relative outsider I see it as a delicate
factor, making its inluence either encouraging or disappointing.
GBKN is the living proof that technology is not the only decisive factor for an information infrastructure. It is the ability to use innovative technology or to keep technology at a distance that makes
GBKN a success in terms of the establishment of an infrastructure. At the start in 1975, cutting-edge
technology was declined as the old-fashioned, Kadaster was not ready for it. On the other hand,
around 1990, when the up keeping of base maps was seen as a problem, the latest state-of-the-art
GIS workstations were used because they were desperately needed to keep track of all the mapped
changes. However, already a few years later, object-oriented mapping which would require additional
technological innovations was kept at a distance. The image that remains is a rhythm of attraction
and let-go, either embracing or declining innovations. Such a treatment of technological temptations
is absent in the cases of NCGI and Geoportals. There innovative technology is focal, and regarded
as essential for having an information infrastructure, diverting attention from the goals and gains of
infrastructure towards having cutting-edge technology as an ultimate, however false source of success.
A non-tangible information infrastructure-concept: the narrative anchor
The common opinion is that information infrastructures rest heavily upon their constituting technologies and that the relationship between them is rather straightforward (Venkatraman, 1991; Harvey,
2000; Puri, 2006). Technology is only seen as enabling towards information infrastructures.
A more developed relationship emerges from the analysis of the GBKN case. Here, the focus from
the outset is on base maps, making them the essence of the infrastructure and as a concept freestanding towards technology. Moreover, it is able to establish relationships with different kinds of
technology at various levels, depending on speciic infrastructure needs, while at the same time
users are able to use it as a frame of reference for their applications. The base map concept acts a
narrative anchor, throughout the 35 years of existence of GBKN.
The narrative anchor has a link with history as it is ought to solve lasting problems that have been
existing for decades. It is expected that these problems will last if nothing is done to contain them
so they have to provide solutions in the future. Thus the narrative anchor connects past present and
future, offering a device to connect them all.
An infrastructure has a relationship to one or more entities of territory, either being physical or imaginary. The narrative anchor acts as a device to establish all these relationships. In the GBKN case,
these were the municipal entity of up keeping, the jurisdiction of utility irms, and the jurisdiction of
GBKN itself, the nation.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
47
GeoValue - Hamburg 2010
Figure 4. A narrative design model of information infrastructures
The case of GBKN has revealed the essence of infrastructure: the narrative anchor. Figure 4 offers
a schematic representation. The liaison between possible applications and the constituting factors of
an infrastructure is formed by the narrative anchor. Time, Technology and Territory are essential to
an information infrastructure and need to become linked. If this is done right, like in the GBKN case,
a long-lasting infrastructure will come into existence.
This model derived from research is in stark contrast with the cases of NCGI and Geoportals. An attempt to develop a narrative anchor is made; irst it was named Idéix, then metadata, then Clearinghouse and inally traded in for the Geoportals concept. The narrative anchor(s) of NCGI and Geoportals have trouble to establish a relationship with the three T’s: Time, Technology and Territory. Cases
are not linked to the past and the present, only to the future. They also fail to keep the temptations of
cutting-edge technology at a distance. Finally, the territories of the organisations that are supposed
to participate are fully ignored. The image that remains is depicted in igure 5 where applications
have a direct link with technology, without a narrative anchor. The result is that technology controls
the fate of these infrastructures, lacking a mediating effect.
Figure 5. The direct relationship of applications with technology within
NCGI and Geoportals
FINAL WORDS
The inding of the narrative anchor is the main conclusion of this research, based on an analysis of
three ethnographies describing how information infrastructures were conceptualised in these respective cases. The analysis was theory-based, meaning that the analysis was guided by theory.
48
The narrative anchor as the decisive element of geoinformation infrastructures
REFERENCES
Beech N. and C. Huxham (2003). Cycles of Identity Formation in Interorganizational Collaborations.
International Studies of Management & Organization 33(3): 28-52.
Berendse M., Duijnhoven H and M. Veenswijk (2006). Editing narratives of change. Identity and legitimacy in complex innovative infrastructure organizations. Intervention Research: 73-89.
Blumer H. (1969). Symbolic Interactionism, Perspective and Method. Englewood Cliffs NY, Prentice
Hall.
Boje D. (1995). Stories of the Storytelling Organization: a Postmodern Analysis of Disney as TamaraLand. Academy of Management Journal 38(4): 997-1035.
Boje D. (2001). Narrative Methods for Organizational and Communication Research. London, Sage
Publications Ltd.
Bowker G. and S. Star (2000). Sorting Things Out; Classiication and its Consequences. Cambridge
MA, The MIT Press.
Box P. and A Rajabifard (2009). SDI governance: Bridging the Gap Between People and Geospatial
Resources. GSDI 11. Rotterdam NL.
Budhathoki N. and Z. Nedovic-Budic (2007). Expanding the Spatial Data Infrastructure Knowledge
Base. Research and Theory in Advancing Spatial Data Infrastructure Concepts. H. Onsrud. Redlands CA, ESRI Press.
Crompvoets J., Rajabifard A., Loenen van B. and T. Delgado Fernández (Eds.) (2008). A Multi-View
Framework to Assess SDIs. Melbourne, Au, Space for Geo-Information (RGI), Wageningen University and Centre for SDIs and Land Administration, Department of Geomatics, The University of
Melbourne.
Czarniawska-Joerges B. (1998). A Narrative Approach to Organization Studies. Thousand Oaks CA,
Sage Publications Inc.
Douglas M. (1986). How Institutions Think. Syracuse, Syracuse University Press.
Gabriel Y. (2000). Storytelling in Organizations, Facts, Fictions, and Fantasies. Oxford UK, Oxford
University Press.
Garinkel H. (1984). Studies in Ethnomethodology. Cambridge, Polity Press.
Georgiadou Y., Harvey F. and G. Miscione (2009). A Bigger Picture: Information Systems and Spatial
Data Infrastructure Research Perspectives. GSDI-11. B. Van Loenen, H. Onsrud, A. Rajabifard and
A. Stevens. Rotterdam.
Georgiadou Y. and V. Homburg (2008). The Argumentive Structure of Spatial Data Infrastructure
Initiatives in America and Africa. International Federation for Information Processing. C. Avgerou, M.
Smith and P. van den Besselaar. Boston MA, Springer. 282: 31-44.
Georgiadou Y., Puri S.K. and S. Sahay (2005). Towards a potential research agenda to guide the
implementation of Spatial Data Infrastructures—A case study from India. International Journal of
Geographical Information Science 19(10): 1113-1130.
Gergen K. (1994). Realities and relationships, Soundings in Social Construction. Cambridge MA,
Harvard University Press.
Goffman E. (1959). The Presentation of Self in Everyday Life, Doubleday Garden City, NY.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
49
GeoValue - Hamburg 2010
Grus L. (2010). Assessing Data Infrastructures. Wageningen, PhD thesis Wageningen University.
Hanseth O., Monteiro E. and M. Hatling (1996). Developing Information Infrastructure: The Tension
Between Standardization and Flexibility. Science, Technology & Human Values 21(4): 407-426.
Harvey F. (2000). The social construction of geographic information systems- Editorial introduction.
International Journal of Geographical Information Science 14(8): 711-713.
Harvey F. (2001). Constructing GIS: Actor Networks of Collaboration. URISA Journal 13(1): 29-37.
Hatch M. and D. Yanow (2003). Organization Theory as an interpretive Science. The Oxford Handbook of Organizational Theory. C. Knudsen and H. Tsoukas. Oxford UK, Oxford University Press.
Homburg V. and Y. Georgiadou (2009). A Tale of Two Trajectories: How Spatial Data Infrastrucutures
Travel in Time and Space. The Information Society 25: 303-314.
Koerten H. (2008). Assessing the Organisational Aspects of SDI: Metaphors Matter. A Multi-View
Framework to Assess SDIs. J. Crompvoets, A. Rajabifard, B. Van Loenen and T. Delgado. Melbourne: 235-254.
Koerten H. (Forthcoming). How utopian and myopian storyboards regulate the narrative anchor in
geoinformation-infrastructures; a narrative comparison of GBKN, NCGI and Geoportals Delft, IOS
Press.
Koerten H. and M. Veenswijk (2009). Building a NGII: Balancing Between Infrastructure and Innovation. GSDI 11. Rotterdam.
Lance, K., Georgiadou Y. and A. Bregt (2009). Cross-agency coordination in the shadow of hierarchy: ‚joining up‘ government geospatial information systems. International Journal of Geographical
Information Science 23(2): 249-269.
Martin E. (2000). Actor-networks and implementation: examples from conservation GIS in Equador.
International Journal of Geographical Information Science 14(8): 715-738.
Masser I. (2005). GIS Worlds: Creating Spatial Data Infrastructures. Redlands CA, ESRI Press.
Obermeyer N. and J. Pinto (2008). Managing Geographic Information Systems Second Edition. New
York NY, The Guilford Press.
Polkinghorne D. (1988). Narrative Knowing and the Human Sciences. New York, State University of
New York Press.
Puri S. K. (2006). Technological Frames of Stakeholders Shaping the Implementation: A case Study
from India. Information Technology for Development 12(4): 311-331.
Rajabifard A. and I. Williamson (2001). Spatial Data Infrastructures: Concept, SDI Hierarchy and
Future Directions. Geomatics ‚80, Tehran, Iran.
Scott W. R. (1995). Institutions and organizations, Sage Publications.
Tesselaar S., Sabelis I. and B. Ligtvoet (2008). Digesting stories - about the use of storytelling in a
context of organizational change. 8th International Conference on Organizational Discourse 2008,
London UK.
Veenswijk M. (2006). Surviving the Innovation Paradox: the Case of Megaproject X. The Innovation
Journal: The Public sector Innovation Journal 11(2): article 6, pp. 1-14.
50
The narrative anchor as the decisive element of geoinformation infrastructures
Venkatraman N. (1991). IT- Induced Business Reconiguration. The corporation of the 1990s; Information technology and organizational transformation. M. Scott Morton. Oxford, Oxforf University
Press.
Verduijn K. (2007). Tales of Entrepreneurship, Contributions to understanding entrepreneurial life.
Amsterdam, PhD thesis Vrije Universiteit Amsterdam.
Warnest M., McDougall, Rajabifard A. and I. Williamson (2003). Local & state based collaboration:
the key to unlocking the potential of SDI. Spatial Sciences 2003, Canberra Australia.
Warnest M., Rajabifard A. and Williamson I. (2005). A Collaborative Approach to Building National
SDI in Federated State Systems: Case study of Australia. From Pharaohs to Geoinformatics, Cairo,
Egypt.
Weick K. E. (1995). Sensemaking in organizations. London, Sage Publications.
Welle Donker F., Loenen van L. and J. Zevenbergen (2010). „Geo Shared licences: a base for better
access to public sector geoinformation for value-added resellers in Europe.“ Environment and Planning B: Planning and Design 37: 326-343.
Yanow D. (2000). Conducting interpretive policy research. Thousand Oaks CA, Sage Publications
Inc.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
51
GeoValue - Hamburg 2010
52
It’s not just a matter of numbers –
the value of spatial data relected in the regulatory framework
Katleen Janssen, Interdisciplinary Centre for Law and ICT, K.U.Leuven - IBBT
Joep Crompvoets, Public Management Institute, K.U.Leuven
[email protected]
[email protected]
ABSTRACT
This paper argues that the value of spatial data not only depends on the individual user or the individual producer, but also on some other, more fundamental values of society. These values are generally translated into international and national legislation or policy for the availability of public sector
spatial data. They should also be taken into account when determining the policy for making public
sector data available and the charging principles under which this should happen. In discussing this
argument, the paper looks at the existing obligations of public bodies to make their data available for
democratic, economic and public task purposes, based on priorities laid down in the international and
European regulatory framework.
Keywords: democratic access, value of spatial data, re-use of data, legislation and policy
1. THE VALUE OF SPATIAL DATA
The value of spatial data is hard to deny. Spatial data underpin governments’ decision making and
public service delivery. Next, they can be an important factor in increasing public participation to this
decision making and in stimulating the governance and legitimacy of the public sector. They also
stimulate innovation and socio-economic growth and can bring along new business opportunities
(Genovese et al., 2009). Yet, the value of spatial data is a multi-layered concept and dificult to determine. It includes inancial and monetary value, but also socio-economic, cultural, democratic and
political value (Longhorn and Blakemore, 2008; Mayo and Steinberg, 2007).
The value of spatial data also depends greatly on the individual. The value of the same data can differ to different people and for different applications (Longley et al., 2001). In each occasion, the data
may be put in a different context and in this way provide different information to the recipient, making
that recipient perceive the world as a different place than it was before (Arrow, 1979; Cahir, 2002;
Mock, 1999; Hugenholtz, 1989). In addition, the value also depends on the position of the user in the
value chain. For instance, the value of spatial data will be different to a private sector provider or user,
whose concern is mostly monetary, than for a civil society organization striving for public participation
and accountability of the government.
2. THE VALUE OF SPATIAL DATA PRODUCED BY THE PUBLIC SECTOR
While value is generally considered to be based on the willingness of the user to pay (“value is what
a damn fool will pay for it”) or on the cost to create the particular product or service, for data produced by the public sector there are also other interests and considerations that need to be taken
into account. These include the safeguarding of the democratic rights of the public, the provision of
services to the citizens, the increase of the social cohesion of society, and the creation of opportunities for innovation and growth. Hence, the question of the value of public sector spatial data is set
in an underlying environment of tension between the inancial value of the data and their potential
policy, socio-economic or democratic value. On the one hand, increasing attention is given to making
data available as widely as possible in order to encourage participation and accountability, while on
the other hand public bodies are confronted with budget cuts, and demands of national Treasuries
to create a proitable business model. This encourages them to convert the inancial value of their
spatial data into real money and to ‘sell’ their data on the market. This conlict has traditionally been
translated in the debate on open access versus cost recovery policies for making public sector spatial data available (see e.g. Onsrud, 1992a; Onsrud, 1992b; Holland, 1997; Van Loenen, 2006; Kabel
and Van Eechoud, 1998; Litman, 1994; Weiss, 2004; Janssen 2010).
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
53
GeoValue - Hamburg 2010
However, these policies are not only inluenced by the value that is attached to the data by the
suppliers, the value-adders or the end users, but also by some other, more fundamental values of
society, that are translated into international and national legislation or policy for the availability of
public sector spatial data. While the particular recipients of public sector spatial data may not always
rate their creation or availability as very valuable, the legislator may still consider it important that
these data are produced and/or disseminated for reasons such as the ones mentioned above. In this
respect, public bodies are obliged under different regulations to make their spatial data available to
different parties in society, based on the value and importance that is attached by international and
national regulatory frameworks to the availability of such information (see Longhorn and Blakemore,
2008). These obligations – stemming from fundamental principles or rights laid down in treaties,
protocols, directives, laws or regulations – should also be taken into account when determining the
value of public sector spatial data, and the charges made for its availability. Hence, value is also
determined by law and policy. This entails that the value of public sector spatial data (and the related
charging policies) can be different not just according to the data, but also depends on the underlying
principle that the legislator is trying to reach with making these data available. Hence, in a democratic society where the legislator is chosen through an election process, the value that is attached to
the availability of spatial data will be based on the view of the elected leaders on the development of
the nation, the society and the economy (Wetenschappelijke Raad voor het Regeringsbeleid, 2000,
Janssen, 2010). In its turn, this view will be based on the political views of these leaders, entailing
that their estimation of the value of spatial data on the macro-level may be completely contrary to the
value that is attached by some of the individual stakeholders. For instance, a political choice for an
extensive and strong public sector may deny the socio-economic value of spatial data for innovation in the private sector, even though private parties would beneit greatly from their availability. On
the other side of the perspective, the view of the government that all spatial data held by the public
sector should be freely available can have a great impact on the inancial value of the data for their
producers.
This paper attempts to demonstrate this argument with regard to the particular question of the value
of the availability of public sector spatial data implied in international and European law and policy,
and the consequences of this value for the charging of public sector spatial data.
3. THE VALUE OF PUBLIC SECTOR SPATIAL DATA IN THE REGULATORY FRAMEWORK
It is increasingly recognized by the legislator and the policy maker that public sector data, including
spatial data, are valuable to many different actors in society and that they should be made available
for any use (e.g. Visby Declaration, 2009; European Commission, 2010; Malmö Ministerial Declaration, 2009). Three different trends can be generally identiied.
a. Access to public sector spatial data for democratic purposes
First, public bodies are confronted with a growing trend towards openness and transparency of
their activities, including the accessibility of their data to the public. While the irst national access
legislation dates from 1766 (in Sweden), the development of the information society has in the last
twenty years given a boost to the development of freedom of information legislation and international
initiatives to make public sector data available. Their rationale is that citizens should be informed at
every stage of the decision-making process, so that they are able to make informed political choices
and can participate in the democratic process (Bovens, 2002; Birkinshaw, 2006; Janssen, 2010). In
addition, public access to public sector data ensures legitimacy and accountability, as the citizen can
check whether the public bodies stay within the limits of the law and of their remit (Schram, 2002;
Mason, 2000; Kranenborg and Voermans, 2005; Beers, 1996; Derclaye, 2008).
One of the main examples of this trend towards transparency and availability of public sector data is
the Aarhus Convention on Access to Information, Public Participation in Decision-making and Access
to Justice in Environmental Matters (United Nations Economic Commission for Europe, 1998). This
Convention had the ambition of being a universal instrument for democratization of environmental
decision making (Prieur, 1999; Noteboom, 2003). It contains obligations for the public bodies to make
their environmental information available to citizens on their request, and to disseminate environmental information of their own motion. Many other international legislative or policy documents also
54
emphasize the value of making public sector data available as a tool for increasing democracy, e.g.
Article 15 of the Treaty on the Functioning of the European Union; Directive 2003/98/EC on public access to environmental information (European Parliament and Council, 2003); Regulation 1049/2001
regarding public access to European Parliament, Council and Commission documents (European
Parliament and Council, 2001); and the OECD Recommendation on Environmental Information. (Organisation for Economic Cooperation and Development, 1998). An important recent development
is the Council of Europe Convention on Access to Oficial Documents, adopted in November 2008.
This Convention is a binding instrument for all Members of the Council of Europe and regulates access to all types of data held by the public sector, either on request or by dissemination of the public
bodies.
All these legislative or policy documents apply fully or partially to spatial data created by the public
sector. Hence, it can be stated that the availability of such data to the public for stimulating their participation in decision making and for increasing accountability and governance is considered highly
valuable by the legislator. This can be translated in a call for low-cost or free availability of public
sector data for such democratic access of the citizens (Van Eechoud, 2006; Love, 1995; Ravi Bedrijvenplatform, 1997).
b. Re-use of public sector spatial data for economic purposes
Next to the requirement of making data publicly available to increase public participation and government accountability, public bodies also face an increasing demand for re-use of their data. Spatial
data held by the public sector are an essential basis for many digital information products and are
important raw material for new services (European Commission, 2001; Economic Development and
Infrastructure Committee of the Parliament of Victoria, 2008). Re-use is important for the information
industry, but also for non-governmental organizations, communities and individual citizens, who play
a growing role in the creation of information services on the Internet by using spatial data on on-line
communities, blogs, and websites, or by making so-called data mash-ups that mix and combine data
to generate valuable new forms of information and new services (Mayo and Steinberg, 2007). While
such use of public sector spatial data may also increase public participation and have a ‘democratic’
effect, the main beneits that are envisaged lie with economic and societal growth.
The importance of re-use has been recognized in international policy and legislative documents,
although to a lesser extent and with less binding force than access for democratic purposes. For
instance, the OECD Recommendations for Enhanced Access and More Effective Use of Public Sector information intend to provide a framework for the wider and more effective use of public sector
in data, and the generation of new uses from it (OECD, 2008). Next, the European Union has been
paying attention to re-use for over 20 years, leading to the 2003 directive on re-use of public sector
information (European Parliament and Council, 2003), which obliges the Member States to adopt
a national legal framework for the re-use of documents held by public sector bodies. However, this
directive does not oblige the Member States or the public bodies to allow re-use of their documents.
This remains for the Member States to decide. In general, no binding legislation can be found on the
international or European level that imposes the availability of public sector data (including spatial
data) for re-use purposes and until recently, very few initiatives had been taken by the government
to actively stimulate the re-use of public sector data. In addition, many public bodies were reluctant
to make their data available to the citizens and to the private sector for the creation of new products
and services. One might wonder if this is an indication that the international and European policy
makers do not attach as much value to the re-use of public sector data for economic growth purposes
as to the access to data for democratic participation of the citizen (Janssen, 2010). While providing
access to public sector data has been a binding obligation for the public bodies that has been recognised in many constitutions and may even be considered a part of the fundamental right of freedom
of information under article 10 of the European Convention on Human Rights (Voorhoof, 2006), the
commitment of the public sector towards re-use has been much more recent and much less binding.
If the policymaker would indeed attach less importance to re-use of public sector data as a catalyst
for economic growth, this would entail that the free availability of such data is rated less important for
re-use than for access purposes, and that the charges should rather be based on economic eficiency (Van Eechoud, 2006; Love, 1995).
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
55
GeoValue - Hamburg 2010
However, a number of EU Member States are seeing the value of the re-use of public sector data
for economic growth and innovation, and for the development of communities in a cohesive society,
and are increasingly promoting free availability of these data. A prime example of this is the United
Kingdom’s data.gov.uk initiative, which was followed by the Ordnance Survey freely making available some of its spatial data sets for any type of use (Cabinet Ofice, 2009; Ordnance Survey, 2010).
Other Member States have also been making considerable process in making available their public
sector data for re-use. For instance, the Dutch government recently launched rijksoverheid.nl, a
website where it will make its data available for re-use without restrictions (Rijksoverheid, 2010).
Next, the Swedish Parliament has also decided to make over 165,000 documents available for reuse (EPSI platform, 2010). The value of public sector data for re-use is also understood outside of
the EU. For instance, the United States government launched the data.gov website in 2009, making
available over 270,000 datasets, of which 10% is spatial data. The Australian, Canadian and New
Zealand government are following suit (data.gov, 2009).
Yet, there are still many EU Member States where re-use is not adopted as broadly, but is still low on
the priority list. In addition, in the Member States where re-use without restrictions is encouraged, this
is mostly based on the democratic discourse that re-use increases accountability and governance,
and that the data belongs to the citizens rather than to the public bodies (e.g. Cabinet Ofice, 2010;
Shadbolt, 2010; The Guardian, 2006; data.gov, 2009). Hence, the question can be asked if the free
availability of public sector data as campaigned for by these initiatives should not rather be seen in
the light of democratic access rather than re-use.
c. Sharing public sector spatial data for the performance of public tasks
A third trend that public bodies are confronted with, is the growing need for data sharing between
governments on all levels, for the improvement of their policy making and public service delivery.
Contrary to the availability of data for access and re-use, which has mostly received attention with
regard to environmental information or all types of public sector data, the need for data sharing between governments has been recognized particularly for spatial data. Sharing public sector spatial
data is seen as invaluable for better dealing with many social and environmental problems from a
local to a global scale, e.g. environmental policy, national defence, emergency services, transport
coordination, and so on. (Onsrud, 1995; Frank, 2003).
The need for sharing spatial and environmental data on a cross-border or even global level has led
many international organisations to take initiatives to improve such data sharing. At the level of the
United Nations (UN), a Spatial Data Infrastructure (UNSDI) is being set up, which should support the
UN’s achievement of the Millennium Development Goals (United Nations General Assembly, 2000;
United Nations Geographic Information Working Group, 2007). Next, the Global Earth Observation
System of Systems (GEOSS) should meet the need for timely, quality long-term global earth observation data as a basis for sound decision making and will enhance delivery of beneits to society
(Group on Earth Observations, 2005).
In the European Union, the 2007 INSPIRE directive (European Parliament and Council, 2007) obliges the Member States to adopt measures for the sharing of spatial data sets and services between
public authorities, enabling them to use the sets and services for the purpose of their public tasks
that may have an impact on the environment. Next, the Shared Environmental Information System
(SEIS) addresses the sharing of environmental data for the purpose of preparing and implementing
environmental policy (European Commission, 2008), and the Global Monitoring for Environment and
Security initiative (GMES) aims to better meet a structured demand for data from European, national, regional and local decision makers with regard to environmental and security issues (European
Commission, 2004).
In each of these initiatives, public sector spatial data can be seen as essential for decision making
with regard to sustainable development, environment, spatial planning, natural resource management, emergency planning and disaster relief, the battle against poverty, and so on (see also Williamson et al., 2007; National Research Council, 2004; Klosterman, 1995). Considering the importance
of these policies for the welfare and well-being of society, once again it could be deduced that such
spatial data should be openly and freely available. They help to ensure the fundamental human right
56
It’s not just a matter of numbers – the value of spatial data relected in the regulatory framework
to life and protection of the family life, as guaranteed in the European Convention on Human Rights
(Council of Europe, 1950) and the International Covenant on Civil and Political Rights (United Nations, 1966). This free and open availability is for instance ensured in the main principles that underpin the GEOSS data policy (Group on Earth Observations, 2007, p.25).
4. CONCLUSION: RE-THINKING CHARGING FOR PUBLIC SECTOR SPATIAL DATA BASED ON
THE VALUE IN LAW AND POLICY
In the previous paragraphs, it was argued that the value of public sector spatial data is not only
determined by economic principles such as the user’s willingness to pay or the cost of the product
or service that is offered, but that this value also depends on the importance the legislator or policy
maker attaches to the creation and availability of these data. As a representative of the electorate,
the legislator and policy maker develop a political opinion on the value public sector spatial data for
the entire nation that will inherently require the balancing of many different interests in society and
that may therefore not always represent the value of particular data for particular users. The process
of creating laws and regulations necessarily entails that individual interests are sometimes denied to
protect the general interest as it is perceived by a democratically elected government.
On the European and international level, it is clear that access to public sector data, including spatial
data, is seen as fundamental to democracy and accountability of the government, from which one
could deduce that the data should be as openly and freely available as possible (Janssen, 2010).
With regard to re-use, the value of the availability of data is still considered to be high, but the call for
wide and free availability is less strong. In many cases, this call could even be considered to relate
more to the democratic need for data than for true re-use. In the case of the sharing of spatial data
between public bodies, the protection of the life and well-being of the global population come into
play, and strong arguments for free availability can be found in the protection of human rights by
international treaties.
These priorities that are set by the European and international consensus-based legislation and
policy, should be taken into account in any discussion on the value of public sector spatial data, particularly with regard to the charging for these data (Janssen, 2010). Any arguments that are used to
determine whether or not spatial data should be charged for (e.g. the taxpayer should not pay twice,
open access stimulates economic growth, charging is a better guarantee for good quality data, and
so on), should be held against the basic principles of the purpose for which the data are used: democratic access of the citizen, re-use for economic growth, or sharing for decision making and public
service delivery. This can help ensuring that the interests of individual suppliers and users are always
put into the perspective of the fundamental principles that are commonly agreed on in international
and European legislation and policy.
REFERENCES
Arrow K. (1979). The Economics of Information, in Dertouzos, M. And Moses, J. (ed.), The computer
Age: A Twenty Year Review, Cambridge: MIT Press, 306-317.
Beers A. (1996). Openbaarheid van overheidsinformatie, in Baten, I. And Van der Starre, G. (ed.),
Elektronische toegankelijkheid van overheidsinformatie, The Hague: Rathenau Instituut, 51-77.
Birkinshaw P. (2006). Freedom of Information and Openness: Fundamental Human Rights?, Administrative Law Review Vol. 58, 177-218.
Bovens M. (2002). Information Rights: Citizenship in the Information Society, The Journal of Political
Philosophy Vol. 10, no. 3, 317-341.
Cabinet Ofice (2009). Pioneer of the world wide web to advise the government on using data,
http://webarchive.nationalarchives.gov.uk/+/http://www.cabinetoffice.gov.uk/newsroom/news_
releases/2009/090610_web.aspx.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
57
GeoValue - Hamburg 2010
Cabinet Ofice (2010). Cabinet Ofice Minister opens up corridors of power, http://www.cabinetofice.
gov.uk/newsroom/news_releases/2010/100531-open.aspx.
Cahir J. (2002). Understanding information laws: a sociological approach, Journal of Information,
Law and Technology 2002 no. 3, http://www2.warwick.ac.uk/fac/soc/law/ elj/jilt/2002_3/cahir/.
Commission of the European Communities (2001). Communication to the Council, the European
Parliament, the Economic and Social Committee and the Committee of the Regions. eEurope 2002:
creating a EU framework for the exploitation of public sector information, COM (2001) 607 inal, 23
October 2001.
Commission of the European Communities (2004). Communication to the European Parliament and
the Council. Global Monitoring for Environment and Security (GMES): Establishing a GMES capacity
by 2008 (Action Plan 2004-2008), COM (2004) 65 inal, 3 February 2004.
Commission of the European Communities (2008). Communication to the Council, the European
Parliament, the European Economic and Social Committee and the Committee of the Regions. Towards a Shared Environmental Information System (SEIS), COM (2008) 46 inal, 1 February 2008.
Commission of the European Communities (2010). A Digital Agenda for Europe. Communication to
the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, COM(2010) 245 inal.
Council of Europe (1950). Convention for the Protection of Human Rights and Fundamental Freedom, 4 November 1950, http://conventions.coe.int/Treaty/en/Treaties/Html/005.htm.
Council of Europe (2008). Convention on Access to Oficial Documents, 27 November 2008, https://
wcd.coe.int/ViewDoc.jsp?id=1377737&Site=CM.
Derclaye E. (2008). Does the Directive on the Re-use of Public Sector Information affect the State’s
database sui generis right?, in Gaster, J., Schweighofer, E. And Sint, P. (ed.), Knowledge rights –
Legal, societal and related technological aspects. Knowright Symposium, 18-19 September 2008,
Vienna: Austrian Computer Society, 137-169.
Economic Development and Infrastructure Committee of the Parliament of Victoria (2008). Inquiry
into improving access to Victorian public sector information and data, http://www.parliament.vic.gov.
au/edic/inquiries/access_to_PSI/EDIC_PSI_Discussion_Paper.pdf.
European Parliament and Council (2001). Regulation 1049/2001/EC of 30 May 2001 regarding public access to European Parliament, Council and Commission documents, OJ L 145, 31 May 2001, 43.
European Parliament and Council (2003). Directive 2003/4/EC of 28 January 2003 on public access
to environmental information and repealing Council Directive 90/313/EEC, OJ L 41, 14 February
2003, 26.
European Parliament and Council (2003). Directive 2003/98/EC of 17 November 2003 on the re-use
of public sector information, OJ L 245, 31 December 2003, 90.
European Parliament and Council (2007). Directive 2007/2/EC of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community (INSPIRE), OJ L 108, 25 April 2007, 1.
European Public Sector Information (PSI) Platform (2010). Sweden’s Parliament goes OpenData!,
http://www.epsiplus.net/news/news/sweden_s_parliament_goes_opendata.
European Union (2009). Treaty on the Functioning of the European Union, OJ C 83, 30 March 2010.
Frank A. (2003). The surveying activities at the Austrian Federal Ofice for Metrology and Surveying:
an economic analysis, ftp://ftp.geoinfo.tuwien.ac.at/frank/reportbevinalv31prn.pdf.
58
It’s not just a matter of numbers – the value of spatial data relected in the regulatory framework
Genovese E., Roche S. Caron C. and R. Feick (2010). The EcoGeo Cookbook for the assessment
of Geographic Information value, International Journal of Spatial Data Infrastructure Research Vol.
5, 120-144.
Group on Earth Observations (2005). The Global Earth Observation System of Systems (GEOSS)
10-Year Implementation Plan, http://www.earthobservations.org/documents/10-Year%20Implementation%20Plan.pdf.
Group on Earth Observations (2007). GEO 2007-2009 Work Plan. Toward Convergence, http://www.
earthobservations.org/documents/wp0709_v6.pdf.
Holland W. (1997). Copyright, Licensing and Cost Recovery for Geographic and Land Information
System Data. A Legal, Economic and Policy Analysis, http://www.geoanalytics.com/library/publications/copyrightAndLicensing.pdf.
Hugenholtz P. B. (1989). Auteursrecht op informatie: auteursrechtelijke bescherming van feitelijke
gegevens en gegevensverzamelingen in Nederland, de Verenigde Staten en West-Duitsland : een
rechtsvergelijkend onderzoek, Deventer: Kluwer.
Janssen K. (2010). The availability of spatial and environmental data in the European Union. At the
crossroads of public and economic interests, Alphen a/d Rijn, Kluwer Law International.
Klosterman R. (1995). The appropriateness of geographic information systems for regional planning
in the developing world, Computers, Environment and Urban Systems Vol. 19, no. 1, 1-13
Kranenborg H. and Voermans, W. (2005). A Comparative Analysis of EC and Member State Legislation, Groningen: Europa Law Publishing.
Litman J. (1994). Rights in government-generated data, Proceedings of the Conference on Law and
Information Policy for Spatial Databases, http://www.spatial.maine.edu/~onsrud/tempe/litman.html.
Longhorn R. and M. Blakemore (2008). Geographic Information. Value, Pricing, Production, and
Consumption, Boca Raton: CRC Press.
Longley P., Goodchild M., Maguire D. and D. Rhind (2001). Geographic Information Systems and
Science, Chicester: John Wiley and Sons Ltd.
Love J. (1995). Pricing Government Information. Journal of Government Information Vol. 22. Nr. 5,
363-387.
Mason A. (2000). The relationship between freedom of expression and freedom of information, in
Eatson, J. and Cripps, Y. (ed.), Freedom of expression and freedom of information. Essays in honour
of Sir David Williams, Oxford: Oxford University Press, 226-238.
Mayo E. and T. Steinberg (2007). The Power of Information: an independent review, http://www.cabinetofice.gov.uk/media/cabinetofice/strategy/assets/power_information.pdf.
Ministers of eGovernment of the European Union (2009), Malmö Ministerial Declaration on eGovernment, 18 November 2009, http://www.se2009.eu/polopoly_fs/1.24306!menu/standard/ile /Ministerial%20Declaration%20on%20eGovernment.pdf.
Mock W. (1999). The centrality of information law: a rational choice discussion of information law and
transparency, John Marshall Journal of Computer and Information Law Vol. 17, 1069- 1100.
National Research Council (2004). Licensing geographic data and services, Washington: National
Academies Press.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
59
GeoValue - Hamburg 2010
Noteboom C. (2003). Addressing the External Effects of Internal Environmental decisions: Public
Access to Environmental Information in the International Law Commission‘s Draft Articles on the Prevention of Transboundary Harm, New York University Environmental Law Journal Vol. 12, 245-285.
OECD (1998). Recommendation of the Council on Environmental Information, 3 April 1998, http://
webdomino1.oecd.org/horizontal/oecdacts.nsf/linkto/C(98)67.
OECD (2008). Recommendation C(2008)36 of the Council for Enhanced Access and More Effective
Use of Public Sector Information, 30 April 2008, http://www.oecd.org/dataoecd/0/27/ 40826024.pdf.
Onsrud H. (1992a). In Support of Open Access for Publicly Held Geographic Information, GIS Law
Vol. 1, no. 1, 3-6, http://www.spatial.maine.edu/~onsrud/pubs/In_Support_OA.htm http://www.spatial.maine.edu/~onsrud/pubs/In_Support_OA.htm.
Onsrud H. (1992b). In Support of Cost Recovery for Publicly Held Geographic Information, GIS Law
Vol. 1, No. 2, 1-7, http://www.spatial.maine.edu/%7eonsrud/pubs/Cost_Recovery_for_GIS.html.
Onsrud H. (1995). Role of Law in Impeding and Facilitating the Sharing of Geographic Information,
in Onsrud, Harlan and Rushton, Gerard (ed.), Sharing Geographic Information, New Brunswick: Rutgers CUPR Press, 1995, 292-306.
Ordnance Survey (2010). Ordnance Survey launches OS OpenData in groundbreaking national initiative, http://www.ordnancesurvey.co.uk/oswebsite/media/news/2010/April/OpenData.html.
Prieur M. (1999). La Convention d‘Aarhus, Instrument Universel de la Démocratie Environnementale, Revue Juridique de l‘Environnement, n° spécial, 9-29.
Ravi B. (1997). Commercialisering van geo-informatie. Een drietal rapporten ter ondersteuning van
de Ravi-discussie 1997, Amersfoort: Ravi.
Rijksoverheid (2010). Rijksbrede website Rijksoverheid.nl live, http://www.rijksoverheid.nl/
nieuws/2010/03/31/rijksbrede-website-rijksoverheid-nl-live.html.
Schram F. (2002). Begrenzingen aan de openbaarheid van bestuur : een analyse en vergelijking van
de internrechtelijke openbaarheidsregelingen in het Belgische recht. Doctoral Thesis K..U.Leuven,
unpublished.
Shadbolt N. (2010). data.gov.uk - The Linked Data Revolution, in London School of Economics, Innovating Through Information Lecture Series London: London School of Economics, http://eprints.
ecs.soton.ac.uk/18787/.
Swedish Presidency of the European Union (2009). Creating impact for an eUnion 2015 – The
Visby Declaration, 10 November 2009, http://ec.europa.eu/information_society/eeurope/i2010/docs/
post_i2010/additional_contributions/conclusions_visby.pdf.
The Guardian (2006). Give us back our Crown Jewels, 9 March 2006.
United Nations (2000). United Nations Millennium Declaration, 18 September 2000, http://www.
un.org/millennium/declaration/ares552e.pdf.
United Nations Economic Commission for Europe (1998). Convention on Access to Information,
Public Participation in Decision-making and Access to Justice in Environmental Matters, adopted at
Aarhus, Denmark, on 25 June 1998, http://www.unece.org/env/pp/documents/cep43e.pdf.
United Nations Geographic Information Working Group (2007). UNSDI Compendium. A UNSDI Vision, Implementation Strategy and Reference Architecture, http://www.ungiwg.org/docs/unsdi/UNSDI_Compendium_13_02_2007.pdf.
60
It’s not just a matter of numbers – the value of spatial data relected in the regulatory framework
United States Government (2010). www.data.gov.
Van Eechoud M. and J. Kabel (1998). Prijsbepaling voor elektronische overheidsinformatie, Deventer: Kluwer.
Van Eechoud M. (2006). The Commercialization of Public Sector Information: Delineating the Issues,
in Guibault, L. and Hugenholtz, P. (ed.), The Future of the Public Domain. Identifying the Commons
in Information law, Alphen a/d Rijn: Kluwer Law International, 279-301.
Van Loenen B. (2006). Developing geographic information infrastructures. The role of information
policies, Delft: DUP Science.
Voorhoof D. (2006). Europees Mensenrechtenhof maakt (eindelijk) toepassing van artikel 10 EVRM
in het kader van openbaarheid van bestuur en toegang tot bestuursdocumenten, Mediaforum Vol.
18, no. 10, 290-294.
Weiss P. N. (2004). Borders in Cyberspace: Conlicting Public Sector Information Policies and their
Economic Impacts, in Aichholzer, G. and Burkert, H. (ed.), Public Sector Information in the Digital
Age, Cheltenham: Edward Elgar Publishing Limited, 137-159,
Wetenschappelijke Raad voor het Regeringsbeleid (2000). Het borgen van publiek belang, The
Hague: Sdu Uitgevers.
Williamson I., Rajabifard A. and M.E. Feeney (2003). Developing Spatial Data Infrastructures: From
Concept to Reality, London: Taylor & Francis.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
61
GeoValue - Hamburg 2010
62
Experimenting GeoInformation Prices
Gianluca Miscione1, Walter DeVries1, Jaap Zevenbergen1, Bastiaan van Loenen2
1
University of Twente, Faculty of GeoInformation Science and Earth Observation
[email protected],
[email protected],
[email protected]
2
Technical University of Delft
[email protected]
ABSTRACT
This paper presents a research-in-progress which aims at extending our understanding of Spatial
Data Infrastructures by looking at prices along inter-organizational relations. Our empirical focus is
within and beyond the Dutch geo-sector. Methodologically, we plan to use experiments to identify
what mechanisms regulate organizations’ roles in price setting. If possible, we will use ‘natural experiments’ (courses of actions happening without our intervention) Otherwise, we will identify relevant
players in geo-information price setting and simulate different situations to identify key variables.
Keywords: geo-information value, geo-information price, behavioral economics, inter-organizational
relations
1. INTRODUCTION
This extended abstract is submitted as a research in progress along the research line proposed in
“Relationality in Geo-Information value - Price as product of socio-technical networks” (De Vries and
Miscione, 2010). In the quarrel between micro (focused on individual products and services ) and
macro (looking at spatial data infrastructures as wholes ) approaches to price settings, we hypothesize that inter-organizational relations (IORs) along which SDI crystallize as such (De Vries and
Miscione, 2010) – provide the anchor prices for products and services expected to be used within
and beyond the geoIT sector.
Therefore, we can propose the similar research questions as De Vries and Miscione (2010), which
will be answered by relying on a different methodological approach:
1) Which inter-organizational conditions deine or affect the process of geo-information
price setting?
2) How are prices set and inscribed into/by the IORs?
In summary, the answer to the irst question was that a determinant condition is that formal and informal inter-organizational agreements act as a regulatory framework within which prices are accepted.
Indeed, the Dutch Cadastre uses fees to balance organizational budgets, and this cost recovery
condition cannot be labeled as market-oriented because there is no competition on those fees. From
the perspective of municipalities, the considerable amount of activities necessary to meet the requirements of the Cadastre is also an indicator of prices as costs. An evidence of those prices being
dependent on stable IORs comes from the recent shrinking of the Dutch real estate market. Following late 2008 inancial crisis, real estate trade decreased substantially, so the revenues for the cost
recovery of the cadastre, which, therefore, increased its data prices. As they are paid by cadastre’s
partner organizations, it cannot be explained in terms of market nor bureaucratic relations.
De Vries and Miscione (2010) ended with two inal questions that are being tackled partly here:
- What the most relevant empirical granularity (of IORs) to look at price setting would be?
- How the examples of the Netherlands would compare to examples in other developed
countries, and whether similar examples from developing countries would resemble these
indings?
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
63
GeoValue - Hamburg 2010
This effort is based on two works: Ariely’s (2009) experiments on relative thinking and anchoring
prices and Krek/Poplin (2010) on transaction costs.
As the geoIT sector is at its early stages of development, the value chains are not yet stable. As a
result, prices need to be established and accepted through anchoring mechanisms rather than supply/demand mechanisms (Ariely, 2009). Anchoring mechanisms consist linking the price of a new
product or service to other products or services which have accepted prices, usually within relatively
stable value chains. The resulting price determines to a large extent how buyers and sellers start
and continue to value the product. Krek/Poplin (2009) states that high transaction costs of geodata
are what make geoIT sector special. In this research in progress report, this is important because
if transaction costs are high, existing IORs become more relevant as it is expensive to move out of
them. In other words, high transaction costs make IORs central in anchoring prices. Our hypothesis
is that those anchoring mechanisms are explained by infrastructural relations. By that, we mean that
the variety of elements constituting spatial data infrastructures (i.e. databases, standards, gateways,
interfaces, formats, procedures, users’ skills, etc) on one side facilitate data sharing among the organizations using them, but at the same time constitute a barrier for easily moving to different geodata
suppliers and users. If IORs and SDI mutually reinforce each other, we propose to conceptualize IOR
costs act as anchoring prices. In other words, we think that the normal process of price setting is:
1) Initial anchoring price with people and organizations with whom relations are in place,
already
2) Existing data sharing activities become routinized (which also implies a consolidation of
an SDI)
3) Alternative possibilities (with the trade-offs they bring in) disappear from decision making
processes
4) Stable infrastructural relations, embedding value chains, become:
a. normal, therefore
b. accepted =stable
These stances contrast with what would derive both from free-market economic theory (free luctuation of prices) and centralized decisions on prices.
2 RESEARCH METHODOLOGY
If price setting takes place along existing IORs, then a view borrowed from information infrastructure
studies (like Bowker and Star, 1999) can help in investigating how that process takes place. Three
dimensions have been considered:
1) Accreditation, which actors can guarantee access, and what mechanisms can allow the
use of information,
2) Interoperability/Integration, establishing couplings between data and related activities
and organizations,
3) Standardization, data and organizational processes’ compliance to common guidelines.
Our research design is organized according to two principles, anchoring price and relative thinking.
We will base it on an experiment like the black pearls on anchoring price vs. supply / demand approach (Ariely, 2009), and possibly a second one like the one on the Economist subscription on
relative thinking (Ariely, 2009).
Experiment 1
The experiment we propose is designed as follows. Road network data and travel time are the varying factor in the value of geodata (because it includes trafic lights, road quality etc). We expect low
use from the health care sector because of scarce or no budget allocated to geodata. Because of low
budget (and high utility to organize prevention campaigns and emergency transport for example) we
expect high demand elasticity depending on price (lowering prices generate quite high rise of consumption). Reversely, with high and routinized use like in the municipalities we expect low demand
64
Experimenting GeoInformation Prices
elasticity (high formalized dependency).
Two experiments (one for the control group) will be organized, each of them with two sub groups (one
from the municipality and one from the health sector) to test their willingness to pay different prices.
Experiment 2
Through Woningwizard people can retrieve through SMS text messaging the purchasing price of a
privately owned house purchased after 1992. The data originate from the Kadaster. In addition, one
receives a second SMS text message of the current estimated value of the privately owned house,
relying on a valuation model calculation (detailed procedure in annex). Steps of the experiment:
1) If respondents / test people could choose between texting SMS info via Woningwizard
and internet based info via enormo.nl what would they choose (and why)?
2) If respondents / test people could choose between the speciic information of Woningwi
zard and the extensive information of the KaData, what would they choose (and why)?
3) The additional information of KaData includes the maps, and third party information. If
the same information would be available through mobile phones (for example with a
(google?) map via MMS or SMS, or another mobile application) what would be a reasonab
le price for the SMS/MMS? How would that price compare to the KaData price?
3. EXPECTED OUTCOME AND RELEVANCE
From experiment 1 we expect to ind a conirmation (or falsiication) that the accepted price does
not depend on use but on the actual budget allocated across IORs (quite low or null between health
sector and municipalities in spite of utility).
From experiment 2 (more similar to the Economist example) we expect to know if individual users
(rather than organization) accept a price by thinking in relative rather than absolute terms.
A positive conirmation of our hypotheses would support a model which contrasts with both free
market models and top-down (bureaucratic) price setting. Stabilized infrastructural relations (like
integrated datasets in the case of health and municipality), geodata wide accessibility in the case of
mobile phone users would give centrality to information infrastructures, speciically in terms of integration and accessibility.
ANNEX
(Procedure experiment 2):
Text: <postal code> <house number> to 2233 (Example: 1234AB 12)
On the basis of streetname, house number and residential area:
Text: <streetname> <house number> <residence> to 2233 Example: Dorpsstraat 12a Groningen
One receives in total 2 SMS responses, which cost € 1,50 per received message
There are two alternatives via internet:
1)
Check the same information via: www.enormo.nl. The price is € 0, but one needs to take
the cost for internet into account, and/or the cost to use internet through a mobile phone.
2)
Download the same information with lots of extra information via KaData. Via Kadata
internet one can receive a full private house report containing (see http://www.kadas
ter.nl/index_frames.html?inhoud=/zakelijk/index_zakelijk.html&navig=/zakelijk/nav_server
side.html%3Fscript%3D1 ):
- Cadastral information:
- Situation map
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
65
GeoValue - Hamburg 2010
- Cadastral object information
- Cadastral map
- Information from third parties (if available):
- 4 pictures by Cyclomedia
- Municipal information by Dataland
- Energy label by SenterNovem
- Value indication based on cadastral information :
- Price index existing residential buildings
- Purchasing price
- Reference buildings
Price of such a ‘woningrapport’: € 23,50
REFERENCES
Ariely D. (2009). Predictably Irrational, Revised and Expanded Edition: The Hidden Forces That
Shape Our Decisions. New york, HarperCollins.
De Vries W. T. and G. Miscione (2010). Relationality in Geo-Information value. Price as product of
socio-technical networks. International Journal of Spatial Data Infrastructure Research 5: 77-95.
Krek/Poplin A. (2009). Methodology for Measuring the Demand Geoinformation Transaction Costs:
Based on Experiments in Berlin, Vienna and Zurich, IJSDI, Vol 5 (2010).
66
How Much does Free Geoinformation Cost?
Alenka Poplin1
HafenCity University Hamburg
[email protected]
ABSTRACT
Several national mapping agencies offer their geoinformation products and services for free. European national mapping agencies follow the example of the U.S. Geological Survey (USGS) which offers a variety of digital map products such as the TIGER iles, for free to all tax payers. In 2003 Spain
was one of the irst European countries that offered their cadastral maps on the internet. Soon after
Canada followed the Spanish example and, last year in 2009, the Ordnance Survey, the national
mapping agency from the United Kingdom, announced that some of their products will be available
for free. This development in Europe is interesting and opens up some new questions. We are interested in the main reasons and consequences of offering geoinformation at low cost. We discuss the
importance of geoinformation transaction costs in the case of free geoinformation and present our
irst experimental results. The article concludes with a discussion and our future work.
Keywords: free geoinformation, national mapping agencies, transaction costs
1. INTRODUCTION
What is the value of our time? How much do we value it? Often we are willing to invest a substantial
amount of time into searching for the optimal solution, for the best and possibly cheapest product,
but we might not be willing to pay with money. All these searching activities, related to the possible
purchase of a product, which is a subject of trade, cost time. The potential buyers have to invest
some time to understand the offer, to educate themselves about the characteristics of the product
and decide what to purchase and when. How complicated it might became to select a new mobile
phone after a two or three years of ignoring the market?
Costs associated with these activities is in economic theory called transaction cost (Williamson,
1985; North, 1990; Williamson and Masten, 1995; Sholtz, 2001). Transaction costs incur also when
searching for (geo)information products; it requires time to ind the appropriate geoinformation provider, to ind the dataset, and to test the itness for use in the application used by the potential user.
In this research we concentrate on geoinformation offered for free and generated either by the national
mapping agencies or by the citizens. In recent years societies have moved from ‘data poor’ to ‘data
rich’ environments. Due to the advent of modern data processing technology, it has become easy to
deduce new data from existing one, to copy it and even to produce new data. Spatial data, and with
this also geoinformation, is available in abundance in many situations. Several organisations and
individuals, even national mapping agencies, create spatial data and offer it for free. Individual volunteers contribute their knowledge of the environment and create maps, which are available for free to
everybody. At the beginning of the 80s Tofler (1980) coined the term “prosumer” when he predicted
that the consumer will act as a consumer and as a producer at the same time and that the differences
between these two roles will begin to blur. It took almost 30 years for the geoinformation science and
practice to experience this predicted move. In 2007 Goodchild (Goodchild, 2007; Goodchild, 2007a)
stressed the volunteer contribution to geoinformation in a term “Volunteered Geographic Information”
(VGI). Several other expressions, such as crowdsourcing, user generated content, neogeography
(O`Reilly, 2005), and citizen science (Goodchild, 2009) were used in related contexts.
At about the same time some European national mapping agencies (NMA) also recognised possible
advantages in offering their products for free or at the low cost of reproduction. This practice has
been known for many years in the United States where the U.S. Geological Survey (USGS) offers a
variety of digital map products such as the TIGER iles for free or for the minor cost of reproduction
to all tax payers.
1 Alenka Poplin earlier published under her maiden name Alenka Krek
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
67
GeoValue - Hamburg 2010
In our research we concentrate on the reasons for such strategies and focus on the transaction cost
of free geoinformation, which is mostly neglected. For this study we use the expression geoinformation in a general sense, also including the examples of spatial data, a variety of geoinformation
products and/or services. The notion of “geoinformation for free” might be misleading. In spite of
the availability of geoinformation at low cost we observed the relatively high cost of searching for
geoinformation and testing its itness of use in a particular application. In this article we give some
examples of geoinformation offered by NMA available at the price of reproduction or completely for
free. We are interested in the recent phenomenon of free geoinformation offered by governmental
organisations, such as by ministries and national mapping agencies. How free is really free data? In
this contribution we review the main reasons for this strategy and show our irst experimental results
in measuring the geoinformation demand transaction costs. We conclude the extended abstract with
a discussion and our further work.
2. FREE GEOINFORMATION: VARIETY OF EXAMPLES
The examples of data offered for free come from the national mapping agencies as well as from the
initiatives started by the people who act as the users and producers (prosumers) of this data. We list
only some examples of national mapping agencies which currently offer their geoinformation for free
or at a low cost of reproduction and provide some examples for volunteering geoinformation.
USA: U.S. Geological Survey (USGS) offers a variety of their map products for free. One of the best
known examples is their TIGER (Topologically Integrated Geographic Encoding and Referencing
system) iles. They include digital iles and contain geographic features such as roads, railroads,
and rivers, as well as legal and statistical geographic areas nationwide for the entire United States,
Puerto Rico, and the Island Areas. The data is available for download over the Internet for the cost
of reproduction in various formats.
Spain: In March 2003, the Spanish Surveying and Cadastre Authority offered the Cadastre Maps of
the country on the Internet. In 2004 they designed the portal IDEE, available under www.idee.es. The
IDEE or Spanish Spatial Data Infrastructure’s main goal is to integrate, through the Internet, data,
metadata, services and geographic information produced in Spain within the State, Regional and
Local government levels and according to their respective legal frameworks. The portal is based on
the Web Map Services (WMS) and enables free access to the most important geoinformation.
Canada: In 2007 Natural Resources Canada (NRCan) decided to release its digital datasets that
underpinned its topographic paper map products for free. The impact was a signiicant increase in
volumes being downloaded; from 100 thousand datasets in 2006/07 to 5.5 million in 2007/08 (OS,
2009). The Canadian federal government has since gone one step further by compiling a series of
nationwide thematic map layers, similar to the U.S.A., and offering them to the public at no charge
at www.geobase.ca.
United Kingdom: Since 2009, the Ordnance Survey, a UK national mapping agency, offers free
geoinformation to all potential users. The UK Prime Minister announced proposals on 17 November
2009 to release a selection of Ordnance Survey products free of charge and without restrictions on
use and re-use. These proposals were made in the context of the Making Public Data Public initiative.
3. REASONS FOR OFFERING “FREE GEOINFORMATION”
There is a variety of reasons why national mapping agencies offer geoinformation for free or at a low
cost of reproduction. This trend recognises, in the last few years, a shift in understanding of the value
of geoinformation, its cost and the consequences and a shift towards a higher availability and affordability of the data in general. We summarise the main reasons in the categories described below,
trying to understand the motivations for the new strategies of the national mapping agencies.
68
How Much does Free Geoinformation Cost?
3.1 Increased sale
One of the reasons is most probably an expected increase in sales of the geoinformation and with
this the revenue for the geoinformation provider. Some geoinformation providers have reported increased sales of their products, which is due to the geoinformation offered to the potential users at no
charge. Here below are the examples of Canada and Austria.
Canada: The federal government of Canada reported increased sales. It provides a series of nationwide thematic map layers to the public at no charge on the Website www.geobase.ca. In its irst
year of operation, the Canadian Geobase site recorded 10 times the number of free data downloads
as it did when the data was sold (GITA, 2005).
Austria: In 2006, the Austrian mapping and surveying agency (BEV) changed the pricing model,
reduced some prices up to 97 % which led to a substantial increase in sales. The study executed by
Micus (Micus, 2009) reports a substantial increase in demand for their products. For example, the
demand for orthophotos increased by 7.000%. Even though BEV substantially reduced the prices,
they managed to keep the total revenue from the sales at the same level as before the reduction,
which means that they sold geoinformation to a higher number of interested users at the lower price.
3.2 Growth of the geoinformation market
Offering free geoinformation opens new opportunities for the growth of the geoinformation market.
Firstly, the value-added companies along the value chain add the value to the basic data provided
by national mapping agencies and can create new ways of offering geoinformation. This becomes
possible due to the relatively low prices of geoinformation offered at the beginning of the value chain,
enabling value-added companies to buy the data as raw material at a very low cost. This is demonstrated in the GITA report (GITA, 2005) as a very positive and important factor in Canada. Secondly,
additional vertical, specialised markets such as geomarketing, health care services, etc., can use the
geoinformation offered at an acceptable price or even at no charge. In this way the data as the raw
material becomes accessible by new potential users. New business models can be developed on the
basis of this new situation, and the geoinformation value chain can potentially develop in additional
new markets. Thirdly, the volunteering geoinformation activities and their geoinformation products
can be integrated with the basic data provided by national mapping agencies leading to the eventual
integration of a variety of geoinformation sources.
3.3 Development of standards
The beneit of creating new standards as the result of data available for free is a beneit observed
especially in Canada. The users of the Geobase offered online felt that the process of making data
freely available has forced government agencies and industry organizations to develop data standards, which ultimately fosters more widespread sharing and use of geospatial data (GITA, 2005).
“The primary economic beneit, to Canadian citizens, of accessible government maps lies in the legislated standardization of content…not in the ‘free’ access to incongruent data sets, (GITA, 2005)”.
This appears especially important for Canada which currently has no standard format for land base
data (GITA, 2005).
3.4 Improved knowledge of the geoinformation market
The availability of geoinformation for free or at the low cost of reproduction can potentially increase
the awareness of the products offered in the geoinformation market. An example for a higher awareness and use was achieved by Google Maps and Google Earth. Many users enjoy the simple user
interface and the application available to everybody at no cost. In the example of national mapping
agencies, the rate of visitors of the Virtual Cadastral Ofice (VCO) offered in Spain in 2003, increased
when offered for free. The research done by the Micus company (Micus, 2009) demonstrates a 50%
increase of the number of VCO visitors since 2003, when they implemented the new strategy of offering the information at a very low cost.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
69
GeoValue - Hamburg 2010
4. HOW FREE IS FREE GEOINFORMATION?
The search for free geoinformation and the test of its quality require time and effort. It requires additional resources allocated to these activities. Our previous research on geoinformation transaction
costs (Krek, 2003; Krek, 2003a; Krek, 2003b; Krek, 2004; Krek, 2009a; Krek, 2009b; Poplin, 2010)
clearly demonstrates that there is a cost of a transaction, called also transaction cost, involved with
geoinformation search and acquisition.
4.1 Geoinformation transaction cost
Transaction cost exists also for the geoinformation offered for free. This is the cost associated with a
transaction cost (Williamson, 1985; North, 1990; Williamson and Masten, 1995; Sholtz, 2001). Every
exchange of goods is a transaction and entails costs that result in both parties attempting to determine the valued characteristics of the good (North, 1990). We are interested in the transaction cost
of geoinformation offered for free. How much does the potential user have to invest, in time spent, in
order to ind, acquire and test the geoinformation offered for free?
GI
User
searching
GI
need
Provider 1
Provider 2
inquiring
Geoinformation
is needed
acquiring
testing
Provider n
Geoinformation does not
satisfy the need
GI
trade
Negotiating the
conditions of
exchange
Measurement geoinformation transaction costs
Figure 1. Demand geoinformation transaction cost (changed after (Poplin 2010))
Transaction costs of free geoinformation appear in different phases of the search and acquisition
of geoinformation. It takes resources, including time, to ind the appropriate geoinformation and
geoinformation provider, to check the quality of the offered geoinformation and transform it into the
appropriate and needed geoinformation format. This cost is, according to North (North, 1990), called
measurement cost. Figure 1 presents the demand geoinformation transaction cost, which is the cost
incurred on the potential user’s side. The rectangle marked with the dotted line shows the measurement transaction cost of geoinformation offered at low cost.
4.2 Setting up an experiment
In our research we continued the experiment started in 2009 (Krek, 2009a; Krek, 2009b; Poplin,
2010) in which the potential user has to ind the layouts of speciic university buildings. The data needed had to be in a format which could be imported into the software packages AutoCAD or ArcGIS,
which we used in our experiment.
In this experiment, an urban planning student who was not aware of the possibilities offered on a
geoinformation market, had to ind the geoinformation offered for free. The simulated potential user
of geoinformation was not a geoinformation expert, but rather a representative of a young generation
who is able to comprehend the new possibilities of electronic markets rather quickly. He did not have
any particular pre-knowledge of the geoinformation market or the geoinformation offered on the web
for free.
70
How Much does Free Geoinformation Cost?
4.3 First experimental results
The potential user rather quickly found the OpenStreetMap project, which is a volunteering collaborative project that aims at creating a free editable map of the whole world. The initial map was initiated
in 2004 and created by volunteers using a handheld GPS, a digital camera, a notebook or a voice recorder. This data was then entered into the OpenStreetMap database. More recently the availability
of aerial photography and other data sources from commercial and government sources has greatly
increased the speed of this work and has allowed land-use data to be collected more accurately. The
potential user summarized his time needed for the activities, which are presented in table 1.
Phases of the process
Measurement categories
Time in
minutes
Activity 1
Searching for the
geoinformation provider
Searching for the providing
organisation
Searching for the responsible
contact person
Inquiring via E-Mail
Inquiring via phone
48
Inquiring about the pricing policy
Inquiring about the availability of
the dataset
10
Defining the features of the
dataset, understanding the offer
and explaining the need
3
Free sample data acquisition and
storage
Testing the “fitness of use”
Reading and understanding the
conditions of use and pricing
policies
111
Activity 2
Inquiring about the general
conditions of the exchange
Activity 3
Inquiring about the specific
conditions of the exchange;
phone or E-Mail
Activity 4
Defining the exact
characteristics of the
geoinformation product
Activity 5
Acquiring and testing the
geoinformation product
Activity 6
Reading the documentation
about the trade conditions
and pricing
Total time
20
21
213 min.
Table 1. Demand geoinformation transaction cost of free geoinformation
The potential user spent the majority of the time searching for the geoinformation offered for free.
This time is relatively high due to the fact that the user has very limited pre-knowledge of the geoinformation market and GIS technologies. Testing the itness of use of the dataset acquired from the
OpenStreetMap project was a rather complex task for an inexperienced user and lasted 111 minutes.
At the moment of writing we have executed one experiment by one potential user. We assume that
the results cannot be generalised in this very early phase of research. These numbers serve just for
an orientation and are the result of the irst experiment. We will repeat the experiments with several
potential users in order to improve the reliability and accuracy of the presented time spent for every
activity.
4.4 Cost of quality can be an issue
The cost of quality has not been measured in this case, but it is a very important factor. Some providers deliver the quality descriptions and parameters together with the geoinformation. In the case
of volunteer geoinformation this is often not the case. Figure 2 two shows only one example of the
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
71
GeoValue - Hamburg 2010
layouts of the buildings which were taken from the OpenStreetMap dataset available for Hamburg.
They are marked in red.
Figure 2. Comparison of OpenStreetMap and DSGK
The comparison was, in this case, done with the digital city map (DSGK) produced by the Mapping
and surveying authority of the city Hamburg (LGV). The layouts coming from this dataset are marked
in light green. In our comparison we assumed the correctness of the DSGK. Figure 2 shows the simpliication of the forms of the buildings provided by the OpenStreetMap. The layouts of the buildings
are often generalised, forming inexact shapes. A more profound analysis has to be executed in order
to understand better the differences of these two datasets.
5. DISCUSSION AND CONCLUSIONS
The recent development of offering geoinformation for free or at the low cost of reproduction is
especially interesting for Europe. In this article we gave some worldwide examples of the national
mapping agencies that decided for this strategic step. We are especially interested in the reasons for
such strategies which are summarised in this article.
One of the main issues that need more consideration is related to the demand geoinformation transaction cost. This cost exists also for geoinformation which is offered for free or at the low cost of
reproduction and has basically to be paid by the potential user of geoinformation. In our experiments
we observed substantial transaction costs which are related to the search for geoinformation and
identifying what has been exchanged, and is relevant for geoinformation offered for free. The main
cost was associated by searching the appropriate provider and testing itness for use. The search for
an appropriate provider was time-consuming due to the lack of knowledge about the geoinformation
market and geoinformation available for free. In our future work we will extend our experiments and
simulate the real-world situation with several users in order to get more reliable quantitative measures.
Acknowledgements
Thank you to Stephen R. Poplin for the language improvements of this text and to my students at
HafenCity University Hamburg, especially to Jan Thomas, for his experimental work.
72
How Much does Free Geoinformation Cost?
REFERENCES
Goodchild M. F. (2007). Citizens as sensors: the world of volunteered geography. GeoJournal, 69(4),
211-221.
Goodchild M.F. (2007a). Citizens as Voluntary Sensors: Spatial Data Infrastructure in the World of
Web 2.0, International Journal of Spatial Data Infrastructures Research, 2, 24-32.
Goodchild M.F. (2009). NeoGeography and the nature of geographic expertise. Journal of Location
Based Services, Vol. http://www.informaworld.com/smpp/title~db=all~content=t744398445~tab=iss
ueslist~branches=3 -33(2), June 2009, pages 82-96.
Krek A. (2003). What are transaction cost and why do they matter? 6th AGILE Conference on Geographic Information Science, Lyon, France.
Krek A. (2003a). Key Reduction in Transaction Cost. In: Geo-Information Engineering, Changing
Technology in a Changing Society, by Lemmens M., GITC publications, The Netherlands.
Krek A. (2003b). Key Reduction in Transaction Costs. Invited reply, In: GIM International, The Worldwide Magazine for Geomatics, February 2003, Volume 17 (2), GITC, The Netherlands.
Krek A. (2004). Cost in GI Product Transaction. In: GIM International, The Worldwide Magazine for
Geomatics, January 2004, Volume 187, Number 1, GITC, The Netherlands.
Krek A. (2009a). Measuring Transaction Cost in Spatial Data Infrastructures: Example of Sweden
and Germany. The International Conference on Advanced Geographic Information Systems & Web
Services, Cancun, Mexico.
Krek A. (2009b). Quantifying Transaction Costs of Geoinformation: Experiments in National Information Structures in Sweden and Germany, in the book of the 27th Urban Data Management Symposium, Taylor & Francis, June 24-26, Ljubljana, Slovenia.
Mansield E. (1993). Managerial Economics, W.W. Norton&Company, Inc.
Micus (2009). Assessment of the Re-use of Public Sector Information (PSI), Micus, Duesseldorf,
Germany.
North D. C. (1990). Institutions, Institutional Change and Economic Performance, Cambridge University Press.
O’Reilly T. (2005). What Is Web 2.0 - Design Patterns and Business Models for the Next Generation
of Software. At: http://oreilly.com/web2/archive/whatis-web-20.html
Poplin A. (2010). Methodology for Measuring the Demand Geoinformation Transaction Costs: Based
on Experiments in Berlin, Vienna and Zurich, The Journal of Spatial Data Infrastructures Research,
Vol. 5, 168-193.
Sholtz P. (2001). Transaction Costs and Social Costs of Online Privacy. First Monday 6(5).
Tofler A. (1980). The Third Wave. Bantam Books, USA.
Williamson O. E. (1985). The Economic Institutions of Capitalism, Free Press.
Williamson O. E. and S. E. Masten (1995). Transaction Cost Economics, Theory and Concepts,
Edward Elgar Publishing Limited.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
73
GeoValue - Hamburg 2010
74
Valuing volunteered geographic information (VGI): Opportunities and
challenges arising from a new mode of GI use and production
Rob Feick, University of Waterloo, Canada
Stéphane Roche, Laval University, Quebec, Canada
[email protected]
[email protected]
ABSTRACT
The collaborative and user-centred Web 2.0 model that has emerged in recent years has had transformative impacts on the processes by which spatial data are used, created and exchanged. While
spatial data were once the domain of a select group of experts, it is now common for individuals with
little to no training in geographic concepts to use online mapping services and GPS-enabled mobile
devices to create, edit and update spatial data that relate to features or events that are of personal
interest. This new mode of producing and using spatial data through mass participation and collaboration introduces several intriguing dimensions to the already complex problem of valuing spatial
information products. This presentation examines how the emergence of this new phenomenon of
volunteered geographic information (VGI) is, and may, induce change in both the way we assign
value to spatial data and the business models that support the production of spatial data. In particular, the presentation examines parallels to the business models that have developed to support the
production of free and open source software.
Keywords: volunteered geographic information, user generated content, spatial data value, Web 2.0
1. INTRODUCTION
The landscape of geographic information (GI) use and production as well as the nature what constitutes GI have changed substantially over the past ive plus years. Once the purview of a relatively
small and sophisticated set of experts, geographic information has become a more commonplace resource throughout broader society in most developed nations. In large part, this expanded presence
of GI can be attributed to improvements in data standards, spatial data infrastructures and computing
platforms which facilitate spatial data exchange, enable consistent documentation through metadata,
and improve access to spatial data authored by governments and private irms (Maguire and Longley, 2005).
However, evolving patterns of spatial data use and production are also being transformed from the
bottom-up by what is referred to as user-generated spatial content (UGSC) or volunteered geographic information (VGI) (Goodchild, 2007). In contrast to the spatial data produced traditionally by
professionals in government agencies and private irms, VGI is predominately created and used by
amateurs or “citizen scientists” to record locations of features or phenomena that are of interest to
them (e.g. a GPS track of a canoe route, bird nesting locations, etc.) or to reference other media to
particular places (e.g. geotagging of vacation photos on a commercial web map).
The literature relating to user-generated content in general and VGI speciically is somewhat embryonic given that the technical means to produce these data have existed for a relatively short time.
For the most part, this literature has examined key questions related to the social dimensions of VGI
use and creation (e.g. democratisation of GI, motivations for contributions, etc.) and issues of data
quality and credibility (Elwood, 2008; Flanagin and Metzger; 2008). Our contribution to the GeoValue workshop will build off of this foundation to consider how the VGI products and processes may
engender change in our understanding of how GI is valued in general and particularly the impacts
that this phenomenon may have for existing and emerging GI business models. The next section
provides an overview of VGI as an increasingly important mode of spatial data production. This is
followed by a section where we discuss how GI business models are evolving in response to the
growth of VGI.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
75
GeoValue - Hamburg 2010
2. VGI AND SPATIAL UGC - TOWARD MASS USE AND PRODUCTION OF GI
The challenges associated with establishing the value of spatial data and appropriate pricing and exchange mechanisms have bedevilled governments, private and non-governmental organisations for
decades now and relate in part to the processes by which these data are produced. Within these contexts, spatial data have typically been created by domain experts either as an outcome of satisfying
their departmental mandates (e.g. government mapping of topography, collecting census data, managing national parks) or as a distinct marketable information product (e.g. NavTeq data sales). Early
efforts to extend the user communities of these data led to some of the irst questions concerning how
the value of spatial data could be conceptualised in social and economic terms and what pricing and
exchange mechanisms low logically from such value assessments (Longhorn and Blakemore, 2008).
Even a cursory examination of government policies in most nations will reveal a patchwork of spatial
data pricing and exchange practices that vary across jurisdictions and often within them. For example, while investments in spatial data infrastructures (SDIs) and state and municipal geoportals have
increased the volume of base or framework data that are available for no-charge in many countries, it
is not uncommon for sub-national government bodies (e.g. municipalities) to use revenue from sales
of GI is used to recover some data creation and maintenance costs. Pricing of GI produced by priva
te irms through direct primary data collection or value-added enhancements of government products
can be equally complex given the ability of irms to segment markets based on capacity to pay and often
inance more frequent updates to data through subscription fees (Longhorn and Blakemore, 2008).
The complexity of the social and economic context for GI use and production has evolved substantially over the past decade. Individuals increasingly expect to be able to access information in a rapid and highly selective manner and, enabled by self-reinforcing developments in
information and communications technology (ICT), to be able to participate in generating information. In contrast to established models of production where a good is created through an
ordered “chain” of inputs and processes (Porter, 1985), this new cadre of “produsers” (Bruns,
2007) collaboratively build, edit and reine shared data resources. This trend toward co-produced information and shared online “communities” is most apparent in sectors with relatively low
barriers of entry (e.g. journalism, photography), however its inluence is becoming increasingly pervasive across many other sectors and application contexts (Singer and Ashman, 2009).
In the context of GI, this distributed and volunteer-based approach offers several key advantages. It
permits citizens to identify and, in some cases, correct errors in commercial or government data and
also to create new data that the market or public agencies cannot support (Goodchild, 2007; Feick and
Roche, 2010). The data that these communities of volunteers create are, with exceptions, largely local
scale in nature and relect citizens’ varying interests and perceptions of their surroundings (Coleman
et al, 2010; Elwood, 2008). Moreover, this collaborative mode of (re)producing GI from the efforts of
many can also foster the representation of multiple rich and socially-relevant geographies based on
differences in citizens’ local and experiential knowledge (Hall et al, 2010; Crutcher and Zook, 2009).
This bottom-up and distributed mode of data production is not without its challenges. GI produced
within government and private irms tends to be governed by known and documented standards of
completeness and quality control. While some mature VGI initiatives with large communities of active
contributors such as OpenStreetMap (OSM) and Wikimapia provide online documentation to guide
data collection and use wiki or crowdsourcing approaches to regulate data quality, a considerable volume of VGI appears to be created in the absence of such standards. Our capacity to assess the quality of VGI that is authored by many untrained, if well-intentioned, volunteers is therefore limited and
subject to a host of uncertainties relating to the patchiness of VGI contributions and individual contributors’ motivations, credibility and expertise (De Longueville et al, 2010; Flanagin and Metzger, 2008).
While these uncertainties limit our ability to quantify the economic and societal value of VGI
and predict precisely how increased volumes of VGI may affect the GI sector over the longer term, there are some early indications of how the business models that underlie public and
private production of GI may evolve. The following section explores briely how these business models of V(GI) production and use may evolve in the near future and, to a lesser extent, speculates on how the changing nature of VGI may affect its economic and social value.
76
Valuing volunteered geographic information (VGI): Opportunities
and challenges arising from a new mode of GI use and production
3. IMPLICATIONS FOR THE „BUSINESS“ OF GI PRODUCTION AND USE
Within the scope of this abstract, we explore three ways that this outsourcing of certain types of GI
production and maintenance has or may yet affect GI business models. The increasing reliance of
individuals, irms and governments on a limited number of “free” base maps distributed through the
Web by irms previously considered external to the GI sector (e.g. Microsoft, Google) has led to the
perhaps most apparent changes to GI business models over the past ive years. Their success in
packaging seamless rendered maps of streets, topography, aerial photography from inputs of varying quality, currency and origin has demonstrated two realities to the GI sector. First, most end-users
value access to data that is fast, free and easy to use over alternatives that may be more limited in
geographic scope or have data quality that is documented and consistent from place to place. Second, users will capitalise on framework GI that is supported by advertising, mining of user’ search
activity, or transaction fees and they will volunteer their time and local knowledge to derive GI that
satisfy personal or comparatively specialised online community interests. Although it is dificult to
quantify, the recent rise of web mapping tools targeted at non-expert consumers may signal a basic
bifurcation of the GI market where revenue from the mass advertisement supported segment pays
for more frequent updating of base data (e.g. roads, orthophotography) and/or higher quality data for
analytical purposes.
In addition to the advertising supported model, there are interesting parallels between recent developments in VGI and business models in the free and open source software (FOSS) sector. Three
distinct business model for the production of FOSS, with many variations in between, appear to be
particularly interesting here:
a) Software that is built and maintained solely by volunteers and is available to all for no
charge (e.g. drupal),
b) Software that is partially supported by corporate contributions of code or direct funding
for core staff who set the strategic direction for the project and control release schedules.
Communities of volunteers perform testing, debugging and contribute new code to
enhance the project (e.g. Mozilla, MapGuide), and
c) Software that is available for no-charge that is paid for by support subscriptions or
through revenues gained from enhanced versions available for sale (e.g. Sun OpenOfice
and StarOfice). The users’ role is restricted to providing feedback on errors and to developing extensions.
The correspondence between these three models and the current VGI landscape is not perfect
however there is a reasonable degree of alignment. For example, much of the VGI produced by
individuals for the personal use by geotagging of images or tagging of locations relative to an online
base map (e.g. Wikimapia) or a personal GPS unit, appears to fall within the irst model (volunteeronly) listed above. The only proviso to this classiication occurs when geotagging is done relative to
a commercial (i.e. advertising supported) base map.
The hybrid model that involves a range of corporate and volunteer support appears to be the most
prevalent means of supporting VGI production. OpenStreetMap’s stated mandate is to build an
“editable map of the whole world” that is free of charge and copyright restrictions through the efforts
of thousands of users, aided by corporate and government data donations (OpenStreetMap, 2010).
Users contribute to this wiki-like effort by uploading GPS tracks of streets, correcting errors, and locating points of interest (e.g. schools, hospitals, etc.). The social value of this massively collaborative
approach to GI production was illustrated well following the January 2010 earthquake in Haiti when
volunteers from around the world compiled a road network from imagery within two days to aid relief
efforts (OpenStreetMap, 2010; Ushahidi, 2010). In the same vein, Goodchild and Glennon (2010)
show to what extend user generated content and crowd sourcing models of GI could be relevant for
disaster and crisis management (example of Santa-Barbara 2007-2009 wildires).
Private irms and some government agencies have also begun to incorporate volunteers’ contributions into their data production processes, particularly for data that require frequent updates such
as street networks. Typically, users’ reports of data errors or their contributions of new information
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
77
GeoValue - Hamburg 2010
is subjected to review by in-house experts prior to being released to other users as illustrated by
NavTeq’s Map Reporter and GoogleMaps’ “report a problem” and “community edits” applications.
However, initiatives like TomTom’s MapShare allow users to choose whether to view only approved
data or to augment them with other users’ contributions concerning road closures and trafic updates
(TomTom, 2010). This form of value-added form of co-production results in data that are more current and thereby more valuable in both market and end-use terms.
4. CONCLUDING REMARKS
These emerging business models point to several research questions that we are interested in examining in the context of this workshop including:
- How can the value of VGI be determined in the absence of purchase prices and near zero
transaction and delivery costs?
- How can the economic value of users’ contributions to a commercial product be assessed?
- How does the emergence of VGI alter our understanding of what constitutes GI and,
more speciically, to what extent will it be valued as data or as a form of media?
REFERENCES
Coleman D.J, Sabone B. and N. Nkhwanana (2010). Volunteering Geographic Information to Authoritative Databases: Linking Contributor Motivations to Program Characteristics. Geomatica 64(1):
27-39.
Crutcher M. and M. Zook. (2009). Placemarks and waterlines: Racialized cyberscapes in post-Katrina Google Earth. Geoforum 40(4): 523-534.
De Longueville B., Luraschi G, Smits P, Peedell S. and T. de Groeve (2010). Citizens as Sensors for
Natural Hazards: A VGI Integration Worklow. Geomatica 64(1): 41- 59.
Elwood S. (2008). Volunteered geographic information: future research directions motivated by critical, participatory, and feminist GIS. GeoJournal, 72 (3-4), 173-183.
Feick R. and S. Roche (eds) (2010), special issue on Volunteered Geographic Information, Geomatica Journal, 64(1).
Flanagin A.J. and M.J. Metzger (2008). The credibility of volunteered geographic information. GeoJournal, 72 (3-4), 137-148.
Goodchild M. (2007). Citizens as voluntary sensors: spatial data infrastructure in the world of Web
2.0. International Journal of Spatial Data Infrastructures Research, 2, 24-32.
Goodchild M. and J. A. Glennon (2010). Crowdsourcing geographic information for disaster response: a research frontier, International Journal of Digital Earth, 1-11, iFirst article.
Hall G.B., Chipeniuk R., Feick R., Leahy M. and V. Deparday (2010). Community-based production
of geographic information using open source software and Web 2.0. International Journal of Geographic Information Science 24(5): 761–781.
Longhorn R. and M. Blakemore (2008). Geographic information: Value, pricing, production and consumption. Boca Raton, FL, CRC Press.
Maguire D.J. and P.A. Longley (2005). The emergence of geoportals and their role in spatial data
infrastructures. Computers, Environment and Urban Systems, 29: 3-14.
78
OpenStreetMap (2010). http://www.openstreetmap.org/ Last accessed on July 23, 2010.
Porter M. (1985). Competitive Advantage: Creating and Sustaining Superior Performance. New York:
Free Press.
Singer J.B. and I. Ashman (2009). Comment Is Free, but Facts Are Sacred: User-generated Content
and Ethical Constructs at the Guardian. Journal of Mass Media Ethics 24:3–21.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
79
GeoValue - Hamburg 2010
80
Re-use of Public Sector Hydrographic Information:
Is the maritime sector a guiding beacon or a slow turning ship?
Frederika WelleDonker
Delft University of Technology
OTB Research Institute for the Built Environment
[email protected]
ABSTRACT
In the last decade, there has been ample research to determine the potential economic value of public sector information (PSI), both on national level as well as on European level. Most of this research
has focused on the re-use of PSI in the land-based geo-sector, but little is known about re-using PSI
in the maritime sector. Especially hydrographic information is an important resource for re-use and
value adding by the private sector. This paper will provide the irst indings of a case study, carried
out as part of a PhD study, assessing the impact of the legal framework on value adding public sector
hydrographic information.
Keywords: value of public sector information, hydrographic information, electronic navigation charts
(ENC), case study
1. PROBLEM STATEMENT
It has been acknowledged that public sector information (PSI) forms a rich resource for the private
sector to create value added products and services (e.g. Pira International, 2000; MEPSIR, 2006;
MICUS, 2008). However, in spite of European initiatives to facilitate PSI re-use, the private sector still
faces a number of obstacles, such as over-restrictive licences or public sector bodies acting as value
added resellers in direct competition with the private sector. Earlier research into public sector spatial
data licences (e.g. WelleDonker et al., 2010) suggested that the marine sector might have overcome
some of the obstacles faced by the land-based geo-sector. In the marine sector, public sector marine
information is disseminated under harmonised licence conditions to the private sector for re-use and
value adding. However, have the marine sector managed to clear all obstacles for re-use and value
adding or is this a supericial impression? Can the marine sector be viewed as a best practice case?
Only by studying the marine sector into more detail can this question be answered.
2. METHODOLOGY
To provide an answer, a single case study into the re-use of hydrographic information was conducted
in the Netherlands. The case study irst started in 2009 and was inalised between May and June
2010. The case study was limited to re-users of hydrographic information of the Dutch Hydrographic
Service and concerned interviewing both public sector and private sector organisations. For the public sector organisations, representatives on senior operational and policy advice level were interviewed. For the private sector, directors or managers of nautical products distributors were interviewed.
The distributors included both resellers and value added resellers. The interviews were held using
a questionnaire with open-ended questions. Although the interviews were limited to Dutch organisations, the outcomes will probably be representative for Western Europe as the Dutch marine sector
operates internationally.
3. HYDROGRAPHIC INFORMATION
3.1 Nature of hydrographic information
Hydrographic information covers a broad spectrum and includes data related to depths, bathymetry,
coastlines, tidal data, and obstructions and wrecks. Hydrographic information is used to produce
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
81
GeoValue - Hamburg 2010
nautical publications. Many maritime nations established national hydrographic ofices1 (NHOs)
to carry out hydrographic surveys for the production of nautical publications. Such surveying is time-consuming and expensive as all depth details have to be included. Therefore,
most maritime nations deem the collection of hydrographic data to be a public task. This hydrographic information is an important resource for re-use and value adding by the private sector. To illustrate the potential value of hydrographic information, the income from supply of information of the United Kingdom Hydrographic Ofice (UKHO) – the leading supplier of
hydrographic information in Europe – amounted to ₤116.6m in 2009/10 (UKHO, 2010, p.46).
3.2 Nautical Publications
Nautical publications include charts, books and periodicals related to e.g. shipping routes, port entries, regulations, lighthouses, buoys and tide tables. Today, more than 80% of international trade in
the world is carried by sea (IHO, 2010 p.4). Thus, maritime trafic is an important part of a nation’s
economy. If shipping routes are poorly charted, voyages may be subject to increasing costs. Even
worse, the safety at sea may be in jeopardy if shipwrecks or other underwater obstacles are not
clearly indicated. Therefore, it is essential that nautical publications be continually updated. Apart
from certiied nautical publications, some NHOs also produce so-called half-fabricates, such as paper charts of coastal zones and rivers for non-international shipping, and software applications for
pilots.
3.3 Nautical Charts
An important task of an NHO is to produce high quality nautical charts. As seas and oceans do not
stop at national borders, NHOs all over the world are authorised to produce the certiied international
nautical charts mandatory for all international vessels. These certiied charts are produced as both
paper charts and electronic navigation charts. Electronic navigation charts may be available as raster
charts (RNCs, effectively digitised paper charts), or as vector charts, the so-called ENCs2 . ENCs
are databases that contain many features such as coastal lines, depths, buoys and lights. Users can
select speciic layers, or zoom in and out. Although paper charts are still in use, electronic navigation
charts are being phased in on all ships.
Following international rules, ENCs will be mandatory on all ships engaged on international voyages
by 2018. A schedule has been set for all ships to be itted with an electronic chart display and information system (ECDIS). An ECDIS is a system that consists of hardware and software required to
display the position of the ship on the relevant charts from its database and to perform navigational
tasks. The driving factor behind the mandated use of ECDIS is the major improvement in safety and
the reduction of the risk of grounding. Many of the safety beneits of ECDIS are immediately obvious,
namely, improved situational awareness, faster and more accurate updating, and reduced workload
(Amels, 2010).
4. REGIONAL ENC COORDINATING CENTRES AND DISTRIBUTION
4.1 Regional ENC Coordinating Centres
Regional ENC Coordinating Centres (RENCs) were speciically established to produce certiied
ENCs. These RENCs work on basis of bilateral agreements with NHOs to combine their hydrographic data into ENCs for their members. In 1999, the Norwegian Hydrographic Service (NHS) and the
UKHO established PRIMAR as a RENC. In 2002, a number of European NHOs split off PRIMAR and
set up the International Centre for ENC (IC-ENC), coordinated by the UKHO. Today, the IC-ENC with
1 Different countries use different names for these types of organisations, such as “maritime administration”. As the International Hydrographic Ofice, the , uses the term ”Hydrographic ofices” for its member organisations, this paper will follow the
same name.
2 As the legal framework related to certiied ENCs refers to them as ENCs, the acronym “ENC” will refer to the certiied ENCs
only.
82
Re-use of Public Sector Hydrographic Information: Is the maritime sector a guiding beacon or a slow turning ship?
its 28 members31 has, by far, a larger coverage than PRIMAR with 11 members42 and is the biggest
supplier of ENCs in Europe. The NHS is the coordinator of PRIMAR. Since there are only two RENCs
in Europe, the RENCs have an oligopoly with respect to ENCs.
4.2 Distributing of nautical publications
The RENCs use a network of distributors to market their products. IC-ENC has only seven so-called
VARs (value added resellers) to distribute their products. VARs are specialist distributors who are
allowed to develop their own services based around ENCs. Usually the value adding amounts to
adapting the displaying format or facilitating the ordering process. Some VARs are developing innovative web services or mobile phone applications. VARs resell their value added products and ENCs
to agents and to end-users. Apart from being IC-ENC’s coordinator, the UKHO also operates as one
of the seven VARs as Admiralty Charts and Publications, a full subsidiary of UKHO. PRIMAR has
a network of over 50 distributors. As PRIMAR was not included in this case study, it is not clear if
all distributors are allowed to add value. All of IC-ENC’s VARs are also part of PRIMAR’s distributor
network and as such, add value to PRIMAR’s products.
NHOs distribute their products both directly (usually to other public sector bodies) and through agents
(mostly for the private sector). The Dutch Hydrographic Service (DHS) further divides its 18 agents
into so-called A-agents and B-agents. A-agents distribute all DHS‘ nautical publications to the international shipping industry, including ENCs. B-agents distribute all DHS’ publications except ENCs,
RNCs and certiied paper charts to the inland shipping industry and recreational shipping industry.
In practice, this amounts to distributing the half-fabricate products of the DHS. Almost all A-agents
order paper publications from the DHS and ENCs from IC-ENC or one of its VARs. As it is a core task
of the DHS to ensure its publications are up to date, the DHS sends regular updates to the paper
publications to the A-agents. A-agents must use these updates to revise their paper charts manually,
which amounts to a large part of their daily activities. B-agents also receive updates but less frequent
than A-agents. B-agents are not allowed to make corrections to charts. Therefore, paper charts sold
by B-agents may be out-of-date. Some B-agents also use raw data or half-fabricates of the DHS to
produce value added products, such as inland water atlases or coastal RNCs.
UKHO
V
A
L
U
E
C
H
A
I
N
NHO
NHO
NHO
NHO
NHO
NHO
NHO
NHO
DHS
IC-ENC
Other PS
AC&P
Agent
NHO
NHO
NHO
NHO
NHO
NHO
NHO
NHO
VAR
VAR
VAR
VAR
VAR
Agent
Agent
Agent
Agent
Agent
Agent
Agent
= DATA FLOW
= PUBLICATIONS FLOW
Other PS
VAR
Agent
Agent
= INTERNATIONAL
SHIPPING
Agent
Agent
Agent
Agent
Agent
Agent
Agent
= INLAND
SHIPPING
Agent
Agent
Agent
Agent
Agent
= RECREATIONAL
SHIPPING
Figure 1. Distribution of nautical publications
3 IC-ENC’s members are: Argentina, Australia, Bahrain, Belgium, Brazil, Chile, Colombia, Cuba, Ecuador, Germany, Greece,
Iceland, India, Indonesia, Mexico, Mozambique, Netherlands, New Zealand, Pakistan, Peru, Philippines, Portugal, Russia,
South Africa, Spain, Turkey, United Kingdom and Venezuela.
4 PRIMAR’s members are: Croatia, Denmark, Estonia, Finland, France, Greece, Latvia, Norway, Poland, Russia and Sweden.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
83
GeoValue - Hamburg 2010
5. FRAMEWORK HYDROGRAPHIC SECTOR
Mandating organisations
There are two international organisations related to institutional aspects of hydrographic information.
The International Maritime Organisation (IMO)
The IMO is a United Nations Organisation specialised in setting standards and regulation for the maritime sector. As safety is one of the principal responsibilities of the IMO, a set of mandatory security
measures for international shipping entered into force on 1 July 2004, amending the 1974 Safety of
Life at Sea Convention (SOLAS)51 . SOLAS deals, inter alia, with Safety of Navigation, including the
mandatory use of ECDIS and ENCs.
The International Hydrographic Organisation (IHO)
The IHO was established to support safety of navigation and the protection of the marine environment. Two of the objectives of the IHO are to coordinate the activities of NHOs and to achieve uniformity in nautical publications, including ENCs. To realise the latter objective, the IHO has developed
technical standards related to the digital data format, speciications for ECDIS content and display,
and data protection. One of those standards is the S-63 standard for encrypting ENCs. S-63 allows
for mass distribution of ENCs on CDs, without losing protection of the data and allows access to only
those cells that a user has been licensed for.
Hydrographic information licences
There are different licence regimes for paper charts and for ENCs. Often, licence conditions prohibit
any alterations but in the case of paper charts, A-agents must carry out corrections according to
updates sent by the DHS. VARs are not allowed to make any changes to the integrity of the ENCs
but are allowed to produce value added services around the ENCs. However, there are limitations to
how far they may go. For instance, a Dutch VAR had developed a value added service to overcome
the problem with pre-selecting ENCs using a tracking system to activate the appropriate ENC permit
only after entering a certain sector. The service had to be severely limited for legal reasons (Amels,
2010).
Selective access to individual ENC cells is supported by providing users with a licensed set of permits containing the encrypted cell (IHO, 2008, p.3). Once an ENC is loaded into ECDIS, the ENC will
remain in ECDIS, even after the licence has expired. Every time a ship sails into another sector, an
ENC for that sector is required. Therefore, in the voyage pre-planning phase, all ENCs that may be
necessary must be preloaded. However, there are still problems, such as ENC coverage overlaps. It
may be hard to determine in advance, which ENCs are required without having access to the actual
ENC cells (Amels, 2010). A warning will appear on the display only after the ENC cell is activated.
In addition, no updates can be uploaded to an ENC with an expired permit. A permit may be taken
out for a 3-month, 6-month, 9-month or 12-month period. NHOs receive royalties for every copy of a
nautical chart sold to an end-user. If the paper chart covers an area of another NHO, it will receive
part of the royalties. Contrary to the UKHO, the Dutch HO does not have to return a proit on sales.
During interviews with VARs, it emerged that S-63 is viewed as a major obstacle to creating value
added services. Neither the interviewed agents nor the VARs were aware of an EU legal framework
dealing with public sector information. They were fully aware of the IMO and IHO regulations but had
never heard of the PSI Directive or of INSPIRE. The VARs were interested in the possibilities offered
by services to be developed under INSPIRE. Especially the free-viewing services and the catalogue
services were of interest, as it may allow them to ind and view data from alternative sources. The
agents were more interested in IMO regulations than EC directives as IMO regulations compel international shipping companies to purchase nautical publications and logbooks more frequently.
1 SOLAS is a treaty covering aspects related to safety such as life saving applications and warning lights. SOLAS has been
accepted by more than 156 countries (http://www.imo.org, accessed 20-06-2010).
84
Re-use of Public Sector Hydrographic Information: Is the maritime sector a guiding beacon or a slow turning ship?
6. CASE STUDY OBSERVATIONS
From the case study, a few observations can be made. Firstly, the international shipping sector overall could be considered to be a sector led by traditions. Business models have often been in place
for centuries. Most of the interviewed A-agents indicated that, even though nautical publications are
shifting from analogue to digital, in fact the old business models are essentially surviving. Most of the
A-agents are satisied with the current system of distributing hydrographic information because they
are ensured of a constant quality. As they resell without adding value, they are less hampered by
licence restrictions. The dominant position of the UKHO was seen to be an advantage rather than a
disadvantage as the UKHO maintains a high standard of products and fast distribution services.
Secondly, the international shipping sector has to adapt to using mandated electronic navigation
aids, such as ECDIS and ENCs. Although the international shipping sector has to invest heavily to
implement ECDIS, in general they consider that the advantages to outweigh the disadvantages.
The VARs view ENCs as a prime opportunity to develop new services and to facilitate the maritime
sector. However, the VARs view that they are severely limited by the current licence conditions and
digital rights management of the NHOs and RENCs. The VARs see parallels with the music industry: organisations can only hold on to protecting information for so long but eventually new ways of
offering information will gain acceptance with the producers. The VARs see the dominant position of
the UKHO – especially with its multiple roles of NHO, IC-ENC Coordinator and VAR – as one of the
major obstacles.
Thirdly, the inland shipping sector is probably led by traditions to a lesser extent than the international
shipping sector. This may be because most inland shipping vessels are owner-operated, whereas international vessels are owned by shipping companies. The international shipping vessels represent
a large investment compared to inland shipping vessels. The mandated ECDIS equipment is expensive, thus shipping companies are reluctant to invest in additional state-of-the-art equipment that
may not even be allowed to be used in some international waters. From interviews, it appeared that
inland skippers are less reluctant to spend money on new navigation systems and services. However, with the current economic downturn, the inland shipping market is stagnating rapidly.
Fourthly, the recreational market is a fast growing market. Although most of the traditional yacht owners still prefer paper charts, the younger generation prefers electronic navigation systems. Again,
the VARs and B-agents see opportunities to develop value added products for this market. However,
the obstacles faced in this market, relate more to organisational problems and to quality problems
rather than restrictive licence conditions. In the Netherlands, inland waters are the responsibility of
many local and some national authorities, all with varying standards, data quality and access regimes. Contrary to international and coastal waters, there is no overarching authority setting standards
for all inland waters.
7. SOME PROVISIONAL CONCLUSIONS
If we compare the hydrographic sector with the land-based geo-sector, the private sector faces
some of the same obstacles for value adding, such as restrictive licences and public sector bodies
acting as VARs. However, there are also some dissimilarities. Unlike land-based geo-data, there are
uniform standards in place for hydrographic information. Prices are less of an issue either as the
private sector can pass these prices on to the end-user. Due to the IMO’s regulatory framework, the
international shipping sector can be considered a captured market. The main obstacle to developing
value added services by the private sector is the hydrographic data monopoly position held by the
NHOs and the RENCs. The monopolistic position inds its roots in the restrictive licence conditions
and digital rights management standards implemented by the NHOs and RENCs. At this stage, there
are no alternative data suppliers for hydrographic information for the international shipping market
due to the IMO framework. For the inland shipping and recreational markets, there are alternative
data suppliers. However, the quality and interoperability of the data is wanting.
The triple role (NHO, RENC coordinator and VAR) of the UKHO may amount to market distortion. As
an NHO, because of its reputation and (national) mandates, the UKHO can secure data from third
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
85
GeoValue - Hamburg 2010
parties to produce better quality charts (OFT, 2006). As IC-ENC coordinator, the UKHO sets policy
guidelines, determines licence agreements and pricing regimes, and selects VARs for distribution. As
Admiralty Charts and Publications, it is also one of only seven VARs. Thus, the UKHO is in effect the
RENC as well as one of its distributors. However, the other VARs are reluctant to complain openly,
as they do not want to bite the hand that feeds them. VARs are dependant on the UKHO as a major
supplier of nautical publications as the coverage of the UKHO’s publications exceeds by far that of
PRIMAR. In addition, the network is relatively small: there are only a limited number of VARs, thus
the VARs have to protect their position.
The strict framework related to certiication and distribution of public sector hydrographic information
provides clear guidelines. This framework works well for paper charts. In that respect, the framework
is a beacon. However, as the maritime sector becomes more dynamic due to technological advances, the same framework is also limiting the opportunities offered by these technological advances.
In that respect, the framework is hampering ships to manoeuvre more quickly.
REFERENCES
Amels W. (2010). Electronic Navigation Charts. Outside-the-box Approach to Improve Licensing.
Hydro International, 14(2) (March/April 2010), at http://www.hydro-international.com/issues/articles/
id1170-Electronic_Navigation_Charts.html. [accessed 24-06-2010].
IHO International Hydrographic Ofice (2010). National Maritime Policies and Hydrographic Services, Monaco, International Hydrographic Ofice, at http://www.iho-ohi.net/iho_pubs/misc/M2ENG_
amended_18-03-10.pdf.
IHO International Hydrographic Ofice (2008). IHO Protection Scheme, edition 1.1 – March 2008,
special publication nr. 63. Monaco: International Hydrographic Bureau at http://www.iho-ohi.net/iho_
pubs/standard/S-63/S-63_e1.1_EN_2008.pdf.
MEPSIR (2006). Measuring European Public Sector Information Resources. Final report of study on
exploitation of public sector information - benchmarking of EU framework conditions. HELM Group
of Companies of Moira, Northern Ireland & ZENC, the Netherlands, at http://ec.europa.eu/information_society/policy/psi/docs/pdfs/mepsir/inal_report.pdf.
MICUS Management Consulting GmbH (2008). Assessment of the re-use of Public Sector Information (PSI) in the Geographic Information, Meteorological Information and Legal Information sectors Final Report. Study commissioned by EC in 2007. Dusseldorf, at http://ec.europa.eu/information_society/policy/psi/docs/pdfs/micus_report_december2008.pdf.
Ofice of Fair Trading (OFT) (2006). The commercial use of public information (CUPI), at http://www.
oft.gov.uk/shared_oft/reports/consumer_protection/oft861.pdf.
Pira International Ltd, University of East Anglia and KnowledgeView Ltd (2000). Commercial exploitation of Europe‘s public sector information - Final report. Pira International Ltd, European Commission Directorate General for the Information Society, at ftp://ftp.cordis.lu/pub/econtent/docs/commercial_inal_report.pdf.
UK Hydrographic Ofice (UKHO) (2010). Annual Report and Accounts for 2009/2010. London: The
Stationery Ofice, at http://www.ukho.gov.uk/AboutUs/Documents/UKHO_Annual%20Report_0910.
pdf.
WelleDonker F., Loenen van B. and J. Zevenbergen (2010). Geo Shared licences: a base for better
access to public sector geoinformation for value-added resellers in Europe. Environment and Planning B: Planning and Design 37(2): 326-343.
86
Statistical analysis of routing processes using OpenStreet Map road data of
Hamburg with different completeness of information about one-way streets
Matthias Fessele, Alenka Poplin
HafenCity University Hamburg
[email protected]
[email protected]
ABSTRACT
In this article we present a statistical analysis of routing processes using OpenStreet Map road
data of the inner city of Hamburg. Our main focus is in research of the impact of the completeness
of information on the quality of decisions. As an example we use a road network which contains
different completeness of information about one-way streets. Examined is the relation between the
completeness of one-way information and the driving time the agent needs to navigate between two
points. Further interest is the learning of one-way streets during the navigation. Both questions are
discussed based on static maps as well as on a converging one-way edge detection process.
Keywords: completeness of geoinformation, navigation in a city, quality of decisions
1. INTRODUCTION
We are interested in how the quality of the map information affects the quality of the decision the user
makes with the help of this information. Therefore we analyzed the behavior of a data user (agent)
who wants to navigate between a start point A and a destination point B, both located in the city of
Hamburg. The navigation in this case represents an example of a decision-making process in which
information is needed. In our previous research (Krek, 2002) we focused on the interrelation of data
com¬pleteness of one-way streets, and the quality of the user decisions while navigating in the city.
An agent has different information sources he can use for making his decisions. Frank (2003) focused on the measuring of information in route descriptions. His data user accesses three different
sources of information:
• The knowledge from the world. This is the information the agent gains from the environ
ment while solving his navigation task;
• The information from the navigation message, which is derived from the map, or a naviga
tion device containing a map, the agent uses for the routing;
• The previous knowledge, which means the knowledge the agent already has about the
real world situation when he starts routing.
In our scenario presented in this article, the agent navigates in the city and uses road data with a
variety of completeness of information about one-way streets which he uses in his decision-making
process. The navigation messages are derived from a map. A map used by the agent for navigation
can provide between 0% and 100% of the one-way streets.
The Routing Algorithm used in our scenario is based on the Shortest-Path-Algorithm of Dijkstra
(1959) as described in Bondy and Murty (1982). When the agent starts at Point A, he computes a
shortest-path-route from Point A to B with the help of Dijkstra-Algorithm. It follows this route as long
as it is consistent with the real road situation. There are situations, where it can not follow the route
any more. This happens because the agent is not allowed to pass the next edge in the given direction
since it is a one-way street which is not included in the data he used for computing the route. In these
situations he has to compute a new shortest-path-route from the actual position to his destination
point B. While navigating from A to B the agent often has to re-compute his route several times when
many one-way streets are missing.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
87
GeoValue - Hamburg 2010
In contrast to the implementation of this algorithm in Krek (2002), a change was made: the DijkstraAlgorithm now uses the edge passing time as a cost function:
Learning about one-way streets means that the one-way properties of the map the agent uses for
the navigation change with time as the agent learns about edges it cannot pass during he solves the
navigaton tasks. Thus our map becomes spatio-temporal. Spatio-Temporal Maps are discussed in
George and Shekhar (2006) and George et al. (2007). In these sources Spatio-Temporal Maps are
deined to be graphs with the nodes an edges having time-depended properties. This means that the
values of the properties can be modelled as a time series. In a graph representing the road network,
information about one-way streets can be provided as an edge property. In our scenario, we focus
on the information about the allowed driving direction of the edges. An edge can be passable in both
directons, or only in the direction of its deinition (from start point of the edge to the end point).
Furthermore a learning functionality was implemented similar to the implementation in Krek (2002).
Learning means, that the agent learns about the property values of the edges or nodes while he
navigates. In our case, he learns about the allowed driving directions of the edges. The learning
functionality is realized in a way that the agent continuously compares his map with the real road
situation and detects the edges that cannot be driven. The agent then updates the properties of
these detected edges in its navigation map. The map, the agent uses for navigation, emerges in a
converging learning process to a complete map. At the end of this process the complete map will
show 100% of the one-way streets.
The novelty in our approach is that the agent now archives the information about one-way streets it
learns during the navigation process. The collected information becomes available not only until the
end of the actual navigation process as it is done in Krek (2002), but it becomes available for future
navigation tasks. In this case the map becomes dynamic.
The article is organized as follows. The navigation examples are demonstrated in section 2. An approach with static maps is discussed in section 3 of this paper. In section 4 the results of the navigation with the learning functionality are shown. We conclude the paper with a discussion about further
research directions.
2. THE NAVIGATION DATA FROM OPENSTREET MAP
The street data was derived from OpenStreet Map (OSM), which models the road map as a graph.
The OSM data used in our research contains the center axes of the roads of Hamburg City as a line
graph. The geometry of the roads is combined with information about one way streets as well as
information about the maximum speed allowed.
Figure 1. The road map from OSM in an overview (a) and in detail (b)
88
Statistical analysis of routing processes using OpenStreet Map road data of
Hamburg with different completeness of information about one-way streets
Using a dataset from OpenStreet Map we have to consider some speciics: OSM not only models the
topology of the road situation, but also the exact location and course of the roads. A road connecting
two crossings very often consists of multiple edges. Crossings are composed of multiple edges and
vertices showing the detailed situation of the crossings including for example turn lanes and the precise courses of the curves. They are often represented with several vertexes where the agent has to
decide if he should drive left, right or follow straight. Figure 1a shows the OSM road graph used for
the analysis and Figure 1b shows the city center (Jungfernstieg) in detail.
Figure 2a demonstrates the real road situation and a result of an example routing from the start point
1439 to the destination point 1221. Figure 2b shows the road network used for the navigation which
does not contain information about one-way streets. In this case our agent assumes that all edges
can be used for driving in both directions. With this information the agent starts navigating like described in section 1. In the navigation process it can happen that the agent arrives at a node where it
can not follow his routing instructions any more this means it has planned to drive on the edge which
is a one-way edge and cannot be used for the purpose of his navigation. This edge is called a wrong
edge because its information about the allowed driving direction is wrong. The green arrows in Figure
2b show the wrong edges which were detected during the navigation.
Figure 2. Example Routing: Resulting path an detected one-way edges
The agent needed 2.53 min for navigating between the two points. If the agent had used a road map
containing all the information about one-way streets it would have needed only about 1.44 min. The
time for planning the navigation was not regarded. During this example navigation the agent detected
ive one-way streets. The number of one-way streets detected by the agent during one navigation
can also be a measure for the relation between the completeness of the information and the quality
of the decision the agent makes with the help of this information.
3. NAVIGATION WITHOUT A LEARNING ALGORITHM FOR ONE-WAY STREETS
We now let the agent navigate between the randomly selected start and destination points using road
maps with different level of completeness of information about one-way streets. We have 6 categories of maps providing 0% to 100% of the one-way streets in 20% steps. The one-way edges provided by the map were chosen randomly and for every navigation task a new map was prepared.
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
89
GeoValue - Hamburg 2010
Figure 3. Statistical analysis of routing time and learning effect
The agent processed 1000 routes in each category. In this way we simulated how the completeness
of information about one-way streets affects the quality of the routing decision of the agent. The
agent learns about one-way streets during the navigation but this information is not memorized for
further use in future navigation tasks. All 1000 cases were performed independently with the starting
and ending node. The quality of the agent decision was measured by the routing time as the agent
uses the edge passing time as a cost function for making his decisions in the routing. We determined
the average driving time difference dt between the navigation with a map providing incomplete information about one-way streets and a map containing the real road situation with complete information
of 1000 navigations in each category. Furthermore we counted the number of routings where the
time difference dt was 0. If for a navigation task dt is 0, this means the incompleteness of information
in these cases did not affect the decision quality of the agent. Figure 3 shows the results. On the left
y-Axis in Fig. 3a the average time difference between navigation with 100% known one-way streets
and the road map used with p% known one-way streets is presented. It shows a maximum at 60%
known one-way streets. The right y-Axis shows the number of routes where the agent the same or
less time for the navigation with imperfect map than with a map containing all data about one-way
streets.
For 35% of the routes there was no difference between routing with a incomplete map or routing with
a complete map. In the routings described in this section the agent learns about one-way streets
during the navigation. This information is not memorized for further use in future navigation tasks.
We also measured the success of this learning algorithm. Figure 3b shows the average number of
detected one-way streets computed in each of the categories. Both igures show the same result: the
user of the information experiences no difference in his decision quality between a decision based
on a map containing 60% of the one-way streets or a map without any information about one-way
streets. Even when the agent uses a map that contains information about 80% of the one-way streets
it needs only 12 seconds longer for an average navigation than it would have needed for navigating
with a complete map.
4. NAVIGATION WITH A LEARNING ALGORITHM FOR ONE-WAY STREETS
In a second step we simulated 2500 routings with an agent using a routing algorithm including a
learning function for one-way streets. The start and destination points were also chosen randomly.
The agent started with a map containing no information about one-way streets. At the beginning the
90
Statistical analysis of routing processes using OpenStreet Map road data of
Hamburg with different completeness of information about one-way streets
agent assumes that he can use all edges for driving in both directions. In contrast to the previous
section the agent now memorizes the information he has learned in his current navigation task and
can access it for the use in the next routing. The agent started with a map containing no information
about the 1255 one-way streets in the inner city of Hamburg. Figure 4 shows the number of one-way
streets detected against the number of routing turns the agent went through. During the 2500 navigation processes the agent learned 551 one-way streets which is equal to 43.9% of the one-way street
edges in the road map of the city center of Hamburg.
Figure 4. Number of detected one-way streets during 1000
navigation processes between random start and destination
points
Figure 5 shows the average time differences between navigation with a complete and navigation
with an incomplete map. To compute average time differences, the 2500 routing turns of the learning
process were executed in intervals with 100 routing turns each. The average time difference in the
last interval (routings 2400 to 2500), between a routing with a complete and an incomplete map, was
0.72 seconds (Fig. 5) even though only 43.9% of the one-way edges were known to the agent. The
missing information about 56.1% of the one-way streets in the inner city of Hamburg had no effect
on the navigation time of the agent.
Figure 5. The time difference between routing with a perfect and
an imperfect map in relation to the number of random routings
GeoValue 2010 Proceedings, Sep. 30 - Oct. 2, Hamburg, www.geo-value.net, Editors: Poplin A., Craglia M., Roche S.
91
GeoValue - Hamburg 2010
Many one-way street edges are not detected with our algorithm. This concerns the one-way street
edges in the middle of one-way streets constructed with multiple edges. An example would be the
road from point 1191 to 1265 in igure 2a. The learning algorithm only detects the one-way street
edge next to the node the agent reached in the navigation process because we did not implement
the backtracking functionality. A backtracking function in this case is deined to be a function that
determines all edges of a road to be one-way edges, when the last one-way edge of the road is
detected by the agent. If this function is included the learning algorithm will result in a complete map
in a much shorter time. In our model we assumed that a road connecting two crossing consists of
edges with the same passing direction. An example of a one-way street edge not detected would be
the one-way street shown in the red circle in Fig. 2b. When this one-way street is detected all edges
between point 1265 and point 1191 can be determined to be one-way streets with a backtracking
algorithm. But in our case the remaining one-way edges between these two points are only detected
when the routing is starting at one of the points between them. Thus the convergence of the learning
algorithm needs a very long time. As the one-way streets learned in such a way are always learned
in the irst step there is no detour because of these yet unknown one-way edges needed. The fact,
that only 43.9% of the one-way streets are known in the routing map of the inner city of Hamburg,
has no effect on the routing time of our agent. It shows that even a map with incomplete information
about one-way streets can meet decision quality needs of the agent. This is because the map is a
result of a converging learning process.
5. CONCLUSIONS
We analyzed the impact of data quality on the quality of decision-making of an agent navigating in
the inner city of Hamburg using information about one-way streets from a map. We considered the
learning aptitude of the agent as an important part in a decision-making process. In our work we
implemented the learning functionality according to which the agent is able to learn about its environment. Using a learning algorithm for one-way streets and memorizing the learned information for
one-way streets in a navigation task improves the quality of the agent decisions.
As a result of our experiments we can conclude that a map does not need to provide 100% of the
information of one-way streets for a successful navigation. In the inner-city of Hamburg the average
time difference between two routing processes with a map containing complete information and a
map containing only 43.8 % of the information about one-way streets is less than 1 second. The next
step of our research work will be the implementation of a backtracking function in the learning algorithm investigating the impact of learning on the quality of decisions.
REFERENCES
Bondy J.A. and U. S. R. Murty. (1982). Graph Theorie With Applications. Elsevier Science Publishing
Co., Inc and North Holland, New York, 5 edition, 1982. ISBN 0-444-19451-7.
Dijkstra E.W. (1959). A note on two problems in connection with graphs. In A. Householder and et al.,
editors, Numerische Mathematik, volume 1, pages 269–271. Springer, 1959.
Frank A.U. (2003). Pragmatic information content - how to measure the information in a route description. In Mike Goodchild, Matt Duckham, and Mike Worboys, editors, Perspectives on Geographic Information Science, pages 47–68. Taylor and Francis, London, 2003.
George B. and S. Shekhar (2006). Time-aggregated graphs for modeling spatio-temporal networks.
In John F. Roddick, editor, Advances in conceptual modeling - theory and practice, volume 4231 of
Lecture notes in computer science, pages 85–99. Springer, Berlin, 2006. ISBN 978-3-540-47703-7.
URL http://www.spatial.cs-umn.edu.
George B., Kim S. and S. Shekhar (2007). Modelling spatio-temporal network computation: A summary of results. In Frederico Fonseca, editor, GeoSpatial semantics, volume 4853 of Lecture notes
in computer science, pages 177–194. Springer, Berlin, 2007. ISBN 978-3-540-76875-3.
Krek A. (2002) An agent-based model for quantifying the economic value of geographic information.
PhD thesis, University of Technology, Vienna, 2002.
92