Goodchild2011 - Scale in Gis_ an Overview

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Geomorphology 130 (2011) 5–9

Contents lists available at ScienceDirect

Geomorphology
j o u r n a l h o m e p a g e : w w w. e l s ev i e r. c o m / l o c a t e / g e o m o r p h

Scale in GIS: An overview


Michael F. Goodchild ⁎
Center for Spatial Studies, and Department of Geography, University of California, Santa Barbara, CA 93106-4060, USA

a r t i c l e i n f o a b s t r a c t

Article history: Scale has many meanings, but in GIS two are of greatest significance: resolution and extent. Ideally models of
Received 7 November 2009 physical process would be defined and tested on scale-free data. In practice spatial resolution will always be
Received in revised form 29 June 2010 limited by cost, data volume, and other factors. Raster data are shown to be preferable to vector data for
Accepted 4 October 2010
scientific research because they make spatial resolution explicit. The effects of resolution are discussed for two
Available online 8 October 2010
simple GIS functions. Three theoretical frameworks for discussing spatial resolution are introduced and
Keywords:
explored. The problems of cross-scale inference, including the modifiable areal unit problem and the
Scale ecological fallacy, are described and illustrated.
Representative fraction © 2010 Elsevier B.V. All rights reserved.
Resolution
Extent
Ecological fallacy
Modifiable areal unit problem

1. Introduction been used to define a map's level of detail, its content, and its
positional accuracy (despite the fact that in principle the ratio cannot
From the extensive literature on the topic it is clear that scale is a be exactly constant over a paper map that flattens the curved surface
problematic issue in many sciences, notably those that study of the Earth, as all paper maps must). Representative fraction also
phenomena embedded in space and time. Numerous books and plays a key role in the analog models that are still used in some areas
articles have provided perspectives, many of them focusing on specific of engineering. Proctor and I (Goodchild and Proctor, 1997) have
disciplines (e.g., Lam and Quattrochi, 1992; Levin, 1992). Of particular argued that representative fraction is undefined for digital data,
interest here are the various books that have examined the role of although a series of conventions have been adopted to give it meaning
scale in the social and environmental sciences (e.g., Foody and Curran, in specific circumstances.
1994; Quattrochi and Goodchild, 1997; Alexander and Millington, The remaining two meanings, both of which are defined for digital
2000; Tate and Atkinson, 2001; Sheppard and McMaster, 2004), data, refer on the one hand to the extent of a study area, primarily its
where the space of interest is that of the Earth's surface and near- extent in space but also to some degree its extent in other dimensions
surface, and where the geographic information technologies – including time; and on the other hand to its resolution, or degree of
geographic information systems (GIS), remote sensing, and the Global detail, again primarily in the spatial and sometimes the temporal
Positioning System (GPS) – play an increasingly important role in dimensions. Both can be expressed for the spatial dimensions in either
support of research. The purpose of this paper is not to add anything linear or areal measure (or volumetric if the third spatial dimension is
new to this extensive literature, but to provide an overview and included), and in other work (Goodchild, 2001) I have argued that
summary of the major issues of scale that arise when GIS is used as a their dimensionless ratio, which I have termed the large over small
research tool, with particular emphasis on geomorphology. ratio (LOS), is in practice remarkably constant across a wide range of
The first and most obvious problem is semantic – that the noun data sources and applications, within the range 103 to 104.
scale is used in three distinct senses in science, and in many other In what follows, and in the overwhelming majority of the literature
senses in society generally. To a cartographer scale normally refers to on scale, the emphasis is on the small or resolution dimension of
the representative fraction, the parameter that defines the scaling of spatial scale. The Earth's surface is infinitely complex, and in principle
the Earth's surface to a sheet of paper. Like all analog representations, could be mapped down to the sub-millimeter and even molecular
a paper map sets a given ratio between distance on the map and the level. In practice, however, given our limited ability to sense, capture,
corresponding distance on the ground, and this ratio has traditionally and handle massive volumes of data, it is essential to reduce detail,
capturing only the largest and therefore likely the most important
features of any spatially distributed phenomenon. Sampling achieves
⁎ Tel.: +1 805 893 8049. this, as do processes of cartographic generalization (McMaster and
E-mail address: [email protected]. Shea, 1992), aggregation, and approximation. These processes tend to

0169-555X/$ – see front matter © 2010 Elsevier B.V. All rights reserved.
doi:10.1016/j.geomorph.2010.10.004
6 M.F. Goodchild / Geomorphology 130 (2011) 5–9

remove unwanted short-wavelength detail, though it is often difficult embody scale to some degree. Both discrete-object and continuous-field
to express the effects in formal terms as the action of a low-pass filter. conceptualizations can be represented as either raster or vector data. In
Domain sciences such as geomorphology are concerned with the the raster case, resolution is always explicit in the size of the raster cells,
modeling of processes that occur on the physical landscape. If the which are almost always square in the two-dimensional case (in the
spatial resolution of data is always limited, it is essential therefore in spirit of the previous comment about the impossibility of a constant
the study of any given process that the data used to model the process representative fraction on a flat map, note that a raster cannot be laid
include all of the important detail needed for accurate modeling. If the over a curved surface, and note the existence of an extensive literature
process is significantly influenced by detail smaller than the spatial on discrete global grids (e.g. Sahr et al., 2003). In the three-dimensional
resolution of the data, then the results of analysis and modeling will case it is common for the vertical dimension to be sampled differently
clearly be misleading. Thus in addition to the spatial resolution of data, from the two horizontal dimensions, leading to differential resolution. In
it is also important to consider the spatial resolution of processes. atmospheric science, for example, the intervals between sampled
However few theories of process make spatial resolution explicit, most heights may be quite different from the horizontal intervals between
being essentially scale-free. The Darcy equations of groundwater flow, sampled profiles. Modifications to simple raster structures are some-
for example, or the Navier–Stokes equations of viscous fluid motion, times made to allow for a degree of sub-cell representation, for example
are expressed as partial differential equations in scale-free variables. by recording more than one attribute per cell. But spatial resolution
When solved using finite-difference or finite-element methods, spatial remains unaffected unless the within-cell locations of each attribute are
resolution is suddenly introduced in the size of the raster cells or finite also recorded, as they are for example in the hierarchical quadtree
elements, and it is difficult to characterize the uncertainties associated structures (Samet, 2006).
with the inevitable information loss. Thus the researcher whose model Resolution is more difficult to define in the vector case. If data are
fits reality to a level that is less than perfect, as all models must, is left captured at irregularly spaced points, there may be some justification
not knowing whether the misfit is due to the effects of spatial for using the distance between points as a measure of resolution, but
resolution, or due to an imperfection in the model, or both. little basis for deciding whether the minimum, mean, or maximum
The first major section discusses scale within the framework of nearest-neighbor distance should be used. When data are captured as
alternative conceptualizations and representations of spatial data, attributes of areas, it is common to represent the geometric form of
with emphasis first on discrete objects and continuous fields, and then each area as a polygon, and volumes are similarly represented as
on raster and vector data structures. This is followed by a section polyhedra. Resolution now appears in several forms: in the willingness
dealing with scale and semantics, or the effects of scale on the to represent boundaries as infinitely thin, in the density with which
definitions of the terms, classes and variables that are commonly boundaries are sampled, in the within-area or within-volume
acquired and used in GIS-supported research. The third major section variation that has been replaced by an assumption of homogeneity,
discusses efforts to formalize scale through concepts of fractals, and in the sizes of areas and volumes. Unfortunately this often creates
Fourier analysis, and the underlying assumptions of geostatistics, the mistaken impression that vector data sets have infinitely fine
followed by a section discussing the difficulties of cross-scale resolution. For such maps we can state in general that at finer
inference. The paper ends with a short summary of the major points. resolutions the numbers of areas and volumes would increase, their
boundaries would be given more detail, and they would be more
2. Scale in GIS representations homogeneous. In the case of lines the conventional digital represen-
tation is as a polyline, a set of points connected by straight-line
It is now widely accepted that two methods exist for conceptu- segments, and a similar issue now arises over the density of sampling
alizing phenomena distributed over the surface of the Earth of the line and its infinitesimally small width.
(Couclelis, 1992). The discrete object conceptualization imagines that In short, vector representations leave resolution poorly defined.
the world is a surface like a table-top, empty except where there exist Moreover it is difficult if not impossible to infer resolution a posteriori
discrete, countable things. These things can overlap, and in many from the contents of a vector data set. While the choice between raster
cases will maintain their integrity through time and when moved. and vector is often guided in practice by the nature of existing data, by
This conceptualization is particularly appropriate when dealing with the software available for handling the data, and by the types of
biological organisms, vehicles, or buildings. In practice discrete analysis and modeling that are to be conducted, in principle the poor
objects may be represented as points, lines, areas, or volumes definition of resolution for vector data is a strong argument for the use
depending on their size and the purposes of the representation. of raster data in rigorous scientific research.
However other phenomena present distinct problems when
conceptualized in this way. Many natural features, including mountains, 3. The semantics of scale
lakes, and rivers, have only vaguely defined limits and may be better
conceptualized as continuous, in what is known as the continuous-field The power of GIS lies in its ability to transform, analyze, and
conceptualization. In this view phenomena are expressed as mappings manipulate geographic data. But since all geographic data must be
from location to value or class, such that every location in space–time specific to resolution, as discussed in the previous section, it follows
has exactly one value of each variable (the full four-dimensional space- that all such transformations, analyses, and manipulations must also
time is often simplified by ignoring time, in the case of static be scale-specific. Consider, for example, the simple task of measuring
phenomena, or ignoring the third spatial dimension, or both). Thus, the length of a digitized line. If the line is represented in vector form as
for example, topography is better conceptualized as a mapping a polyline, an easily computed estimate of length will be the sum of
everywhere from location (x,y) to value z than as a collection of vaguely the straight-line segments that compose the polyline. If we assume
defined discrete features, and soil type is better conceptualized as a that each sampled point lies exactly on the true line, the length of each
mapping from location to class c than as a collection of homogeneous segment will be a lower bound on the length of the true line. In
and non-overlapping areas separated by infinitely thin boundaries, general, estimates of length obtained in this way (and of perimeters of
across which class transition is instantaneous. Many physical phenom- areas and surface areas of volumes) will be underestimates of the
ena, from air temperature to soil moisture content and land cover class, truth, by an amount that depends on the sampling density. This
are more often conceptualized as fields than as collections of discrete phenomenon has long been recognized in the fractal literature, in
objects. Mandelbrot's 1967 question “How long is the coastline of Britain?”
In principle both field and object conceptualizations are independent (Mandelbrot, 1967), and the relationship between length and
of scale. In practice, however, their digital representations always sampling density is often termed a Richardson Plot in recognition of
M.F. Goodchild / Geomorphology 130 (2011) 5–9 7

the work of Lewis Fry Richardson (1960). More generally, we can state finer resolution, and would these justify the increased costs? What
that the lengths of natural features such as coastlines cannot be confidence limits can be placed on estimates of properties obtained
defined – or measured – independently of scale. Note however that in from coarse data?
the cases of area for polygons and volume for polyhedra, there is the Frolov and Maling (1969; see also Maling, 1989) analyzed the
potential for both under- and over-estimation of measures. errors introduced by representing a perfectly known area in a raster of
This same conclusion about the importance of scale in length a given cell size, using simple models from the literature of geome-
measurement applies to a remarkable number of geographic data tric probability. In a subsequent paper (Goodchild, 1980), I showed
types. Consider the measurement of slope, which is typically done that their results could be framed within the theory of fractals
from a raster of elevations, otherwise known as a digital elevation (Mandelbrot, 1977) given that many geographic phenomena exhibit
model (DEM). A common algorithm due to Horn (1981) takes the eight fractal behavior (see below), allowing one to estimate exactly how
cells forming a given cell's Moore (or queen's case) neighborhood, much information is lost by coarsening resolution. More recently
weights the eight neighbors depending on their distance from the Shortridge and I (Shortridge and Goodchild, 2002) showed that such
central cell, and obtains estimates of the two components of slope. But problems could be addressed by an extension of Buffon's Needle, a
the resulting estimates are dependent on the raster cell size, and in classic problem in geometric probability.
general larger cells will yield smaller estimates of slope. Hence again Mandelbrot's essential thesis (Mandelbrot, 1977) in advancing the
slope cannot be defined or measured independently of scale. More theory of fractals was that the rate of information loss or gain with
fundamentally, topographic surfaces are often subject to breaks of scale change was orderly and predictable through principles that have
slope where derivatives are undefined (another link to the fractal come to be called scaling laws. The Richardson Plot referenced in the
literature, see for example Mandelbrot, 1982). previous section shows length plotted against resolution on double-
These two examples both concern geometry, but insofar as they log paper, and commonly results in a close fit to a straight line.
address the ways parameters such as length and slope are defined, Mandelbrot generalized this result to argue that phenomena exhibit-
they can be regarded as problems of meaning. More broadly, the ing what he termed fractal behavior would show such power-law
meanings of many more geographic data types prove to be scale- behavior whatever the method used to obtain the measure. In the case
dependent. Consider land cover, for example. At a coarse scale it may of length measurement this might be the spacing of a pair of dividers
be possible to define broad categories of land cover, such as urban or used to step along the line, or the size of raster cells in a raster
oak savannah. But in both of these examples definitions break down at representation of the line, or a resolution parameter from any of a host
finer scales, as the landscape begins to fragment itself into individual of other methods. Fractal behavior includes the property of self-
buildings, gardens, and roads, or into individual oak trees and similarity, meaning that any part of the feature is statistically
surrounding grassland. At even finer scales further fragmentation indistinguishable from the feature as a whole. Simulations of self-
occurs, as the individual leaves of the trees or tiles of the roofs become similar lines and surfaces show striking resemblance to certain
apparent. Such issues tend to be far more obvious in the raster case, geomorphological features, such that exceptions to power-law
particularly when data have been obtained by remote sensing or other behavior become scientifically interesting (Goodchild and Mark,
types of imaging, since resolution is explicit in the raster cell size. 1987).
Efforts are often made to extend the range of resolutions that are valid The fractal properties of geomorphic features have been the
for any given set of class definitions, by recognizing mixed pixels, for subject of an extensive literature (Xu et al., 1993). Andrle investigated
example. But even though a pixel may be regarded as split between the surfaces of talus slopes (Andrle and Abrahams, 1989) and
two or more classes, the inhomogeneities of those classes will become coastlines (Andrle, 1996); Mark (1984) investigated coral reefs;
glaringly apparent as resolution becomes finer. Klinkenberg (1994) examined regional topographies of the United
Unfortunately this essential importance of scale in geographic data States; Clarke and Schweizer (1991) measured the fractal dimensions
is often missed by the systems that have been developed for searching of some natural surfaces; and Tarboton et al. (1988), La Barbera and
and discovering geographic data sets. Such systems, variously known Rosso (1989), Liu (1992) and Nikora et al. (1993) studied the fractal
as data warehouses, digital libraries, or portals, give greatest properties of river networks.
prominence to the area of interest and the data type or theme. For Geostatistics (Goovaerts, 1997) provides another powerful frame-
example, in the search screen for the US Geospatial One-Stop portal work for addressing issues of scale. One of the most widely used
(www.geodata.gov; Goodchild, et al., 2007), a project initiated by the functions in GIS is spatial interpolation (Longley, et al., 2005), the task of
Bush Administration and designed to provide a single point of entry to estimating the value of some variable z at locations x where it has not
the geographic data resources of the US Federal government, the user been measured, based on measured values (z1,z2,…,zn) at some set of
is able to specify area of interest, data type, date, and storage medium, locations (x1,x2,…,xn). In effect this proposes to refine the spatial
but has no ability to specify scale despite its critical importance. resolution of a point data set artificially, replacing a finite number of
observations with potentially an infinite number. It is used for
4. Formalizing scale interpolating contours between point observations, and for resampling
raster or vector data to a different set of points (a different support in the
The previous section ended with a comment on the paucity of terminology of geostatistics) A wide range of techniques have been
useful information about scale in geographic data sets. In part this is proposed, varying with the set of assumptions the user is willing to
attributable to the lack of theory about scale, and thus to the difficulty make.
of formalizing it, a problem that was discussed earlier in the context of Geostatistics provides what is perhaps the technique of spatial
vector data. The purpose of this section is to review the available interpolation with the strongest theoretical basis. In brief, the theory of
frameworks for a formal discussion of scale, drawing from a variety of regionalized variables proposes that values of variables z distributed
literatures. over geographic space (continuous fields in the sense discussed above)
Some of the earliest efforts to deal with scale in geographic data are not sampled independently, but instead show strong spatial
were concerned with very practical issues. If scale is an essential autocorrelation. This is an entirely reasonable proposition, since
parameter of any data set, and the costs of acquiring, handling, storing, physical laws ensure that properties such as atmospheric temperature,
and processing data are dependent on scale, then it would be helpful to soil moisture, or elevation show strong autocorrelations over short
have some objective bases for resolving the associated issues of GIS distances. Geostatistics further proposes, however, that the mathe-
design. How much information is lost when data are collected at matical form of the decline of spatial autocorrelation (or the increase in
coarser resolution? What would be the benefits of collecting data at variance) with distance is a general and measurable property of each
8 M.F. Goodchild / Geomorphology 130 (2011) 5–9

field that can be estimated from a sample of data points. Armed with has become the classic example of the modifiable areal unit problem
such a correlogram (or more commonly its close relative the variogram, (MAUP), by showing that reaggregation of county data for Iowa could
depicting the increase in variance with distance) estimated from be used to produce correlations ranging from extremely negative to
sampled values of the field, it is possible to make generalized least- extremely positive. When the 99 counties were aggregated to
squares estimates of values at points where the field was not alternative arrangements of 12 regions, correlations between % over
measured, and to compute estimation variances. This last property, 65 and % registered Republican voters ranged from −0.936 to + 0.996.
together with the use of the data to parametrize the correlogram, are While one might think of this issue as akin to the random variation
what give this class of techniques, generally known as Kriging, their produced by alternative samples, Openshaw and Taylor argue
claim to theoretical rigor. convincingly that the effect must be treated as systematic.
The correlogram or variogram allow the researcher to see the Closely allied to the MAUP is the problem of cross-scale inference – of
distances over which values are strongly correlated. Pairs of observa- inferring behavior of a system at one scale from its observed behavior at
tions separated by such distances are partially redundant, since one another, coarser scale. In the extreme this is known as the ecological
partially duplicates the other. More specifically, the distance beyond fallacy, which is made when a researcher assumes that correlations
which observations are not even partially redundant is termed the range observed for aggregates can be transferred to the individual. King
of the variable, and provides an empirical estimate of the detail that can (1997) has provided a comprehensive analysis of the problem and has
be discarded without substantial loss of information. Thus we have a proposed some powerful methods for appropriate inference. To my
rigorous way of defining the spatial scales of a variable that has interval knowledge neither the MAUP nor cross-scale inference have been
or ratio properties and is conceptualized as a continuous field. investigated extensively for physical properties that are typically
Recently, Boucher et al. (2008) have shown how geostatistics can analyzed at the aggregate level, such as the properties of watersheds
be used to support the challenging task of downscaling fields. In or geomorphic units.
remote sensing, for example, the researcher is often constrained by The problem of downscaling, of which the ecological fallacy is an
the spatial resolution of the instrument used to create images of the extreme example, has been widely examined in the geomorphological
Earth's surface. The 1 km resolution of the AVHRR instrument, for literature, and discussed in the previous section in the context of
example, clearly limits the applicability of its imagery, since it is geostatistics. Luoto and Hjort (2008) downscaled data on periglacial
difficult to identify and impossible to position features that are only a features using simple techniques, and compared the results to ground
fraction of 1 km across. Thus it is reasonable to ask how much truth. Zhu et al. (2001) and Toomanian et al. (2006) experimented
information has been lost due to the coarse resolution, and what with downscaling soil properties while Zhang et al. (1999) attempted
uncertainties this introduces in the results of analysis. Downscaling the downscaling of slope estimates.
attempts to insert the missing detail, using properties that can be
identified from the coarse imagery. In a geostatistical framework, in 6. Conclusions
essence this means estimating a correlogram using the coarse data,
inferring its behavior at shorter distances, and then using that inferred One of the effects of the development and widespread adoption of
correlogram to generate simulated data that are consistent with the GIS has been the attention that has been devoted to issues of scale.
coarse imagery. While many naïve users, including the multitude of users who have
An alternative theoretical framework to geostatistics is provided by encountered digital geographic data through Web services such as
spectral or Fourier analysis, which decomposes any field variable into its Google Earth, might assume otherwise, seasoned users of GIS know
harmonic components. Spatial resolution now becomes a matter for the well that scale, and specifically spatial resolution, is an important
spectrum: variations over wavelengths less than the spatial resolution factor in any application. People who think about the broader
are discarded or already missing. Clarke (1988), for example, has shown implications of their use of GIS, rather than worrying about which
how Fourier analysis can be used to discard detail in topography. button to push next, are increasingly referred to as critical spatial
However Fourier analysis has not achieved the same level of popularity thinkers (NRC, 2006); and much of that thinking inevitably revolves
in the research community as geostatistics, due perhaps to the obvious around scale.
lack of regular periodicity in most geographic fields. Wavelet analysis GIS forces these issues to the fore because it insists on formalizing
provides an interesting extension to Fourier analysis by allowing geographic data and analysis, by reducing both to a series of coded,
spectral properties to vary spatially in a hierarchical fashion. reproducible binary signals. This is often perceived as an advantage,
particularly in the use of GIS by regulatory agencies, because it forces
5. Cross-scale inference all manipulations to meet standards of replicability and accountability.
On the other hand it is clear that in many cases the data input to such
The effects of coarsening spatial resolution on the results of manipulations cannot stand up to the same level of objective, scientific
analysis have long been a topic for investigation, and recently this has scrutiny.
expanded into a major effort by the GIS research community. The norms of science, as expressed in countless volumes (e.g.
Consider, for example, an analysis of the correlation of two variables Harvey, 1969), establish certain principles that are intended to apply
that are expressed on area support, in other words, their values are to the process of scientific research. The issues discussed in this paper
both known for a set of geographic areas. In the investigation of social intersect with the norms of science in several important and
processes the areas might be counties, or the smaller reporting zones fundamental ways. First, spatial resolution is in practice a result of a
used by the US Bureau of the Census. In hydrology, they might be the mostly implicit analysis of the benefits of detail, versus the costs. But
lumped properties of small watersheds. It has been known for a long science provides no general guidance on this issue, and no basis on
time, and it is easy to show analytically, that aggregation of the small which to value science-on-the-cheap against expensive science.
areas into larger ones, and averaging of the variables over each Second, the norms of science include replicability, and oblige a
aggregation, results in a stronger correlation between the variables; it researcher to report his or her results to a level of detail sufficient to
tends to be no more significant, however, because the improvement in allow someone else to repeat the research. But in GIS-based research it
correlation is offset by the loss of degrees of freedom. is common for the documentation of software to fall short of this
What is less well known, however, is that significant variation can standard, and the algorithms of commercial software are often
also be produced by holding spatial resolution constant, but repeating regarded as valuable intellectual property. Moreover it is common in
the analysis over alternative sets of areas at the same spatial GIS for researchers to use data created by others, without full
resolution. Openshaw and Taylor (Openshaw, 1983) provided what documentation of the provenance and lineage of the data. Third,
M.F. Goodchild / Geomorphology 130 (2011) 5–9 9

reference has already been made to the dilemma faced by a scientist Horn, B.K.P., 1981. Hill shading and the reflectance map. Proceedings of the Institute of
Electrical and Electronic Engineers 69, 14–47.
who cannot determine whether the failure of a model to fit perfectly is King, G., 1997. A solution to the ecological inference problem: reconstructing individual
due to the model itself, or to the coarse spatial resolution of the behavior from aggregate data. Princeton University Press, Princeton NJ.
analysis. Klinkenberg, B., 1994. A review of methods used to determine the fractal dimension of
linear features. Mathematical Geology 26, 23–46.
Finally, the paper has drawn attention to the fact that issues of La Barbera, P., Rosso, R., 1989. On the fractal dimension of stream networks. Water
spatial resolution span, and raise similar issues, in a wide range of Resources Research 25, 735–741.
sciences from the social to the environmental, in fact in all sciences that Lam, N., Quattrochi, D.A., 1992. On the issues of scale, resolution, and fractal analysis in
the mapping sciences. Professional Geographer 44, 88–98.
deal with the surface and near-surface of the Earth. This creates an Levin, S.A., 1992. The problem of pattern and scale in ecology. Ecology 73, 1943–1967.
enormous opportunity for cross-fertilization, as researchers experi- Liu, T., 1992. Fractal structure and properties of stream networks. Water Resources
ment with techniques and ideas generated in disciplines that are often Research 28, 2981–2988.
Longley, P.A., Goodchild, M.F., Maguire, D.J., Rhind, D.W., 2005. Geographic Information
far removed from their own. The last section, on cross-scale inference,
Systems and Science, 2nd Edition. Wiley, Hoboken, NJ.
suggested for example that much could be gained by transferring ideas Luoto, M., Hjort, J., 2008. Downscaling of coarse-grained geomorphological data. Earth
concerning the MAUP to environmental sciences that use spatially Surface Processes and Landforms 33, 75–89.
lumped models. In the four decades since its initial development, GIS Maling, D.H., 1989. Measurement from maps: principles and methods of cartometry.
Pergamon, New York.
has become a common technology across many disciplines, and a basis Mandelbrot, B.B., 1967. How long is the coastline of Britain? Statistical self-similarity
for conversation and the transfer of ideas between them. and fractional dimension. Science 156 (3775), 636–638.
Mandelbrot, B.B., 1977. Fractals: Form. Chance and Dimension, Freeman, San Francisco.
Mandelbrot, B.B., 1982. The fractal geometry of nature. Freeman, San Francisco.
References Mark, D.M., 1984. Fractal dimension of a coral reef at ecological scales: a discussion.
Marine Ecology Progress Series 14, 293–294.
Alexander, R., Millington, A.C. (Eds.), 2000. Vegetation Mapping: From Patch to Planet. McMaster, R.B., Shea, K.S., 1992. Generalization in digital cartography. Association of
Wiley, New York. American Geographers, Washington, DC.
Andrle, R., 1996. Complexity and scale in geomorphology: statistical self-similarity vs. National Research Council (NRC), 2006. Learning to Think Spatially: GIS as a Support
characteristic scales. Mathematical Geology 28, 275–293. System in the K-12 Curriculum. National Academies Press, Washington, DC.
Andrle, R., Abrahams, A.D., 1989. Fractal techniques and the surface roughness of talus Nikora, V.I., Sapozhnikov, V.B., Noever, D.A., 1993. Fractal geometry of individual river
slopes. Earth Surface Processes and Landforms 14, 197–209. channels and its computer simulation. Water Resources Research 29, 3561–3568.
Boucher, A., Kyriakidis, P.C., Cronkite-Ratcliff, C., 2008. Geostatistical solutions for Openshaw, S., 1983. The modifiable areal unit problem. Geobooks, Norwich, UK.
super-resolution land cover mapping. IEEE Transactions on Geoscience and Remote Quattrochi, D.A., Goodchild, M.F. (Eds.), 1997. Scale in Remote Sensing and GIS. CRC
Sensing 46, 272–283. Press, Boca Raton, FL.
Clarke, K.C., 1988. Scale-based simulation of topographic relief. The American Richardson, L.F., 1960. Statistics of deadly quarrels. Boxwood, Pittsburgh.
Cartographer 15, 173–181. Sahr, K., White, D., Kimerling, A.J., 2003. Geodesic discrete global grid systems.
Clarke, K.C., Schweizer, D.M., 1991. Measuring the fractal dimension of natural surfaces using Cartography and Geographic Information Science 30, 121–134.
a robust fractal estimator. Cartography and Geographic Information Systems 18, 37–47. Samet, H., 2006. Foundations of multidimensional and metric data structures. Morgan-
Couclelis, H., 1992. People manipulate objects (but cultivate fields): beyond the raster Kaufmann, San Francisco.
vector debate in GIS. In: Frank, A.U., Campari, I. (Eds.), Theories and Methods of Sheppard, E., McMaster, R.B. (Eds.), 2004. Scale and Geographic Inquiry: Nature,
Spatio-Temporal Reasoning in Geographic Space: Lecture Notes in Computer Society, and Method. Blackwell, Malden, MA.
Science, 639. Springer-Verlag, Berlin, pp. 65–77. Shortridge, A.M., Goodchild, M.F., 2002. Geometric probability and GIS: some
Foody, G., Curran, P. (Eds.), 1994. Environmental Remote Sensing from Regional to applications for the statistics of intersections. International Journal of Geographical
Global Scales. Wiley, New York. 250 pp. Information Science 16, 227–243.
Frolov, Y.S., Maling, D.H., 1969. The accuracy of area measurements by point counting Tarboton, D.G., Bras, R.L., Rodriguez-Iturbe, I., 1988. The fractal nature of river networks.
techniques. Cartographic Journal 6, 21–35. Water Resources Research 24, 1317–1322.
Goodchild, M.F., 1980. Fractals and the accuracy of geographical measures. Mathemat- Tate, N.J., Atkinson, P.M. (Eds.), 2001. Modelling Scale in Geographical Information
ical Geology 12, 85–98. Science. Wiley, New York. 292 pp.
Goodchild, M.F., 2001. Metrics of scale in remote sensing and GIS. International Journal Toomanian, N., Jalalian, A., Khademi, H., Eghbal, M.K., Papritz, A., 2006. Pedidiversity
of Applied Earth Observation and Geoinformation 3, 114–120. and pedogenesis in Zayandeh-rud Valley, Central Iran. Geomorphology 81,
Goodchild, M.F., Mark, D.M., 1987. The fractal nature of geographic phenomena. Annals 376–393.
of the Association of American Geographers 77, 265–278. Xu, T., Moore, I.D., Gallant, J.C., 1993. Fractals, fractal dimensions and landscapes—a
Goodchild, M.F., Proctor, J., 1997. Scale in a digital geographic world. Geographical and review. Geomorphology 8, 245–262.
Environmental Modelling 1, 5–23. Zhang, X., Drake, N.A., Wainwright, J., Mulligan, M., 1999. Comparison of slope
Goodchild, M.F., Fu, P., Rich, P., 2007. Sharing geographic information: an assessment of the estimates from low resolution DEMs: scaling issues and a fractal method for their
Geospatial One-Stop. Annals of the Association of American Geographers 97, 249–265. solution. Earth Surface Processes and Landforms 24, 763–779.
Goovaerts, P., 1997. Geostatistics for natural resources evaluation. Oxford University Zhu, A.X., Hudson, B., Burt, J., Lubich, K., Simonson, D., 2001. Soil mapping using GIS,
Press, New York. 496 pp. expert knowledge, and fuzzy logic. Soil Science Society of America Journal 65,
Harvey, D., 1969. Explanation in Geography. Edward Arnold, London. 542 pp. 1463–1472.

You might also like