The Hazards of Formal Geographical Modeling in Bouckaert et al.—and Elsewhere
The linguistic and historical failings of the Bouckaert et al. Science article have been examined in previous posts and will be revisited in subsequent ones. The model’s cartographic miscues have also been dissected. The present post takes on the more abstract geographical issues associated with the authors’ approach.
The Bouckaert et al. article is overtly geographical. “Mapping” is the first word in its title, and the second sentence focuses on “explicit geographical models.” But the geographical model employed is so stripped of substance as to become almost anti-geographical. No allowances are made for actual geographical features other than the basic differentiation of “land” and “water” (with the latter term apparently meaning “seas and oceans”). In one of several sub-models employed, the authors assume that “movement into water is less likely than movement into land by a factor of 100.” But the ease of movement over water depends on the technology at hand and the cultural proclivities of the people in question. Would one ever make such an assumption when modeling the spread of the Austronesian language family, which depended on the double-outrigger canoe? The language map of the Philippines posted here clearly shows that in this case it is water that links linguistic communities and land, specifically the mountainous interiors of the main islands, that separates them. It is also notable that those who model pre-modern transportation networks generally assume that movement over water is vastly more efficient than movement over land.*
In the Bouckaert et al. model, geography is essentially reduced to geometry, which in turn becomes merely a matter of distances and directions. Mountains, passes, rivers, badlands, dense forests, and so on account for nothing. Such a stripped-down view of geography is convenient for mathematical modeling, but only at the expense of truth. We know from numerous historical studies that the movement of peoples (which is not necessarily the same as the movement of languages) is often guided by variation in the physical landscape. Agricultural settlers typically sought out appropriate soils, such as loess in the case of Neolithic farmers venturing into central Europe; heavy clay soils were avoided for millennia. Pastoralists, by the same token, sought out good pastures; it is no accident that the equestrian Magyars, like the Huns and Eurasian Avars before them, settled on the grassy Alföld of the Danubian Basin. Agricultural settlers, like the supposed carriers of Indo-European languages in the Bouckaert model, do not simply “diffuse” over a landscape like pathogens jumping from host to host. The process is rather more intentional, and much more molded by the variegated features of actual physical landscapes.
Bouckaert and company’s modeling is by no means the first attempt to flatten geography into geometry. I am particularly concerned about this maneuver because an earlier attempt to do the same thing within geography greatly weakened the discipline. I am often asked why geography is such a weak field in the United States, absent from most leading universities. The issue is complicated, but a key event was geography’s own “quantitative revolution” in the early 1960s, an intellectually aggressive refashioning the discipline into a positivistic, statistics-dominated “science” centered around the discovery of supposedly invariant spatial laws. To allow the statistical methods that the young revolutionaries favored, geography had to be reduced to distance and direction. Most of their studies began by assuming that the landscape being investigated—or merely hypothesized—was an “isotropic plain,” completely uniform and featureless in all directions. Such an assumption rules out everything that differentiates actual landscapes. The main result was mathematically elegant but empirically questionable and often worthless conceptual structures.
A prime example of geography losing its way was Central Place Theory, initially developed in Germany in the 1930s and then celebrated by Anglo-American geographers as a conceptual breakthrough in the 1960s. Central Place Theory postulates that the distribution of cities and towns of various sizes follows regular hexagonal patterns generated automatically by retail marketing behavior. The theory is almost entirely deductive, beginning with a set of assumptions and then working out their logical consequences. The assumptions**, however, do not hold, and as a result the theory did not work as promised. It is true that in some relatively flat areas urban patterns approximate the expected form, but in such cases administrative hierarchies generally played a much larger role than retail marketing. In regard to the United States, moreover, geographer James Vance showed in the early 1970s that wholesaling was far more important that retailing in determining the location and relative standing of major cities. Vance was attacked at the time not so much for being incorrect as for challenging the new theoretical underpinnings of a discipline in the desperate thrall of physics envy.
It is difficult to exaggerate the damage done to geography by the quantitative revolution. Suddenly, cultural and historical geography were deemed trivial, widely viewed as examining little more than noise that distracted attention from the underlying spatial laws. Exploring the complex interactions found in any given region now seemed quaint if not pathetic, a mere descriptive exercise deemed insignificant when contrasted with mathematically rigorous and supposedly scientific investigations. For the same reason, world geography—the core of the field, as constituted since antiquity—virtually vanished from the curriculum. Teaching “the world” came to be viewed as the mere cataloging of facts, failing to provide the conceptual purchase necessary for real understanding. Field study in distant lands was for the same reason actively discouraged by many; why go to Ghana and suffer the inconveniences and indignities of travel in a poor country when the same invariant spatial laws could be discovered in Iowa in the comfort of one’s own lab? Armed with such scientific-seeming techniques, geographers could now reach the height of their profession without knowing much of anything about the actual world.
Needless to say, the “laws” discovered by the quantitative revolutionaries of the early 1960s seldom proved very powerful, just as the explanations they offered seldom had much explanatory power. It is no accident that the doyen of the movement, David Harvey, abandoned the entire effort soon after publishing Explanation in Geography. In the early 1970s, Harvey—“the 18th most-cited intellectual of all time in the humanities and social sciences”— abruptly converted to Marxism, a transition followed by many other geographers at the time. Within a few years, radically leftist social theory had displaced positivism as the “cutting edge” of the discipline. Despite the huge intellectual shift that this entailed, including the general rejection of mathematical methods, the insistence on high theory and the corresponding denigration of empirical study remained firmly entrenched. Throughout this period, important geography departments continued to be shuttered by budget-conscious university administrations.
Admittedly, a number of scholars did attempt to link the abstract models of geography’s quantitative revolution to real landscapes, with mixed results. A key figure here was the preeminent historical anthropologist of China, G. William Skinner (1925-2008). Skinner had become enamored of Central Place Theory in the 1960s, which he used to “explain” the location of cities and towns in China’s Sichuan Basin. He later turned his attention to larger regions, brilliantly arguing that the structure of Chinese history had to be conceptualized around a handful of “physiographic macro-regions” loosely coincident with drainage basins. Skinner subsequently tried to integrate such regional analysis with Central Place Theory as well as several other abstract spatial schemas into what he called the “Hierarchical Regional Space Model.” He was convinced that this model applied to any preindustrial agrarian society, and he went to heroic efforts to show that it worked equally well in France and Japan as in China. In the Skinner model, geographic cores and peripheries of varying scales coincide with drainage basins to form all-encompassing spatial structures. Everything from the average age of marriage to the average wage rate was supposedly predicated on positioning within such highly structured spaces. Unfortunately for Skinner, empirical verification proved elusive, and his project essentially came to naught. All that his three decades of lavishly funded research produced was a few minor articles and a massive, idiosyncratic cartographic archive. As it turns out, human geography is an intrinsically complex affair that is not so easily reduced to clean conceptual structures.
More recently, genuine progress has been made in applying technical analysis to geographical issues. The key has been to use such techniques as tools rather than ends in themselves. Geographical Information Systems (GIS), for example, offer no “explanations” on its own, but rather allows scholars to more effectively uncover patterns and visualize evidence and complexity, as noted by Andrew Zolnai in a comment on the previous GeoCurrents post.
Quentin Atkinson claims that he would like to refine his own model of Indo-European expansion to encompass actual geographical variation beyond the land/water dichotomy. Doing so would surely be advantageous, but as long as his underlying assumptions fail to withstand scrutiny, the end result will still be untenable. Again, this is not to argue that abstract models are of no use in geographical or historical analysis, but only to insist that they be applied with great care.
* For an impressive model of the transportation networks of the Roman Empire, see Orbis: The Stanford Geospatial Network Model of the Roman World.
** Walter Christaller, who originated Central Place Theory, made the following assumptions, as outlined on the Wikipedia article on the theory:
▪ an unbounded isotropic (all flat), homogeneous, limitless surface (abstract space)
▪ an evenly distributed population
▪ all settlements are equidistant and exist in a triangular lattice pattern
▪ evenly distributed resources
▪ distance decay mechanism
▪ perfect competition and all sellers are economic people maximizing their profits
▪ consumers are of the same income level and same shopping behaviour
▪ all consumers have a similar purchasing power and demand for goods and services
▪ Consumers visit the nearest central places that provide the function which they demand. They minimize the distance to be travelled. No provider of goods or services is able to earn excess profit (each supplier has a monopoly over a hinterland)
« Self-Rule and Environmental Crisis in...
The Crown Dependencies: What Exactly Are... »