Www.WorldHistory.Biz
Login *:
Password *:
     Register

 

17-05-2015, 23:07

Development of Archaeological Dating

Establishing chronology was not always an overriding concern among archaeologists. In North America at the turn of the twentieth century, for example, resolution of the ‘moundbuilder’ and ‘American Paleolithic’ controversies actually dampened interest in chronological issues because it appeared that human occupation in the New World was relatively shallow, perhaps representing only a few thousand years. Throughout much of the eighteenth and nineteenth centuries, establishing age tended to be a coarsegrained affair, and usually involved identifying a particularly well-known historical period (e. g., Roman) or documenting the presence or absence of materials such as ceramics or iron which were thought to be generally indicative of time (e. g., the stone-bronze-iron age distinctions of Christian Thomsen and Jens Worsaae). Material types became chronological indicators if the materials consistently produced a generalized pattern stratigraphically (e. g., bronze artifacts in deeper layers were presumed to be older than iron artifacts in more shallow layers if the strata were undisturbed) (see Artifacts, Overview; Fiber Artifacts). Although detailed stratigraphic studies were making vast improvements to geological chronology at this time, comparable advances in archaeology were slow to be realized. Another early and unsophisticated archaeological technique involved estimating the minimal age of earthen mounds in the American Midwest by assessing the age of trees growing on them, although it was impossible to know just how much older the mounds might actually be. The slow development of archaeological chronology began to change in the late nineteenth and early twentieth centuries when it was realized that measuring change in the frequency of artifact types from different contexts and/or different depths could produce a temporal sequence.

In the late nineteenth century, Sir Matthew Flinders Petrie, working with predynastic Egyptian ceramics from grave contexts, appears to have been the first to realize that the various combinations of artifact types found in different contexts could be ordered in a way so that all the artifact-type distributions formed overlapping monotonic curves, with some forms gradually replacing others through time. Petrie appreciated that this arrangement represented chronological order and was able to deduce the relative age of the graves, although identifying which end of the order was younger still needed independent evidence such as stratigraphic superposition. Petrie’s innovation came to be known as ‘seriation’, a method independently elaborated upon in the 1910s by American scholars such as Alfred L. Kroeber, Nels C. Nelson, Leslie Spier, and Alfred V. Kidder working in the Southwest. The development of the seriation method in archaeology had an enormous impact on the discipline. One of the chief advantages of the seriation method is that initial chronological frameworks are able to be tested empirically against other artifact assemblages and can be subsequently refined, giving the method a science-like quality that was recognized early on, even by those outside the discipline. Seriation was soon applied to archaeological assemblages throughout the world, and constructing chronological frameworks became the primary interest of most archaeologists working from the 1920s through the 1950s.

Another archaeological dating method, ‘dendrochronology’, was also being developed in the American Southwest around the same time that the seriation method was being refined. Unlike simply counting annular tree rings, dendrochronology uses the unique patterns of varying ring widths in some tree species that are sensitive to climate and precipitation fluctuation. By linking multiple tree samples of different ages, master sequences of this patterning have been built that extend back thousands of years.

Samples of unknown age can then be fitted against this master sequence to derive an age estimate. One of the principal advantages of dendrochronology is its high precision, with the potential to identify the exact year in which an event took place. The development of seriation and dendrochronology were remarkable achievements that occurred well before the advent of radiometric dating techniques such as radiocarbon dating, and both methods continue to be used today.

More recently, a veritable explosion of radiometric and chemical archaeological dating methods has occurred in the last few decades, concordant with advances in the ability to detect and measure various physical and chemical changes in materials. Many of these methods are referred to as ‘absolute dating’ methods, in which measurements are made at an interval or ratio scale that includes the possibility of connecting the age of an event to a calendrical system. By comparison, seriation is a form of ‘relative dating’ in which measurements are made at an ordinal level based on order, and when tied to outside information such as stratigraphic superposition, yield older than/ younger than distinctions in age. While there are many advantages to absolute-dating methods, all were developed outside the discipline of archaeology and thus require rigorous bridging arguments that link the dated event obtained to the archaeological event of interest. Furthermore, the technical sophistication of many absolute-dating methods is costly in terms of specialized expertise and equipment. Even so, these methods offer archaeologists the possibility to assess the age of archaeological events with greater precision and accuracy than ever before.



 

html-Link
BB-Link