Fingerprints

Fingerprints: A History

DNA

Unless you belong to one of four extended families who has adermatoglyphia, you have a unique and unmistakable set of fingerprints. They are formed at the early developmental stages of pregnancy between the deep layers of the skin by buckling pressures. These manifest as ridges you can see on the surface of your digits, and it is why fingerprints are such a permanent fixture on your hands and toes. Because they are embedded so early in life and at these skin layers, even severe burns via fire and acid will eventually see fingerprints returning to an individual’s digits.

The process for fingerprints developing is extremely random in nature, with small things affecting the outcome such as amniotic fluids, blood pressure, and hormone levels. All these factors plus many other conditions are why our fingerprints vary not only person to person, but finger to finger. Scientists believe these ridges are helpful in discerning fine textures when we feel surfaces with our digits, and may also serve to grip objects better. But what does the history of fingerprinting throughout the world look like?

 

History of Fingerprinting

Fingerprinting as a means of criminal identification is a relatively recent phenomenon, but the understanding of fingerprints as unique began some time ago. Historians and archaeologists have unearthed documents and art in ancient Babylon and China, as well as other old civilizations, that used fingerprints and handprints in place of signatures, for both royal seals and business contracts.

Several systems have since been designed to categorize and identify individual fingerprints, and they most notably grouped them by shape and ridge structure. The main patterns are arches, loops, and whorls. It wasn’t until the 18th century, however, that Johann Mayer declared that no two fingerprints were identical, though it’s likely civilizations of antiquity knew this in some capacity. Nehemiah Grew was the first to publish a scientific paper on the ridge structures of fingerprints a century earlier in 1684.

Crime scene investigation as we know it today did not utilize fingerprints until the 19th century in India, in what later became known as the Henry classification system. By 1903, US departments began the classification and cataloging of criminal fingerprints.

 

Modern Fingerprinting

Today many security systems and government-run citizen databases include fingerprinting technology. From phone lock screens to door codes, fingerprinting has left a monumental imprint on the world of criminology, security, and technology.

Fingerprints (or lack thereof) can tell investigators a lot about the nature of a crime scene. They can identify potential suspects, victims, and witnesses, as well as discern what activity went on when a crime took place. Fingerprints can lead to reasonable assumptions about a person’s age, sex, body type, race, and occupation. A lack of fingerprints or apparent attempts to conceal fingerprints also reveals valuable information.

As government projects expand the use of fingerprinting technology, it is important to remember potential drawbacks. While fingerprints themselves are unique, the methods for identifying fingerprints is prone to human error. This is shown in many modern cases yielding false positives, which in turn lead to false accusations of crimes and even imprisonments of individuals that were later found to be innocent. In 1993, New York police officers were even found guilty of planting fabricated evidence in the form of fingerprints to implicate a man of a murder he did not commit. Despite these few cases of error, fingerprints remain a modern key forensic tool and means of identification.