Ballistic Imaging: Not Ready for Prime Time

Policy Backgrounders | Crime

No. 160
Wednesday, April 30, 2003
by David B. Kopel, J.D., & H. Sterling Burnett, Ph.D.


Accuracy of Computer Matching

The most extensive examination so far of the accuracy of ballistic matching found that the number of possible matches in a comprehensive database would be so large as to require a substantial diversion of police resources from other, more productive crime-fighting efforts.

The California Department of Forensic Services Study. An October 2001 study for the California Department of Forensic Services concluded that an imaging database for new handguns would be unmanageably large:

  • When applying this technology to the concept of mass sampling of manufactured firearms, a huge inventory of potential candidates will be generated for manual review. This study indicates that this number of candidate cases will be so large as to be impractical and will likely create logistic complications so great that they cannot be effectively addressed.29

"The California Department of Forensic Services concluded that an imaging database for new handguns would be unmanageable."

Led by Frederick Tulleners,30 the California researchers conducted tests to gauge the accuracy of ballistic imaging. The first test fired two rounds of Federal-brand ammunition from 792 Smith & Wesson Model 4006 .40 caliber semiautomatic pistols.31 The ballistic image of one test fired cartridge case from each of the 792 guns was entered into an IBIS database.

From among the second set of test fired cartridges, 50 cases were randomly selected and imaged. These images were run through the database to look for matches. A suggested match was considered a success if the IBIS computer listed the parent gun of the cartridge case in the database among the top 15 most likely guns for the leftover case.

As mentioned previously, various parts of a firearm may mark the cartridge casing and bullet, but for automated imaging only the firing pin impressions, breech face marks and ejector marks are used.

  • The computer failed to suggest any top 15 match in 38 percent of the test runs.
  • At least one correct match was indicated for markings on cartridge cases from either the firing pin or the breech face in the remaining 62 percent of the tests.
  • In 48 percent of those tests in which IBIS suggested a correct match, the correct gun was in the top 15 suggested matches based on a distinguishing characteristic created by both the breech face and the firing pin.32

"California test had poor computer match rate with less than 800 handguns in a test database."

In other words, in this very limited test, distinguishing characteristics from more than one part of a firearm were matched and listed in the top 15 suggested matches less than one-third of the time. This is a problem because expert ballistic examiners are relatively scarce and their limited time is valuable. Their talents are called upon only after the computer-imaging database has found a reasonable likelihood of a match - usually a top 10 match on more than one characteristic. The more matching marks, the more likely that a ballistic examiner will conduct a final comparison. In the real world, it would not be surprising to find that ballistic examiners did not as a rule even bother to inspect cartridges for comparison when a computer did not find at least two or more matching characteristics between casings.

A second test used 22 Smith & Wesson pistols, firing one shot using each of five different brands of ammunition. (The brands used were PMC-Eldorado, CORBON, ARMSCOR, Remington and Winchester.) Seventy-two of these cartridge cases were then tested against the ballistic database that had been constructed using Federal ammunition.

  • On this test, only 11 percent of the computer tests put the correct gun in the top 15 for both breech face and firing pin images.
  • Thirty-eight percent of the tests put the correct gun in the top 15 for breech face or firing pin images.33

The second test illustrated the tremendous degradation in ballistic imaging accuracy when the recovered test cartridge comes from a different manufacturer than the cartridge in the database.

The failure rate might have been even greater if the California researchers had not used looser "success" criteria than ballistic examiners actually do. Ordinarily, examiners limit their search for a ballistic match to the top 10 ranking cartridge cases.34 Examiners then visually compare those 10 cases. Below the top 10, ballistic researchers generally find the odds of matching a case do not warrant the time and resources required. However, the California study counted as a "success" any ballistic identification that was ranked among the top 15 matches.35 Notably, the testing was conducted on cartridge casings - since "fired cartridge casings are much easier to correlate than fired bullets."36

The California test had a dismal computer match rate with fewer than 800 handguns in its test database. The report explained that if the number of database records were a hundred thousand (one year after database implementation) or a million (after a decade), the computer matching rate would be much lower.37

In contrast to NIBIN, which focuses on crime guns, comprehensive ballistic imaging would likely produce so many "hits" it would create a database of potential "suspect" guns so large as to be useless to forensic examiners.

That markings change over time also is a problem. Even the limited success with computer matching was achieved by matching cases used in a gun at nearly the same time. The report cautioned: "Firearms that generate markings on cartridge casings can change with use and can also be readily altered by the user. They are not permanently defined identifiers like fingerprints or DNA."38 And of course fingerprints and DNA are permanently associated with only one individual; consumer goods like firearms are not.

"A ballistic imaging database of all handguns would cost hundreds of millions of dollars and require an enormous number of personnel."

As the report from California details, a ballistic imaging database of all guns, or of all new handguns, would require substantial funding and an enormous number of personnel. At the federal level, the BATFE would have to receive significant increases in funding and staff to create and maintain such a database. This funding increase could come from a number of sources, none of which seems politically palatable. These include cutting the budgets of other programs and shifting the savings to the BATFE, substantially increasing taxes or fees on firearms or ammunition and dedicating the revenue to the BATFE database, or increasing deficit spending.

BATFE's Criticism of the California Study. A May 2002 BATFE report criticized the California study, in large part because Federal-brand ammunition was used.39 BATFE argued that other ammunition would produce better results: "The bearing surface of the bullet metal and case primer could not be too hard to get good consistent detail for correlations and later visual examination, yet the ammunition components could not be too soft, as that effect would give the correlation search a different benchmark to be compared against."40 In other words, BATFE argued that ballistic imaging studies should be performed under a Goldilocks standard, with test ammunition neither too hard nor too soft. However, criminal shootings rarely occur under controlled laboratory conditions.

Indeed, the Federal ammunition used in the California study is one of the three most popular brands sold in the United States and thus seems as likely as others to be used in a crime.

If the California study was flawed, the flaw was that it made matches much easier to achieve than they would be in real-world conditions. The study involved a much smaller number and variety of gun models than would be registered in an all-encompassing database. In addition, the guns used were all new, lacking the diverse histories of use and firing conditions that would have changed their ballistic images over time. Getting useful results from a real-world database containing lawfully owned guns discovered at or near crime scenes would present immense difficulties, much greater than either the California tests suggest or than would be possible under the Goldilocks standard.

A further evaluation ordered by California Attorney General Bill Lockyer found BATFE's complaints meritless and concluded that the original California study was correct.

Peer Review of the California Study. After the California study reported dismal prospects for ballistic imaging of noncrime guns, Attorney General Lockyer ordered an evaluation of the study by Dr. Jan De Kinder, head of the Ballistics Section of the National Institute for Forensic Sciences in the Belgian Department of Justice. The De Kinder report was released in January 2003.41 The evaluation examined the California test of 50 random cartridge cases (of a single brand) and the separate test for various brands of ammunition. De Kinder stated: "I fully agree with the analysis of the data as it was performed."42

"As the number of firearms in the database is increased, the results worsen considerably."

As De Kinder explained, "For the system to be successful, the correct gun should be listed in the top few ranks."43 For the Federal ammunition, the tests had found that 38 percent of the pistols did not even achieve a place in the top 15 ranks; IBIS incorrectly predicted that at least 15 other guns were more closely matched with a particular cartridge case than was the gun from which the cartridge case was actually fired.44

The California test using a variety of ammunition brands had achieved even worse results, with 62 percent of pistols not placed in the top 15 ranks in either breech face imaging or firing pin imaging. De Kinder commented: "[T]he trends in the obtained results show that the situation worsens as the number of firearms in the database is increased."45 This is precisely why collecting ballistic images for guns not associated with crimes - such as all new guns or all new handguns - would make current ballistic imaging programs much less effective.

BATFE had criticized the California study because it used Federal cartridges, rather than Remington; BATFE claimed that Federal primers are too hard, and thus do not mark well. De Kinder explained that Federal primers are actually significantly softer than Remington's, and indeed are the softest of seven different brands of ammunition tested by Erich Randich of Lawrence Livermore National Laboratory.46

De Kinder further noted that BATFE had previously used Federal ammunition in its own protocol testing.47 The manufacturer of IBIS, Forensic Technologies, Inc. (FTI), had argued that results from eight of the 50 cartridges should be ignored, since those cartridges did not match their parent gun even when carefully studied by a firearms examiner. De Kinder replied: "FTI proposed to remove them from the statistics to achieve better results. This is unacceptable...all data points have to be taken into consideration."48

De Kinder's agreement with the California study was conclusive: "As progressively larger numbers of similarly produced firearms are entered into the database, images with similar signatures should be expected that would make it more difficult to find a link. Therefore, this increase in database size does not necessarily translate to more hits."49

In other words, collecting ballistic images from guns not involved in crime (such as all new guns) would degrade existing ballistic imaging forensic efforts. The existing city-based databases of crime-related ballistic images would be flooded with orders of magnitude more images from ordinary firearms sales. This flood of additional data would seriously impair the ability of the NIBIN to produce "cold hits" linking a bullet or cartridge case to a gun owned by a criminal who was not a suspect in the crime where the bullet or cartridge case was found.

Maryland and New York Databases. The problems caused by creating databases of guns owned by law-abiding citizens are illustrated by the experience of Maryland and New York.

A 2000 Maryland law requires that images of test fired cartridge cases for every new handgun sold be added to the state's ballistic database. Gun buyers are charged $20 per gun for this system, and the state government has so far spent $5 million on it. The database now includes images from over 17,000 guns. The program has been used 155 times by investigators and has not solved a single violent crime - even though the database comprises new handguns, which are more likely to be used in a crime than are older handguns, rifles or shotguns. The database did help identify two stolen handguns.50

The Maryland State Police spent $1.1 million to purchase the Integrated Ballistic Identification System (IBIS) software to run its database. The $1 million that paid for IBIS was cut from community-policing funds.

"Maryland's ballistic database has images from over 17,000 guns and has been used 155 times by investigators, but has not solved a single violent crime."

The warranty on IBIS costs $150,000 per year. Seventeen people have been hired to administer the system at an annual cost of about $643,000, while annual operating costs are about $112,000. Meanwhile, the Maryland government cut 12 state trooper positions as an economy measure. Had the money spent on "ballistic fingerprinting" been used to maintain the existing community police programs or to maintain state police levels, many more crimes might have been solved or prevented. Even one solved crime would outweigh the nonexistent crime-solving accomplishments of the "ballistic fingerprinting" program.

The Maryland law has been quite effective in suppressing firearms sales. For example, the Thompson/Center Encore is a custom pistol for which the buyer can choose from a variety of calibers and barrels. The gun is shipped to the dealer without a barrel, so the gun cannot be test-fired at the factory and therefore cannot be sold in Maryland - even though this high-quality single-shot pistol costs over $500 and is virtually never used in crimes.

New York initiated its own statewide ballistic database program for law-abiding gun owners in March 2001 for a startup price of $4.5 million. Thus far, not a single case has been prosecuted in New York as a result.51 As of November 2002, the system had yet to produce a single hit.52


Read Article as PDF