Untangling identities

Genotyping is widely accepted as more accurate at extracting profiles from mixed DNA samples but courts are questioning the software being used. Police Professional examines whether algorithms can pass the evidential test.

Nov 15, 2017

In 2013, 108 forensic science laboratories across the US took part in a study to examine a mixed DNA sample taken from a ski mask that had been left at a crime scene following a series of bank robberies. They were asked to determine whether a separate DNA sample – ‘X’ – supposedly obtained from a suspect in the crimes, was also part of the mix. The DNA profile containing a mix of four individuals had been created by geneticist Dr Michael Coble of the National Institute of Standards and Technology (NIST), a non-regulatory agency of the US Department of Commerce. Seventy-three of the laboratories got the answer wrong, producing results that suggested DNA belonging to ‘X’ had been found on the mask even though it was not present. “It’s the Wild West out there,” Dr Coble said at the time. “Too much is left to the analysts’ discretion.” DNA has long been the ‘gold standard’ of forensic science with its ability to link a suspect to a crime scene considered irrefutable. And advances in DNA profiling, such as the polymerase chain reaction, mean it is now possible to detect DNA at levels hundreds, or even thousands, of times lower than when the technique was developed in the 1980s. Investigators can even collect ‘touch DNA’ – the handful of cells left behind when someone briefly touches a drinking glass or the handle of a door. However, this advanced profiling can just as easily create ‘false’ positives, with analysts now potentially ‘picking up’ DNA that has been transferred from one person to another by way of an object that both have touched. With a simple DNA sample, analysts examine two sets of peaks at a given ‘genetic locus’ – one for the victim and one for the perpetrator. DNA analysis becomes trickier when a mix of DNA from a number of possible suspects is detected in a single crime scene sample. With mixtures, analysts are looking at clusters of peaks, with no indication of which pairs go together, or their source – apart from those of the known victim. At this point the analysis becomes highly subjective. Until a few years ago, most laboratories tasked with analysing mixed DNA profiles used a statistical approach called ‘combined probability of inclusion’, but this has now been replaced by a newer technology, known as probabilistic genotyping, which uses advanced computer software to analyse mixed samples. As demand for this type of analysis capability has grown, many companies are now vying for a greater share of the market by claiming their program is able to produce the most accurate results. There are around eight different probabilistic genotyping software programs currently on the market. They include LiRa produced by LGC Forensics in the UK, LikeLTD created by Professor David Balding of University College London, TrueAllele from a laboratory in Oakland, California and STRmix from New Zealand. All four have been used in court cases around the world, including a number of high-profile trials in the UK. Traditional DNA analysis involves manually and visually interpreting DNA markers. Probabilistic genotype software runs DNA data through a series of complex statistical algorithms to calculate the likelihood that a particular person’s DNA is present in a mixture, compared with a random person’s DNA. The software is said to remove human bias from the equation, delivering accurate, consistent results. Mark Perlin, chief executive officer of Cybergenetics and the creator of TrueAllele, started building the program for casework in 1999, a few years after working on the Human Genome Project. In the early 2000s, his company helped clear the backlog of DNA samples waiting to be interpreted for the Government’s national database in the UK, and later used TrueAllele to help identify victims’ remains at the World Trade Center site after the 9/11 terrorist attacks. TrueAllele was used for the first time in a criminal case in 2009 and now encompasses some 170,000 lines of computer code. The program claims to be able to untangle mixed profiles and “produce accurate results on prev

Related News

Select Vacancies

Constables on Promotion to Sergeant

Greater Manchester Police

Copyright © 2024 Police Professional