Wednesday, Aug 7: 11:50 AM - 12:05 PM
3047
Contributed Papers
Oregon Convention Center
The persistent issue of innocent individuals being wrongly convicted emphasizes the need for scrutiny and improvement in the US criminal justice system. Statistical methods for forensic evidence evaluation, including glass, fingerprints, and DNA, have helped solve complex crime investigations. Yet, national-level standards that could enforce the rigorous implementation of statistical analyses of forensic evidence have not been established. We investigate the use and misuse of statistical methods in crime investigations, such as the likelihood ratio approach for hypothesis testing. We further consider graphical models, where hypotheses and evidence can be represented as nodes connected by arrows describing association or causality. We emphasize the advantages of special graph structures, such as object-oriented Bayesian networks and chain event graphs, which allow for the concurrent examination of evidence of various nature. Finally, we discuss strategies to make the interpretation of statistical analyses of forensic evidence more accessible to non-statisticians, especially in the courtroom where decisions about the fate of potentially innocent individuals are made every day.
Forensic statistics
DNA typing
Hypothesis testing
Likelihood ratio
Graphical models
Bayesian networks
Main Sponsor
IMS