IN THE CIRCUIT COURT OF COOK COUNTY - CRIMINAL DIVIS, STATE OF ILLINOIS I I 16 CR 1321601 V. .. . JUDGE WILLIAM HOOKS .. MOTION TO EXCLUDE FINGERPRINT TESTIMONY DUE TO Courtney Henderson, through the Cook County Public Defender, requests that-this Court exclude the testimony of Of?cer Malone. In support thereof, Mr. Henderson states the I follOwing: I. INTRODUCTION The practices of the Chicago Police Department Latent Print Unit (LPU) violate every I professional norm and standard of good practice in the scienti?c community, With the result being unreliable?forensic opinions by examiners who use ?awed methods and misstate the I probative value of the evidence. The problems with the LPU are many, and they include the following: 0 Protocols- While standard practice in the ?ngerprint community requires that examiners follow written laboratory protocols that de?ne the examinatiOn process and the opinions that result from the examination process, the LPU does not operate pursuant to such; protocols. - I . a Quality Ass-urance- -While ?ngerprint labs in the US. are required to maintain a . documented quality assurance program designed to identify and correct errors, the LPU i has'no such program. 0 Training- While other ?ngerprint labs have documented training programs designed to re-educate examiners on the fundamental changes and reforms in the ?eld, the LPU has no such training program and its examiners lack basic knowledge about the present state of their own forensic discipline. 0 Validation- While stande practice in the scienti?c community requires ?ngerprint alteration techniques to be validated for scienti?c reliability, the CPD uses techniques to alter the appearance of original ?ngerprint evidence without ever having validated any of the techniques to establish reliability. I 0 Methodology? While some ?ngerprint labs in the past permitted themselves to Cheat by looking at the suspect?s prints when attempting to identify ambiguous features in latent prints, this older method is no longer generally accepted yet still in use at the LPU. Accreditation- While hundreds of forensic labs across the country have gone through the accreditation process to establish the validity and reliability of their examinations, the LPU has side-stepped this process and sought to shelter its substandard practices from meaning?il oversight. As will be explained in detail below, the ?aws 'with the LPU have been discovered through extensive subpoenas, through interviews1 with and cross examination of LPU ?ngerprint examiners, through examinations of LPU casework, and through consultation with three of the country?s leading ?ngerprint experts. The resulting picture is of a lab that is substandard in many ways, that has never met any external and objective criteria for proper functioning, and that employs examiners who overstate the probative value of ?ngerprint evidence due to a lack of training and understanding of their own discipline. 1 More recently, LPU examiners have refused to answer basic questions during pre-trial interviews, based on inappropriate advice of CPD legal counsel. Based on this advice, LPU examiners refuse to answer numerous basic questions about fingerprint comparison, including but not limited to the following:' 1) is there a recognized error rate for ?ngerprint comparisons, 2) is your method of fingerprint analysis subjective, 3) has the LPU validated fingerprint enhancement techniques it uses, 4) are you certified to conduct fingerprint examinations, 5) do you read forensic journals to stay current on developments in your ?eld, 6) have any other LPU examiners disagreed with your opinions during verification in the East 5 years, 7) are there sources of uncertainty in your examinations, 8) does your lab have written professional development procedures, 9) describe your lab?s Quality Assurance program, 10) does your lab have any written Quality Assurance procedures, 11) does your lab have a written ethics code, and 12) does your lab have a written procedure for conflict resolution and corrective action. (Attachment A). While this obstructionist tactic needlessly slowed discovery of LP U?s substandard practices, it did not prevent discovery ofthe problems documented in this Motion. II. IN VIOLATION OF PROFESSIONAL NORMS. THE LPU DOES NOT MAINTAIN PROPER PROTOCOLS DIRECTING ITS EXAMINERS IN HOW TO CONDUCT RELIABLE FINGERPRINT EXAMINATIONS. Adequate protocols are a pre-requisite for the proper functioning of any crime laboratory.2 Protocols assist crime labs in producing reliable results by de?ning appropriate examination methods, setting thresholds for what justi?es association opinions, and providing for quality assurance programs to handle problems when they arise. According to the US. Department of Justice, ?ngerprint labs must have written procedures detailing how examinations of latent prints are to be conducted, dictating what conclusions should be reported after an 3 This requirement is so examination is complete, and establishing a veri?cation process. fundaniental to the reliable functioning of a crime lab that the American Bar Association adopted a resolution directed to all crime labs, stating that ?jvrocedares should be standardized and published to ensure validity, reliability, and timely analysis of forensic evidence.?4 Because a lab cannot produce consistent and reliable results without written procedures, a forensic lab cannot attain accreditation unless the lab adopts complete written procedures. For instance, Forensic Quality Services (a national forensic auditor) requires that forensic labs maintain written procedures that detail the following: proper use of ?ngerprint processing techniques, proper examination methods, the process for resolving differences of opinion between an original examiner and a verifying examiner, the collection and preservation of complete case ?le documentation, the corrective action process, a schedule for periodic management review of lab practices, training needs of examiners, and quality control procedures to monitor the validity of examination results. (Attachmw B). 2 ?The Fingerprint Sou rcebook," U.S. Department of Justice p. 12-5 (2012). 3 4 The failure to maintain proper written procedures has been a contributing cause of the systemic failures of numerous crime labs across the country. For instance, when the Governor of Massachusetts ordered that a law enforcement crime lab cease operations, a post-failure audit by the Massachusetts Inspector General determined that the lab lacked basic written procedures outlining proper examination methods, and that this failure ?allowed [forensic examiners] to create their own discordant (and sometimes incorrect) practices.? (Attachment C, p. 115). The Michigan State Police made a similar ?nding when they audited the failed Detroit Police Department Crime lab. (Attachment D). The MSP determined that the procedure manual for the Detroit Crime lab ?provides only an administrative overview with limited technical direction on how to perform an analysis. (Attachment D). Further, the audit concluded that ?the lack of well-de?ned procedures may allow an examiner to draw conclusions on an analysis that may be inaccurate,? and also concluded that ?accountability is impossible? when a lab maintains inadequate written procedures. (Attachment D, p. 21-22). Similarly, auditors at the North Carolina Crime Lab determined that inadequate written procedures were a cause of that lab?s systemic problem of examiners. misrepresenting testing results. (Attachment E, p. I Finally, when the FBI discovered that one of its DNA had been falsifying data from DNA testing, they determined after-the-fact that inadequate protocols prevented earlier discovery of this forensic misconduct.5 The Illinois State Police Crime Lab (ISP) provides an example of a ?ngerprint lab with extensive written procedures- both administrative procedures and substantive examination 5 U.S. Department ofJustice, ?The FBI DNA Laboratory: A Review of Protocol and Practice Vulnerabilities,? p. 130 (2004). --1- n- an?: procedures?. (Attachment F). With regard to substantive protocols, the ISP has written protocols that de?ne the proper method to be used, describe each step of the method and applicable decision standards, provide for proper veri?cation, and de?ne the scope of quality assurance. (Attachment F). ISP protOcols speci?cally instruct examiners to use the ACE-V method, and de?ne each required step in the ACE-V process, from analysis through veri?cation. (Attachment F). ISP protocols de?ne the threshold that must be met for an examiner to offer an opinion of an association between a latent print and a suspect, as well as the standard for a determination of ?inconclusive.? (Attachment F). With regard to cases involving'AFIS searches, ISP protocols de?ne the minimum parameters for a latent print to be suitable for computerized (8 identi?able features and a discernible core and axis) and further require the preservation of these features in the case ?le. (Attachment F). Finally, examiners are speci?cally required to comply with a written quality assurance manual. (Attachment F). In stark contrast, the substandard collection of memos that purport to be ?ngerprint protocols does not resemble a real laboratory protocol in any way. (Attachment G). To start with, the memos do not even mention the method or methods to be used by LPU ?ngerprint examiners. The memos do not enumerate any steps to be taken by LPU examiners when analyzing latent print evidence, and do not de?ne in any way the criteria for an association opinion or an exclusion opinion. Additionally, the memos do not mention a quality assurance plan of any kind, nor describe any quality assurance procedures for documenting and resolving 6 Some of the ISP procedures are administrative in nature, while others are substantive and go to the heart of the examination process. Administrative protocols describe various processes other than the actual examination of the print evidence- evidence handling and labeling, use of computer systems to generate documentation, and guidelines for interacting with other agencies. Substantive protocols describe the methods to be used for examination of print evidence, standards for determining whether a print can be associated with a suspect, and quality assurance procedures used to detect errors in analysis. While forensic labs must have both administrative and substantive protocols, it is substantive protocols (defining examination methods and providing for quality assurance) that are critically important in making sure that fingerprint examinations are done correctly and result in reliable opinions. problems with analyses. Rather, the memosdeal almost exclusively with administrative matters (such as inventorying print evidence and uploading information into LPU computer systems), as this list of every LPU ?procedure? memo establishes: 4 Jan 2007 4 Jan 2003 24 Jan 2008 27 Feb 2003' 8 Sept 2008' 18 Nov 2008 28 Dec 2008 12 Mar 2009 16 Jun 2009 8 Oct 2009 14 Jul 2010 9 Aug 2010 16 Aug 2010 16 Aug 2010 25 Jam 2012 1 Feb 2012 27 Feb 2012 16 Aug 2012 20 Aug 2012 12 Oct 2015 4 Dec 2015 0 date 1322 Administrative Administrative Administrative Administrative Administrative Administrative Administrative Substantive Administrative Administrative Administrative Administrative Administrative Administrative Administrative Administrative Administrative Administrative Administrative Administrative Administrative Administrative Description Using E-track system, sealing evidence with tape Putting and Inventory #5 on reports Don?t use AFIS when case is already closed - Run prints in CPD database first, then ISP database When CPD can accept latents from other agencies Retrieval of digital latents from computer system Use of new identifying number for digital latents When testimony is needed, conduct second analysis Instructions on reporting use of digital latent Using eTrack system to send evidence to Forensic Services Paperwork necessary prior to examination of latent Using the CLEAR system to notify of suitable latents Procedure for incorrect RD numbers on latent evidence Using CLEAR .to track requests for analysis Order of analysis in property crime cases Number/order of comparisons in multiple-latent cases wording regarding cases with open latents How to inventory print evidence in computer system Documenting IR numbers and SID numbers . Providing discovery to defense attorneys Providing discovery to defenSe attorneys User guide for automated report system Con?rming that this collection of administrative memos above does not constitute a procedure mannal in any meaningful sense, a ?ngerprint examiner with the LPU (Of?cer Thurston Daniels) has admitted in sworn testimony that LPU examiners follow no substantive procedures when conducting ?ngerprint examinations: Q: But, importantly, in these memos that serve as, I guess, what your unit calls procedure, you don?t rely on these memos to tell you anything about how to conductACE?V, right? A: Right. Q: Okay. And you don?t rely on these memos to tell you what characteristics you should be looking for a possible source of a latent, right? A: That ?s correct. Q: And the memos don ?t include any standard whatsoever for when you can say that a print has been identified right? A: That ?s correct. Q: And they don?t give you any guidance on how verification is supposed to be conducted? A: No. You don ?t need guidance for verification. Q: . Okay. And they don?t mention what to do if you have an opinion that?s different from a colleague, right? A: No. Q: And just so we ?re clear, these memos you?ve been talking about, these are the only written directives that your unit keeps whatsoever? A: As far as I know. (Attachment H, p. 134). In addition to Of?cer Daniels? admissions, three leading ?ngerprint experts have reviewed the LPU memos and conclude that they are substandard in every way. Glenn Langenburg7 reviewed the LPU memos and concludes that these memos ?do not contain any 7 Glenn Langenburg has a Masters in Analytical Chemistry, a in Forensic Science, and his thesis evaluated best practices for conducting fingerprint examinations pursuant to the ACE-V method. He has been a practicing 7 .. . . procedures and indications of how the CPD LPU conducts ?ngerprint examinations.? (Attachment J). Langenburg notes that maintaining adequate written protocols is one of the three fundamental requirements of any forensic lab, and further notes that lack of written protocols at the LPU is ?terribly surprising? given that ?all of this information and industry accepted examples of SOPs are readily available for?ee on the internet.? (Attachment J). In addition to Langenburg, Cedric Neumann8 reviewed the same LPU memos, and concludes that ?it is impossible that this collection of documents could guarantee any form of scientific rigor in the examinations performed by the latent print examiners [at (Attachment L). Neumann explains that none of the memos address the scienti?c elements of ?ngerprint examinations, and I further explains that ?none of the documents attempts to describe, even in vague terms, the process that latent print examiners in the Chicago PD Latent Print Unit need to follow to examine latent prints and reach conclusions.? Because of these ?aws and others in the CPD memos, Neumann concludes that the collection of memos ?does not ensure that all examiners examine fingerprints in a transparent, reproducible, and reliable manner.? Finally, ?ngerprint fingerprint examiner for 17 years, testifying frequently for the prosecution. He has been an appointed/elected member of many national fingerprint bodies, including SWGFAST, the Nil Human Factors Working Group, and the IAI Standardization Committee (winning the Distinguished Member Award in 2007). He has published more than 20 articles in the field of forensic fingerprint comparison, including one study funded by the U.S. Department of Justice that involved participation of 146 fingerprint examiners from across the country, and including another study examining sources of bias in fingerprint analyses that was funded by the Midwest Forensics Resource Center and involved participation of 43 fingerprint examiners. In addition, he has given 1005 of lectures and workshops on forensic ?ngerprint comparison to audiences around the world. By any measure, Mr. Langenburg is a recognized leader in the field of forensic fingerprint comparisons. Mr. La ngenburg?s CV is provided at Attachment I. . 3 Cedric Neumann has a in forensic science (Magna Cum Laude), has directed extensive research on fingerprint evidence funded by the U.S. Department ofJustice, has been an editor of two of the most important professional journals for the forensic fingerprint community (Journal of Forensic Identification and Forensic Science International), is a past board member of the International Association for Identification (the leading professional association for fingerprint examiners in the U.S.), was a contributor to the Human Factors Working Group Report of 2012, was a member of SWGFAST (the leading organization for setting standards for best practices in the forensic ?ngerprint community), is a former forensic scientist for the Forensic Science Service in England (the government funded national forensic labortatory) and a former visiting scientist with the U.S. Secret Service, and has published extensively in peer-reviewed forensic journals on the topic of forensic fingerprint comparison. Mr. Neumann?s CV is provided at Attachment K. expert David Stoney9 reviewed these same CPD memos and reaches a similar conclusion? LPU does not ?maintain SOPs that de?ne the latent print examination procedures that are used. (Attachment N). Due to the lack of written protocols alone, this Court should exclude the results of ?ngerprint examinations by LPU. There is no more basic requirement in forensic science than operation pursuant to generally accepted protocols. Without them, there is no assurance that examiners are properly applying generally accepted methods, no assurance that quality problems are being identi?ed and addressed, and no assurance of valid and reliable results. This Court should exclude the opinion of the LPU examiner in this case, and in all cases, until such time as the LPU complies with basic industry standards and adopts adequate operating procedures. 9 David Stoney has a in Forensic Science, with Bachelor of Science Degrees in Chemistry and Criminalistics. He is the Chief Scientist at Stoney Forensics. He has been an editor of several of the most important peer-review forensic journals (Journal of Forensic Sciences, Journal of Forensic Identification, The Microscope). He is the former Director of the Forensic Sciences Program at the University of Illinois-Chicago. He is a grant-funded forensic science researcher. He has published many articles in peer-reviewed journals about forensic fingerprint comparison. He was a contributor to the Human Factors Report, as well as a reviewer of the NAS Report. Mr. Stoney?s CV is provided at Attachment M. IN VIOLATION OF PROFESSIONAL NORMS, THE LPU DOES NOT IMAINTAIN A QUALITY ASSURANCE PLAN. GIVING THE LPU NO WAY TO DETECT AND CORRECT FLAWED EXAMINATIONS AND ERRONEOUS FORENSIC CONCLUSIONS. Operating pursuant to a written Quality Assurance (QA) plan - is a fundamental requirement of ?ngerprint labs. According to. the US. Department of Justice, the purpose of a QA plan ?is to ensure that all examiners meet the quality standards set by the discipline and by the individual laboratory.?10 According to SWGFAST11 guidelines, a QA program is necessary for every ?ngerprint lab so that each lab can evaluate its practices ?to ensure standards of quality . are being met. ?12 For these reasons, SWGFAST states that a written QA plan ?shall be established for organizations conducting ?'iction ridge examinations. ?13 As Glenn Langenburg . states, a meaningful QA program is one of three- ?fundamental cornerstones of measuring competency and performance,? and further states that any laboratory operating without a QA program ?cannot objectively demonstrate that their examiners are competent in fingerprint examinations. (Attachment J). Similar to the problems described in the preceding section with substandard protocols, the failure of crime labs to maintain adequate QA programs has been a common ?nding in labs that suffer systemic failure.14 An adequate QA program includes ?those planned and systematic actions necessary to provide sufficient confidence that a laboratory?s product or service will satisfy given 1? US. Department of Justice, ?The Fingerprint Sourcebook,? p. 12-3 (2012). 11 The Scienti?c Working Group on Friction Ridge Analysis, Study, and Technology publishes guidelines and standards for fingerprint analysis in the U.S. First organized by the FBI, the SWGFAST membership includes 40 experts representing every type of law enforcement fingerprint lab, including the FBI, the U.S. Army Crime Lab, various state crime labs, and various fingerprint units of local law enforcement. swgfast. org/Members. Document "Standard for a Quality Assurance Program' In Friction Ridge Examinations,? (2012). 13SWGFAST Document ?Standard for a Quality Assurance Program in Friction Ridge Examinations,? (2012). 14 The Massachusetts crime lab was cited for a failed QA program that was ?insufficient to detect any malfeasance or issues related to chemist errors in drug analysis.? (Attachment C, p. 47). Among the many systemic failures of the Detroit Police Crime Lab were the failures to conduct technical reviews, administrative reviews, and testimony reviews. (Attachment D, p. 25-27). 10 9:15 requirements for quality. Components of an adequate QA program for a. ?ngerprint I lab include the following written documents: a code of ethics, a training program, procedures for con?ict resolution, procedures for testlmony renew, and procedures for correctlve actions. 6 The need for a comprehensive QA program would not be so critical if opinions of ?ngerprint examiners were objective, consistent ?om one examiner to the next, and error-free; However, the-opposite is 'true- the opinions of ?ngerprint'examiners are subjective and vary a lot from examiner to examiner. Examiners routinely disagree on the basic question of whether 'a ?ngerprint is suitable for comparison?: mmreirepamamsDecisions billowing Analysis Figure 24: Repdrted conclusions following the Analysis phase for each trial for examiners using Approach #2 . Outcomes are (value for identi?cation) VEO (value for exclusion only) and NV (no value). The results for same source comparisons are presented in aqua; the results for different sources comparismss are presented in red 15 SWGFAST Document ?Standard for a Quality Assurance Program in Friction Ridge Examinations,? (2012); 5 U. Department oflustice, ?The Fingerprint Sourcebook, 12- 3 (2012).: I :Ld at12_-5t012-3 17'Neumann et al., ?Improving the Understanding and the Reliability of the COncept of 'SufflcienCy in Friction Ridge Examination, U. Department oflustice, p. 56 (2013)(For example, forty five ?ngerprint examiners concluded that latent print #11 was suitable for comparison, while 53 examiners claimed that the same latent print was not suitable for comparison). 11 Examiners disagree on how many comparison features appear in a latent print18 Latent individualization Sufficiency- 30- . lndividualizaiion I. Inconciusive . I 5 25?. Exclusion (false negative}. 1! . ONO Value . --.mated image pairs I Figure 6. Detail oi Figure 5 for the 39 image pairs that had median corresponding minutla counts between 6 and 9.5 with the addition of box plots showing inlerquartile range. mlnima. and maxima. 452 rest-rooms: 6 to 16 responses per inage pair.) 18 Ulery, et at, ?Measuring What Latent Fingerprint Examiners Consider Sufficient Information for Individualization - Determinations PiosQne' 9(11) 2014. (For. some prints in this Study, some examiners identified as few as 5 features while other examiners analyzing the same print identified 20 or more features, leading the authors of the study to state _?Aithough we expected variability in minutI'a counts we did not expect the counts to vary as much as they did, especiaiiy in those critical cases in which examiners do not agree on their determinations and precise counting might be pivotal.- The differences in minutia caunt understate the variability because annotations not only di?er substantIoiiy In totai minutia counts, but aiso in which specific minUtiae were selected?). 12 Finally, with regard to the ultimate question of Whether a print can be associated to a suspect, comprehensive studies demonstrate the same trend- examiners often disagreeing19 Table 4. Repeatability and reproducibility of individualization and exclusion decisions. by examiner assessment of dif?culty. -: Exclusion Ripe-ted Reproduced Rope-tad. Reproduced Obvioustasyz?Modkim 92% 85% 889i: - 77%. oifliculuvely Dif?cult 55a 5095f All of this modern research in the ?eld tells the same tale- ?ngerprint comparisons are ective individual assessments that differ signi?cantly. from examiner to examiner. Given the subjectivity of the discipline, it should come as no surprise that errors int-- ?ngerprint cases20 are coMon. The most reliable error rate data in the discipline comes from two large-scale error rate studies published in the last several years. One such study, conducted by the Miami-Dade Police Department, generated a false+poSitive rate 1 in_24 cases, and possibly as high as in-18 cases.21 A second study, conducted by the FBI, generated a false positive rate 22 of 1. _in 604 case, and possibly as high as 1 in 306 cases. False exclusion errors (examiners concluding'that a print did not come from a suspect when it in fact-did come from the suspect) 19_U ery, et and Reproducibility of Decisions By Latent Print Examiners,? Proceedings of the- . National Academy of Sciences p. 8 (2012)(Examiners analyzing the same latent print evidence disagree about 50% ofthe time on final opinions in more difficult cases, and disagree about 20% of the time on even the easiest cases). ?_Although there are many different types of errors that fingerprint examiners can make one of the most troubling IS the false positive? claiming that a latent print can be associated to a suspect when the Suspect is intact not the source of the latent print 21President?s Council of Advisors on Science and Technology, ?Forensic Science' in Criminal Courts: Ensuring 2Szcientific Validity of Feature- -Comparison Methods September, 2016, p. 95 221d. at 94. 13 were even more common than false positive errors. Based on this data, it is reasonable to expect errors in forensic ?ngerprint cases on a rOutine basis. And while there .is no way to determine how many defendants in criminal cases have been convicted based on erroneous ?ngerprint claims, there are. many repOrted instances of this phenomenon.? Looking at the. big picture, the data described above tells the tale of the-need for a robust QA system- a. program designed to referee differences of opinions, deteCt errors, and remediate I poor practices that. leadto errors. Despite this reality, the LPU has no. QA program to ensure the. reliability of its forensic-analyses.- First, the coillectionloflLPU memos listed above does not contain any of the basic elements of a QA program. Second,_the_undersigned attorneys. sought to 23 Higgins, ?Fingerprint Evidence Put on Trial,? Chi. 25, 2002 (where a fingerprint examiner from the. Chicago Police Department erroneously matched a critical latent print in the Brown?s Chicken killings to an innocent person); Jofre, ?Falsely Fingered, Guardian July 9 2001 (where 4 different law enforcement fingerprint examiners made two different false matches In one case, with a detective being falsely implicated by the 4 examiners who found 16 or more points of similarity between the latent print and the detective); Scheier, ?New Trial Sought' In U. Darby Slaying," Phila. Inquirer, Aug. 16, 1999 (where the defendant was wrongfully convicted of murder based'on a fingerprint misidenti?cation by 3 different fingerprint examiners); Vlgh, ?Evidence Bungled in Slaying,? Salt Lake Trib., Reb. 19, 2003 (where a man was falsely implicated in anattempt murder case when a mgerprint examiner erroneously matched to bloody prints from the crime scene to the defendant); Michael Coit, ?Santan Rosa Woman Identified as Vegas Slaying Victim Turns up Alive.? Santa Resa Press Democratic Sept. 13, 2002 (documenting a case where a female crime victim was misidentified through an erroneous fingerprint match, when the latent fingerprint was attributed to a living woman); Commonwealth v. Cowans,.756 622 (Mass. App. Ct. 2001)( where the defendant was erroneously matched to a latent fingerprintifrom the scene, only to be exonerated by. DNA thereafter); State v. Caldwell, 322 N. W. 2d 574- (Minn. 3 different fingerprint examinehrs misidentified a fingerprint with 11 points of similarity to the defendant), Cooper v. Dupnik, 963 F. 2d 1220 (9th Cir. 1992); Starrs, ?More Saltimbancos on the Loose? Fingerprint Experts Caught in a World of Error, 12 Sci. Sleuthing Newsl. 1 (1988)(Where 2 different print examiners misidentified two latent fingerprints with 11- 12. points of similarity to the defendant In a rape case)(where a review of work by a fingerprint examiner in Nerth Carolina found 4 erroneous identifications); Grey,- ?Yard In Fingerprint Blunder, Lendon Sunday Times, Apr. 1997 (Where a suspect was incarcerated for a rape of a child based on a fingerprint misidentification)(also documents another erroneous fingerprint match In a burglary case from 1987, when 3 separate examiners falsely attributed two latent prints from a burglary scene to the suspect); Noble and Avercuch, "Never Plead Guilty: The Story of Jake Ehrlich, 295,1955 (where a defendant was coniricted based on a 14? ?point fingerprint misidenti?cation), McRoberts et al., ?Forensic Under the Microscope,? Chicago Tribune, October 17, 2004; Cole, ?More Than Zero; Accounting for Error in Latent Fingerprint Identification", of Criminal Law 8: Criminology, 95, p. 985-986, 997-998(documenting the existence of numerous unreported cases of ?ngerprint errorlISaltzman MacDaniel, ?Man Freed in 1997 Shodting of Officer: Judge Ruling After Fingerprint Revelation,? Boston Globe, law. 24, 2004, at A1 (where DNA exonerated a man after he was convicted by erroneous . fingerprint examiners); WA Sims-.pdf (Appendix H)(documenting 36 cases of erroneous fingerprint opinions, including 30 'cases of false identification; 5 cases of false exclusion, and 1 case of erroneously failing to exclude). 14 obtain through subpoena any documentation that represents the LPU QA program, and the LPU - responded by admitting that they have no such program (Attachment 0): Subpoena - 1. Regarding RDiie HW579017, copies of any and all documents that constitute the Quality Assurance Plan for the Chicago Police Department Fingerprint Unit, including but not limited to: copies of any written QA plans and blank copies of any forms used by examiners to record and report QA issues such as differences ofopinions between examiners: - no responsive documents: Third, LPU examiners admit to the complete lack of a QA program For instance, Of?cer I Thurston Daniels admitted this to this failure 1n his ?ippant testimony below: Q: Okay And in your unit, for all the that are doing fingerprint examination, there 5' no error management system that? in place in your unit, right? A: Other than the Public Defender ?5 O?ice - - - Q: I appreciate the compliment. A. - - checking my work. Q: Your unit has nothing that you know of in place to deal with what would happen 1f an error occurred? . . A: No, not ?om the Chicago Police Department. - Q: Okay. And pertaining to your work perSOnally, Mr. Daniels, there ?5 no documentation of you having ever disagreed with the opinion of another examiner ?om your unit? A: No. Q: Okay. And there ?3 no recording-of any diScrepancy in your casework? A: No. Q: Okay. And memos] don mention what to do if you have an opinion that? ditf?rent from a colleague right? A: No. 15 Q: And just so we ?re clear, these memos you?ve been talking about, these are the only written directives that your unit keeps whatsoever? A: Asfar as I know. (Attachment H, p. 130-134). Of?cer Iwona Dabrowska admitted the same, asserting ina pre-trial interview that the LPU ?has no quality assurance measures for documenting and dealing with disagreements.? (Attachment P). Finally, Of?cer Malone (the LPU examiner in this case) has likeWise admitted to a complete lack of program: Q: You don ?t have any written error management system if something went wrong, right, O?icer Malone? A: That is correct. Q: There is no quality assurance system, right? A: Correct (Attachment Q, p. 27). The complete failure of the LPU to adopt industry-required QA procedures is diSturbing, and has resulted in the failure to detect erroneous opinions by LPU examiners. As detailed in pre-trial interviews, some LPU examiners (including Of?cer Malone) make the absurd claim that I there are never differences of opinions in casework between two LPU examiners.24 These claims of uniform opinions across an entire ?ngerprint lab over years of operation are scienti?cally indefensible given the studies above showing that examiners disagree about 50% of the timeon dif?cult cases and even 20% of the time on easy cases. Therefore, the failure of the LPU to detect and report any differences of I opinion across 1000s of cases is only possible if the LPU turns a total blind-eye to quality issues. 24 Of?cer Michael Malone asserts that no verifier has disagreed with his opinions in the thousands of cases he has handled over the past 5 years. (Attachment 0)(Attachment R). Officer lwona Dabroska asserts that no verifier has disagreed with her opinions in case werk in 5 years. (Attachment P). Of?cer Seavers asserts that no veri?er has disagreed with her opinions in case work in 8 years. (Attachment 5). 16 When coupled with the complete lack of adequate written procedures, the absence of a QA program should be fatal to the admissibility of the ?ngerprint evidence in this case. The lack of written protocols means that examiners have no guidance in how to conduct a reliable ?ngerprint examination, which means that errors are far more likely. The lack of a QA program means that the LPU has no system designed to catch these errors, and to remedy recurring problems with examiners. Taken together, these two fundamental failures mean that the LPU is missing the most important attributes of a crime lab. In this context, this Court can have no I con?dence in the reliability of the ?ngerprint opinions being reported by the LPU. IV. IN VIOLATION OF PROFESSIONAL NORMS. THE LPU DOES NOT MAINTAIN A WRITTEN TRAINING PROGRAM OF ANY KIND, GIVING THE LPU NO CAPACITY TO ENSURE THAT ITS EXAMINERS UNDERSTAND THE MANY FUNDAMENTAL CHANGES IN THE DISCIPLINE. 5 in the ?eld, ?ngerprint examiners should According to a panel of leading experts2 receive Ongoing professional training, not only on substantive examination issues, but also on ethics, cognitive bias, and proper scienti?c research methods.26 According to the National Commission on Forensic Science?, ?ngerprint examiners have an ethical obligation to ?commit to continuous learning in relevant forensic discipiines and stay abreast of new findings, 25 The Human Factors Working Group was funded by the US. Department ofJustice, and consisted of ?ngerprint examiners from every level of law enforcement, including fingerprint experts from the FBI, the Maryland State Police, the US. Secret Service, the Massachussets State Police, the Las Vegas Police Crime Lab, the Indiana State Police Crime Lab, and the Los Angeles County Sheriff Crime Lab. In 2012, the Working Group issued a report, reviewing practices and problems with the forensic fingerprint discipline. pa pers.ssrn.com/sol3/ papers.cfm ?abstract_id=2050067 26 Working Group on Human Factors in Latent Print Analysis, ?Latent Print Examination and Human Factors: Improving the Practice Through a Systems Approach,? National Institute ofJustice, p. 168 (2012). 27 The NCFS is a collaboration between the US. Department ofJustice and the National Institute of Standards and Technology. The stated mission of the NCFS is to provide recommendations concerning ?notional methods and strategies for strengthening the validity and reliability of forensic science. . The NCFS Board includes representatives from law enforcement forensic agencies from around the country, including but not limited to: the founder of the Armed Forces DNA Identification Lab, a chemist formerly with the U.S. Customs Lab, a member of the FBI DNA Advisory Board, the Director of the Palm Beach Sheriff?s Office Crime Lab, a forensic chemist with ATP, the Director of the Virginia State Crime Lab, and a forensic chemist with the FBI. The NCFS is co-chaired by the Deputy US. Attorney General. 17 equipment, and techniques.? (Attachment T, p. 3). For a ?ngerprint lab to obtain and maintain accreditation, it must have written procedures ?ifor identi?ing the training needs and providing training of personnel. (Attachment B, p. 20). Moreover, the lack of such a comprehensive professional development plan not only precludes I accreditation, but also ?can lead to perpetuation of improper or inappropriate methods.?28 In addition, a lack of training can lead ?ngerprint examiners to mislead the trier of fact by overstating the probative value of ?ngerprint associations. A training program that meets industry standards must be written, and should contain modules, listing the objectives of the modules and the reading materials required to achieve the stated objectives. Examiner progress through the training modules must be documented, and examiners must pass a competency test at the end of the last training module. (Attachment J). Separately, once an examiner success?illy completes this initial training, the examiner must attend at least 80 hours of continuing training in a 5-year cycle. This retraining is critical because the ?eld of forensic ?ngerprint comparison is changing rapidly- ?more critical research and professional issues have occurred in the last 15 years than in the entire history of the . (Attachment J). Only through structured retraining can a ?ngerprint examiner maintain - competency. As with every other fundamental requirement of a properly functioning crime lab, the LPU fails this one. This should come as no surprise to anyone who is familiar with the US. Department of Justice report identifying the many systemic problems with the CPD.29 With 28 Working Group on Human Factors in Latent Print Analysis, ?Latent Print Examination and Human Factors: Improving the Practice Through a Systems Approach,? National Institute ofJustice, p. 165 (2012). 29 As part of a broad investigation into unlawful CPD practices, the US. Department ofJustice conducted a year- long investigation, including interviews of 340 members of the CPD, meetings with 90 community organizations, 18 regard to training and retraining of CPD of?cers in general, the DOJ determined that CPD practices are de?cient: ?Our investigation revealed engrained de?ciencies in the systems CPD uses to provide o?icers with supervision and training. CPD ?s inattention to training needs, including a longstanding failure to invest in resources, facilities, sta?ing, and planning required to train a department of approximately I 2, 000 members, leaves o?icers underprepared to police e?ectively and lawfully. . .CPD and the City of Chicago have not provided such training to CPD o?icers for many years, to the disservice not only of those o?icers but to the public they serve. ?30 While the DOJ did not separately report on forensic training, the LPU response to subpoenas for their training documentation tells the whole story. In response to a subpoena requesting the production of written training materials, the LPU responded by admitting that they have ?no responsive documents.? (Attachment U). The very foreseeable consequence of this failure is that LPU examiners are shockingly unfamiliar with basic and important concepts in their own ?eld. In particular, Of?cer Malone (the ?ngerprint examiner in this case) is unfamiliar with basic and important concepts in his own discipline: and a review of thousands of pages of CPD documentation. This investigation resulted in a 164?page report entitled, "Investigation of the Chicago Police Department.? 30 Id_. at 93?94 19 It is Widely aCknowledged- in the forensic ?ngerprint cemmtinity that ?ngerprint opinions. - deriVed from the ACE-V .?method are subjeCtive. 3?1 Every important _-large scale study 1n- the ?eld clearly establishes subjectivity.32 Yet, Of?cer Malone denies this truism. (Attachment Q, p. 53). Although ?ngerprint examiners testi?ed in the past to 100% certainty in their cOnclusions, this was never scienti?cally-defensible and examiners have been told to stop providing this imprOper testimony.33 Nonetheless, Of?cer Malone persists in providing misleading testimony on this concept, recently testifying under oath that ?if an identification is made, I am I?m speaking for myself 100 percent certain that I have made an identification. . .when' I make an identification, 1 am 100 percent certain that it is an identification. (Attachment Q, p. 51-52). 0 It has never been proven in any meaningful or systematic way that eVery ?ngerprint of every person in the World 15- unique. Yet, Of?cer Malone testi?es without scienti?c I justi?cation that ?ngerprints are unique. (AttaCMent Q, p. 31) Modern ?ngerprint examiners recognize the uncertainly inherent 5in ?ngerprint comparisons, in part due to phenomena such as Connective ambiguity35 (the fact that examiners cannot always distinguish between impOrtant ?ngerprint features such as 31 The NAS Committee reported in 2009 that ?the outcome ofa friction ridge analysis is not necessarily repeatable- from examiner to examineur .This subjectivity' is intrinsic to friction ridge analysis.? The NAS Committee further reported that the final opinion of an eXaminer regarding whether a fingerprint came from a certain- person ?is a- subjective assessment.? National Academy of Sciences, ?Strengthening Forensic SCienCe in the United States: A Path Forward,? National Academies Press, 2009, p. 184; See also, Working Grdup on Human Factors in'LatentPrint Analysis, ?Latent Print Examinatibn and Human Factors: Improving the Prattice through a Systems Approach,? National Institute of Justice, 2012 ("At every step in the process human factors can affect the outcome. Latent print examiners rely heavily on their training and experience to make the reauiredjudgments. Subjectivity l5 02? inextricable part of the process. 32See, Ulery, et al., ?Repeatability and Reproducibility of Decisions By Latent Print Examiners,? Proceedings ofthe National Academy of Sciences, 2012. 33According to the Human Factors Working Group, "Because empirical evidence and statistical reasoning do not support a source attribution to the exclusion Of. all other individuals in the world, latent print examiners should not report or testify, directly or by-implication, to a-source attribution to'the exclusion of all others in the war! Working Group on Human Factors in Latent 'Print Analysis, ?Latent Print Examination and Human Factors: Improving the Practice through a Systems Approach,? National Institute 'ofJustice, 2012, p.72. Likewise, the US. Army-Crime Lab has rejected claims of source attribution, stating ?several'well respected'and authoritative scientific committees and organizations have recommended forensic science laboratories not report or testify, directly or by implication, to a source attribution to the exClUsion of all others in the World or to assert 100% certainty and state conclusions in absolute terms when dealing with population iSsues.? (Attachment V). 54 Page et al., ?Uniqueness in the Forensic Identification Sciences? Fact or Fiction,? Forensic Science International (2011)(stating that ?the concept of uniquenesshas more the qualities of a cultural meme than a scientnic fact. . .most of the studies attempting to prove the uniqueness of a particular forensic feature suffer flaws that render their conclusion See also, Knapton, ?Why Your Fingerprints May _Not Be Unique,? The Telegraph, 2014 (?Nobody has yet proved that fingerprints are unique and families canshare elements of the same See also, Working Group on Human Factors in Latent Print Examination and Human Factors: Improving the Practice Through a Systems Approach,? National Institute of Justice, p. 165 (2012)(stating that ?uniqueness does not guarantee that prints from two di?erent'peopie are always sufficiently di?erent that they cannot be confused, or that two impressions made by the same. finger will also be su?iciently similar to be discerned as coming from the same source. - fDrints nwlean. net/c. (defining connective ambiguity as ?minutia that cannot be specifically_ determined due to distortion. _20 .I There-a: . . bifurcations and ridge endings), close non? -n1atches36 (the fact that ?ngerprints from different people can share the same constellation of features raising the possibility of a false ?ngerprint association), and error rates? (the two scienti?cally valid error rate studies in the ?eld established upper bound false positive rates of 1 in 306 cases and 1 1n 18 cases respectively). Of?Cer Malone has testi?ed that he is unaware of each of these concepts. (Attachment Q, p, 18, R). 0 The Journal of Forensic Science is the ?internationally recognized scienti?c journal? for the ?world?s most prestigious forensic science organization?? the American Academy of Forensic Sciences. The AAFS has 7000 members, representing all 50 states and 70 other countries.38 The Journal routinely publishes articles describing advancements and reforms in the ?eld of forensic ?ngerprint comparisons.39 Of?cer Malone has never heard of it. (Attachment Q, p. 25). - In terms of systemic failure, the lack of training and understanding does not stop with Of?cer Malone, but includes his co-workers in the LPU: 36 See, Of?ce of Inspector General, Review of the Handling of the Brandon Mayfield Case,? US. Department ofJustice, 2006, p. 137(where the OIG determined that one factor contributing to the false identification was the fact that and search is designed to identify close non?matches, and further finding that the "likelihood of encountering a misleadingly closenon-match through an search is therefore greater than in a comparison ofa latent print with the known prints ofa suspect whose connection to a case was developed through an investigation?); See also, Working Group on Human Factors in Latent Print Analysis, ?Latent Print Examination and Human Factors: improving the Practice through a Systems Approach,? National Institute of Justice, 2012, p. 63 (?When comparing latent prints to exemplars generated through searches, examiners must recognize the possibility and dangers of incidental similarity. Adjustments such as a higher decision threshold, stricter tolerances for differences in appearance, and explicit feature weighing need to be considered. Modh?ying quality assurance practices for this scenario also should be considered 37Office of the Presidents Council of Advisors Science and Technology, ?Forensic Science In Criminal Cases: Ensuring Scientific Validity of Feature- Comparison Methods,? p. 149 (2016). 38 39 Langen bu rg et at, ?Testing For Potential Contextual Bias Effects During the Verification Stage of the ACE-V Methodology When Conducting Fingerprint Comparisons,? JFS (2009); Speckels, Can ACE-V Be Validated,? JFS (2011); Dror and Rosenthal, ?Meta-Analytically Quantifying the Reliability and Biasa bility of Forensic EXperts,? JFS (2008); La ngen bu rg, ?Distortion in Fingerprints: A Statistical investigation Using Shape Measurement Tools,? JFS (2014); de Jongh and Rodriquez, ?Performance Evaluation of Automated Fingerprint Identification Systems Fore Specific Conditions Observed in Casework Using Simulated Fingermarks,? .IFS (2012); Christiansen et aI., ?Error and Its Meaning in Forensic Science, JFS (2014); 21 Of?cer Dabrowska: (Attachments A, P). 0 Does not know the meaning of blind analysis.40 0 Has no understanding of a ?close non-match? and the problems associated with close non-matches in AF IS cases.41 - Believes that there is a 0% chance that she makes errors.42 0 Does not know what contextual bias means and does not recognize cognitive bias as a potential source of error.43 0 Claims that all ?ngerprints are proven to be unique.44 ?10 The FBI has been using blind analysis of the latent fingerprint for years: ?According to LPU Unit Chief Meagher, the analysis should be performed on the latent print before consideration of any available known prints, in order to ?limlt or try to restrict any bias in terms of what appears in the known exemplar.? in other words, analysis of the latent is performed prior to the examination of the relevant exemplar, in order to avoid having the known print suggest features in the latent print to the examiner.? Office of inspector General, Review of the Handling of the Brandon Mayfield Case,? US. Department ofJustice, p. 105-106 (2006). 41 Office of Inspector General, Review of the Handling of the Brandon Mayfield Case,? US. Department of Justice, 2006, p. 137(where the OIG determined that one factor contributing to the false identification was the fact that and AFIS search is designed to identify close non-matches, and further finding that the ?likelihood of encountering a misleadingly close non-match through an search is therefore greater than in a comparison of a latent print with the known prints of a suspect whose connection to a case was developed through an i investigation?); See also, Dror and Mnookin, ?The Use of Technology in Human Expert Domains: Challenges and Risks Arising From the Use of Automated Fingerprint Identification Systems in Forensic Science,? Law, Probability and Risk Vol. 9, 2010, p. 55 (?the chances of finding by pure coincidence a lookalike print, a print originating from another person but that is nonetheless extremely similar to the latent print, is much higher than when comparing the latent print to just a few dozens, hundreds or even thousands of prints prior to the introduction of See also, Working Group on Human Factors in Latent Print Analysis, ?Latent Print Examination and Human Factors: Improving the Practice through a Systems Approach,? National Institute of Justice, 2012, p. 63 (?When comparing latent prints to exemplars generated through searches, examiners must recognize the possibility and dangers of incidental similarity. Adjustments such as a higher decision threshold, stricter tolerances for differences in appearance, and explicit feature weighing need to be considered. Modifying quality assurance practices for this scenario also should he considered. 42 According to an article published in the leading forensic science journal, ?there is always a nonzero probability of error, and to claim an error rate of zero is inherently unscienti?c. . .We strongly recommend that educational programs in forensic sciences as well as training programs for practitioners address error and error analysis.? Christiansen et al., ?Error and Its Meaning in Forensic Science,? Journal of Forensic Science, Vol. 59, p. 123-126 (2014); See also, Working Group on Human Factors in Latent Print Analysis, ?Latent Print Examination and Human Factors: Improving the Practice through a Systems Approach,? National Institute of Justice, 2012, p. 127. (stating testifying expert should be familiar with the literature related to error rates. . . The expert should notstate that errors are inherently impossible or that a method inherently has a zero error rate?). .3 43 According to the FBI Crime La b, ?examiners must be aware of. . .contextual bias and confirmation bias,? and ?continuing education can keep these topics fresh in an examiner?s mind.? mu nications/fsc/oct2009/ review. 44 See footnote #34. 22 Of?cer Joseph Calvo: (Attachment R) 3 Does not know that statements of 100% certainty are not permitted in the ?eld.45 I Cannot discuss the concept of close non-matches.46 0 Cannot discuss fundamental error rate studies in the ?eld and admits that he is not ?an expert in error rates. 47 I Cannot discuss the validity research 1n his own ?eld and admits that he 1s not ?an expert in reproducibility studies.? 0 Denies that ?ngerprint examinations lead to subjective opinions of individual examiners. Of?cer Seavers: (Attachment S). . Could not de?ne SWGFAST. 5? 0 Unfamiliar with the research 1n the ?eld discussing the increased chances of error in AFIS cases due to close non-matches 1n large databases.51 I Does not know that know that statements of 100% certainty are not permitted in the ?eld.52 0 Cannot de?ne the concept of incidental similarity.S3 45 See footnote #33. 46 See footnote #36. 47 Working Group on Human Factors in Latent Print Analysis, ?Latent Print Examination and Human Factors:. Improving the Practice through a Systems Approach,? National Institute of Justice, 2012, 127 (stating that testifying expert should be familiar with the literature related to error rates?); See also, President?s Council of Advisors on Science and Technology, ?Forensic Science in Criminal Courts: Ensuring Scienti?c Validity of Feature- Comparison Methods,? p. 53 (2016)(stating that it is essential to know the error rate of a forensic method "[b]ecause without appropriate empirical measurement of a method's accuracy, the fact that two samples in a particular case show similar features has no probative value- and, as noted above, it may have considerable prejudicial impact because juries will likely incorrectly attach meaning to the observation? and also stating that the field of forensic ?ngerprint comparison has only two valid error rate studies [the FBI Black Box Study and the Miami-Dade Study] establishing an upper bound false positive rate of 1 in 306 cases and 1 in 18 cases respectively). 48 See, -Ulery, et al., ?Repeatability and Reproducibility of Decisions By Latent Print Examiners,? Proceedings of the National Academy of Sciences, 2012 (the most important and comprehensive large-scale study assessing the reproducibility and repeatability of opinions by fingerprint examiners). 49See footnote #31, 32. 50The Scienti?c Working Group on Friction Ridge Analysis, Study, and Technology publishes guidelines and standards forI ?ngerprint analysis' In the U.S. First organized by the FBI, the SWGFAST membership includes 40 experts representing every type of law enforcement fingerprint lab, including the FBI, the US. Army Crime Lab, various state crime labs, and various ?ngerprint units of local law enforcement. For several decades, it has been recognized by examiners in the field as the most authoritative source of national standards and guidelines for forensic ?ngerprint comparison. 51 See footnote 36. 52 See footnote 33. 53 Working Group on Human Factors in Latent Print Analysis, ?Latent Print Examination and Human Factors: Improving the Practice through a Systems Approach,? National Institute of Justice, 2012 (?When comparing latent prints to exemplars generated through AFIS searches, examiners must recognize the possibility and dangers of incidental similarity?): See also, Dror and Mnookin, "The Use of Technology in Human Expert Domains: Challenges and Risks Arising From the Use of AFIS in Forensic Science,? Law, Probability and Risk incidental similarity as ?highly similar, look-alike prints from different sources?). 23 0 Could not recall if she had ever read a study validating her method of ?ngerprint comparison. - Re?lsed to answer whether ?ngerprint comparison opinions are subjective. Refused to discuss her standard for determining whether a print can be associated with a suspect. 0 Refused to answer any questions about published error rates in the ?eld. 0 Refused to state whether she understood the term cognitive bias or had received any training on the topic. Whatever ad-hoc training is made available to police of?cers in the LPU, it clearly is not working. As documented above, LPU examiners seek to testify to a laundry list of claims that are rejected by the ?eld and that will overstate the probative value of ?ngerprint associations. LPU examiners seek to testify that ?ngerprint comparisons lead to objective truths even though they do not. LPU examiners seek to testify that they can identify the source of a latent-print to the exclusion of all others in the world, even though the ?eld has rejected such unsupported claims. LPU examiners seek to testify that they are 100% certain of their ?match? claims, even though fundamental research in the ?eld proves that opinions are not reproducible and false positive rates may be as high as in 18 cases. Finally, LPU examiners seek to deny the scienti?c truth that ?ngerprint investigations that begin with an AF IS search are more likely to lead to false positives due to close non-matches in large AFIS databases. Given the scope of information documented in this Motion, this Court is in a better position than previous judges to assess the overall failures of the LPU training and professional development program. And given the many and fundamental failures of the LPU training program documented above, this Court should not admit ?ngerprint opinions of LPU examiners until such time as the LPU adopts a documented training program and establishes that its examiners are suf?ciently trained. 24 V. IN VIOLATION OF PROFESSIONAL NORMS. THE CPD DOES NOT VALIDATE ANY OF THE FINGERPRINT PROCESSING OR ENHANCEMENT TECHNIQUES IT USES AND THEREFORE HAS NO BASIS TO. ASSERT THAT THEIR TECHNIQUES ARE RELIABLE. . . In order to- establish reliability, forensic labs must validate all forensic examination methods prior to use in casework.54 Validation involves a ?comprehensive performance and documentation of measurements to veri?/ a method is reliable andfit for purpose.?55 Without such testing, the resulting data would be _?scierztifically For a forensic lab to pass basic accreditation, its methods must be validated and the validation documentation must be maintained and available.? (Attachment B, p. 30). Additionally, SWGFAST guidelines for ?ngerprint labs require that labs create and maintain ?method validation records? as part of an acceptable quality assurance program.58 The particular importance of method validation in the context 'of ?ngerprint cases stems from the use by ?ngerprint examiners of processes to alter the appearance of latent ?ngerprint images. Sometimes referred to as ?enhancement techniques,? these methods can dramatically change the appearance of a latent print, creating apparent ?ngerprint ridges where none were 54 SWGFAST, Document ?Standard for a Quality Assurance Program in Friction Ridge Examinations,? p. 2. 55 Kevin Schug, ?Forensics, Lawyers, and Method Validation- Surprising Knowledge Gaps,? The Column, Vol. 11, p. 2 (2015): See also, Forensic Science Regulator, Guidance p. 3 (2014). See also, UNODC, ?Guidance for the Analytical Methodology and Calibration Equipment used for Testing cf Illicit Drugs in Seized Maerials and Biological Specimens, p. 9-12 (2009)(Validating the specificity of a method involves determining the extent to which the method can be subject to interference by impurities and matrices that are not the target of the method, validating the precision of a method involves establishing that the method reproduces the same or similar results each time it is used, and validating the accuracy of a method involves establishing that the results of the method are consistent with ground truth). Id. 57 Method validation is required as part of 17025 accreditation. 53 SWGFAST, Document "Standard for a Quality Assurance Program in Friction Ridge Examinations,? p. 2. 25 previOusly _ViSib1?- For instance, ?ngerprint labs use chemicals to try to make ?ngerprint ridges appear where they were previously not visible? In addition to altering the latent ?ngerprint image through physical and chemical procesSing, ?ngerprint examiners can dramatically alter the appearance of a suspected latent print image through the use of computer programs Such :as Photo'shop60 ORIGINAL FORENSIC PHOTOSHOP PROCESS- U. S. Department of Justice ?Fingerprint Sou rcebook p. 7? 20 (showing the effects of chemical processing using the chemical reagent 1 8- Diazaflouren?Q?one). 60,?Carusso ?Alternative Methods of Latent Fingerprint Enhancement and Metrics for Comparing Them,? National Institute of Standards and Technology (2013) - 26 Clearly, the use of techniques to alter the appearance of a latent ?ngerprint image can have dramatic effects on the quality and quantity of data available for interpretation. The concern with the use of these alteration methods, such as Photoshop, is that they may cause ?unintended collateral damage? by altering the appearance of the ?ngerprint ridges.61 This collateral damage comes in two equally troubling forms: the loss of real data from the original print and the creation of false data in the form of artifacts.62 With regard to the loss of real ?ngerprint data, experts admit that the use of Photoshop ?may inadvertently eliminate vital ?63 With regard to the creating fine scale information, information that might exclude a suspect. of false data, experts in the field admit that the use of Photoshop can create ?misleading arti?icts,? leading to unreliable comparisons and questionable results.64 One leading expert in the ?eld has-described digital enhancement in particular as ?an area of concern? due to the fact that the processes used in digital enhancement are not always reproducible.65 Because of the possibilities both of losing real fingerprint data and of creating false data, orensic service 51 Carasso, Framework for Reproducible Latent Fingerprint Enhancements,? Journal of Research of the National Institute of Standards and Technology, p. 212, 214 (2014). 52 Id_. at 225. .63 Carusso, ?Alternative Methods of Latent Fingerprint Enhancement and Metrics for Comparing Them," National Institute of Standards and Technology (2013). 64 Carasso, Framework for Reproducible Latent Fingerprint Enhancements,? Journal of Research of the National Institute of Standards and Technology, p. 212, 225 (2014)(stating that the ?overzealous application? of Photoshop to latent prints can both eliminate significant information from the original latent print and add new artifacts to the latent print image); See also, Working Group on Human Factors in Latent Print Analysis, ?Latent Print Examination and Human Factors: Improving the Practice through a Systems Approach,? National Institute of Justice, p. 80 (2012)(stating that digital alteration of a latent image ?could create artifacts that an examiner might mistake for minutiae?); See also, SWGIT, ?Guidelines for Image Processing,? p. 2 (2010)(stating that examiners involved in image enhancement should "use caution to avoid the introduction of artifacts that add misleading information to the image or the loss of image detail that could lead to an erroneous interpretation?). 55 Carasso, Framework for Reproducible Latent Fingerprint Enhancements,? Journal of Research of the National Institute of Standards and Technology, p. 212 (2014): See also, Smith, ?Computer Fingerprint Enhancement: The Joy of Lab Color,? Journal of Forensic Identification, Vol62, p. 464 (2011)(stating that digital alteration processes must be documented ?in enough detail that any person with similar training and experience would be able to follow the steps and produce a similar, although not necessarily mathematically exact, result. 27 . 7".Tn- 1: providers should validate latent print enhancement technologies prior to use in casework.?66 According to labs that implement digital enhancement techniques must not only prove that the methods are valid, but must also ?implement quality assurance programs to ensure that results achieved are repeatable and val id.?68 Contrary to this generally accepted practice, the CPD has not conducted any validation testing of any kind on its alteration/enhancement techniques, some of which were used in this case. The undersigned attorneys requested through subpoena the production of ?any/all validation studies that validate the reliability of techniques performed on latent images by members of the Forensic Services Section.? (Attachment X). In response to this subpoena, the CPD provided a print-out of a powerpoint presentation, apparently created by the CPD, that is neither a validation document nor refers to any other validation studies of any kind. (Attachment Y). Based on this subpoena response, it seems very safe to say that the CPD has not validated any of their processing and enhancing methods (chemical or digital). 55 Working Group on Human Factors in Latent Print Analysis, "Latent Print Examination and Human Factors: Improving the Practice through a Systems Approach,? National Institute ofiustice, p. 80 (2012); See also, John Brunty, ?Validation of Forensic Tools and Software: A Quick Guide for the Digital Forensic Examiner,? forensicmag.com (2011)(stating that ?one of the issues that continues to be of utmost important is the validation of the technology and software associated with performing a digital forensic examination?); See also, SWGIT, ?Field Photography Equipment and Supporting Infrastructure,? p. 2 (2010)(stating that "validation is a necessary part of infrastructure design and usage?). 67 SWGIT stands for Scientific Working Group for Imaging Technology. The SWGIT was formed by the FBI in 1997, with a purpose of ?providing definitions and recommendations for the capture, storage, processing, analysis, transmission, and output of images.? SWGIT consists of members of law enforcement from every level, including federal, state, and local representatives. The immediate past Chair of the SWGIT was FBI examiner Melody Buba. 63 SWGIT, ?Overview of SWGIT and the Use of Imaging Technology in the Criminal Justice System,? p. 5 (2010); See also, International Fingerprint Research Group, "Guidelines for the Assessment of Fingermark Detection Techniques,? (Labs seeking to valid'a fingerprint alteration method should conduct a pilot study, followed by an optimization and validation trial, and complete testing with an operational evaluation). ploads/2014/06/IFRG?Resea 28 .. rrm-nmn-ar?nr Without validation testing, this Court has no proof of any kind that the LPU has the knowledge and skill to properly and reliably alter ?ngerprint images. The lack of validation testing is compounded. by the fact that the LPU does not maintain any procedures that describe the proper ways to use ?ngerprint enhancement techniques. (Attachment Therefore, this Court should exclude the ?ngerprint evidence in this case until such time as the CPD validates the methods it uses to alter the appearance of ?ngerprint images. VI. IN VIOLATION OF PROFESSIONAL NORMS. THE LPU STILL USES AN OLDER FLAWED METHODOLOGY THAT IS NOT RELIABLE AND HAS BEEN REJECTED BY THE SCIENTIFIC COMMUNITY. The LPU substandard approach to forensic ?ngerprint comparison carries over to the method its examiners use to examine latent prints. The forensic fingerprint community has learned from its mistakes in the past, and has reformed the method to require blind analysis and documentation of features in a latent print prior to exposure to the biasing effect of the suspect?s prints. No longer are ?ngerprint examiners permitted to peek at the suspect?s prints while attempting to identify ambiguous features in a distorted latent print. While leading crime labs have reformed their methods and adopted blind analysis, the LPU has not- they continue to use the old and unreliable method. To understand the ?aws with the older side-by-side analysis and the need for method reform, it is important to understand cognitive bias as a source of error and how the older method invited cognitive bias and lead to error. At its most basic, cognitive bias is outside in?uence that can affect any scienti?c or pseudo-scienti?c decision-maker and cause error. Some 29 . . ?rnIF-mrnun- Var-1:" ?arm-nun?. simply refer to the phenomenon as ?mental contamination. ?69 As one scientist explains: ?Cognitive bias is the tendency for an examiner to believe and express data that con?rm their own expectations and to disbelieve, discard, or downgrade the corresponding data that appear to conflict with those expectations. The observer ?s conclusions become contaminated with a pre-existing expectatiOn and perception, reducing the observer?s objectivity and laying the groundwork for selective attention to evidence. ?70 Cognitive bias causes scientists to ?seek information that they consider supportive of a favored hypothesis or existing beliefs and to interpret information in ways that that are partial to those n71 hypotheses or beliefs. Due to this phenomenon, scientists can ?see in data the patterns for ?l2 By in?uencing which they are looking, regardless of whether the patterns are really there. decision-makers to ignore data and seek support for preconceived ideas, cognitive bias is a source of error in data interpretation.73 For instance, cognitive bias skews clinical trials of novel 59 Wilson and Brekke, ?Mental Contamination and Mental Correction: Unwanted Influences on Judgments and Evaluations,? Bulletin, 116, p. 119, 1994. - 7? Bieber, "Fire Investigation and Cognitive Bias,? of Forensic Science, 2014. 71 Nickerson, "Con?rmation Bias: A Ubiquitous Phenomenon in Many Guises,? Review of General 2, p. 177. 72 Nickerson, "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises,? Review of General 2, p. 181. 73 See, Hrobjarlsson et al., ?Bias Due to Lack of Patient Blinding in Clinical Trials,? Int. J. Epidemiol., p. 1272-83 (2014)(finding empirical evidence of bias in non-blind clinical trials); See also, Tape and Panzer, ?Echocardiography, Endocarditis, and Clinical information Bias,? Journal of General Internal Medicine, 1, p. 303, 1986 (examples include the identi?cation of abnormal cells-in pathology, the counting of blood cells by medical technicians, the reading of orthopedic x-rays by medical doctors, and times for stellar transits in astronomy). 30 medical treatments where non-blinded assessors are used.74 Decades of research in the scienti?c community has con?rmed the troubling effects of cognitive bias in science},5 By the mid?twentieth century, scientists inthe broader community began adopting blind analysis methods to minimize errors that result from cognitive bias}!6 The simple theory behind blinding is that an examiner can minimize bias by ?avoiding contaminants that might bias one ?s As one expert in the broader scienti?c community explains: ?sometimes researchers can implement procedures in the research design that minimize observer e?ects. A clear-cat example are the doable-blind trials used in biomedical Today, blind testing methods have been adopted across numerous scienti?c disciplines. For instance, blind methods in are ?widespread in areas of particle and nuclear physics,? and ?blind analysis is often considered the only way to trust many results.?9 In pharmaceutical drug research, the failure to blind the clinicians from information about Which patients received the 74 Hrobjartsson et al., ?Observer Bias in Randomized Clinical Trials With Measurement Scale Outcomes: A Syetematic Review of the Trials with Both Blinded and Nonblinded Assessors,? Canadian Medical Association Journal, March 5, 2013, p. 201 (establishing the effects of observer bias in clinical trials and concluding that ?failure to blind outcome assessors in such trials results in a high risk of substantial bias?); See also, Nuzzo, "How Scientists Fool Themselves-And How They Can Stop,? Nature, October 7, 2015 (detailing the failure of reproducibility in many areas of scientific research, attributable to cognitive bias); See also, Ionnidis, ?Why Most Published Research Findings Are False,? PLOS.Medicine, August 30, 2005(finding significant bias in the methodology and publication of research). 75 See also, Nuzzo, ?How Scientists Fool Themselves?AndHow They Can Stop,? Nature, October 7, 2015 (detailing the failure of reprodt?rcibility in many areas of scientific research, attributable to cognitive bias); See also, Ionnidis, "Why Most Published Research FindingsAre False,? PLOS.Medicine, August 30, 2005(finding signi?cant bias in the methodology and publication of research). 75 Nuzzo, ?Fooling Ourselves,? Nature, Vol. 526, p. 183 (2015); See also, Wilson and Brekke, ?Mental Contamination and Mental Correction: Unwanted influences on Judgments and Evaluations,? Bulletin, 116, p. 134, 1994. 77 Wilson and Brekke, ?Mental Contamination and Mental Correction: Unwanted Influences on Judgments and Evaluations,? Bulletin, 116, p. 134-135, 1994. 78 Salkind, of Research Design,? p. 564. I 79 MacCoun and Perlmutter, ?Hide Results To Seek the Truth,? Nature, Vol. 526, p. 187 (2015). '31 drug and which patients received a placebo would be a serious ?aw in the study, 'due to the fact that the doctors reviewing the data would be subject to cognitive bias.80 With regard to the more narrow community of forensic scientists, the move to blind analysis methods did not pick up steam until the May?eld misidenti?cation, which highlighted the problems with cognitive bias and the old ACE-V method to anyone who. cared to look.81 In 2004, the FBI mistakenly arrested Brandon May?eld for a terrorist bombing after FBI ?ngerprint examiners claimed to have associated his ?ngerprint to one from the scene of the bombing. However, before the FBI focused on Mr. May?elid as a suspect, their ?ngerprint examiners reviewed the latent print blindly (without a suspect to compare with the latent print) and labeled only 7 features in the latent print, identifying them as either bifurcations or ridge endings. (See left side of diagram on the next page). After keying in on Mr. May?eld as a suspect, the same ?ngerprint examiners then ire-examined the latent print side-by-side with Mr. May?eld?s ?ngerprint standards. The troubling influence of cognitive bias on the conclusions of the FBI examiners was clear in two important ways. First, the FBI examiners changed and re-labeled 5 of the 7 original features (from bifurcation to ridge ending, and vice versa) after viewing Mr. May?eld?s ?ngerprint standard and in an attempt to try to make the features ?match? Mr. May?eld. (See right side of diagram below): 8? Working Group on Human Factors in Latent Print Analysis, ?Latent Print Examination and Human Factors: Improving the Practice through a Systems Approach,? National Institute ofJustice, 2012, p. 10. 81 Office of Inspector General, Review of the Handling of the Brandon Mayfield Case,? U.S. Department of Justice, 2006, p. 1. I 32 FIGURE BB aim Paint! roller!? mold . Imp-risen . min pk!) FIGURE 3A mum mum-ding an *=Chan9ain type (9.9.. Eta BorBtoE) - Moved one ridge down Ridge Ending Bifurcation lmportantly, an after-the-fact independent investigation determined that the-FBI examiners erred on all but one of the changes- all but one of the features were in fact inconsistent with Mr.- I May?e-ld- yet nonetheless switched to appear to ?match? him after a sideI-by-side ComparisOn. Second, the FBI ?ngerprint examiners added 9 new features after examining the latent print and Mr. May?eld?s print standard side-by?side. While the FBI examiners reported that all 9 of these features matched Mr. May?eldis ?ngerprint standard. after-the- fact analysis established that the FBI examiners were wrong on 7 of these 9 features.82 The in?uence of bias on the misidentification in'the May?eld case was documented by Inspector General, who conducted a inquiry into the faulty forensic analysis. The 01G ?reported that the erroneous feature identi?cations by FBI ??ngerprint examiners ?were adjusted or in?uenced during the comparison phase by reaSom'ng ?backward ?om features that 82Office of Inspector General Review of the FBI 5 Handling of the Brandon May?eld Case U. S. Department of Justice p. 171 (2006) 33 are visible in the May?eld exemplars. ?83 The OIG concluded that this error was made due to I ?circular reasoning? that ?occurred after the :Mayfield prints were compared to [the latent print: ?34 ?om the detonator bag]. This circular reasoning (cognitive bias) ?began to infect the . examiner?s mental process? and led to the examiner creating inculpatory evidence out of thin .air..8?S . Havingidenti?ed the terrible effects of cognitive bias on the false ?ngerprint association. I the O-IG-called' for method reform. The May?eld misidenti?cation' occurred because the. older- ACE-V method did not require ?ngerprint examiners to analyze the latent print in isolation ifrOm the suspect?s prints and permanently document their feature selections, The OIG: recommended that the FBI reform their ACE-V method to reqUire blind analysis. informing the FBI that it should require ?documentation offeatures-observecl in the latent ?ngerprint before the comparison phase to help prevent circular reasoning.?g6 In the years following the May?eld misidenti?cation, others in the forensic community woke up and called for the adoption of blind ?ngerprint analysis methods. The National Academy. of Sciences87 (NAS Committee) was the ?rst blue-ribbon panel to address the problem of cognitive bias in forensic science and recommend method reform, calling for the adoption of ?3 Id. at 138. 8? Id. at 139. 85 Id. at 150'. 85 Id. at 271. 87 The NAS is mandated by Congress to ?advise the federai government on scientific and technical matters.? ?The NAS committee that authored a ground?breaking report on the status of forensic science in the U.S. included:'the Director of the Virginia State Crime Lab (Pete Ma rone), a professor of Engineering at Stanford University (Chan ning Robertson), a forensic chemist and the former Director of the Forensic Science Program at Michigan State University (Jay Siegei), .a'forensic statistician at the University of Virginia (Karen Kafadar), and a federal Court of Appeals Judgef(Hon. Harry Edwards). 34 blind analysis.88 The Human Factors Working Group89 isSued eVen stronger ?ndings and recommendations 1n 2012, calling for all ?ngerprint examiners to be trained 1n cognitive bias and recommending that ?procedures should be implemented to protect. examiners ?om' exposure to 90 extraneous (domaindrrelevant) information in a case. Moreover, the .Working Group recommended that any features identi?ed after exposure to the suspect?s print standards should be ?viewed with caution,? and should be speci?cally labeled as having-been identi?ed after exposure to the suspect?s print standard.91 Most recently, the older ACE-V. method was rejected by the PreSident?s Council of Advisors on Scienceand Technology (PCAST-I392 :?As a matter of scienti?c validity, examiners must be required to _?-complete and document their analysis of a latent ?ngerprint before looking at any known fingerprint? and ?must separately document any data relied upon during comparison or evaluation that differs from the information relied upon" during "analysis. The FBI adapted these rules following 83National. Academy of Sciences ?Strengthening Forensic Science' In- the United States: A Path Forward,? National Academies Press 2009, p. 124, 184. See also Melendez-Diaz Mass, 129 S. Ct. 2527 (2009)(where the U. S. Supreme Court relied on the NAS report as persuasive authority' In the forensic community when holding that gserious deficiencies have been found' In the forensic evidence used In criminal triais.? nghe Human Factors Working Group was funded by the U. S. Department of Justice and consisted of fingerprint examiner from every level of law enforcement includingI ?ngerprint experts fromthe FBI, the Maryland State Police, the US. Secret Service, the Massachussets State Police, the Las Vegas Police Crime Lab, the Indiana State Police Crime Lab, and the Los'Ahgeles County Sheriff Crime Lab. the Working Group issued a - report, reviewing practices and problems with forensic fingerprint camparison. Their findings, issued' In 2012 in a ZOO?page report, can be considered authoritative' In the field.. hottps :llpapers. ssrn. com/scl3/papers. cfm?abstract_ id: 2050067 90-'II'liorking Group on Human Factors in Latent Print Analysis, "Latent Print EXamination and Human Factors: IZEnproving' the Practice through a Systems Approach,? National Institute ofJustice, p. (2012). Id. 9:"President 5 Council of Advisors on Science and Technology, "Forensic Science in Criminal Courts. Ensuring Scientific Validity of Featu re- Comparison Methods,? September, 2016. The PCAST consists primarily of leading scientists from across the country, including: a geneticist from MIT, an engineer and Vice President of the National Academy of Engineering, a mathematician and former CEO of The Aero'space Corporation, a doctor who was the first female president of the American College of Physicians, a chemist who directs the Institute for Nanotechnology at Northwestern University, the director of The Laboratory for Geochemical Oceanography at Harvard University, a doctor of biochemistry and professor emeritus at the University of California Berkeley, and a physicist who is a Senior Vice President at a leading aerospace and technology Corporation (to name a few). For several decades, the PCAST has reported to US. presidents on a'wide- range of scientific issues, including but not . - limited to nanotechnology, internet broadband development, cloning, and the uses of science and technology to combat terrorism. In short, the PCAST represents one of the most importa nt and authoritative collections of scientists in the country. 35 . the Madrid train bombing. case misidenty'ication,? they need to be universally adopted by . all laboratories ?93 - Acknowledging the Weight of the current research in the ?eld and demands for reform as discussed above,.leading crime labs have rejected the older ACE-V method and reformed their procedures to require blind analysis and perrnanent documentation of features prior to exposure - to the .suspect?s ?ngerprints. FBI fingerprint examiners are no longer permnted to look _at_ suspect?s ?ngerprints. prior-to "complete examination and documentation of Comparison features- on the latent print. The head of the FBI ?ngerprint laboratory has. made this practice clear: .?iAccording to LPU Unit Chief Meagher the analysis should be performed on the latent print before conSideration of any available known prints, in order to ?limit or try to restrict any bias in terms of what appears in the hnown exemplar. In other wards, analysis of the latent is performed prior to the examination of the relevant exemplar, in order to avoid having the known print suggest features in the latent print to the examiner. ?94 Likewise, the ISP has adopted blind analysis and documentation procedures, requiring examiners at the ISP to assess latent prints in isolation and document their feature selection prior to exposure to- suspect?s ?ngerprint standard. If an ISP examiner wishes to add- features to her analysis after exposure, these features are color-coded to alert the defenSe to the fact that these additional features maybe in?uenced by cognitiye bias. Other crime labs have done the same.? Unfortunately, . the LPU still clings to the older ?awed method. examiners, including Of?cer Malone, have stated that they do'not document features in any EWay prior to 93 ?at 100. See also, U.S. Department ofJustice National Council on Forensic that Forensic Analysis is Based Upon Task-Relevant Information,? (acknowledging the effects of cognitive bias on forensic analyses and calling for blind analysis methods involving ?sequential unmasking?). 9? Office of Inspector Gen_,era Review of the Handling of the Brandon Mayfield Ca_,se U. S. Department of Justice, 2006, 105- 106. 95 See, Interpol European Expert Group on Fingerprint Identification IEEFFI II. Method for Fingerprint Identi?cation; See also, European Network of Forensic Science Institutes, ."Best Practice Manual for Fingerprint Examination,? p. 13- 15, 2015, See also, forensic- (Virginia State Crime Lab). - 36 exposure to the suspect?s prints, and have gone so far as to claim that the only reason they document features at all is out of a courtesy for defense attorneys when requested in a case: 0 Of?cer Malone testi?ed that the reformed method of identifying and documenting features in the latent print prior to expose to the suspect?s prints ?is not a process we use.? (Attachment Q, p. 71). . 0 Of?cer Dabrowska has stated that she does not document comparison features during analysis. (Attachment A, p. 8, 10). I Of?cer Calvo admits that he does not document any comparison features prior to reaching his ?nal opinion in a case. (Attachment W, p. 2). I 0 Of?cer Seavers states that she does not document comparison features prior to looking at the suspect?s prints. (Attachment S, 4). 0 Of?cer Daniels claims that he documents comparison features only after-the-fact when creating a ?demonstrative? for court, and ?nther has testi?ed that the number of features he identi?es on the demonstrative is ?basically random. . . just for foundation for the identification. (Attachment H, p, 19-21). According to these of?cers, they document comparison features in the latent only after examining the suspect?s prints, or sometimes later after reaching their opinion about whether the suspect is the source of the latent print. This flawed process? and its open invitation to error based on biasing signals from the suspect?s prints- has been abandoned by leaders in the ?eld and is no longer scienti?cally defensible given the modern understanding of cognitive bias. When combined With the other systemic failures detailed above, this Court should not hesitate to exclude the opinions of LPU of?cers obtained through this ?awed method. VII. THE LPU HAS FAILED TO ATTAIN ACCREDITATION. CALLING INTO QUESTION THE RELIABILITY OF ITS FORENSIC CASEWORK. Today, forensic labs must be accredited, and the accreditation process presents very real hurdles that ?ngerprint labs must overcome to establish their professionalism and the reliability of their results. The US. Department of Justice, through its National Commission on Forensic Science (N CFS), issued the latest call for accreditation? ?To improve the overall quality of forensic science, all entities performing forensic science testing, even on a part-time basis, must 37 be included in universal The NCFS explains that accreditation involves an assessment of the validation and reliability of crime lab methOds, the competency and training of examiners at the lab, and the quality assurance procedures in the lab. Basic accreditation requires labs to adopt the fundamental attributes of good science: 1) validation of methods, 2) operation pursuant to written protocols, and 3) audits to assess the ongoing validity and reliability of their scienti?c methods.? The question of accreditation is so ?mdamental to the proper functioning of a crime lab that the American Bar Association formally adopted a resolution demanding that all crime labs be accreditedgg The American Society of Crime Laboratory Directors (ASCLD) conducts accreditation reviews of crime laboratories and approves accreditation for any labs that pass muster.99 Labs seeking ASCLD accreditation must have written procedures de?ning testing methodology and interpretation of testing data, written training manuals, and a written quality assurance plan that involves corrective actions for non-conforming work. Currently, there are 350 gOVernment/law enforcement crime labs in the US. that have proper accreditation through ASCLD.100 Many police agencies similar to the CPD have sought and obtained accreditation, including but not limited to the following: 96 The NCFS is a collaboration between the U.S. Department of Justice and the National Institute of Standards and Technology. The stated mission of the NCFS is to provide recommendations concerning "notionai methods and strategies for strengthening the validity and reliability of forensic science. . The NCFS Board includes representatives from law enforcement forensic agencies from around the country, including but not limited to: the I founder of the Armed Forces DNA Identification Lab, a chemist formerly with the U.S. Customs Lab, a member of the FBI DNA Advisory Board, the Director of the Palm Beach Sheriff?s Of?ce Crime Lab, a. forensic chemist with ATF, the Director of the Virginia State Crime Lab, and a forensic chemist with the F31. The NCFS is co-chaired by the Deputy U.S. Attorney General. 97 Huber, ?Understanding and Implementing 17025,? Agilent Technologiesstandards of quality and sound scienti?c practice. Alemeda County (CA) Sheriff Crime Lab Albuquerque (NM) Police Department Crime Lab' Austin (TX) Police Department Crime Lab Broward County (FL) Sheriff Crime Lab Chandler (AZ) Police Department Crime Lab Charleston (SC) Police Department Crime Lab Charlotte (NC) Police Department Crime Lab Chicago Regional Computer Forensics Lab Columbus (OH) Police Department Crime Lab Corpus Christi (TX) Police Department Crime Lab El Cajon (CA) Police Department Crime Lab Eugene (OR) Police Department Crime Lab Hagerston (MD) Police Department Crime Lab Las Vegas (NV) Police Department Crime Lab Long Beach (CA) Police DepartmentCrime Lab Los Angeles (CA) County Sheriff Crime Lab Los Angeles (CA) Police Department Crime Lab Mans?eld (OH) Police Department Crime Lab Nashville (TN) Police Department Crime Lab Miami-Dade (FL) Police Department Crime Lab Monroe County (NY) Crime Lab Montgomery (MD) Police Department Crime Lab Oakland (CA) Police Department Crime Lab Oklahoma (NE) City Police Department Crime Lab Palm Beach (FL) County Sheriff Crime Lab Prince William (VA) County Police Crime Lab Rapid City (SD) Police Crime Lab San Diego (CA) Police Department Crime Lab San Francisco (CA)_Police Department Crime Lab Scottsdale (AZ) Police Department Crime Lab St. Louis (MO) Police Department Crime Lab Tucson (AZ) Police Department Crime Lab Tulsa (OK) Police Department Crime Lab Wilmington (NC) Police Department Crime Lab Yonkers (NY) Police Department Crime Lab The ISP is accredited, and boasts that accreditation means that the ISP ?must adhere to stringent Given the many systemic ?aws of the LPU discussed above, it may not be surprising that the LPU has never sought accreditation. No independent authority has assessed the training program at the LPU for adequacy, has reviewed LPU policies to see if they provide minimal direction for examiners, or has examined LPU QA procedures to assess whether they are robust 39 9:101 enough to identify and rectify substandard forensic work. The LPU has simply operated, year after year, while avoiding the oversight that is built into the accreditation process. This should be a 'scary prospect for everyone in the Cook County criminal justice system. When other crime labs with de?ciencies similar to the LPU have been audited, systemic problems have been identi?ed and plans for improvement have been implemented. For instance, when quality issues were identi?ed with a drug lab in Massachusetts, the Of?ce of Inspector General conducted an audit, and identi?ed numerous lab de?ciencies~ lack of accreditation, lack of written protocols, lack of examiner training, and lack of a QA program. (Attachment C, p. 19? 46). The OIG recommended important reforms for the lab, including better training, better QA procedures, and accreditation for the lab. (Attachment C, p. 117-119). In North Carolina, the Attorney General appointed two retired FBI forensic experts to audit its crime lab. The FBI experts discovered that the NC lab lacked appropriate written procedures and suffered quality issues due to a lack of accreditation. (Attachment E). The auditors recommended that the N.C. lab attain accreditation ?at the earliest possible date?, that the lab re?train its examiners, and that the lab post all operating procedures on?line so that ?the operations of the lab are transparent and accessible to the public.? (Attachment E, p. 29-30). Similar audits have been undertaken to address shortcomings with crime labs :in Detroitm, St. Paul Washington D.C.104, and Austin105 (TX). 102 103 5.pdf. - 105 4O Given the substantial ?aws with the LPU, the lack of accreditation should preclude admission of ?ngerprint opinions from the LPU until this situation is recti?ed. This Court should not admit opinions of LPU examiners until a comprehensive accreditation audit establishes that LPU practices are reliable, their examiners are competent, and their results do not misstate the probative value of ?ngerprint evidence. ILLINOIS LAW SUPPORTS JUDICIAL ACTION IN THIS MATTER In F_rxLe jurisdictions, judicial inquiry into general acceptance is only a small part of the court?s duty when assessing the admissibility of scienti?c evidence. Eyen when a scienti?c method may be generally accepted by the broader scienti?c community, the application of the method in any given case may be so substandard that the. evidence is no longer reliable.106 Acknowledging this, Illinois courts ?act as ?the gatekeeper allowing through only reliable and relevant evidence for consideration by the jury.? Roach v. Union Paci?c Railroad, l9 61, 70 (1St Dist. 2014)(holding that courts have ?considerable leeway in deciding how to go about determining whether a particular [medical] expert ?s testimony is reliable?); See also, Decker v. Lain, 737 623, 625 (Ill. that in assessing expert testimony, the trial court ?serves the role as ?gatekeeper, barring testimony that is not su?iciently relevant or reliable to be admitted into evidence See also, Soto v. Ga?an, 728 1126 (2nd Dist. 2000)(holding that ?as gatekeeper of expert opinions disseminated to the jury, the trial court plays a critical role in excluding testimony that does not bear an adequate foundation of reliability?); See also, See McKown, 236 111. 2d at 305; See also People v. Luna, 2013 IL App (lst) 072253, p2 (lst Dist. 2013); People v. Floyd, 2014 IL App (2d) 120507,1122-24 (2d Dist. 2014); United States v. Frazier. 387 F.3d 1244, 1263 (11th Cir. 2004); Murray, 2014 DO. Super. LEXIS at 33-35, 56-58:United States v. Van ?Wk, 83 F. Supp. 2d 515 (D.N.J. 2000); United States v. Santillan, 1999 WL 1201765 (ND. Ca 1999); United States v. Revnolds. 904 F.Supp. 1529, 1558 (ED. Oka. 1995); Bowers, ?Forensic Testimony: Science, Law and Expert Evidence,? Academic Press (2014); Mnookin, ?The Courts, NAS, 85 the Future of Forensic Sciences,? Brooklyn L. R., Vol. 75, p. 51-55 (2010). - 41 Illinois v. Taylor, 782 920, 927 (2nd Dist. 2002)(holding that ?as the gatekeeper of expert opinions disseminated to the jury, the trial court must look behind the expert?s conclusion and analyze the adequacy of the foundation,? and further holding that ?the trial court is not required to blindly accept an expert?s assertion that his or her testimony has an adequate foundation?). When conducting this increased scrutiny of proposed expert testimony, courts are guided by Illinois Rule of Evidence 403, which requires exclusion of evidence ?if its probative value is substantially outweighed by the danger of unfair prejudice, confusion of the issues, or ?107 Courts use Rule 403 when assessing the admissibility of forensic misleading the jury. testimony,108 and exclude unreliable forensic testimony when the testimony would confuse or mislead the trier of factwg In fact, the US. Supreme Court in DLbert held that trial courts should conduct vigorous scrutiny when applying Rule 403 to expert testimony.110 In so holding, the court stated that, ?[e]xpert evidence can be both powerful and quite misleading because of the di?iculty in evaluating it. Because of this risk, the judge under Rule 403 of the . . 111 - present rules excretses more control over experts than over lay Witnesses. In tl'us case, the danger of exposing the jury to confusing and misleading scientific testimony could not be more Rule. Evid. 403. 103 See, U. S. V. Van Wyk, 83 F. Supp 2d 515 (D ..N J. 2000)(holding that ?in assessing a proffer of expert testimony the court must also consider other applicable rules such as F. R. E. See also, Bowers, ?Forensic Testimony: Science, Law and Expert Evidence,? Academic Press, 2014 (stating that courts can use Rule 403 to exclude expert testimony that is unfairly prejudicial). 109 See, U. S. V. Santillan, 1999 WL 1201765 (N D. Cal. 1999)(holding that handwriting comparison testimony was more prejudicial than probative), See also, William V. Rs,eynold 904 F. Supp. 1529 (E. D. Oka. 1995) (holding that the probative value of hair comparison evidence is substantially outweighed by its prejudicial effect); See also, Muookin, ?The Courts, The NAS, and the Future of Forensic Science, Brooklyn Law Review, Vol. 75, p. 51-55 (2010). no Daubert v. Merrill Dow Pharmaceuticals Inc., 509 U. S. 579, 595 (1993). 111 Daubert v. Merrill Dow Pharmaceuticals Inc. 509 U. S. 579, 595 (1993); See also, United States v. Monteiro. 407 F.Supp.2d 351, 358 (D. Mass. 2006) (?The court?s vigilant exercise of this gatekeeper role is critical because of the latitude given to expert witnesses to express their Opinions on matters about which they have no rsthand knowledge, and because an expert? testimony may be given greater weight by the jury due to the expert?s background and approach. See also,S Soto V. Ga?an, 728 N. E. 2d 1126 (211d Dist. 2000)( holding that trial courts must assess the foundational admissibility of expert testimony more closely than lay testimony ?because the rules of evidence grant expert witnesses testimonial latitude unavailable to other witnesses on the assumption that the expert?s opinion will have a reliable basis in the knowledge and experience of his discipline?). 42 clear, and it requires judicial action. The unfair prejudice in this case comes from two sources: the fact that ?ngerprint examinations performed by the LPU are unreliable, and the fact that testimony by LPU examiners will mislead the trier of fact regarding the probative value of the I ?ngerprint evidence. For. all of the reasons explained above, ?ngerprint examinations by the LPU are completely unreliable. LPU examiners do not follow any written protocols, making errors more likely. The LPU has no QA system, so it is incapable of identifying errors and correcting them. The LPU has not properly trained its examiners, causing them to fail to understand the and Weaknesses of their discipline and to provide inaccurate opinions. The LPU has not validated any of the methods it uses to alter the appearance of ?ngerprint evidence, meaning that none of these methods have been shown to be reliable and accurate. The LPU still uses a ?awed examination method that is more likely to lead to erroneous results. And the LPU has never passed an accreditation audit, raising red ?ags about the professionalism and accuracy of its examinations. At every turn, the LPU has failed to establish the reliability of its operation, leaving no option for this Court as gatekeeper but to exclude the proffered ?ngerprint testimony. Separately, the proffered testimony in this case would unfairly prejudice Mr. Henderson because the CPD examiner will not accurately inform the trier of fact of the weaknesses with the LPU, the known limitations of ?ngerprint comparisons, the chance of error in this case, and the probative value of a ?ngerprint association. The list below details the claims that CPD examiners make about the probative value of a ?ngerprint association, and the claims that they should make based on accepted research in the ?eld: 43 CPD Claims Fingerprint opinions are objective. Fingerprint comparison can identify the source of a print to the exclusion of all others in the world. There is a 0% chance that an error was made in this case, and I am not Willing to discuss recognized error rate studies. The use of AFIS in this case does not change anything, because the examiner still makes the ?nal decision. The non-blind method I used in this case is generally accepted and cognitive bias is not Accurate Claims Fingerprint opinions are subjective. Fingerprint examiners must refrain from making claims of absolute source attribution. False positive errors happen, and studies show they happen in as many as 1 in 18 cases. Errors are more likely in AFIS cases because of close non-matches in large databases. The non?blind method I used in this case increases the chance of error due to a problem. cognitive bias. In addition to these claims, ?ngerprint examiners from the LPU will fail to concede that the lack of written protocols, the lack of a quality assurance plan, the lack of accreditation, and the failure of validation, all mean that errors are more likely to occur in examinations conducted by the LPU. Although the fact that testimony by LPU examiners will overstate the probative value of print evidence is clear from the arguments summarized above, it also the opinion of Glenn Langenburg. After reviewing LPU policies and practices, Langenburg concluded that LPU examiners ?cannot provide the proper foundation, context, and objectivity supporting a scientific expert opinion of latent print evidence. (Attachment J). Langenburg further concluded that the de?ciencies with the LPU ?raise serious concerns about their knowledge, skill and abilities to perform fingerprint examinations and appropriately convey their results to the trier of fact. (Attachment J). Despite the many de?ciencies with the LPU and the mandate of Rule 403 calling on courts to exclude unreliable forensic testimony, this Court may be tempted to accept the argument that the concerns raised here go to weight rather than admissibility. However, this 44 hands?off approach could only be justi?ed if cross-examination or the con?icting opinion of a I defense expert possessed the potential to expose the weaknesses of the ?ngerprint methods of the CPD and the resulting weakness of the ?ngerprint evidence in this case. However, the research bears out precisely the opposite conclusion. Study after study shows that jurors have a dif?cult time accurately assessing the real value of forensic evidence.112 These same studies also show that with forensic evidence in particular, cross-examination is ineffective in reCtifying erroneous assessments of forensic evidence by jurors.113 Finally, even when the defense presents its own ?2 See, Thompson, ?Lay Understanding of Forensic Statistics.? Law Human Behavior, Vol. 31, p. 332-49 (2015) . (reviewing studies on juror comprehension of statistics and concluding that are susceptible to statistical fallacies, both prosecrition and defense varieties); See also, Koehler, ?If the Shoe Fits They Might Acquit,? Northwestern University Public Law Research Paper, January 12, 2011 (concluding that jurors ?are slow to revise incorrect probabilistic hypotheses? ?fall prey to logical fallacies and ?failed to appreciate the role that error plays in interpreting the value ofa reported match?); See also, Sanders, ?Reliability Standards?~?Too High, Too - Low, or Just Right?,? Seton Hall L. Rev., Vol. 33, p. 881-1282, at 901, 919 (2003) (describing jurors as struggling with statistical information and unable to detect expert witness biases); See also, Dawn McQuiston?Surrett Michael J. Saks, Communicating Opinion Evidence in the Forensic Identification Sciences: Accuracy Impact, 59 Hastings L.J. 1159, 1170 (2008) (?most jurors have an exaggerated view of the nature and capabilities of forensic identi?cation?); See also, People v. New, 2014 1L 116306, at 1126 (111. 2014) (noting the ?natural inclination of the jury to equate science with truth and, therefore, accord undue signi?cance to any evidence labeled scientific?); See also, People v. Zayas, 131 Ill. 2d 284, 292 (1989) (in ruling hypnotically-assisted-recall testimony inadmissible court emphasized the likelihood and danger of prior juror exposure to misleading information about hypnosis 113 See, Sanders, ?Reliability Standards?Too High, Too Low, or Just Right?,? Seton Hall L. Rev., Vol. 33, p. 881- 1282, at 934-936 (2003)(Concluding that multiple studies bear out the sobering reality that even robust cross examination of experts affects neither ultimate verdicts nor even juror con?dence in said verdicts); See also, Koehler, ?If the Shoe Fits They Might Acquit,? Northwestern University Public Law Research Paper, January 12, 2011 (?Contrary to predictions, none of the source and guilt dependent measures in the main experiment were a?ected by the introduction of cross examination. There was no e?ect for cross examination on source confidence, source probability, guilt confidence, guilty probability, or verdict. Likewise there was no e?ect for cross examination across the two individualization conditions on any of the dependent measures?); See also, Saks, ?The Testimony of Forensic Identi?cation Science: What'Expert Witnesses Say and What fact?nders Hear,? Law Human Behavior (Authors conducted a study and reviewed others, ultimately ?nding ?little or no ability ofcross- examination to undo the e?ects of an expert?s testimony on direct examination, even if the direct testimony is ??aught with weaknesses and the cross is well designed to expose those weaknesses.? Interestingly, the authors conclude that cross examination can effect juror evaluation of expert evidence if it is presented honestly as a subjective guess, but that unshakea?bleness of the traditional forms: match and produce something of a ceiling effect, which resist moderation by the presentation of other information?); See also, Shari Seidman Diamond, et al., ?Juror Reactions to Attorneys At Trial,? 87 J. Crim. L. Criminology 17, 41 (1996) (The author conducted an experiment, using 1925 jury-eligible residents of Cook County, which varied the strength of an attorney?s cross examination of an expert witness and found that, lthough juror perceptions of the attorney appear susceptible to in?uence by the attorney's e?orts during cross?examination, the strong cross?examination had no e?ect on the verdict?). 45 expert, juror misconceptions about the forensic evidence can persist.114 It is for these reasons that researchers who investigate the effects of expert testimony on jurors conclude that their ?results should give pause to anyone who believes that the traditional tools of the adversarial process will always undo the adverse ejj?ects of weak expert testimony. These crucial factors- jurors? misperceptions of the value of forensic evidence and the ineffectiveness of cross examination- make clear that a judicial approach of just leaving it to the jury to sort out is untenable. Rather, this Court should exclude the prosecution?s unreliable, misleading, and confusing forensic evidence. IX. CONCLUSION For the reasons stated above, Mr. Henderson requests that this Court do the following: 1) Exclude the results of the ?ngerprint examination in this case, 2) Order that the LPU pass an independent and comprehensive audit before the results of ?ngerprint examinations by the LPU will be admitted into evidence in this Court, 3) Conduct an evidentiary hearing to further establish the factual basis of claims made herein. I Respectfully Submitted, Brendan Max, Brett Gallagher Cook County Public Defender Of?ce ?4 Sanders, ?Reliability Standards?Too High, Too Low, or Just Right?,? Seton Hall L. Rev., Vol. 33, p. 881-1282, at 934 (2003). ?5 McQuiston-Surrett Saks, ?Communicating Opinion Evidence in the Forensic Identi?cation Sciences,? Hastings Law Journal, Vol. 59, at p. 1188; See also, Sanders, ?Reliability Standards?Too High, Too Low, or Just Right?,? Seton Hall L. Rev., Vol. 33, p. 881-1282, at 936 (2003) (?experimental ?ndings should give pause to others who believe that the traditional tools of the adversarial process are a full substitute to restrictions on the admissibility of unreliable expert testimony?); See also," Murray v. Motorola, 2014 DC. Super. LEXIS 16 (2014)(wherein the court decided not to leave it up to the jury to assess the methods of an expert epidemiologist, reasoning that ?the court cannot be con?dent that e?ective advocacy can eliminate the risk that a jury would be misled by [the expert?s] testimony and reach a result on an improper basis?). 46 STATE OF ILLINOIS SS. COUNTY OF COOK IN THE CIRCUIT COURT OF COOK COUNTY, COUNTY CRIMINAL DIVISION PEOPLE OF THE STATE OF ILLINOIS, COURTNEY HENDERSON, Petitioner Defendant. Respondent Plaintiff, No. Hon. William Hooks, Judge Presiding RESPONSE To MOTION To EXCLUDE TESTIMONY . - . Now come the PEOPLE OF THE STATE OF ILLINOIS, by theii? attorneys, KIMBERLY M. FOXX, State?s Attorney of Cook County, Illinois, and her Assistant, Mark A. Ertler, and'resPectfully ask that this Honorable Court-deny Defendant?s Motion to EXclude Fingerprint Testimony Due to Multiple Systemic Failures. In suppoxtthereof, - Ithe People state as follows: I - I 1.7 PROCEDURAL HISTORY 5/ Defendant is charged by way of indictment with Attempt Murder, Aggravated Discharge of a Firearm, Armed Robbery and Aggravated Unlaw?il Restraint stemming . wpau . .. .. m. . -.. .- .. victim with a sawed off shotgun during a robbery in vvhich items were eventually taken from the victim?s car. The Chicago Police Department (CPD) recovered latent ridge impressions from the victim?s vehicle. Subsequent analysis conducted by the CPD Latent Print Unit resulted in the identi?cation of the defendant as the source of the latent impressions found in the car. On April 25, 2017, counsel for Defendant ?led a ?Motion to Exclude Fingerprint Testimony Due to Multiple Systemic Failures,? which essentially argues that this Court should as a matter of lavv bar the introduction of evidence regarding the ?ngerprint comparison referenced above, as well as order the Chicago Police Latent Print Unit to submit to an audit. The People respectfully respond as set forth below II. THE RESULTS OF LATENT PRINT COMPARISON CONDUCTED BY THE CHICAGO POLICE DEPARTMENT SHOULD BE ADMITTED AT . TRIAL Evidence regarding the analysis and comparison of ?ngerprints and latent impressions has been admissible in Illinois for well over 100 years.1 Defendant does not suggest that this Court conducts. hearing pursuant to v. United States? Therefore, evidence of ?ngerprint analysis is admissible, subject to the customary voir dire regarding any expert witness? quali?cations at trial. Testimony by expert witnesses is admissible in Illinois under Rule 702 of the Illinois Rules of Evidence. ?If scienti?c, technical, or other specialized. knowledge will assist the trier of fact to understand the evidence or to determine a fact in issue, a witness quali?ed as an expert by knowledge, skill, experience, training, or education may testify . 1 People v. Jennings, 252 Ill. 534 (1911). 2 Fry: v. UnitedStares, 293 F. 1013, 54 App. no. 46 (1923). . . .. ?my" "l-Wi-If "um-nun cum-m:? thereto in the form of an opinion or otherwise.?3 Fingerprint examiners from the Chicago Police Department (CPD) have for decades been routinely found quali?ed to provide expert testimony in the Circuit Court of Cook County. I Defendant?s assertion that this Court should now as a matter of law bar all evidence of ?ngerprint analysis conducted by CPD is unjusti?ed and would constitute the most extreme of remedies where none is warranted. Defendant is ?ee to cross-examine Latent Print Ehaminer Michael Malone or any other witness called by . the People to testify to ?ngerprint analysis regarding that witness? quali?cations at trial. His . methodology and conclusions may likewise be subject to cross?examination. Defendant may argue to the trier of fact what weight should be given to LPB Malone?s testimony, but there is no reasonable basis to hold that his testimony. is inadmissible as a matter of law. While defendant argues that he would be unable to cross-examine Malone in a meaningful way, the sheer volume of material prepared by and presented by defendant in his motion demonstrates that quite the opposite is true.4'lt appears that the defendant, through his counsel, is well-prepared to conduct a vigorous cross-examination. Defendant attempts to skirt the fact that controls here, and the fact that no hearing is necessary to determine the admissibility of evidence regarding ?ngerprint - comparison. He urges this Court to examine a number of factors that are not required under the Frye standard. Defendant is essentially asking this Court to apply an analysis akin to Disrziherif,5 which is simply not the law in Illinois. In fact, when codifying the . Illinois Rules of Evidence, the Illinois Supreme Court speci?cally chose to ensure that 3 Ill.R.Evid. 702. . 4 See Defendant?s Motion, p. 44?46. 5 Daubert v. Merrill Dow Pharmaceuticals, Inc, 509 U. S. 579 (1993). - . . gull-?uuwa .. pan-aw men-A Munro- A, .W. . .. Frye remained the standard for determining the admissibility of evidence, in keeping with well?established Illinois law.6 Defendant argues that Illinois Rule of Evidence 403 mandates the exclusion of ?ngerprint evidence in this case because it would expose a jury to .confusing and misleading scienti?c testimony? Rule 403 provides for a balancing test where evidence may be excluded ?if its probative value is-substantially outweighed by the danger of unfair prejudice, con?ision of the issues, or misleading the jury, or by considerations of undue delay, waste of time, ?or needless. presentation of cumulative evidence.?8 In support of his assertion'that this Court should apply Rule 403 in the instant matter, defendant cites federal court decisions which reference Federal Rule of Evidence 403.9 These decisions were reached in the conteXt of Dauberz?, which is the standard used in federal courts, but which does not apply in Blindis. If this Conrt were to apply the balancing test under Illinois law, it is clear that the trier of fact should be permitted to hear the evidence in question and to make its own detennination as to the weight to be given said evidence. Testimony regarding ?ngerprint comparison is not so complex or confusing that a jury would be incapable of understanding it or of determining the credibility of the evidence. Illinois juries have done so for over a century. Rule 403 should not serve as a back door to a Daaberr analysis in con?ict with deeply rooted Illinois law. It is not necessary to address the accuracy of defendant?s numerous assertions regarding the standards and practices of the CPD Latent Print Unit at this juncture, as the 51d See also Donaldson v.Cer1rrdI Illinois [Dahlia Service Co, l99 ill314 (2002)?- See Defendant?s Motion, 42 i 111. Evid. 403 9 Defendant?s Motion, p. 42. lemma-I1: I u-?v .. ., methodology employed by LPE Malone is appropriately the subject of cross-examination .at trial. Defendant urges this Court to paint with a broad brush and to anticipatorily assume that information garnered from multiple interviews with latent print examiners not involved in the instant case would be both releVant and impeaching at trial. Such an assumption is both premature and unjusti?ed. Similarly, the criticisms expressed by defendant?s purported experts is misplaced ?in this motion. The evidence of ?ngerprint comparisons conducted by LPE Malone is clearly admissible under the Frye standard. Defendant is free to call his own experts to testify, if he so chooses. The determination of Malone?s quali?cations is an individual determination to be made by the Court at the appropriate time, not as part of an all- encompassing ban on the Chicago Police Department?s Latent Print Unit. Perhaps the most informative aspect of defendant?s motion lies in what he does not say. Defendant raises no suggestion that LPE Malone is wrong. Despite the massive amount of information he asks the Court to consider, defendant has not offered an Opinion by one of his ovvn experts that Malone made a misidentification. The best remedy available to the defendant at this time is to have his own expert conduct an evaluation of the evidence, yet that is the remedy of which he chooses not to avail himself. REQUEST FOR A COURT ORDERED AUDIT SHOUD BE STRICKEN, OR IN THE ALTERNATIVE, DENIED. Defendant seeks the extraordinary remedy of a court-ordered audit of the CPD Latent Print Unit. This motion is not properly brought before the Court, and no such remedy is available even if this Court had the jurisdiction to enter such an order. The thereforebe? stricken? or in the alternative, denied. Defendant seeks an adverse order against the Chicago Police Deparhnent. Neither the City of Chicago, nor speci?cally CPD, are. party to the instant cause of action. Defendant has made no showing that the City of Chicago or the Chicago Police were served with proper legal notice of defendant?s motion. The Of?ce of the Cook County State?s Attorney represents the People of the State of Illinois in this proceeding and is not an. agent of the City or of CPD for purposes of service of process. It is wholly unfair for the defendant to ask this Courtto enter an order against a non?party who has not been given the bene?t of due process. Outside of rhetoric, defendant has failed to establish that the extraordinary remedy he seeks is legally available. Defendant 'has' cited no statute or precedent that establishes the court imposed audit of a pelice department?s ?ngerprint unit as a remedy available to 'a defendant in a criminal case. The matter currently before the Court is unlike the exclusion of evidence in a motion to suppress, Where the sanction. is based on a ?nding that the past conduCt of law enforcement agents violated a defendant?s rights. Rather, the defendant here asks the Court- to order future action by a non?party. Further, defendant has failed to make any shbwingthat this Honorable Court even has the jurisdictiOn to hear suCh a request. Defendant?s request that this Court order a comprehensive audit of the Chicago Police Latent Print Unit should be stricken. Defendant has failed to demonstrate that this Court has jurisdiction over the matter. Additionally, the remedy sought is not legally available to the defendant. In the alternative, defendant?s request should be denied as a matter of law. In,? a raw-=- mt?WA-i' ., IV. CONCLUSION Fingerprint comparison evidence has long been and remains admissible in Illinois. It is not new or novel and it enjoys general acceptance in the relevant scienti?c community. Challenges to the quali?cations of and methods employed by a particular expert should be'limited to the speci?c case and expert involved and should not involve a sweeping review of an entire agency. Defendant has failed to establish that the remedies he seeks are available or that thisCourt has jurisdiction to order the audit he urges. . WHEREFORE, the People respect?illy ask this Henorable Court- to DENY the Motion to Exclude Fingerprint Testimony Due to Multiple Systemic Failures. Respectfully submitted, if! 5? Mark A. Ertler Assistant State?s Attorney. 2650 S. California Av, Room 11C39 Chicago, IL 60608 773?674-5832 ?um wmwr' - - .mv you? yr IN THE CIRCUIT COURT OF COOK COUNTY - CRIMINAL DIVISION STATE OF ILLINOIS 16 CR 1321601 . JUDGE WILLIAM HOOKS COURTNEY HENDERSON . PRESIDING I REPLY- AUG 03 201? damp? or cincu?li?cou? DUE TO NIULTIPLE SYSTEMIC FAILURES Courtney Henderson, thrOugh the Cook County Public Defender, requests that this Court exclude the testimony of Of?cer Malone. In reply to the State?s Response Motion, Mr. Henderson states the following: I. INTRODUCTION The State?s Response Motion provides very limited insight into the important issues before this Court and does not provide justi?cation for precluding further litigation of the issues contained in Mr. Henderson-?s Motion to Exclude Fingerprint Testimony. First, the State?s Response does not address in any way the factual. allegations supported in Mr. Henderson?s Motion to Exclude the ?ngerprint evidence in this case- pointing to the inescapable conclusion that the is a woefully substandard forensic lab that fails every benchmark of reliable science. Second, even though Mr. Henderson?s Motion to Exclude is not based on nge and its progeny, the State attempts to muddy the waters of this litigation by claiming that Page is at issue in this case, thereby seeking to sidestep the clear application of Rule 403 to the unreliable . .H. um. . . :rr .. . . . . ?ngerprint evidence in this case. Third, the State?s tactic to try to avoid dealing with the actual substance of the many failings of the getting a second opinion from another ?ngerprint examiner- is equally ?awed because scoond opinions have been demonstrably wrong in the past and because a second opinion does not cleanse the if their many and ?mdamental de?ciencies. For these reasons, this Court still has every reason to address the merits of Mr. Henderson?s Motion to Exclude Fingerprint Testimony, and rule in favor of Mr- Henderson. II. THE UNREBUTTED FACTUAL CLAIMS OF SYSTEMIC LAWS AT THE REQUIRE FURTHER LITIGATION, INCLUING - AN EVIDENTIARY HEARING. Supported by extensive investigation, the latest research" in the relevant scienti?c community, and the. signed statements of three of the nation?s leading forensic experts, Mr. Henderson alleges that the suffers ?om multiple systemic ?aws that violate every tenant of reliable science. (Motion to Exclude Fingerprint Testimony, p. 341). These flaws include the following: Protocols- While standard practice in the ?ngerprint community requires that examiners follow written laboratory protocols that de?ne the examinatibn process and the opinions that result from the examination process, the LPU does not operate pursuant to such protocols - 0- Quality Assurance- While ?ngerprint labs in the U. are required to maintain a documented quality assurance program designed to identify and correct errors, the LPU has no such program 0 Training- While other ?ngerprint labs have documented training programs designed to re?educate examiners on the fundamental changes and reforms' in the ?eld, the LPU has no such training program and its examiners lack basic knowledge about the present state of their own forensic discipline . Validation- While standard practice in the scienti?c community requires ?ngerprint alteration techniques to be validated for scienti?c reliability, the CPD uses techniques to alter the appearance of original fmgerprint evidence withciut ever having validated any of the techniques to establish reliability. . 0 Methodology? While some ?ngerprint labs in the past permitted themselves to cheat by looking at the suspect?s prints when attempting to identify ambiguous features in latent prints, this older method is no longer generally accepted yet still in use at the LPU. . ?Accreditation- While hundreds of forensic labs across the country have gone through the accreditation process to establish the validity and reliability of their examinations, the has side?stepped this process and sought to Shelter its substandard practices from meaningful oversight. Despite the seriousness of Mr. Henderson?s claims, the State did not expend a single sentence of their Response trying to refute these factual allegations or justify-the work conducted by the This fact alone should be fatal to the admissibility of the State?s ?ngerprint evidence in this case. Of the many ?aws with the listed above, consider just the problem with lab protocols. The forensic community universally agrees that forensic labs must operate pursuant to adequate written protocols-' the Department of Justice and the American Bar Association have said as much, accrediting bodies refuse to certify labs that fail to maintain adequate written protocols, and auditors of failed crime labs have found the lack of adequate written protocols to . be a major factor in the failure of numerous forensic labs. (Motion to Exclude Fingerprint Testimony, p. 3-9). With regard to protocols, Mr. Henderson alleges that the protocols are so ?awed as to amount to a complete lack of meaningful direction for examiners. This allegation is supported by a simple reading of the documentation which the passes off as ?protocols,? by statements of examiners who admit that they don?t follow protocols, and by the informed opinion of three of the nation?s leading forensic scientists who reviewed documentation. (Henderson Motion to Exclude Fingerprint I Testimony, p. 3-9). Faced with this irrefutable record of scienti?c failure, the State simply encourages this Court to ignore this problem.1 This Court should decline the State?s offer to have the criminal justice system look the other way while the conducts unreliable science with ?awed methods. The allegations in Mr. Henderson?s Motion to Exclude Fingerprint Testimony with regard-to protocols and other important scienti?c benchmarks should give this Court (and for that matter the State). great pause. By simply ordering that this matter proceed to an etddentiary hearing, this Court can take an important ?rst step toward ensuring that Mr. Henderson and other indigent residents of Cook County are not faced with faulty forensic evidence by a substandard lab. Any other resolution to this litigation would strike a heavy blow to the notion of fairness and justice for all. IH. THE STATES LEGAL ARGUMENT IS A CLASSIC RED HERRING AND PROVIDES N0 JUSTIFICATION FOR SHORT-CIRCUITING THIS LITIGATION. The State seeks to evade meaningful pre-trial review oftheir ?awed forensic evidence in this case by rewriting Mr. Henderson?s legal arguments to focus on m. Even though Mr. Henderson raised no issues with regard to the State concentrates most ?of their efforts in their Response to propping up an imaginary Em argument, and then sheeting it down. The motivation behind this "straw?man argument is easy to see- defendants almost always lose Egg motions regarding forensic ?ngerprint evidence?? In referencing Frye, the State?s ?awed legal 1 The same is true of each of Mr. Henderson 5 factual allegations- regarding failures of quality assurance, training, validation. methodology, and accreditation- in addition to those pertaining to protocols. Mr. Henderson supported each of these claims with extensive investigation, current scienti?c research and expert opinions. The State has likewise ignored the substance of these claims. 2 Mr. Henderson does not claim that the entire ?eld of forensic ?ngerprint comparison? Is flawed such that eis implicated. Rather Mr. Henderson claims that the particular flaws with the render the evidence' In this case unreliable. 3 This' 18 so for several reasons, including because the burden on a Egg e-movant is high and because the case law strongly supports a ?nding of general acceptance. ?,Also the stakes are higher for the court' In Eryg cases because of the precedentia'l value of edecisions? the granting of a echallenge can affect an entire class of evidence rather than only the admissibility of scienti?c evidence at one particular case. 4 position boils down to the following- as long as a forensic discipline has passed a general acceptance inquiry under Erie, trial judges are without authority to exclude the forensic evidence for any reason, no matter how ?awed and unreliable the evidence may be. This Court shOuld reject the State?s legal argument, both because this case is not a Fi? case and because this Court has a separate and important gatekeeping responsibility under Rule of Evidence 403 to exclude unreliable ?ngerprint evidence. - Pursuant to their gatekeeping function and independent of live,- trial judges have an important role in scrutinizing expert claims prior to trial, and eitcluding such claims that are not demonstrably reliable. See, Roach v. Union Paci?c Railroad. 19 61, 70 (19't Dist. 2014)(holding that courts ?act as ?the gatekeeper? allowing through only reliable and relevant evidence for consideration by the jury,? and also holding that courts have ?considerable leeway in deciding how to go about determining Whether a particular [medical] expert?s testimony is .- reliable? See also, Decker v. Libel], 737 623, 625 (Ill. 2000)(stating that in assessing I expert testimony, the trial court ?serves the role as ?gatekeeper,? barring testimony that is not suf?ciently relevant or reliable to be admitted into evidence); See also, Illinois v. Taylor, 732 920, 927 (22?1 Dist. 2002)(holding that ?as the gatekeeper of expert opinions disseminated to the jury, the trial court must look behind the expert?s conclusion and analyzethe adequacy of the foundation,? and further holding that ?the trial court is not required to blindly accept an expert?s assertion that his or her testihiony has an adequate foundation?). For instance, in Ga?, the trial court denied the defendant?s pro-trial motion to exclude medical expert testimony and the defendant appealed. Soto v. Ga?an, 728 1126 Dist. 2000). On appeal, the Second District described the important duties trial judges have (none of them in this - case Fae-based) When deciding Whether expert testimony is admissible. The Soto court held that ?as gatekeeper of expert opinions disseminated to the jury, the? trial court plays a critical role in excluding testimony that does not bear an adequate foundation of reliability.? Li. at 1133. Unrelated to Fi?: the court stated that trial judges need not ?blindly accept the expert?s assertion that his testimony has an adequate foundation.? Rather, the Court held that ?scrutiny is required because an expert?s opinion bears an aura of reliability and trustworthiness,?land the Court directed trial judges to assess the ?bases and reasons for the opinion [ofan expert].? at 1132-1133. In addition, Rule of Evidence 403 authorizes trial judges to exclude evidence that confuses or misleads the trier of fact, and trial judges rely on this rule when excluding improper expert claims. 403; See also, U.S. viVan Wyk, 83 F. Supp. 2d 515 (D.N.J. 2000)(holding that ?in assessing a proffer of expert testimony the court must also consider other applicable rules such as F.R.E. 403. . See also, US. v. Santillan 1999 WL 1201765 (N .D. Cal. 1999)(holding that handwriting comparison testimony was more prejudicial than probative); See also, William v. Reynolds. 904 F.Supp. 1529 (ED. Oka. 1995) (holding that the probative value of hair comparison evidence is substantially outweighed by its prejudicial effect); See also, Bowers, ?Forensic Testimony: Science, Law and Expert Evidence,? Academic Press, 2014 (stating that Courts can use Rule 403 to exclude expert testimony that is unfairly prejudicial); See also, Mnookin, ?The Courts, The NAS, and the Future of Forensic Science,? Brooklyn Law Review, Vol. 75, p. 51-55 (2010). This obligation is separate and distinct ??om issues surrounding Frye and general acceptance. And while the State did not acknowledge in their Response the role that Rule 403 has in the Court?s pre-trial assessment of the admissibility of the forensic evidence in this case, the State has conceded the importance of Rule 403 in other recent litigation regarding the admissibility of forensic evidence. (Attachment Z, p. 2, 16). . . For the many reasons argued in Mr. Henderson?s Motion to Exclude Fingerprint Testimony, this Court should exercise its gatekeeping function and exclude the proffered fingerprint testimony. (Motion to Exclude Fingerprint Testimony, p. 41-46). As explained at length, the proffered opinion of is unreliable due to the many systemic failures in the Additionally, the testimony of the examiner would greatly mislead the trier of fact due to the important ways in which the examiner (due to lack of adequate training) will misstate the scienti?c foundation and limitations of forensic ?ngerprint evidence: CPD Claims Accurate Claims 0 Fingerprint opinions are objective. 0 Fingerprint opinions are subjective. Fingerprint comparison can identify the 0 Fingerprint examiners must re?ain from source of a print to the exclusion of all others In the world. . There is a 0% chance that an error was made in this case, and I am not Willing to discuss recognized error rate studies. The use ?of AFIS in this case does not change anything, because the examiner still makes the ?nal decision. The lien-blind method I used in this case is generally accepted and cognitive bias is not a problem. making claims of absolute source attribution.- False positive errors happen, and studies show they happen in as many as 1 in 18 cases. Errors are more likely in AF IS cases because of close non-matches in large databases. The non-blind method I use-din this case increases the chance of error due to cognitive bias. IX. THE FACT THAT THE STATE SOUGHT A SECOND OPINION REGARDING THE FINGERPRINT EVIDENCE DOES NOT EFFECT THIS PRE-TRIAL. ADMISSIBLITY LITIGATION. Not only does a second opinion by another examiner net settle any of the important issues of systemic failure raised in Mr. Henderson?s Motion to Exclude Fingerprint Testimony, it likewise does not settle another important question that will be contested later at trial? whether the ?ngerprint in this case is properly associated to Mr. Henderson. One important lesson from the ?ngerprint misidenti?cation ?asco in the Brandon May?eld case is that second 'opinions- and third and fourth opinions- can be just as wrong as the initial opinion.4 The events leading to the May?eld ?ngerprint misidenti?cation began on 'March 11, 2004, when terrorists detonated bombs on trains in Madrid, Spain, killing 200 people? Within days of the bombing, the (considered by some to be the world?s leading crime lab and unquestionably better resourced than the assigned some of their most experienced ?ngerprint examiners to analyze a ?ngerprint found on a bag of detonators connected to the terrorists.6 Leading the FBI ?ngerprint team was Unit Chief Michael Wieners, who both reviewed the ?ngerprint evidence and assigned other FBI ?ngerprint examiners to the-case. Wieners ?rst assigned Terry Green to work on the high-pro?le case due to this ?extensive experience and strong skills.? Green conducted his examination and erroneously concluded that - the print from the detonator bag matched an attorney named Brandon May?eld, who lived in Oregon and had never been to Spain? Next, Wieners assigned another examiner, John Massey, to'the case to ?ve the results. Massey had 35 years of experience as a latent print examiners, and Wieners selected Massey due to his n?'extensive skill and extensive experience.? Massey, along with Wieners, both ?veri?ed? the erroneous ?ngerprint association to May?eld.3 Finally, one additional experienced examiner (retained by May?eld?s lawyers) conducted another full review of the ?ngerprint evidence and likewise erroneously agreed with the false ?ngerprint association to May?eld.9 4 Of?ce'of Inspector General, Review of the Handling of the Brandon May?eld Case,? US. Department of Justice, 2006. 5 Id. 5 Id. at 30-33. 7? Id. at 30-31. 3 Id. at 32-33. 9 Id. at 80. _r This complicated tale of forensic failure includes a very simple lesson for the criminal justice system- a second or third or fourth opinion does not settle the factualimatter of whether a ?ngerprint can accurately be associated to a suspect. For this reason, this Court must reject the State?s claim that the important issues contained in Mr. Henderson?s Motion to Exclude Fingerprint Testimony are somehow moot because the State has sought a second opinion. Rather, this Court should proceed to the merits of this litigation to determine as a factual matter the extent of systemic failure with the the effects of these failures on the reliability of the forensic evidence this case, and whether the trier of fact should be exposed to the misleading opinions of the ?ngerprint examiner in this eerie. X. CONCLUSION For the reasons stated above, Mr. Henderson requests that this Court do the following: 1) Exclude the results of the ?ngerprint examination in this case, 2) Order that the LPU pass an independent and comprehensive audit before the results of ?ngerprint examinations by the LPU will be admitted into evidence in this com, 3) Conduct an evidentiary hearing to further establish the factual basis of claims made herein. - Respectfully Submitted, Brendan Max, Carly 2: Cook County Public Defender Of?ce .