Please use this identifier to cite or link to this item: http://dx.doi.org/10.25673/34885
Title: On digitized forensics : novel acquisition and analysis techniques for latent fingerprints based on signal porcessing and pattern recognition
Author(s): Hildebrandt, Mario
Referee(s): Dittmann, JanaLook up in the Integrated Authority File of the German National Library
Granting Institution: Otto-von-Guericke-Universität Magdeburg, Fakultät für Informatik
Issue Date: 2020
Extent: xxiv, 243 Seiten
Type: HochschulschriftLook up in the Integrated Authority File of the German National Library
Type: PhDThesis
Exam Date: 2020
Language: English
URN: urn:nbn:de:gbv:ma9:1-1981185920-350853
Subjects: Kriminologie
Maschinelles Sehen
Abstract: Forensic investigations are an important factor of the analysis of committed crimes to reconstruct the sequence of events and eventually to bring the offender to trial. While forensic sciences have been further developed especially within the last century, the application of novel techniques poses challenges from practical and legal perspectives. In general, novel techniques need to be assessed prior to the admission of the resulting evidence in court. In the example of the U.S. supreme court level this assessment is governed by the Federal Rules of Evidence, in particular Rule 702 addressing the expert testimony, and the Daubert challenge for scientific evidence. During the latter at least five so-called Daubert factors – ”Whether a method/technique can be (and has been) tested”, ”Whether a method/technique has been subject to peer review and publication”, ”The known or potential rate of error of a method/technique”, ”The existence and maintenance of standards controlling the technique’s operation” and ”Whether a method/technique is generally accepted in the scientific community” – are assessed prior to the admission of novel techniques in court. The introduction of novel sensors and processing techniques is a growing research field as such techniques allow for analyzing new details of traces, allow for reducing the impact on other traces or increasing the repeatability of the processing steps. A part of such novel techniques could be represented by Computational Forensics, which describes the utilization of computer-based techniques in forensic investigations. As this domain is rather broad and without specific requirements, the focus of this thesis is narrowed down to the newly introduced terminology of digitized forensics – the projection of the analysis of physical traces to the entirely digital domain including the specific requirements. The intention of digitized forensics is a similar application of computer-based acquisition and processing techniques to the investigation of various types of traces. Such a similar application procedure could assist judges in their role as a gatekeeper for novel techniques in a Daubert hearing to evaluate the particular forensic soundness in a similar process. Furthermore, shared requirements and processing steps can help to derive standards, guidelines and best practices for the application of such novel computer-based forensic technologies. In conjunction with those intentions the following set of research questions is derived within the scope of this thesis:• How could a generic digitized forensic investigation be formalized as a process and validated for the selected domain of latent fingerprints? • Which novel challenges need to be addressed within digitized forensic investigations, in particular with respect to latent fingerprints? • Which requirements need to be fulfilled by metrology sensory for an application in digitized forensics and what is the impact syntax and semantics of the captured sensor data related to error, loss and uncertainty? n order to address the issue of the lack of particular standards for computational forensics, this thesis introduces a novel process model designed particularly for digitized forensics. The intention of a first-tier of phases is the general structuring of the forensic investigation process and trace handling. Afterward, a second tier of trace specific phases can be derived under the canon of the first-tier phases. Secondly, particular requirements and novel challenges regarding the digitization of physical traces are assessed and addressed within this thesis. In particular the challenges of the authenticity of the traces, the reproducibility of results and the benchmarking of processing techniques are discussed within this thesis. Towards the benchmarking and authenticity-preservation, particular supporting forensic tools are introduced within this thesis. Furthermore, a scheme for formalizing sensory for the digitization is introduced as a foundation for the selection of the application specific sensors. The introduced process model is validated on the foundation of two application scenarios in the domain of latent fingerprint processing. In this context additional research questions are defined: • How could and should latent fingerprints be captured and analyzed within a digitized forensics process using signal processing and pattern recognition to ensure an accurate digital representation of the physical trace? • Which classification scheme suits a pattern recognition based fingerprint-substrate segregation best? • How and in which way could the new technology support the detection of forged fingerprint traces? The first application scenario addresses the challenge of separating the fingerprint pattern from the substrate data by means of statistical pattern recognition. For that, a feature space is designed and evaluated within a two-class supervised learning approach. Subsequently, the performance of the introduced approach is evaluated using automatic biometric matching with an off-the-shelf matching algorithm to approximate the resulting comparison performance of highly trained experts. The second application scenario addresses the challenge of latent fingerprint forgeries for the example of artificially printed latent fingerprint patterns. Similar to the first application scenario a feature space is designed and evaluated using a two-class supervised learning approach. In addition to that, particular influence factors are systematically evaluated using the introduced StirTrace benchmarking approach. Overall, this thesis presents a cross-sectional topic of applied computer science to forensic sciences.
URI: https://opendata.uni-halle.de//handle/1981185920/35085
http://dx.doi.org/10.25673/34885
Open Access: Open access publication
License: (CC BY-SA 4.0) Creative Commons Attribution ShareAlike 4.0(CC BY-SA 4.0) Creative Commons Attribution ShareAlike 4.0
Appears in Collections:Fakultät für Informatik

Files in This Item:
File Description SizeFormat 
Hildebrandt_Mario_Dissertation_2020.pdfDissertation8.05 MBAdobe PDFThumbnail
View/Open