154 research outputs found
From the Editor-in-Chief
Welcome to JDFSL’s first issue for 2015! First, I would like to thank our editorial board, reviewers, and the JDFSL team for bringing this issue to life. It has been a big year for JDFSL as the journal continues to progress. We are continuing our indexing efforts for the journal and we are getting closer with some of the major databases
Measuring Accuracy of Automated Parsing and Categorization Tools and Processes in Digital Investigations
This work presents a method for the measurement of the accuracy of evidential
artifact extraction and categorization tasks in digital forensic
investigations. Instead of focusing on the measurement of accuracy and errors
in the functions of digital forensic tools, this work proposes the application
of information retrieval measurement techniques that allow the incorporation of
errors introduced by tools and analysis processes. This method uses a `gold
standard' that is the collection of evidential objects determined by a digital
investigator from suspect data with an unknown ground truth. This work proposes
that the accuracy of tools and investigation processes can be evaluated
compared to the derived gold standard using common precision and recall values.
Two example case studies are presented showing the measurement of the accuracy
of automated analysis tools as compared to an in-depth analysis by an expert.
It is shown that such measurement can allow investigators to determine changes
in accuracy of their processes over time, and determine if such a change is
caused by their tools or knowledge.Comment: 17 pages, 2 appendices, 1 figure, 5th International Conference on
Digital Forensics and Cyber Crime; Digital Forensics and Cyber Crime, pp.
147-169, 201
FROM THE EDITOR
In this issue we have three papers that have made the cut. The first paper titled “The Cost of Privacy: Riley v. California’s Impact on Cell Phone Searches” is timely. In 2014 there was a unanimous decision that requires a warrant for all cell phone searches. This has some strong implications on the forensic analysis of mobile phones, and to that end, this article discusses and summarizes this legal precedent with its practical implications
Generating system requirements for a mobile digital device collection system: A preliminary step towards enhancing the forensic collection of digital devices
Collecting digital devices in a forensically sound manner is becoming more critical since 80% of all cases have some sort of digital evidence involved in them (Rogers, 2006, p. 1). The process of documenting and tagging digital devices is cumbersome and involves details that might not apply to other types of evidence, since each evidence item has unique physical characteristics (Hesitis & Wilbon, 2005, p. 17). The process becomes less manageable when a large number of digital devices are seized. This paper examines the information and issues investigators should be aware of when collecting digital devices at crime scenes. Furthermore, this paper proposes a mobile solution that can potentially improve the process of forensic digital device collection, by keeping track of what has been collected at a crime scene
Paper Session II: Forensic Scene Documentation Using Mobile Technology
This paper outlines a framework for integrating forensic scene documentation with mobile technology. Currently there are no set standards for documenting a forensic scene. Nonetheless, there is a conceptual framework that forensic scientists and engineers use that includes note taking, scene sketches, photographs, video, and voice interview recordings. This conceptual framework will be the basis that a mobile forensic scene documentation software system is built on. A mobile software system for documenting a forensic scene may help in standardizing forensic scene documentation by regulating the data collection and documentation processes for various forensic disciplines
From the Editor-in-Chief
Welcome to JDFSL’s second issue for 2015! First, I would like to thank our editorial board, reviewers, and the JDFSL team for bringing this issue to life. In this issue, we continue our multidisciplinary tradition. The first paper, Two challenges of stealthy hypervisors detection: time cheating and data fluctuations, showcases an important contribution to the computing discipline. The use of virtualization has dramatically increased given our strong reliance on cloud services both private and public. Even though hypervisors enhance security, they can also be exploited by malware. Therefore, this paper is of importance given that it introduces a novel method for detecting stealthy hypervisors
From the Editor-in-Chief
We are proud to share with you this special edition issue of the JDFSL. This year, JDFSL partnered with both the 6th International Conference on Digital Forensics and Cyber Crime (ICDF2C) and Systematic Approaches to Digital Forensic Engineering (SADFE)–two prominent conferences in our field that were co-hosted. Fifty-three papers were submitted, and the Technical Program Committee accepted only 17 after a rigorous review process
Cybercompetitions: A survey of competitions, tools, and systems to support cybersecurity education
Over the last decade, industry and academia have worked towards raising students’ interests in cybersecurity through game-like competitions to fill a shortfall of cybersecurity professionals. Rising interest in video games in combination with gamification techniques make learning fun, easy, and addictive. It is crucial that cybersecurity curricula enhance and expose cybersecurity education to a diversified student body to meet workforce demands. Gamification through cybercompetitions is one method to achieve that. With a vast list of options for competition type, focus areas, learning outcomes, and participant experience levels we need to systematize knowledge of attributes that ameliorate cybercompetitions. In the wake of the COVID-19 pandemic and global lock-downs, competition hosts scrambled to move platforms from local to online infrastructure due to poor interoperability between competition software. We derive a list of takeaways including the lack of interoperability between state-of-the-art competition systems, breaking the high knowledge barrier to participate, addressing competition type diversity, then suggest potential solutions and research questions moving forward. Our paper aims to systematize cybersecurity, access control, and programming competitions by surveying the history of these events. We explore the types of competitions that have been hosted and categorize them based on focus areas related to the InfoSEC Color Wheel. We then explore state-of-the-art technologies that enable these types of competitions, and finally, present our takeaways
File Detection on Network Traffic Using Approximate Matching
In recent years, Internet technologies changed enormously and allow faster Internet connections, higher data rates and mobile usage. Hence, it is possible to send huge amounts of data / files easily which is often used by insiders or attackers to steal intellectual property. As a consequence, data leakage prevention systems (DLPS) have been developed which analyze network traffic and alert in case of a data leak. Although the overall concepts of the detection techniques are known, the systems are mostly closed and commercial. Within this paper we present a new technique for network traffic analysis based on approximate matching (a.k.a fuzzy hashing) which is very common in digital forensics to correlate similar files. This paper demonstrates how to optimize and apply them on single network packets. Our contri- bution is a straightforward concept which does not need a comprehensive configuration: hash the file and store the digest in the database. Within our experiments we obtained false positive rates between 10-4 and 10-5 and an algorithm throughput of over 650 Mbit/s
File Detection on Network Traffic Using Approximate Matching
In recent years, Internet technologies changed enormously and allow faster Internet connections, higher data rates and mobile usage. Hence, it is possible to send huge amounts of data / files easily which is often used by insiders or attackers to steal intellectual property. As a consequence, data leakage prevention systems (DLPS) have been developed which analyze network traffic and alert in case of a data leak. Although the overall concepts of the detection techniques are known, the systems are mostly closed and commercial. Within this paper we present a new technique for network traffic analysis based on approximate matching (a.k.a fuzzy hashing) which is very common in digital forensics to correlate similar files. This paper demonstrates how to optimize and apply them on single network packets. Our contribution is a straightforward concept which does not need a comprehensive configuration: hash the file and store the digest in the database. Within our experiments we obtained false positive rates between 10−4 and 10−5 and an algorithm throughput of over 650 Mbit/s
- …
