Issues Related to Data Acquisition, Research Paper Example
Words: 1853Research Paper
In analyzing special issues related to data acquisition, there are numerous factors for which one must account. The configuration, performance, security and accountability of a data system establish the quality and depth of infrastructures which it can manage.
Knowledge of computer workstations and other processing devices.
Computer work stations are the core on which all data is stored, shared, or retrieved. Understanding the dichotomy of issue that can arise between a works station and it’s respective network is mandatory to prevent, or to be able to successfully retrace malware and other data corruption. There are a wide range of data acquisition tools and devices used in the process of retrieving or supporting data. OpenNMS is an enterprise-grade network management platform developed as an open source tool. As studies note when utilizing this resource, “The end result of this, from a responder/analyst perspective, was that a malware infection became the least frequent activity to occur on a system (Carvey, 2011).” Tools built to secure open source systems are especially useful as they can adapt to a wide range of malware and new data sets that could potentially do damage to a network. Sipc is a voice over IP, or VoIP software that relies on Session Initiation Protocol (SIP) to distribute a telephony network, support voice, video, and data media streams between respective clients (Luo, 2012). On open source networks such as this, there is a very real threat of malware. As many data forensic experts note malware authors design their viruses to be intuitive, “As malware authors and intruders began taking specific steps to ensure that their actions became less noticeable and “flew beneath the radar”, these actions became more difficult to detect, as the infections did not result in massive amounts of file activity or memory consumption (Carvey, 2011) .” This is why it is mandatory for all certified Forensic Examiners to possess depth of knowledge about all computer work stations and devices. A mental map of partitions and underlying system tools is the single most important weapon against data corruption.
Live system forensics.
Live system forensics requires the extraction, identification, preservation, and documentation of computer data or evidence in real time. This is usually data in high risk of being corrupted. There many aspects to live system forensics that involve retrieving old files, or sourcing a particular data process. This is usually to resolve a system crash, error, of introduction of malware that may be affecting the system (Luo, 2012). Integrity checking, otherwise known as change detection, is the act of reviewing data for in search of hashed signatures. These hashed signatures are then compared original data filed d in the databases to see if there is a match. If the files match than the integrity of the file is sound (NY Computer Science Services, 2012).
Knowledge of application-based file systems.
Knowledge application based file systems sometimes refer to the way files are written in a computer. For example the Windows, DOS, OS/2, UNIX, and Macintosh based operating systems all have files systems. Specifically knowledge application based file system through third party applications incorporate the use of theories like Locard’s Exchange Principle, As noted by Carvey, “Locard´s Exchange Principle: This is an analysis concept that has been addressed and discussed in a number of resources; I´m including it here because no discussion of analysis concepts would be complete without it. In short, Locard was a French scientist who postulated that when two objects came into contact, material from each was transferred to the other (Carvey, 2011) .” With knowledge based application file systems, little work is necessary on the part of the user, but these file systems can also be more vulnerable to corruption is one is not careful about to what data the system is exposed. As the author further notes, “In short, any interaction between two entities (one being the computer operating system) results in the transfer or creation of data (Carvey, 2011).” This basically states that knowledge based application systems through storing data create data as well.
Application-oriented data acquisition methods.
All modern applications in one way or another deal with data, this is particularly web applications which are fundamentally exposed to manipulating data of search results, catalog inventory, user profile account information, map coordinates, financial data etc… Data stored in a database is not usually the type of data best acquired through application-oriented acquisition methods. As Carvey notes, “Generally, there are two types of artifacts that you can expect to find when performing an examination: direct and indirect (Carvey, 2011).” These two forms come standard but still require presentation or extraction on an open source network. “A direct artifact is something that is the direct result of an incident, such as a malware infection or an instruction. These are usually things like files that are added or copied to a system (Carvey, 2011).” A direct artifact falls into the classification of Presentation tools. Presentation tools arrange data from an extraction tool into a readable or useful format. Extraction tools process data to retrieve a subset. According to Carvey this would be an indirect artifact. “An indirect artifact is something that is the result of the ecosystem or environment in which the incident occurs, and is not a direct result of the incident (Carvey, 2011).” Both tool sets require a knowledge of open source system verses data storage, as it relates to the difference of their application.
Numerous geospatial application studies find application-oriented case studies that show the most popular method of application-oriented retrieval is target tracking data from big datasets which have been saved in spatial database. This method is executed through inserting and retrieving tracks and analyzed in terms of different spatial data types designs. Multi-sourced database system organize data and knowledge with integrity through organizing observed spectral data of ground objects. The method of utilizing remote sensing images and the remotely sensing models according to a given applications is one of the core methods implementation for Application-oriented data acquisition, another method is to interpret the application’s request for the data and models in a command line (ie, in the format of many of the standard programming languages) which is stored in a Job table. In many ways, to execute the application’s request reverse logic is used.
Application-driven data forensics tools
There are two main tool categories commonly used in data forensics. These are extraction and presentation. While most vendors are not willing to publicly publish the source code of their data forensic tools, the core goal of all of these tool formats is to provide valid integrity check during the forensic process. . In some cases, computer forensics is actually a search for evidence before the initial crash, as opposed to waiting until an error happens. The three main aspects of computer forensics examinations of a network that require integrity checks, are active, archival, and latent data (NY Computer Science Services, 2012). These data types require integrity checks during the computer forensic process. Active data is usable information, and the easiest type to retrieve. Active data consist of programs, data files, used files and operating systems. Archival data is stored through alternative methods, hard drives, USB, CD, or even in the form of tapes. Latent data need specific tools for proper access. This type of data is usually the hardest to retrieve. This is usually due to the fact that it has been deleted or overwritten. Checking the integrity of files is the core focus of computer forensics.
Some very powerful computer forensic tools include, Foremost, Scalpel, Photorec, FTK, and Ddrescue. Here is some more detailed information about some of the most effective Application driven forensic tools in the industry:
Foremost is a forensic operation management data carving tool that works on Linux operating systems. It was originally designed by the U.S. Air Force. Foremot commands allow users to extract data from a wide range of file types including, jpg, gif, png, ole, pdf among more. Once files are extracted, it provides an audit.txt file, this gives the user detailed information about all the files and their influence on other processes.
Scalpel is an open source carving utility, very similar to foremost but much faster that reads database of header, footer, and files from FATx, NTFS, ext2/3, or raw partitions. Scalpel is ideal for file recovery and digital forensics investigations, because it’s functionality allows the user to recover files off an extended file system. The system is driven by a configuration file. Scalpel has specific configurations that can be set, like max bounds, and the applications provides file format specifications and detailed data on extended file systems that many other tools do not. Carving or un-deleting files can be very complicated and risky to a hard-rive if not handled carefully, scalpel informs the user what files can be overwritten or not, carved out of a file system, or not. Scalpel does this through the use of header byte signatures and footer byte signatures. The tool also allows previews of files to make sure they render properly before they are carved. Due to these particular function Scalpel is considered one of the more secure tools for data recovery.
PhotoRec is a file carver data recovery software tool that can retrieve data from camera files such as CompactFlash, Smart Media, USB flash drives, hard disks and CD-ROM. Application driven data forensic tools have a wide range of selections and functionality. The user launches PhotoRec by opening like a normal program. The application then allows the user to export all of their files to specific folder. Unlike Scalpel, PhotoRec opens data on a massive scale and does not provide the same depth of selection for partition table types.
Test Disk can rebuild partition tables and restore file allocations. It also allows users to recover files from formatted partitions. Test Disk is considered a more advanced version of PhotoRec, but it’s very basic to use. Test Disk is excellent at locating corrupt files due to its data recovery utility functionalities which allow it to copy files and insert them into directories based on their integrity.
Forensic Toolkit (FTK) is recognized worldwide as one of the standards in computer forensic software. FTK is actually accepted in courts as a credible digital investigations platform. Compared to many of the other applications on this list, it has significant speed, it’s known for its intuitive interface capabilities such as e-mail analysis, customizable data views and stability. In addition to having AccessData which offers expansions modules for malware analysis capabilities, FTK has add-ons like Cerberus, which is a malware triage technology which provides analytics to determine database behavior and the intent of suspect binaries. The system even provides scores for systems to assess their level of vulnerability.
In sum, if a data system is corrupted the smartest thing to do is to act quickly by initiating the forensic process. Certified Forensic Examiners carryout all of the necessary diagnostics and integrity checks needed to re-secure the system. These professionals also have a depth of knowledge about data of forensics that can’t be matched by civilian standards.
Carvey, H. (2011). Windows forensic analysis dvd toolkit. Syngress.
Luo, J. (2012). Affective computing and intelligent interaction. (p. 980). Springer.
NY Computer Science Services. (2012). The computer forensic examination process. Retrieved from http://www.newyorkcomputerforensics.com/learn/forensics_process.php
Time is precious
don’t waste it!