Acquiring Volatile Operating System Data Tools and Techniques Iain Sutherland*, Jon EvansŶ, Theodore Tryfonas*, Andrew Blyth* Ŷ

Gwent Police High Tech Crime Unit Gwent Police Headquarters Croesyceilog Cwmbran NP44 2XJ United Kingdom 00 44 (0)1633 838111

*Information Security Research Group Faculty of Advanced Technology, University of Glamorgan Pontypridd, Wales CF37 1DL United Kingdom 00 44 (0)1443 654085

[email protected]

{isutherl, ttryfona, ajcblyth} @glam.ac.uk

essential information may be recovered by system logs that keep a record of system, application or security related events. Assuming log integrity, event logs within Microsoft Windows allow a forensic investigator to obtain information about software, hardware and system components, and monitor security events on a local or remote computer [17].

ABSTRACT The current approach to forensic examination during search and seizure has predominantly been to pull the plug on the suspect machine and subsequently perform a post mortem examination on the storage medium. However, with the advent of larger capacities of memory, drive encryption and anti-forensics, this procedure may result in the loss of valuable evidence. Volatile data may be vital in determining criminal activity; it may contain passwords used for encryption, indications of anti-forensic techniques, memory resident malware which would otherwise go unnoticed by the investigator. This paper emphasizes the importance of understanding the potential value of volatile data and how best to collate forensic artifacts to the benefit of the investigation, ensuring the preservation and integrity of the evidence. The paper will review current methods for volatile data collection, assessing the capabilities, limitations and liabilities of current tools and techniques available to the forensic investigator.

One particular area of current research is obtaining volatile data from the operating system [2], [4], [7], [16]. Volatile data may be defined as any data which no longer exists when power is removed from a computer system [8]. The traditional forensic examination approach during search and seizure has principally been to pull the plug on the suspect machine and then to perform a post mortem examination on the storage medium. But with the increasing availability of drive encryption, antiforensics tools and techniques and also the continued expansion in memory capacity, the ‘pull the plug’ approach may result in the loss of valuable evidence contained in volatile data. Such data may be vital to determine criminal activity.

Categories and Subject Descriptors

Computer systems sold for domestic use now have memory capacities of around 4GB. Consequently the capture of data from within RAM is becoming increasingly significant. It may allow for the discovery of passwords used for encryption or the acquisition of a mounted volume which would otherwise appear as encrypted data. Other key issues of concern which may be further aided by live analysis are the effects of network intrusion [11] or malware in the form of memory resident Trojans or root kits which would otherwise go unnoticed by the investigator relying solely on a post mortem examination [5]. Further threats are posed by antiforensics techniques aimed at obfuscating the traditional forensic approach. In addition to these concerns pulling the plug on a live system may not necessarily be the best course of action in certain circumstances. One example scenario would be that of a business critical networked file server. It may not be feasible to power off the server due to:-

K.4.2 [Social Issues]: Abuse and Crime involving computers.

General Terms Algorithms, Measurement, Experimentation, Standardization, Legal Aspects, Verification.

Security,

Keywords Forensics, volatile data, live acquisition.

1. INTRODUCTION The Operating System can provide the investigator with a range of artefacts containing information of forensic value. Windows for example has a number of records generated by the operating system: The Windows Registry is one of the main locations to view various pieces of information relating to the OS settings, applications that have been installed on the machine, and information about users and user preferences. Internet explorer caches the last recent typed URLs on the machine for each user, Protected Storage holds the information from Internet Explorer form auto-complete function which includes usernames and passwords, and other user specific information [18]. Other

1) Loss of business productivity. 2) Acquisition of the whole drive, typically terabytes on this type of system is difficult, thus selective data acquisition must be considered.

65

may be a problem where the Operating System Environment (OSE) may be compromised. This may present difficulties when presenting evidence at court which was collated as a result of a live forensic examination.

3) Advanced RAID configurations may make difficult the acquisition of a single logical image in a reasonable time frame. 4) Powering off the machine may result in the loss of potential network evidence such as traffic session data.

2.1 Law Enforcement Guidelines in the UK

5) There is no guarantee that a server that has been continually running for an extensive period of time can successfully be restarted.

Criminal investigations involving live memory analysis carried out by Law Enforcement within the UK are guided by two of the underlying principles of the Association of Chief Police Officer (ACPO) Guidelines:

A further example would be a home network where several machines are connected by friends or household members. Powering off these machines may result in the loss of network evidence such as traffic session data. Such evidence may assist an investigator in determining who had access to what file shares or who had been allocated the relevant Internet Protocol (IP) addresses.

Principle 1: No action taken by law enforcement agencies or their agents should change data held on a computer or storage media which may subsequently be relied upon in court. Principle 2: In exceptional circumstances, where a person finds it necessary to access original data held on a computer or on storage media, that person must be competent to do so and be able to give evidence explaining the relevance and the implications of their actions [1]

This paper examines current practice in the collection of live memory artifacts and assesses the impact of using particular tools and techniques in the acquisition of live memory. The paper focuses on Windows XP Service Pack 2 systems on the Intel architecture and proposes a series of assessments for tools to ensure an examiner is able to quantify the forensic footprint of any given tool.

The wording permits the use of live investigations, and elsewhere the Association of Chief Police Officer (ACPO) Guidelines recognize the importance of the content of volatile data:

2. THE CHALLENGE OF VOLATILE DATA COLLECTION

“If the traditional approach of removing the power to the device is followed such artifacts are lost. If captured beforehand and then removing the power, an investigator will have a wealth of information from the machine’s volatile state in conjunction with the evidence on the hard disk. By profiling the forensic footprint of trusted volatile data forensic tools, an investigator will be in a position to understand the impact of using such tools and will therefore consider this during the investigation and when presenting evidence.”[1].

Live Data analysis requires an understanding of some of the specific issues relating to volatile data: It is any data that is stored in memory, or in transit, that will be lost when the computer loses power or is powered off. Volatile data resides in registers, cache, and Random Access Memory (RAM) [8]. Live volatile data that can be collected may consist of [7]: x

System date and time

x

Logged on user(s) and their authentication credentials

x

Process information

x

Network connections

x

Network status

x

Clipboard contents

x

Command history

x

Services/driver information

x

Open files.

The Association of Chief Police Officer (ACPO) Guidelines do not however provide detailed directions on the acquisition of volatile data as that is outside of their remit. It is not possible to predict how the courts will view evidence based on live data collection so the challenges now faced by many criminal investigations stem from evidence which must be able to withstand defense scrutiny in trial. For this reason it is important that the forensic investigator has a thorough knowledge and understanding of the impact of the tools he or she is deploying on the system and to be able to identify the changes that the deployed tools make to the system during a post-mortem examination. Although the techniques involved in memory analysis is an area of current research [16], [27], [29], there has been limited work on the impact of deployed tools. This would allow the investigator to justify to the court the actions taken, describe the impact of the investigator’s actions on the computer system and what the benefits were as a result of taking that particular course of action.

However, information can also be volatile if it is changed as a result of the system being shut-down and rebooted, such as access times on files that are accessed during shutdown or restart [6]. Conducting a live forensic examination involves a more complex approach than the traditional post mortem examination. Care must be taken by the examiner on a live system to minimize the impact of any tools used. However, there is no way to avoid making changes, since in order to conduct a live examination it is necessary to deploy tools on the live system to capture data, and such tools will make changes to the running system.

To limit the rejection of evidence from live memory a methodological approach is required as highlighted by a number of international standards, the Association of Chief Police Officer (ACPO) Guidelines states:

Also results can not be reproduced, it may be difficult to cover new questions which arise later regarding the data, and certain data collection tools rely on trust to gather accurate results. This

“The recommended approach towards seizing a machine whilst preserving network and other volatile data is to use a sound and predetermined methodology for data collection.” [1]

66

contents of a Word document. It may also be possible to retrieve chat logs, valuable data which may not be cached on a hard disk. From the network state it may be possible to capture the state of a malevolent connection with its associated IP address, providing further leads for an investigator. From running processes in addition to any user data it may be possible to capture process options, e.g. Hacking tools run from the command line may show the command line options used together with IP addresses. Malevolent processes such as anti-forensics tools may be identified and acquired, thus allowing later forensic analysis which may assist the examiner during the traditional postmortem examination of the system.

However, this begs the question, what is the pre-determined methodology? The rapid changes in memory development and technology have a major impact in this area in that operating systems, and revisions of operating systems handle data in different ways and so new tools are developed and existing tools and techniques revised rapidly to keep in step with the changes in operating systems. So it is important to have a methodology for evidence tool testing which can provide some flexibility and resilience to change. In order to facilitate this it should be possible to develop a methodology to aid the investigator to assess the impact of tools on a target’s operating system.

2.2 Legal impact of Live Analysis in the UK

It is desirable for the examiner to capture as much of volatile data as possible, and the order in which data capture occurs could also be crucial to the investigation. Thus the examiner should consider carefully the order of data collection, obtaining the data most likely to be modified at the earliest stage of the investigation. The order of volatility could be described as follows (with 1 being the most volatile):-

The forensic examiner must consider the implications when conducting a live forensic examination. The examiner must be justified in any subsequent actions. ACPO Guidelines should be given careful consideration and an examination should only be conducted if it is lawful, proportional and necessary. For the courts to accept evidence it has to be admissible and it has to have weight (persuasiveness or probative value). The item of evidence should be shown as authentic, accurate and complete. Under UK Law the equality of arms requires that the defence have an opportunity to test the evidence presented by the prosecution. The defence could argue for an “abuse of process”. This has been defined as “something so unfair and wrong with the prosecution that the court should not allow a prosecutor to proceed with what is, in all other respects, a regular proceeding “[9]

1.

Registers, peripheral memory, and caches etc

2.

Physical memory

3.

Network state

4.

Running processes

5.

Mounted volumes.

encrypted

volumes/or

other

encrypted

Physical Memory should always be acquired first, but the tools that are subsequently deployed will largely depend on the type of investigation, this includes:-

Likewise the defence could seek to have the evidence excluded under Section 78 of Police And Criminal Evidence Act. Evidence collated from "live data analysis" presents some difficulties in its classification and admissibility as evidence. Where trusted validated tools are deployed the reported results could be considered as:

x

What offences are being investigated?

x

Is this a network related incident?

x

Is encryption suspected to have been used?

“testimonial evidence” (Testimonial – the eyewitness observations of someone who was present and whose recollections can be tested before the court.) [15]

x

Who is the user of the machine or who was recently using the machine?

Many of these questions can only be answered by interviewing suspects or other persons in charge or control of the equipment at the actual scene of the investigation.

An example would be where a traffic officer provides evidence relating to the use of a speed camera instrument. Thus the deployment of the trusted tools, the documentation relating to its use and the subsequent results should be admissible.

In executing tools to obtain memory, network or system details the use of static binaries in incident response is generally considered best practice [2]. This is generally easier to achieve on Open Source operating systems as many of the required tools can be compiled with the necessary libraries needed to function, rather than using libraries from a possibly compromised system. Owing to the closed nature of the Windows Operating System achieving this same functionality is not possible leaving many processes with little option but to use Dynamic Link Libraries (DLLs) from the compromised system. However Windows XP and later version of Windows supports side by side assemblies using manifests [19], meaning that a safe DLL library can be used where a process requires it rather than relying on DLLs on a compromised system.

As a result of the very nature of volatile evidence not being static this presents difficulties. Thus it should be shown that the tools and techniques deployed can stand up to the rigour of cross examination by the defence. This paper hopes to address some of these issues by suggesting a series of analyses forming a framework whereby future use of such tools can be tested.

3. VOLATILE DATA COLLECTION AND TOOLS There are a number of tools available to acquire volatile data such as physical memory, network state, running processes, and mounted encrypted volumes. Each should be tested and validated to ensure that their use is as minimally invasive as possible. From main memory it may be possible for the examiner to find passwords, data associated with user applications such as the

67

Figure 1: Physical Memory Acquisition Tools

3.1 Assessing the Impact of Volatile Data Collection When deploying a volatile data collection tool there are then a number of areas that may be affected on the system. The most obvious impact is to the contents of the memory as the tool is deployed. The investigators need to be able to isolate this when examining any subsequent memory dump. As we are considering Windows XP systems there is also the possibility that there may be an impact on the Windows Registry. Some tools may leave direct traces in the registry as keys are modified or as an indirect result of using the tool in the case of an external file system being connected to the system to capture the results of the tool’s analysis. There may be modifications to the system’s file system as active processes may be captured in the prefetch directory (typically C:\WINDOWS\Prefetch). The use of a tool may also be recorded in audit logs depending on how the Audit Policies on the system are configured. By default the Windows XP operating system does not collate such activity. However if appropriate audit policies and security settings are enabled an entry may appear in the Security Event log. A further possible area of interaction with the system is that tools may use DLLs present on the system, although this may lead to questionable reliability as these DLLs may be compromised. Key areas of assessment for volatile data collection tools being used on Windows operating systems can then be summarized as:x

Amount of memory required by the tool

x

The tool’s impact on the Windows Registry

x

The tool’s impact on the file system

x

Use of DLLs present on the system

Tool

Version used

Availability

FAUdd

v1.0.0.10 36 (beta1)

http://users.erols.com/gmgarner/forensics/

Pmdump

v1.2

http://ntsecurity.nu/toolbox/pmdump/

KnTTools

v1.1.0.21 12

http://users.erols.com/gmgarner/KnTTools/

Userdump

v8.1

http://www.microsoft.com/downloads/detai ls.aspx? FamilyID=e089ca41-6a87-40c8bf69-28ac08570b7e&DisplayLang=en

Memdump

v2.0

http://www.tssc.de/download/prods/memdu mp.zip

Nigilant32

V0.1 beta

http://www.agileriskmanagement.com/dow nload.html

ProDiscover Basic

V4.9

http://toorcon.techpathways.com/uploads/P roDiscoverRelease49Basic.zip

Figure 2: Network Status information Tools Tool

Version used

Availability

Openports

v1.0

http://www.diamondcs.com.au/downloads/o penports.zip

Fport

v2.0

http://www.foundstone.com/knowledge/prod desc/fport.html

PSFile

v1.02

http://www.microsoft.com/technet/sysinterna ls/utilities/psfile.mspx

tcpvcon

V2.34

http://www.microsoft.com/technet/sysinterna ls/Networking/TcpView.mspx

tcpview

V2.4

http://www.microsoft.com/technet/sysinterna ls/Networking/TcpView.mspx

Figure 3: System Status information Tools

The way memory is addressed in various Operating Systems and each revision changes rapidly so a definitive tool set would be difficult to develop and would rapidly become dated. Thus it is important to have a grasp of the methodology for evidence tool testing which can provide some flexibility and resilience to change. The volatile data tools may be split into individual areas based on the purpose of the tool and what volatile data they collect. The following tools detailed in figure one are some of those which are currently used to acquire live memory although this is not a definitive listing it serves to illustrate some of the software methods used in physical memory acquisition. It should be noted that there are also a number of hardware methods which may be appropriate under certain circumstances for example; ‘Tribble’ developed by Carrier and Grand [4]. Figure two details some of the more common tools used to obtain network status information while figure three outlines a selection of tools used to gain information relating to system status

Tool

Version used

Availability

Pclip

V1.0

http://unxutils.sourceforge.net/

Psinfo

V1.74

http://download.sysinternals.com/Files/PsTools. zip

pslist

V1.28

http://download.sysinternals.com/Files/PsTools. zip

Psloggedon

V1.33

http://download.sysinternals.com/Files/PsTools. zip

Psservice

V2.21

http://download.sysinternals.com/Files/PsTools. zip

Tlist.exe

V.5.1

MS Debugging tools

4. TOOL ASSESSMENT PLATFORM The focus of this paper is to discuss possible tools and techniques suitable for collating volatile data on Windows XP Service Pack 2 systems and to demonstrate a process to measure the evidence collecting tools impact on a system. Doing so should assist the investigator to develop an examination methodology to justify the appropriateness and use of any particular tool.

It is worth noting that Garner [9], [12] recommends renaming the tools as best practice to avoid malevolent processes identifying and hi-jacking the tool. It is also generally considered good practice to assign a unique name to any forensic tool used in volatile data collection. This sets the process apart and aides its identification as a “safe binary” but may also aide the investigator in identifying the tool from the acquired physical memory dump. The tools outlined in figure two can be used to capture network status information, including open files while some may be used to associate network activity to the owning process.

To quantify the impact of the tools in memory a test platform consisting of a Windows XP Service Pack 2 install with no user tasks being performed was created. The install was made to a VMWare Workstation ACE Edition e.x.p build-42757 machine.

68

The use of VMWare in forensics is becoming increasing recognised as a powerful tool [3]. In this examination VMWare allowed the baseline environment to be archived and re-used repeatedly either by replacing the VMWare files from the archive or using the VMWare snapshot facility. This version of VNWare was also chosen as it supports USB2 devices, an important factor where memory dumps were made to such devices in the VMWare environment. Where a VMWare was not suitable, e.g. memory acquisition using hardware techniques, an IBM laptop computer was used. The hard drive was split into two logical partitions, a baseline Windows XP SP2 installation being placed on the first partition and the second partition was used to store an image of the Operating System using Ghost 4 Linux g4l64. This allowed the baseline installation to be relayed onto the first partition when required.

Vadump.exe (Available from the Windows Resource Toolkit)[20]

x

WpfPerf.exe (Available from the Windows Software Developers Kit)[21]

x

System Monitor Control snap-in using ActiveX

x

A perl script written by Pavel GLADYSHEV which provided access to the same information as the System Monitor Control, but from the command line rather than a GUI.[22]

Page File Bytes

x

Page File Bytes Peak

x

Virtual Bytes

x

Virtual Bytes Peak

x

Working Set

x

Working Set Peak

Additional criteria included: time elapsed when running the tool, the impact on the Registry, an important source of evidence on Windows systems [18], and the use of DLLs. The following section demonstrates the analysis of one of the volatile memory collection tools KntTTool developed by Garner [10].

The tools were then assessed to determine the size of the ‘footprint’ left by each of these tools, in terms of the criteria listed above: in terms of memory use this also comprised of measuring the impact on the file system, the Windows Registry and also assessing the tools dependency on DLLs. The amount of memory used was monitored using a number of methods, these included: x

x

5.1 An Example Analysis: KntTTools Memory usage of this tool was measured using the Microsoft Management Console and an ActiveX snap-in System Monitor Control to record Counter sets for each of the following factors outlined in 5 above Following the good practice mentioned earlier the KnTDD memory acquisition tool was renamed as ddnt.exe. The computer used was the Virtual machine of the Windows XP SP2 Operating system configured with 256Mbytes of memory. It was necessary to populate the System Monitor Control process list by deploying the tool initially to be able to select the process name. After the System Monitor Control snapin was generated this was saved for future use and the vmware snapshot facility was used to reset the virtual machine. A USB2 hard drive was connected to the virtual machine and a physical memory dump was made to this attached storage device. Throughout the imaging process the memory usage of the ddnt process remained fairly constant with a single peak, which may be attributed to the additional work needed by the ddnt process to calculate the hash values for the created evidence files and generating the system user state report. The maximum memory usage for the KnTDD process is shown in figure four.

There are a number of other useful tools available for measuring memory usage, e.g. pslist from Sysinternals. Another is pmon.exe from Windows Support Tools for XP, which is a Windows equivalent to the Linux top command. This selection of tools was deemed sufficient for the purpose of this analysis. Impact on the system registry and the file system was monitored using ProcessMonitor by Systinernals available from Microsoft [23]. Artefacts not identified by use of these tools were identified by later forensic analysis. Forensic analysis involved the use of EnCase® EE, v6.03. In a post-mortem examination the Registry hive HKEY_LOCAL_MACHINE\SYSTEM\Select must be queried to determine the CurrentControlSet a process described by Carvey [6] This will then guide the investigator as to which subkeys hold relevant data.

A forensic examination of the file system using EnCase 6.03 indicated one entry in the file system C:\WINDOWS\prefetch [28] folder for the ddnt process. The prefetch directory can be a valuable source of information but has a limit of 128 .pf entries. This a further reason why deployed tools should be renamed using a unique naming convention as there is a possibility that the tool may over write a prefetch entry pertinent to the investigation. Neither procdump.exe nor userdump.exe were suitable processes for measuring memory activity using this method and hence Pavel Gladyshev’s [14] perl script was used. It should also be noted that these tools do not acquire all the system’s physical memory but rather the memory allocated to a single process. For the pmdump and userdump the process acquired from memory was Mozilla Firefox v2.0.

It is possible to ascertain the DLLs an application uses with a tool such as Dependency Walker which is included with the Windows SDK package. Version 2.2.6000 was used to examine each of the applications. For security reasons the results of the examination of the commercial tools are not included in this paper.

5. TOOL ANALYSIS A number of the tools were assessed using the test platform outlined above to determine the impact on the target system in terms of the following:

69

Figure 5: Memory footprint for memory acquisition Tools

Figure: 4 Memory Usage for KNTDD tool as shown by the System Monitor Control

Tool

Page File Bytes

Page File Bytes Peak

Working Set

Working Set Peak

Page File Bytes

2641920

Page File Bytes peak

3194880

FAUdd

438272

442368

1449984

1449984

6635520

6635520

7241728

7241728

36663296

Pmdump

Virtual Bytes peak

38387712

KnTTools

2641920

3194880

5578752

5578752

Working set

5578752

Userdump x86

1761280

1794048

1683456

1683456

Working set peak

5578752

ProDiscove r Basic

5844992

8749056

7708672

8769536

Nigilant32

2064384

3526656

7077888

7729152

Virtual Bytes

5.2 The Impact of GUI Tools A number of the tools selected for analysis provide the user with a Graphical User Interface (GUI). It was noted that using tools with a GUI interface had a greater impact on the registry. An example is the use of TCPView to save a log, which created Recently Used entries in the user’s NTUSER.DAT registry file, within the keys LastVistedMRU and OpenSaveMRU (see figure 9). Similar behaviour was noted for all of the GUI related tools, more registry entries were associated with these tools over the command line tools. The following are some of the registry entries created as a result of the GUI tools. HKEY_CURRENT_USER\Software\Microsoft\Windo ws\CurrentVersion\Explorer\ComDlg32\OpenSaveMR U

x

HKEY_CURRENT_USER\Software\Microsoft\Windo ws\CurrentVersion\Explorer\ComDlg32\LastVisitedMR U

x

HKEY_Current_User\Software\Sysinternals\TCPView\ Settings

x

HKEY_CURRENT_USER\Software\Microsoft\Windo ws\ShellNoRoam\MUICache

x

Tool

Registr y Keys Written

Shared DLL

Files Writte n

Own DLL

Elapsed Time (seconds)

FAUdd

0

1

7

3

38.73

Pmdump

0

1

2

0

2.11

KnTTools

0

1

*

*

416.62

Userdump x86

0

1

6

1

1.29

ProDiscove r Basic

3

3

15

4

116.12

Nigilant32

2

1

13

0

881.9

* Not Disclosed

Figure 7: Virtual bytes and virtual bytes peak for memory acquisition Tools 80000000 70000000 60000000

HKEY_CURRENT_USER\Software\Microsoft\Windo ws\UserAsssit\HKEY_LOCAL_MACHINE\SOFTWA RE\

Memory in Bytes

x

Figure 6: Time and file system impact for memory acquisition Tools

Duplicate entries for some of these keys were created under the HKEY_USERS hive for the user whose session the tools were deployed under. Thus an entry was created in the users NTUSER.DAT file e.g. C:\Documents and Settings\User\NTUSER.DAT . For all of the tools examined each resulting process created an entry in the prefetch folder. e.g. C:\WINDOWS\prefetch\NC.EXE-12414508.pf

50000000 Virtual Bytes Virtual Bytes Peak

40000000 30000000 20000000 10000000 0 FAUdd

Pmdump

KnTTools

Userdump x86

ProDiscover Basic

Nigilant32

Tools

Figure 8: Memory footprint for Network Status tools

6. SELECTED TOOL ANALYSIS Each of the memory acquisition tools highlighted in figure one were also assessed. The impact of the tools in terms of bytes used is shown in figure five, and on the other selected criteria in figure six. To enable comparison between tools, virtual bytes and virtual bytes peak are displayed graphically in figure seven. The results in figure seven imply that based on the selected criteria that FAUdd appears to have the least overall impact on the target system. Those steps were repeated for the network and system tools. The results of the network tools are shown in figures eight and nine. Figure ten displays virtual bytes and virtual bytes peak for network status tools to enable a graphical comparison

70

Tool

Page Bytes

File

Page File Bytes Peak

Working Set

Working Set Peak

Openport s

4771840

4771840

77824

77824

fport

372736

372736

1245184

1245184

psfile

552960

552960

1716224

1716224

tcpvcon

1454080

1454080

3670016

3682304

tcpview

1630208

1695744

4362240

4440064

Figure 9: Time and file system impact for Network Status tools Tool

Registry Keys Written

Files Writt en

Shared DLLs

Own DLL

Elapsed (ms)

Openport s

-

1

5

0

254

fport

0

1

5

0

98.6

psfile

0

1

8

0

11.1

tcpvcon

0

1

7

0

486.2

tcpview

4

2

10

0

30.62

Time

Figure 12: Time and file system impact for System Status tools

Figure 10: Virtual bytes and virtual bytes peak for Network Status tools 45000000

Tool

Registry Keys Written

Files Written

Shared DLL

Own DLL

Elapsed Time (ms)

pclip

0

1

3

0

70.1

psinfo

0

1

12

0

6935.9

pslist

0

1

7

0

2140.6

psloggedo n

0

1

7

0

1109.6

psservice

0

1

8

0

1741.6

tlist

0

1

9

0

-

Nigilant32

2

1

13

0

11.35 (seconds )

40000000

Memory in Bytes

35000000 30000000 25000000

Figure 13: Virtual bytes and virtual bytes peak for system Status tools

Virtual Bytes Virtual Bytes Peak

20000000 15000000

300000000

10000000 5000000

250000000

0 fport

psfile

tcpvcon

tcpview

Memory in Bytes

Openports

Tools

In terms of virtual bytes and virtual bytes peak displayed in figure 10 Openports is highlighted as the tool which has the lowest impact of the tools tested. However if the figures for page file use are examined it indicates that the reverse is true with Openports displaying the highest number of bytes used. This emphasizes the importance of testing tools to ensure the impact of a particular tool is understood and that the appropriate tool is used in a specific case.

Page File Bytes

Page File Bytes Peak

Working Set

pclip

233472

233472

905216

905216

psinfo

2916352

3018752

4222976

4263936

741376

786432

3182592

3215360

psloggedo n

684032

684032

3043328

3043328

psservice

716800

716800

3088384

3088384

tlist

-

-

-

-

Nigilant32

4968448

5210112

9289728

9510912

100000000

0 pclip

psinfo

pslist

psloggedon psservice

tlist

Nigilant32

Tools

7. DISCUSSION Using the criteria defined above, it is possible to compare the selected tools and identify those that have the least impact on the system under live review/investigation. Use of the framework and the results taken together should allow an investigator to assemble a toolset which would provide the most valuable information and be minimally invasive on the system. The analysis also assists in determining the artifacts left behind as a result of deploying these tools. By assessing tools in this way, investigators should have a better understanding of the impact of the tools they wish to deploy. This process should also allow for the validation of new tools which can then be assessed comparatively with other tools in respect of their impact on the system. It will also enable a judgment as to the reliability of results provided by a particular tool. Many of the tools rely on the Operating System to enumerate and provide the investigator with the live data, in particular in terms of the tool’s reliance on DLLs. This can pose a problem when there are concerns that the Operating System may have been compromised.

Working Set Peak

pslist

Virtual Bytes Virtual Bytes Peak

150000000

50000000

Figure 11: Memory footprint for System Status tools Tool

200000000

The results of the analysis on system status tools is shown in figure 11, figure 12 and figure 13. In this case pclip appears to have the least impact of the system tools assessed with Nigilant32 one of the highest, particularly in terms of time used (seconds rather than milliseconds). Due to the short time it took tlist.exe to query the system the GLADYSHEV’s perl script did not record any useful information when used to monitor this tool.

This paper has explored the impact of tools within a Windows Operating System. Further work is required to assess other tools on other operating systems. This would be of value to the forensic investigator, but the way memory is handled and its analysis varies greatly between Windows Service packs let alone other

71

operating systems; as a result the area of memory forensics is deeply complex and requires a significant amount of time and effort invested by the forensic examiner to begin to comprehend how memory works in modern Operating Systems.

10. ACKNOWLEDGEMENTS The authors would like to acknowledge the following: George Garner Jnr. President of GMG. Systems Inc., Harlan Carvey, Peter Sommer, Esther George and also Detective Chief Inspector Christopher Dodd of South Wales Police, UK. The research in this paper is based on work carried out by JE while working on his M.Sc. thesis.

Many of the tools examined have differences in functionality and this should also be borne in mind when the investigator makes a choice on the most suitable tool to perform a particular analysis. It may be that to obtain a particular piece of evidence a more invasive tool has to be used, provided this is appropriate to the investigation.

11. REFERENCES [1] ACPO Good Practice Guidelines for Computer Based Evidence 2006 page 20 and 21

8. SUMMARY AND CONCLUSIONS

[2] Adelstein F., Live forensics: diagnosing your system without killing it first, Communications of the ACM , Vol. 49, 2 (Feb.2006), 63 – 66

This paper has proposed a process for comparing digital forensic tools assisting volatile evidence acquisition, in order to understand their potential impact on the system being examined. Tools for acquiring data relating to memory, network and system activity were assessed to determine the impact on the file system, Registry, memory and the usage of DLLs. Our particular criteria set in this respect should provide a forensic guide for examiners to refer to, whilst investigating Windows XP workstation computers. The overall outlook of this work may facilitate examiners in devising similar criteria for a variety of other systems and configurations.

[3] Bem D., Huebner E., Computer Forensic Analysis in a Virtual Environment, International Journal of Digital Evidence, 2007, Vol. 6, 2 (online) www.ijde.org [Accessed November 2007] [4] Carrier B. D., Grand J., A hardware based memory acquisition procedure for digital investigations, Journal of Digital Investigation Vol.1, 1.(2004) [5] Carrier B., D., Risks of live digital forensic analysis, Communications of the ACM, Vol. 49 , (Feb. 2006) 56 - 61

9. AREAS FOR FUTURE WORK When gathering volatile data Farmer and Venema [12] stated, “We strongly believe to obtain dependable results automation is a near-necessity for gathering forensic data” Best practice recommends automation when collecting volatile data. However using the command line presents the least footprint on a system, but a study by Ray Panko [25] highlights the risks of human error in typing being increased. If a firewire port is available, then it may be possible to acquire physical memory using this hardware method. For this it is not necessary to login to the target system, and there is no requirement to get administrator access. This method provides one option for automating the acquisition of physical memory. A further option is the work of Petroni et al [26] in the production of FATKit which is a prototype system designed to automate the extraction of objects from memory.

[6] Carvey, H., (2004), Windows Forensics and Incident Recovery, Addison-Wesley.

With the release of Windows Vista and since Windows 2003 Service Pack 1 access to the object which describes physical memory e.g. \\.\PhysicalMemory [24] was moved from user land to kernel land. Thus user land tools such as Garner’s [13] Forensic Acquisition Utils can no longer acquire such memory. Further work will be required to determine how this memory might be acquired for forensic analysis.

[10] DFRWS 2005 Forensics Challenge Kntlist Analysis Tool by George M. Garner Jr., http://www.dfrws.org/2005/challenge/kntlist.shtml, [Accessed January 2006]

It was noted during the analysis that event logs on a network system could be impacted as a result of deployment of some of the volatile data acquisition tools. This is a neglected area and one deserving of more attention. Much work has been done with memory analysis to recover processes using the EPROCESS data structure. Little work has been done on “carving” process application metadata. This area could provide investigators with a wealth of valuable evidence and is highly recommended as an area for further research. What is also not known is what happens to memory allocated but not used by the tools deployed during the evidence gathering process. Could there be remnants of data from a previous process?

[12] Farmer, D., and Venema, W., (2005), Forensic Discovery, Addison-Wesley, p 6

[7] Carvey H., Instant Messaging Investigations on a live Windows XP System, Journal of Digital Investigation Vol.1, 4.(2004) [8] CERT First Responders Guide to Computer Forensics, (Online), http://www.cert.org/archive/pdf/FRGCF_v1.3.pdf, [Accessed March 2007] [9] Crown Prosecution Service, (Online), Abuse of Process – Definition, http://www.cps.gov.uk/legal/section13/chapter_c.html#_Toc 3276980, [Viewed April 2007]

[11] Eoghan Casey, E., Friedberg, S., Investigating sophisticated security breaches, Communications of the ACM, Vol.49, 2 (Feb. 2006) 48 - 55

[13] Garner, G.M. jr, (2006) ‘Forensic Acquisition Utilities’, (online) http://users.erols.com/gmgarner/forensics [Accessed May 2006] [14] Gladyshev, P., Perl script for collecting information from MS Windows performance counters, (online), School of Computer Science and Informatics, University College Dublin, Ireland http://cci.ucd.ie/mu_script [Accessed December 2007]

72

[15] IAAC, Directors and Corporate Advisors’ Guide to Digital Investigations and Evidence, Information Assurance Advisory Council, Online, http://www.iaac.org.uk/LinkClick.aspx?link=Evidence+of+C yber-Crime+v12 rev.pdf&mid=680&contenttype=application/pdf, [Viewed 11/3/2007], p25

[22] Microsoft MSDN Library, System Monitor, (online), http://msdn2.microsoft.com/enus/ library/aa379688.aspx, [Accessed April 2007] [23] Microsoft, ProcessMonitor, (online), http://www.microsoft.com/technet/sysinternals/utilities/proce ssmonitor.mspx, [Accessed March 2007]

[16] MacLean, N.P. (2006), “Acquisition and Analysis of Windows Memory”, Masters Thesis, University of Strathclyde, UK.

[24] Microsoft Technet, (online), ‘Device\PhysicalMemory object’, http://technet2.microsoft.com/WindowsServer/en/Library/e0f 862a3-cf16-4a48-bea5f2004d12ce351033.mspx?mfr=true, [Viewed October 2006]

[17] Mee V., Sutherland I (2005) Windows Event Logs and their Forensic Usefulness. 4th European Conference on Information Warfare and Security, University of Glamorgan, July 2005.

[25] Panko, R, “Human Error Web Site”, (online), http://panko.cba.hawaii.edu/HumanErr,[Accessed May 2007]

[18] Mee, V., Tryfonas, T. Sutherland. I., (2006) The Windows Registry as a Forensic Artifact: Illustrating evidence collection for Internet usage, Digital Investigation, Volume 3 Issue 3 pages 166-173

[26] Petroni N., L.,Walters A., Fraser T., Arbaugh W., A., FATKit: A framework for the extraction and analysis of Digital forensic data from volatile system memory, Journal of Digital Investigation, Vol.3, 4.(2006)

[19] Microsoft MSDN, Side-by-side Assemblies, (online), http://msdn.microsoft.com/library/default.asp?url=/library/en us/sbscs/setup/supported_microsoft_side_by_side_assemblie s.asp, [Accessed May 2007]

[27] Solomon. J., Huebner E., Bem D,. Szezynska M, . User Data Persistence in Physical Memory, Journal of Digital Investigation Vol.4, 2 (2007)

[20] Microsoft Windows Resource Toolkit, (online) http://www.microsoft.com/downloads/details.aspx?familyid= 9D467A69-57FF-4AE7-96EEB18C4790CFFD& displaylang=en, [Accessed April 2007]

[28] Russinovich, Mark; David Solomon (2005). "Memory Management", Microsoft Windows Internals, 4th edition, Microsoft Press, pp. 458-462. ISBN 0-7356-1917-4.

[21] Microsoft Windows SDK, (online), http://www.microsoft.com/downloads/details.aspx?FamilyID =c2b1e300-f358-4523-b479f53d234cdccf&displayLang=en, [Accessed April 2007]

[29] Urrea, J.N. (2006), “An Analysis of Linux RAM Forensics”, Masters Thesis, Naval Postgraduate School, Monterey California (online), http://cisr.nps.navy.mil/downloads/theses/06thesis_urrea.pdf [Accessed December 2007]

73

Acquiring Volatile Operating System Data Tools and ...

acquisition using hardware techniques, an IBM laptop computer was used. The hard .... Figure 10: Virtual bytes and virtual bytes peak for Network. Status tools. 0.

127KB Sizes 0 Downloads 133 Views

Recommend Documents

system programming and operating system by dhamdhere pdf ...
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. system ...

Operating System Concepts and Networking Management.pdf ...
Which protocol is used by TFTP at the 5. transport layer ? Also, give ... What is X-Window system ? Explain the 5 ... Windows 2000 layered Architecture. - o 0 o -.

operating- system concepts
Internet electronic mail should be addressed to [email protected]. Physical mail .... In order for two machines to provide a highly available service, the state on the two .... lines, such as a high-speed bus or local area network. h. Clustered.

Android (operating system)
Oct 21, 2008 - Android (operating system) - Wikipedia, the free encyclopedia ... [10]. ), Rich Miner. (co-founder of Wildfire Communications, Inc. [11]. ) ...

Distributed Operating System
IJRIT International Journal of Research in Information Technology, Volume 1, ... control unifies the different computers into a single integrated compute and ... resources, connections between these processes, and mappings of events ... of excellent

[O973.Ebook] Ebook Operating System: Operating ...
Jan 21, 2016 - prosperous system by reading this soft file of the Operating System: Operating System For Beginners ... What do you think of our concept here?

Data Structure And Operating Systems.pdf
Sign in. Page. 1. /. 1. Loading… Page 1 of 1. Sl no. Rollno NAME DOB Posting. 1 1201000028 LAKSHMINARAYANA S 23/01/1990 Mysuru. 2 1201000036 ...

Acquiring Data for Textual Entailment Recognition - raslan 2013
acquiring language resources by means of a language game which is a cheap but long-term method. ... Key words: textual entailment, language game, games with a purpose,. GWAP. 1 Language Resources and .... between an adverbial and an object therefore

Acquiring Data for Textual Entailment Recognition - raslan 2013
Both true and false entailments are needed,. 1 http://www.oecd.org/pisa/. 2 http://www.piaac.cz/. 3 http://pascallin2.ecs.soton.ac.uk/Challenges/RTE2 ..... and Evaluation 47(1), 9–31. (2013), http://dx.doi.org/10.1007/s10579- 012- 9176- 1. 17. Werb

windows operating system concepts pdf
windows operating system concepts pdf. windows operating system concepts pdf. Open. Extract. Open with. Sign In. Main menu. Displaying windows operating ...

windows operating system concepts pdf
Sign in. Loading… Whoops! There was a problem loading more pages. Whoops! There was a problem previewing this document. Retrying... Download. Connect ...

Deadlock in Distributed Operating System
Examples are given to illustrate these methods for avoiding file .... of files, such communications are never required to handle processes using only local files.

Chapter 2 Operating-System Structures
Operating System Structure. • Virtual Machines. • Operating System Debugging and Generation. • System Boot. CSD 204, Spring 2018. Shashi Prabh. 2 ..... (bootloader), locates the kernel, loads it into memory, and starts it. – Modern general pu

[PDF BOOK] SCO UNIX Operating System V: System Administrator s ...
[PDF BOOK] SCO UNIX Operating System V: System. Administrator s Guide EPUB By Santa Cruz Operation. Book Synopsis none. Book details. Author : Santa ...

operating system poster 2.pdf
Download. Connect more apps... Try one of the apps below to open or edit this item. operating system poster 2.pdf. operating system poster 2.pdf. Open. Extract.

operating system vulnerability trends
deriving an information system's security from knowledge about ... security, much as insurance companies use actu-. It's hard to ..... Digital Rights Management.

Improving Dependability by Revisiting Operating System ... - Choices
Figure 1. Microkernel OS structure also exists in other microkernels like L4 [17], Chorus [18], .... filesystem service and a network service that use SSRs.

windows operating system pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. windows ...

volatile oil pdf
Whoops! There was a problem loading more pages. volatile oil pdf. volatile oil pdf. Open. Extract. Open with. Sign In. Main menu. Displaying volatile oil pdf.

An Introduction to the Linux Operating System and ...
to the Linux Operating System and Command. Line Read Unlimited eBooks and Audiobooks. Book detail. Title : PDF Online Linux for Beginners: An q. Introduction to the Linux Operating System and. Command Line Read Unlimited eBooks and. Audiobooks isbn :