Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

EHDF ut2

1. Describe different methods to create forensic duplicate of static


data in UNIX operating system.
Ans:
1. Duplicating with dd and dcfldd : For creating a true forensic duplicate
image, dd utility is the
most efficient tool. dd will perform bit-for-bit copy of the original, as long as
the operating system
kernel recognizes the storage medium. However, it is expensive.
2. Creating a Linux Boot Media : Preparation for duplication using Linux is
difficult from the
methods that we discuss in this section. But using Linux is worthy, as it can
be the most flexible
boot environment in the toolbox.
3. Performing a Duplication with dd : Sometimes, to fit on a specific media
type, such as
CD/DVD or file systems with files fewer than 2.1 GB, duplication will be
stored in a series of
files. This is usually referred to as segmented image.
4. Duplicating with the Open Data Duplicator : The new open source tool is
ODD. To perform
forensic duplication simultaneously on a number of computers over a Local
LAN, the client-server
model is followed by this tool. To use the software on single forensic
workstations, you need to run
both halves on the same computer.
Three portions of ODD are:
1. Bootable CD-ROMs: This is similar to Trinux Linux Distributions;
2. Server-side applications: Most of the duplications, such as string
searches, calculation of hashes,
and storage of true forensic duplications, will be done by the server.
3. Client-side applications: If you are duplicating drives on forensic
workstations, this portion may
be run locally.

2. Explain how to use dcfldd tool to create forensic duplicate of static


data with an example.
=> Duplicating with dcfldd :
• DCFLdd and DC3dd are purpose-built applications derived from the
original dd source
code.
• The U.S. Department of Defense Computer Forensics Laboratory (DCFL)
and Defense
Cyber Crime Center (DC3) are the creators of these tools.
• The modified code is currently kept in projects on the Sourceforge
website.
• Features were added that make them more ideal than the normal dd
command to use as a
forensic duplication tools.
• Hashing on-the-fly: dcfldd can hash the input data as it is being
transferred, helping to
ensure data integrity.
• Status output: dcfldd can update the user of its progress in terms of the
amount of data
transferred and how much longer operation will take.
• Image/wipe verify: dcfldd can verify that a target drive is a bit-for-bit match
of the
specified input file or pattern.
• Multiple outputs: dcfldd can output to multiple files or disks at the same
time.
• When dd uses a default block size (bs, ibs, obs) of 512 bytes, dcfldd uses
32768 bytes (32
KiB) which is HUGELY more efficient.
• Example:
• dcfldd if=/dev/sde hash=md5,sha1 md5log=md5.txt shalog=sha1.txt
hashconv=after
bs=512 conv=noerrorr sync split=1500 splitformat=000 of=testusb

3. Explain the concept of image integrity and justify its importance.

4. Describe Network Forensics and list at least two network forensics


tools.

Network forensics is a sub-branch of digital forensics relating to the


monitoring and analysis of
computer network traffic for the purposes of information gathering, legal
evidence, or intrusion detection.
• Digital investigators/examiners must become skilled at following the
cybertrail to find related digital
evidence on the public Internet, private networks, and other commercial
systems.
• Network forensics follow the same basic principles of digital forensics as
outlined above. For the current
task of performing network forensics, the OSCAR (Obtain Information,
Strategize, Collect Evidence,
Analyze and Report) methodology will be used.
• Different types of network-based evidence:
• Full content data is exactly what the name implies: it is every single piece
of information that
passes across a network (or networks). Nothing is being filtered, exact
copies of all the traffic (often
called "packet captures", abbreviated to PCAP) are being stored.
• Another source of network-based evidence is called session data. It
usually consists of aggregated
traffic metadata and usually refers to the conversation between two network
entities, grouped
together into "flows" and/or groups of network packets related to one
another.
• Alert data are typically generated by Network Intrusion Detection Systems
(NIDS) such as Suricata
or Snort
• Statistical data provide the analyst with network-related aspects such as
the number of bytes
contained in a packet trace, start and end times of network conversations,
number of services and
protocols being used, most active network nodes, least active network
nodes, outliers in network
usage, average packet size, average packet rate, and so on. It can
therefore also act as a useful source
for anomaly detection.
Packet capturing tools: tcpdump, dumpcap:

• Beyond dedicated hardware sniffers, there are two software tools which
perform the
task of capturing network packets and writing them to disk: tcpdump and
dumpcap.

• tcpdump is a software package on its own while dumpcap is a tool in the


Wireshark package, where it is used to perform the actual packet capturing
for
Wireshark and tshark.
• Tcpdump
• Unix-based command-line tool used to intercept packets
• Including filtering to just the packets of interest
• Reads “live traffic” from interface specified using -i option ...
• ... or from a previously recorded trace file specified using -r
option
• You create these when capturing live traffic using -w option
• Tshark
• Tcpdump-like capture program that comes w/ Wireshark
• Very similar behavior & flags to tcpdump
• Wireshark
• GUI for displaying tcpdump/tshark packet traces

5. Explain different formats of forensic images with their importance.

6. Explain forensic investigation through some utilities of Window


registry.

7. Explain the concept of file carving with its use in forensic analysis.
what is File Carving?
In layman’s terms File Carving is the process of taking “chunks” of data out
of disk
images, memory dumps, packet captures basically files or data in a raw
state. In most cases
the way this is done is by looking for recognizable signatures in file dumps
which look like
garbage to the untrained eye.
There are two kinds of people in this world those with automated forensics
tools and those
who carve files, this section is for the second kind. Digital forensics like
other branches of
forensics science relies of artefacts and the effects of those artefacts on an
environment,
hopefully the presence or absence of these artefacts help prove or
determine an event occurred.

So why carve Files?


File carving can often be time consuming and tedious, however the basic
concepts of file
carving are important corner stones of data recovery and Computer
Forensics, if you don’t
know how to carve files I highly recommend you start now, even though it
can be time
consuming and tedious it’s an important skill to have and hopefully as this
post will show not
that hard either.
What I used?
I performed the majority of the File Carving for this post on Windows where
I used HxD. On
Mac OSX I used iHEX and on Linux I used BLESS Hex Editor. They can all
be found here:
• Hxd: http://mh-nexus.de/en/hxd/
• iHEX:
https://itunes.apple.com/au/app/ihex-hex-editor/id909566003?mt=12
• BLESS: http://home.gna.org/bless/
Examples:
Identifying .jpeg’
.Jpeg/jpg FF D8 FF D9

So, to find a .jpg/jpeg file you need to locate the header which starts with
the Hex values FF D8
and its footer FF D9.
8. Explain where the evidence resides in Windows system for forensic
investigation.

The location will depend on the specific case, but in general, evidence can
be found in the
following areas:
• Volatile data in kernel structures.
• Slack space, where you can obtain information from previously deleted
files that
are unrecoverable.
• Free or unallocated space, where you can obtain previously deleted files,
including damaged or inaccessible clusters.
• The logical file system.

• The event logs.


• The Registry, which you should think of as an enormous log file.
• Application logs not managed by the Windows Event Log Service.
• The swap files, which harbor information that was recently located in
system
RAM (named pagefile.sys on the active partition)
• Special application-level files, such as Internet Explorer’s Internet history
files
(index.dat), Netscape’s fat.db, the history.hst file, and the browser cache.
• Temporary files created by many applications.
• The Recycle Bin (a hidden, logical file structure where recently deleted
items can
be found)
• The printer spool.
• Sent or received email, such as the .pst files for Outlook mail.

During an investigation, you may need to search for evidence in each of


these areas, which can
be a complicated process. We will outline an investigative framework in this
chapter.CONDUCTING A WINDOWS INVESTIGATION:
After you’ve set up your forensic workstation with the proper tools and
recorded the low-level
partition data from the target image, you are ready to conduct your
investigation. The following
basic investigative steps are required for a formal examination of a target
system:
Review all pertinent logs.
Perform keyword searches.
Review relevant files.
Identify unauthorized user accounts or groups.
Identify rogue processes and services.
Look for unusual or hidden files/directories.
Check for unauthorized access points.
Examine jobs run by the Scheduler service.
Analyze trust relationships.
Review security identifiers.
These steps are not ordered chronologically or in order of importance. You
may need to perform
each of these steps or just a few of them. Your approach depends on your
response plan and the
circumstances of the incident.

9. Describe how the pertinent log files will be investigated forensically


in UNIX system.

1. Reviewing Pertinent Logs :


Unix operating systems have a variety of log files that can yield important
clues during incident
response. Not only are system activities such as logons, startups, and
shutdowns logged, but also
events associated with Unix network services. Most log files are located in
a common
directory, usually /var/log. However, some flavors of Unix will use an
alternate directory, such
as /usr/adm or /var/adm. Some logs are placed in nonintuitive locations,
such as /etc. When in
doubt, consult operating system-specific documentation. Additionally, not all
log files are even
on the system in question. You may find pertinent logs on a network server
or security
device, such as a firewall or an IDS. Types of logs in UNIX as:

• Network Logging (syslsog)


• Remote Syslog Server Logs
• TCP Wrapper Logging
• Other Network Logs (service specific)
• Host Logging
• User Activity Logging
• Shell Histories: Users with interactive access to Unix systems have an
associated
command shell, such as the Bourne (sh), Korn (ksh), or Bourne-Again
(bash) shell. These
shells provide the capability to log all commands, along with their
command-line options.
Typically, the history file is stored as a hidden file in the user’s home
directory.
What Can Happen ?
An attacker just gained root access to your system. One of the first steps
the attacker takes is to
delete the .bash_history file. Then he links the file to /dev/null, rendering it
incapable of logging
commands.
Where to Look for Evidence ?
Whenever you investigate a Unix system suspected of being compromised,
check for shell
history files. If the history feature is enabled and the history file does not
exist, there is a good
chance that the hacker deleted the history file. If the history file exists as a
link to /dev/null, as
shown below, that is another strong indication that the system has been
compromised. Also, note
the date/time of the file—the intruder has provided a clue for further
investigation.

10. Explain how the proprietary email files proves as investigating


path in Windows based system.

1. Web Browsers :
Web browsers are used to execute different activities on the Internet by
users (Figure 7.15).

Browsers are used for many functions, such as information search, access
to e-mail accounts, e-
commerce, making the banking, instant messaging, online blogs, access to
social networks. Web

browser records many data associated with user activity.


2. E-mail : E-mail has emerged as one of the most widely used
communication application, used
for exchange of data and to carry out data transactions. Due to an
increased use of e-mails in the
present scenario, its security has also become a major issue.
3. Mail Forensic Tools :
There are numerous tools which may contribute in the study of sender and
text of e-mail
message, so that an attack or the mischievous motive of the invasions may
be examined.
a. eMailTrackerPro
b. EmailTracer
c. Adcomplain
d. Aid4Mail Forensic
e. AbusePipe
f. AccessData’s FTK
g. EnCase Forensic
h. FINALeMAIL
i. Forensics Investigation Toolkit
1. Basic Static Analysis :
The basic method in static analysis, carried out testing against a program
which is called as
malware with doing the scanning using antivirus, moreover also doing
hashing, and detection of
packed or obfuscated at the program.
2. Advanced Static Analysis :
In the advanced method of static analysis, further analysis will be
undertaken of the method of
static analysis with analysis against the strings, linked libraries and function
as well as using IDA
disassembler.
3. Basic Dynamic Analysis : The basic method in dynamic analysis, will be
build a virtual
machine that will be used as a place to do a malware analysis. In addition,
malware will be
analysis using malware sandbox and monitoring process of malware and
analysis packets data
made by malware.
4. Advanced Dynamic Analysis :
In the advanced method of dynamic analysis, further analysis will be
undertaken of dynamic
analysis methods with debugging on malware, analysis the registry and do
an analysis on a
windows system.
5. Malware Analysis Report :
From the results of malware analysis using static analysis and dynamic
analysis method, we Will obtain a report of information on the
characteristics of malware.

You might also like