top of page

Search Results

327 items found for ""

  • Analysis of Super-Timeline: Created using Plaso(Log2timeline)

    Lets start with an example: The two drive letters present that could indicate USB/external device activity are "E:" and "F:". The creation of a LNK file for "E:\V" indicates that a device was inserted, likely an external device such as a USB drive, and opened in something like File Explorer. Several LNK files were created immediately following this event, suggesting that they were opened from the external device. Internet Explorer was likely used to surf the web when the user accessed their webmail account. This can be inferred from the IE history entries in the timeline, which show activity related to accessing a webmail account. The event log entry near the bottom of the timeline can provide further information about system activity, such as application launches, system reboots, or user logins. Live attack example: Account Logon: The account logs in as a local administrator, indicating privileged access to the system (EIDs 4624/4672/4776) at 22:58:10. Executable Creation: A new executable, spinlock.exe, is created in the C:\Windows\System32 folder at 22:59:43. This activity raises suspicion as it involves the creation of a potentially malicious file in a critical system directory. Application Execution: Spinlock.exe and Netstat are executed at -23:09:26 and -23:10:02, respectively, suggesting further system exploration or potential malicious activity. Directory Creation: The creation of the C:\Windows\System32\dllhost directory at 23:35:07 may indicate attempts to hide or obfuscate malicious files or activities within a system directory. Registry Modification: Modification of the \services\Netman\domain registry key, which contains an interesting URL, occurs at 23:42:04. This modification could be indicative of attempts to manipulate system configurations or establish persistence. Suspicious Executables: Svchost.exe and a.exe are executed for the first time, with the latter being highly suspicious. Star Rule: Determine Timeline Scope: Analyze key questions and case type to identify the timeframe during which the suspicious activity occurred. This helps narrow down the scope of the timeline analysis and manage data volume. Narrow Pivot Points: Identify specific time-based or file-based pivot points closest to the suspected incident. This focuses the analysis on relevant data and reduces noise. Determine the Best Process for Timeline Creation: Decide whether to create a super timeline or a targeted timeline based on required data sources and investigative needs. Super timelines cover a broad range of data sources, while targeted timelines focus on specific sources. Filter Timeline: Filter timeline data to include only relevant information, de-duplicating and eliminating unnecessary entries. Keywords can help identify and extract relevant pivot points. Analyze Timeline: Focus on analyzing the context of evidence discovered, examining activities before and after each pivot point to understand user behavior and the sequence of events. Reference tools such as the Windows Forensic Analysis Poster for interpretation.

  • A Deep Dive into Plaso/Log2Timeline Forensic Tools

    Plaso is the Python-based backend engine powering log2timeline, while log2timeline is the tool we use to extract timestamps and forensic artifacts. Together, they create what we call a super timeline—a comprehensive chronological record of system activity. Super timelines, unlike file system timelines, include a broad range of data beyond just file metadata. They can incorporate Windows event logs, prefetch data, shell bags, link files, and numerous other forensic artifacts. This comprehensive approach provides a more holistic view of system activity, making it invaluable for forensic investigations. Example: Imagine you've been given a disk image, perhaps a full disk image or a image created with KAPE. Your task: find evil, armed with little more than a date and time when the supposed activity occurred. So, you begin the investigation with the usual suspects: examining Windows event logs, prefetch data, various registry-based artifacts, and more. But after a while, you realize that combing through all these artifacts manually will take forever. Wouldn't it be great if there was a tool that could parse all these artifacts, consolidate them into a single data source, and arrange them in chronological order? Well, that's precisely what we can achieve with Plaso and log2timeline. I am going to use Ubuntu 22.04LTS version(Virtual box) and Plaso version 20220724 Installation: https://plaso.readthedocs.io/en/latest/sources/user/Ubuntu-Packaged-Release.html Lets start: We need image or collected artifact: The data we're dealing with could take various forms—it might be a raw disk image, an E01 image, a specific partition or offset within an image, or even a physical device like /dev/sdd. Moreover, it could manifest as a live mount point; for instance, we could mount a VHDX image created with KAPE and direct the tool to that mount point. With such versatility, we're equipped with a plethora of choices, each tailored to the specific nature of the data at hand. In current case I did capture the image using Kape tool and then I mounted the image in form of drive in my windows host than I shared the Mounted drive to (Ubuntu) virtual box If you are not able to access the mounted drive in ubuntu you have to enter below in terminal Command :- sudo adduser $USER vboxsf than restart the VM 2. Command and output (Syntax) Syntax log2timeline.py --storage-file OUTPUT INPUT and command will be like in our case log2timeline.py --storage-file akash.dump /media/sf_E_DRIVE akash.dump -- output file name which will be created (this will be in SQL format) you can add path like /path-to/akash.dump /media/sf_E_DRIVE -- Mounted drive path (1) Raw Image log2timeline.py /path-to/plaso.dump /path-to/image.dd (2) EWF Image log2timeline.py /path-to/plaso.dump /path-to/image.E01 (3) Physical Device log2timeline.py /path-to/plaso.dump /dev/sdd (4) Volume via Sector Offset log2timeline.py -o 63 /path-to/plaso.dump /path-to/image.dd 3. if you have entire image of drive as a artifact. log2timeline can ask to provide the which partition or vss you want to parse. if log2time find VSS. it will as for which vss as well You can mention identifier either one vss or all. Example :- 1 or 1..4 or all or (Single command) log2timeline.py --partitions 2 --vss-stores all --storage-file /path-to/plaso.dump /path- to/image.dd Now in current case I don’t have VSS or partition because I collected only needed artifacts (not entire drive) so in this case I did not get above options you can see screen shot below what it looks like once you hit enter. You can also use Parsers and filters against image with plaso/log2timeline and store in akash.dump or any output.dump file Parsers:- which will help us tell log to timeline to concentrate only on certain specific forensic artifacts To check all available parsers: log2timeline.py --parsers list |more if you want to use particular parser: In current case log2timeline.py --parsers windows_services --storage-file akash2.dump /media/sf_E_DRIVE you can write your own parsers: https://plaso.readthedocs.io/en/latest/sources/developer/How-to-write-a-parser.html 2. Filters: - Filter will tell logged timeline to go after specific files that would contain forensically valuable data like /users /windows/system32 Now there is txt file containing all important filter you can parse from image. Link below https://github.com/mark-hallman/plaso_filters/blob/master/filter_windows.txt you can do is open link and click on raw copy the link in ubuntu write : wget https://raw.githubusercontent.com/markhallman/plaso_filters/master/filter_windows.txt it will save the txt file after saving text file you can run below command Command log2timeline.py -f filter_windows.txt --storage-file akash2.dump /media/sf_E_DRIVE What this command will do from image it will go to specific files /Paths which are mentioned in txt file and capture artifact into akash2.dump file you can combine parser and filter in same command as well log2timeline.py - -parsers webhist -f filter_windows.txt --storage-file akash2.dump /media/sf_E_DRIVE what i am telling timeline to do is to target the paths and locations within the filter file and then against those particular locations run the web hist parser which will parse our browser forensics artifacts Now after all the command you will get output in output.dump or in my case akash.dump file. output will be in sql format and its very difficult to understand so now you have convert this dump file into csv format or any format which you prefer (I prefer CSV format because i will use timeline explorer to analyze further) 1. Using pinfo.py As the name suggests, it furnishes details about a specific Plazo storage file (output file): In our case for akash.dump Command pinfo.py akash.dump 2. Using psort.py this command is for Which format you want to create output. Command :- psort.py --output-time-zone utc -o list Now to analyze output with timeline_explorer from eric Zimmerman we will use  l2tcsv format Complete command :- psort.py --output-time-zone utc -o l2tcsv -w timeline.csv akash.dump -w write format "Within an investigation, it's common to have a sense of the time range in which the suspected incident occurred. For instance, let's say we want to focus on a specific day and even a particular time within that day—let's choose February 29th at 15:00. We can achieve this using a technique called slicing. By default, it offers a five-minute window before and after the given time, although this window size can be adjusted." Command : psort.py --output-time-zone utc -o l2tcsv -w timeline.csv akash.dump - - slice '2024-02-29 15:00' "However, use a start and end date to delineate the investigation timeframe. This is achieved by specifying a range bounded by two dates. For example, "date > '2024-12-31 23:59:59' and date < '2020-04-01 00:00:00'." Command : psort.py --output-time-zone utc -o l2tcsv -w timeline.csv akash.dump "date > '2024-12-31 23:59:59' AND date < '2024-04-01 00:00:00'" Once super timeline is create in CSV format. We can use timeline explorer to analyze. The best part of timeline explorer is Data loaded into Timeline Explorer is automatically color-coded based on the type of artifact. For example, USB device utilization is highlighted in blue, file openings in green, and program executions in red. This color-coding helps users quickly identify and interpret different types of activities within the timeline. Recommended column to look while analyzing: Date, Time, MACB, Source type, desc, filename, inode, notes, extra Conclusion: In conclusion, Plaso/Log2Timeline stands as a cornerstone in the field of digital forensics, offering investigators a powerful tool for extracting, organizing, and analyzing digital evidence. Its origins rooted in the need for efficiency and accuracy, coupled with its continuous evolution and updates, make it an essential asset for forensic practitioners worldwide. As digital investigations continue to evolve, Plaso/Log2Timeline remains at the forefront, empowering investigators to unravel complex digital mysteries with ease and precision.

  • Importance of Timestamp in Timeline Analysis while Forensic Investigations

    Introduction: Timestamp analysis plays a crucial role in forensic investigations, offering valuable insights into the timeline of events and activities on a system. Understanding Timestamp Behavior in Network File Transfers: When files are transferred over network shares using protocols like SMB, their timestamps behave similarly to local file operations. Despite being copied remotely, files retain their original modification time while being assigned a new creation time upon arrival at the destination host. This phenomenon provides forensic analysts with a clear "time of file copy," which serves as a pivotal point for investigative analysis. Tracking File Movements and Lateral Movement Techniques: By examining the creation time of transferred files, analysts can track the origin of the file and uncover insights into potential lateral movement techniques used by threat actors. For example, the creation time of an .exe file copied over SMB can indicate the time of file transfer, shedding light on unauthorized activities or malware execution on the remote system. Utilizing Timestamp Analysis as a Pivot Point: The creation time of transferred files serves as a valuable pivot point for forensic analysis, enabling analysts to delve deeper into the timeline of events on the remote system. By correlating creation times with other forensic artifacts such as event log entries and application execution events, analysts can gain a comprehensive understanding of the activities conducted by threat actors. Enhancing Forensic Investigations: Timestamp analysis provides forensic investigators with a powerful tool for detecting and responding to security incidents. By leveraging the mechanics of filesystem timestamps, analysts can uncover hidden insights, track file movements, and identify potential security breaches with greater accuracy and efficiency. Conclusion: Timestamp analysis is a cornerstone of forensic investigations, offering forensic analysts a window into the timeline of events on a system. By understanding the behavior of timestamps in network file transfers and lateral movement techniques, analysts can uncover valuable insights, track file movements, and enhance their ability to detect and respond to security incidents effectively.

  • Understanding NTFS Timestamps(Timeline Analysis) : With Example

    Lets understand with example: We have created table to understand NTFS Operations 1. Create Operation: When a file is created, according to the table, all timestamps (Modified, Accessed, Created) are updated 2. Modify Operation: When a file is modified, only the Modified timestamp is expected to change, while the Accessed and Created timestamps remain unchanged. However, if NtfsDisableLastAccessUpdate is enabled (set to 0), the Access timestamp will be updated along with the Modified timestamp. In this case its enabled: 3. Copy Operation: When a file is copied using Windows Explorer, the Modified timestamp of the new file inherits from the original file, while the Created and Accessed timestamps are updated to the current time. If a file is copied using the command line (cmd), the behavior is similar to using Windows Explorer. Both methods update the Created and Accessed timestamps of the copied file. However: But when we analyze $MFT File. We may actually see a difference. Because MFT will show us all the time stamps ($SI)These time stamps are which accessible by windows API ($FN) These time  stamps are accessible by Windows kernel 4. File Access: The behavior of the Access timestamp depends on the NtfsDisableLastAccessUpdate registry setting. If enabled, the Access timestamp will be updated upon file access. -------------------------------------------------------------------------------------------------------------

  • Understanding NTFS Timestamps (Timeline Analysis)

    Introduction: In digital forensics, understanding NTFS timestamps is crucial for reconstructing events and analyzing user activities on a computer system. NTFS stores four significant filesystem times for files and directories: last modification time (M), last access time (A), last modification of the MFT record (C), and file creation time (B). Let's break down these timestamps and their significance in simpler terms. NTFS Timestamps Explained: Last Modification Time (M): This timestamp indicates when a file's content was last modified or changed. Last Access Time (A): Historically, this timestamp recorded when a file was last accessed, but its reliability has been questionable due to delayed updates in some Windows versions. Last Modification of MFT Record (C): This timestamp reflects changes to the file's metadata, such as renaming, changing file size, updating security permissions, or modifying file ownership. File Creation Time (B): Indicates when a file was created on the filesystem. Significance of NTFS Timestamps: Focus on M and B Times: While all four timestamps provide valuable information, focusing on M (modification) and B (creation) times is recommended for most forensic queries due to their clarity and reliability. UTC Format: NTFS timestamps are stored in UTC format, unaffected by changes in time zone or daylight savings time, unlike some other filesystems like FAT. Understanding Timestamp Updates: Granular Timestamps: NTFS timestamps use a high-resolution format, allowing for precise tracking of events down to hundred nanoseconds. Impact of Actions: Different actions, such as file creation, modification, renaming, or moving, trigger updates to specific timestamps. For example, a file's C (metadata change) time is updated when file attributes or permissions are modified. Tips for Timestamp Analysis: Pattern Recognition: Recognizing patterns in timestamp updates can provide valuable insights into file movements and actions taken by users. Testing Hypotheses: When timestamp provenance is crucial, it's essential to test hypotheses multiple times on a close approximation of the original system to validate findings. Impact of Different Actions on Timestamps: File Creation: All four timestamps (MACB) are set to the time of creation when a file is created. File Access: Access times have been disabled since Windows Vista but may be re-enabled in some Windows 10 versions. Due to inconsistency, access times are often disregarded in forensic analysis. File Modification: Modifications to file content and size update both M and C times. File Rename and Local Move: Only the metadata (C time) of the file, such as name and parent folder, is updated. File Deletion: Windows does not update timestamps when a file is deleted, as there is no deletion time recorded. Understanding Patterns in Timestamp Updates: File Copying and Volume Moves: During file copying or volume moves via the command line (CLI), modified times may precede creation times, indicating files originating from elsewhere. This anomaly serves as a crucial indicator of file movement and copying activities. Unique Timestamp Patterns: Differences in timestamp updates between GUI desktop and command line moves highlight the complexity of timestamp behavior and provide valuable insights into file manipulation. Conclusion: Understanding NTFS timestamps is essential for digital forensic analysts to reconstruct events, track file movements, and uncover insights into user activities on computer systems. By focusing on key timestamps like M and B times and recognizing patterns in timestamp updates, forensic investigators can effectively analyze timelines and piece together a timeline of events with precision.

  • Understanding Timeline Analysis in Digital Forensics

    What is Timeline Analysis? Timeline analysis in digital forensics is the process of examining chronological data to reconstruct events that occurred on a computer or digital device. It involves analyzing timestamps, file metadata, and other artifacts to piece together a timeline of user activities. Why is Timeline Analysis Important? Timeline analysis is crucial in digital forensics for several reasons: Reconstructing Events: It helps reconstruct the sequence of actions taken by users on a system, providing insight into their activities. Evidence Collection: It enables forensic investigators to collect and organize evidence in a chronological order, making it easier to understand and present findings. Identifying Patterns: By analyzing timelines, investigators can identify patterns of behavior, detect anomalies, and uncover potential security breaches or malicious activities. Correlating Data: Timeline analysis allows for the correlation of data across different sources, helping investigators establish connections and draw conclusions. Key Steps in Timeline Analysis: Determine Timeline Scope: Identify the timeframe during which the activity of interest occurred. Narrowing down the scope helps manage the volume of data to be analyzed. Narrow Pivot Points: Identify specific events or artifacts, such as timestamps or key files, that can serve as focal points for analysis. These pivot points help focus the investigation. Choose the Timeline Creation Process: Select the appropriate method for creating the timeline based on the data sources available and the requirements of the investigation. This could involve creating a filesystem timeline or a super timeline. Filter Timeline Data: Filter and de-duplicate the timeline data to focus on relevant information. Keywords can be used to identify pivot points and extract essential data for analysis. Analyze Timeline: Analyze the timeline data, focusing on the context of evidence discovered. Examine events before and after each pivot point to understand user activity and behavior. Types of Timelines: Filesystem Timeline: This type of timeline collects data from files and directories on a volume, including both allocated and unallocated metadata structures. It provides insights into file creation, modification, access, and deletion. Advantages of Filesystem Timelines: Flexibility: Filesystem timelines can parse various filesystem types, including NTFS, FAT, EXT, HFS+, and more. Comprehensive: They include information about both active and deleted files, allowing for a thorough analysis. Widely Applicable: Filesystem timelines can be used across a wide range of devices and operating systems. In conclusion, timeline analysis is a critical aspect of digital forensics, allowing investigators to reconstruct events, collect evidence, and uncover insights into user behavior. By following a systematic approach and leveraging appropriate tools, forensic analysts can effectively analyze timelines to support investigations and legal proceedings.

  • Artifacts for USB or Drive Usage Part 1: Key Identification || First/Last Times || User

    1.Key Identification USB devices are commonly used for data transfer and storage, making them a crucial aspect of digital forensics investigations. By tracking USB devices plugged into a machine, investigators can gather valuable information about the usage history and activity on a system Location of USB Device Information: USB device information is stored in the Windows Registry under the following locations: SYSTEM\CurrentControlSet\Enum\USBSTOR SYSTEM\CurrentControlSet\Enum\USB Query using CMD: Reg Query HKLM\SYSTEM\CurrentControlSet\Enum\USBSTOR Reg Query HKLM\SYSTEM\CurrentControlSet\Enum\USB Manually collect artifact using CMD: or can use Kape Reg Save HKLM\SYSTEM\CurrentControlSet\Enum\USBSTOR C:\Users\User\Downloads\output.hiv 2. .Reg Save HKLM\SYSTEM\CurrentControlSet\Enum\USB C:\Users\User\Downloads\output.hiv After collecting Artifact registry explorer can be used to investigate: Interpretation: • Identify vendor, product, and version of a USB device plugged into a machine. • Identify a unique USB device plugged into the machine. • Determine the time a device was plugged into the machine. • Devices that do not have a unique serial number will have an “&” in the second character of the serial number. ------------------------------------------------------------------------------------------------------------- 2.First/Last Times USB devices play a significant role in digital forensics investigations, and understanding when these devices were first and last connected to a Windows machine can provide valuable insights into user activity and potential security incidents. First Time Connection: Location: Plug and Play Log Files: Windows XP: C:\Windows\setupapi.log Windows 7-10: C:\Windows\inf\setupapi.dev.log Query using CMD: C:\Windows\INF>setupapi.dev.log (it is file so if you enter it will open with notepad) Manually collect artifact using CMD: or can use Kape copy C:\Windows\INF\setupapi* C:\Users\User\Downloads\Shell Interpretation: • Search for Device Serial Number • Log File times are set to local time zone Last Time : - NTUSER.DAT Hive: Location: NTUSER//Software/Microsoft/Windows/CurrentVersion/Explorer/MountPoints2/{GUID} Query using CMD: reg query "HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\MountPoints2" Manually collect artifact using CMD: or can use Kape Reg Save "HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\MountPoints2" C:\Users\User\Downloads\output.hiv Interpretation: Using the Serial Number as the marker, you can determine the time a specific USB device was last connected to the local machine. ------------------------------------------------------------------------------------------------------------- 3. User USB devices often play a critical role in forensic investigations, and identifying the user who used a particular USB device can provide valuable insights into user activity and potential security incidents. Location: GUID from SYSTEM\MountedDevices: This GUID will be used to identify the user who plugged in the USB device. NTUSER.DAT Hive: Path: NTUSER\Software\Microsoft\Windows\CurrentVersion\Explorer\MountPoints2 Query using CMD: Command • Look for GUID from SYSTEM\MountedDevices reg query HKLM\SYSTEM\MountedDevices GUID • reg query "HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\MountPoints2" Save details using reg save and provided path which registry you want to save and destination where you want to store in case of manual extraction of artifact or you can use Kape. Interpretation: Identify GUID from SYSTEM\MountedDevices: The first step is to locate the GUID associated with the USB device of interest from the SYSTEM\MountedDevices registry key. Locate User's Personal MountPoints Key: Once the GUID is identified, it is used to locate the user's personal mountpoints key in the NTUSER.DAT hive, specifically in the Explorer\MountPoints2 subkey. Determine Last Write Time: The last write time of the user's mountpoints key corresponds to the last time the USB device was plugged into the machine by that user. User Attribution: By examining the user's mountpoints key, forensic investigators can attribute the usage of the USB device to a specific user account. -------------------------------------------------------------------------------------------------------------

  • Artifacts for Account Usage: Last Login || Success/Fail Logons || Last Password Change || Logon Types || RDP Usage.

    1. Last Login: Location: C:\windows\system32\config\SAM SAM\Domains\Account\Users Interpretation: The last login time for local accounts is stored in the registry key. This information can be valuable for understanding user activity and identifying active accounts on the system. ------------------------------------------------------------------------------------------------------------- 2. Success/Fail Logons: Location: XP: %systemroot%\System32\config\SecEvent.evt Win7-10: %systemroot%\System32\winevt\logs\Security.evtx Interpretation: Event IDs provide information about successful and failed logon attempts: 528/4624: Successful Logon 529/4625: Failed Logon 538/4634: Successful Logoff 540/4624: Successful Network Logon (e.g., file shares) Monitoring these events helps track account usage and detect potential security breaches. 3. Last Password Change: Location: C:\windows\system32\config\SAM SAM\Domains\Account\Users Interpretation: Registry key stores the last password change time for specific users. Useful for monitoring password security and identifying potential security incidents. ------------------------------------------------------------------------------------------------------------ 4. Logon Types: Location: XP: Event ID 528 Win7-10: Event ID 4624 Interpretation: Different logon types indicate the method used for account authorization: 2: Logon via console 3: Network Logon 4: Batch Logon 5: Windows Service Logon 7: Credentials used to unlock screen 8: Network logon sending credentials (cleartext) 9: Different credentials used than logged-on user 10: Remote interactive logon (RDP) 11: Cached credentials used to logon ------------------------------------------------------------------------------------------------------------- 5. RDP Usage: Location: XP: %systemroot%\System32\config\SecEvent.evt Win7-10: %systemroot%\System32\winevt\logs\Security.evtx Interpretation: Event IDs provide information about Remote Desktop Protocol (RDP) sessions: 682/4778: Session Connected/Reconnected 683/4779: Session Disconnected Hostname and IP address of remote machines making the connection are logged. -------------------------------------------------------------------------------------------------------------

  • Artifacts for file download Part 2: Firefox || Internet Explorer || Chrome.

    Tools of Analysis: DB Browser for SQLite/SQLciper Armed with the "DB Browser for SQLite," forensic investigators gain a powerful lens into the browsers artifact. This tool, available for download at sqlitebrowser.org, empowers analysts to navigate and dissect the SQL database seamlessly. For a visual guide, the YouTube tutorial here offers step-by-step insights. Unveiling the Downloads. SQLite Artifacts Upon accessing the Downloads. SQLite database, forensic analysts can extract a wealth of information: Filename, Size, and Type: Details of each downloaded file. Download and Referring Page: Insight into the source of downloads. File Save Location: The directory where downloaded files are stored. Application Used: Information on the application used to open downloaded folders. Download Start and End Times: Temporal details capturing when downloads occurred. ------------------------------------------------------------------------------------------------------------- 1. Firefox To access the repository of Firefox artifacts, command :- cd %USERPROFILE%\AppData\Roaming\Mozilla\Firefox\Profiles\ serves as our gateway. Further exploration reveals a directory named .default, within which lies the SQL database, a treasure trove of information waiting to be unearthed. ------------------------------------------------------------------------------------------------------------ 2. Index.dat/ Places. SQLite: Tracing Internet Explorer Adventures Internet Explorer, though evolving, still retains artifacts crucial for forensic scrutiny. command : - cd %userprofile%\AppData\Local\Microsoft\Windows\History\Low\History.IE5\ leads investigators to the Index.dat/Places.SQLite repository. Here, details for each local user account are stored, recording the frequency of visits to specific locations ----------------------------------------------------------------------------------------------------------- 3. Chrome Chronicles: Chrome, a modern-day browser juggernaut, leaves its mark on digital landscapes. Command :- cd %userprofile%\AppData\Local\Google\Chrome\User Data\Default\ unravels the Chrome artifact. Forensic investigators can gather a variety of artifacts, including: ------------------------------------------------------------------------------------------------------------- Tools for collection these artifacts: Unleashing Kape: A Forensic Powerhouse For a comprehensive approach to artifact gathering, Kape emerges as a potent tool. With its versatility, Kape can efficiently collect browser artifacts, providing investigators with a unified dataset for analysis. 2. Taking Artifacts Home: A Command of Copy(Manually copying artifacts) Whether using Kape or opting for a manual approach, the command Command :- copy "C:\Users\\AppData\Local\Google\Chrome\User Data\Default\History" "C:\Path\To\New\Location\HistoryCopy" allows forensic analysts to copy artifacts for further analysis. The subsequent use of SQLite3 facilitates in-depth examination "You can choose any way to collect yo artifacts at end of the day this blog is for information purposes" As we conclude our forensic exploration into browser artifacts, the significance of each command and tool becomes evident. Firefox, Internet Explorer, and Chrome each contribute a unique chapter to the digital saga. Forensic investigators armed with commands, tools, and methodologies can unlock the secrets within browser histories, painting a vivid picture of user activities in the vast landscape of digital forensics.

  • Artifacts for file download Part 1: Open/Save MRU Artifacts || Email Attachments || Skype History

    1. Open/Save MRU Artifacts: It acts as a repository for a history of files accessed or saved by users, offering a panoramic view of their digital footprint. Location in the Registry To get a glimpse into this trove of information, one need only venture into the registry. The Open/Save MRU key resides at: Command:- reg query HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\ComDlg32\ You will find this artifact in C:\Users\User\NTUSER.DAT Collect All 3 Artifacts Crafting a Seamless Forensic Workflow To craft a seamless forensic workflow, consider the following steps: KAPE Automation: Leverage the power of KAPE for efficient and automated artifact collection. Manual Extraction: For those who prefer a more hands-on approach, manual extraction via registry exploration is a viable option. Forensic Image Considerations: Ensure that the registry hive NTUSER.DAT is part of your forensics image to unlock a comprehensive array of artifacts. If you using FTK Imager: Simple do this to collect all NTUSER.DAT Hive. ------------------------------------------------------------------------------------------------------------- 2. Email Attachments The exploration begins by navigating to the user's Outlook data files, a realm rich in potential forensic artifacts. command :- cd %USERPROFILE%\AppData\Local\Microsoft\Outlook leads us to the hub where OST and PST files reside. These files, OST (Offline Storage Table) and PST (Personal Storage Table), are the cornerstone of Microsoft Outlook's data storage. Understanding OST and PST Files OST Files: Offline Storage Table files facilitate offline access to mailbox data. Cached mode in Outlook relies on OST files to ensure users can seamlessly interact with their mailbox even when offline. PST Files: Personal Storage Table files serve as local repositories for mailbox data. Users often utilize PST files to store their mailbox data locally, providing a degree of autonomy and control. Extraction of these files, the simple yet potent Command :- copy "OriginalFilename.ost" "DestinationPath" command proves invaluable. It allows forensic analysts to create copies for analysis without compromising the integrity of the original data. Analysis Tools: Unveiling the Forensic Arsenal Forensic analysis of OST and PST files requires specialized tools equipped to navigate the intricate structures of these data repositories. Here are some stalwarts in the forensic arsenal: FTK (Forensic Toolkit): Renowned for its comprehensive forensic analysis capabilities, FTK is adept at parsing and examining email artifacts, including those stored in OST and PST files. Encase: A stalwart in digital forensics, EnCase provides a robust platform for dissecting OST and PST files, unraveling the layers of email data with precision. MailXaminer: Tailored for email forensics, MailXaminer proves to be a versatile tool, offering a range of features to analyze and interpret OST and PST files. Kernel for OST Viewer: Designed specifically for OST files, this tool provides a streamlined view into the contents of Offline Storage Table files. ------------------------------------------------------------------------------------------------------------- 3. Skype History To unveil the secrets held within Skype chat history, command :- C:\Users\\AppData\Roaming\Skype\ serves as our compass. This command directs us to the directory where Skype diligently stores the traces of chat sessions and files exchanged between users. Forensic investigators armed with the knowledge of Skype's data storage location can navigate the intricacies of chat history. Analyzing these artifacts can reveal a wealth of information, shedding light on communication patterns, shared content, and potentially uncovering crucial details in digital investigations.

  • Artifacts for Deleted File or File Knowledge Part 2: Search -WordWheelQuery || Index.dat file://

    1.Search-WordWheelQuery The "WordWheelQuery" registry key is a valuable artifact found in the Windows registry of Windows 7 to Windows 10 systems. It stores information about keywords searched for from the START menu bar, providing insights into user search behavior and interests. Location: Registry Hive: NTUSER.DAT Key Path: NTUSER.DAT\Software\Microsoft\Windows\CurrentVersion\Explorer\WordWheelQuery You can capture entire NTUSER.DAT or else you can manually extract particular hive using command line on live endpoint Command:- Reg Save HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\WordWheelQuery C:\Users\User\Downloads\Shell\output.hiv After capturing this artifact you can use registry explorer to do further investigation example of registry explorer: Interpretation: Keywords: The registry key stores keywords searched from the START menu bar. These keywords are added in Unicode format, allowing for the representation of a wide range of characters and languages. Temporal Order: The keywords are listed in temporal order, indicating the sequence in which they were searched. This temporal order can provide valuable insights into the frequency and recency of user searches. Analysis: MRUlist: The keywords are typically stored in an MRUlist, which stands for Most Recently Used list. This list organizes the keywords based on their usage, with the most recently searched keywords appearing at the top. Unicode Encoding: Since the keywords are stored in Unicode format, special attention should be paid to decoding and interpreting them correctly during analysis. Forensic Significance: User Behavior: Analysis of the WordWheelQuery registry key can reveal valuable information about user behavior, interests, and activities on the system. Search History: Investigators can reconstruct the search history of users, uncovering patterns, trends, and areas of interest. Evidence Correlation: Correlating keyword searches with other artifacts and evidence on the system can provide a comprehensive understanding of user activities and intentions. ------------------------------------------------------------------------------------------------------------- 2.Index.dat file:// Index.dat files are a lesser-known but crucial component of Internet Explorer history. While many users associate IE history solely with web browsing, index.dat files also record local and remote file access, providing valuable insights into the files and applications accessed on a system over time. Description: File Format: Stored in index.dat files as file:///C:/directory/filename.ext. Local and Remote Access: Tracks both local file access (from the system's local drives) and remote access (via network shares). Not Browser Activity: It's essential to note that entries in index.dat files do not necessarily indicate that a file was opened in a browser. Instead, they capture instances where files were accessed or interacted with, regardless of the application used. Windows Vista and Later: In newer versions of Windows, like Windows Vista, 7, 8, and 10, index.dat files are located in the following directory: C:\Users\AppData\Local\Microsoft\Windows\Temporary Internet Files\Content.IE5 These files are hidden system files, so you may need to configure your system to show hidden files and folders to access them. Additionally, index.dat files may be present in other locations on the system, depending on the usage and configuration of Internet Explorer. ----------------------------------------------------------------------------------------------------------

  • Lnk files Analysis: Tool-->LECmd.exe

    On Live System: (Can be used for collected lnk files) Key Data Extracted from LNK Files: When parsing LNK files, forensic investigators focus on extracting vital metadata, including: Source path of the file and its time tags: Full path Target file access time (UTC) Target file creation time (UTC) Target file modification time (UTC) Drive type Volume serial number (Drive serial number) Volume label Target file size (bytes) Additionally, certain fields such as 'Droid file' and 'Birth droid file' may also be present. DROID (Digital Record Object Identification) represents the individual profile of a file and can be utilized by the Link Tracking Service to determine whether the file has been copied or moved. Command and Execution: Forensic analysts utilize specialized tools like LECmd to parse LNK files efficiently. The following command demonstrates how LNK file parsing can be executed: LECmd.exe -d C:\Users\User\AppData\Roaming\Microsoft\Windows\Recent -q --csv .\ In this command: LECmd.exe represents the executable file of the parsing tool. -d indicates selecting all directories. (or put location where lnk files are present) C:\Users\User\AppData\Roaming\Microsoft\Windows\Recent denotes the path where LNK files are located. -q ensures only the filename being processed is displayed to speed up exporting to CSV. --csv specifies the output format as CSV. .\ signifies storing the output in the current working directory. Conclusion: Parsing LNK files is a crucial step in digital forensics, enabling investigators to extract essential metadata and uncover valuable evidence. By leveraging specialized tools and understanding the key components of LNK file parsing, forensic analysts can effectively analyze file access history and user activities, contributing to comprehensive forensic investigations.

bottom of page