top of page

Search Results

418 results found with an empty search

  • Google Chrome Forensics: Analyzing History and cache

    Introduction Since its release in 2008, Google Chrome has become one of the most widely used web browsers, thanks to its user-friendly interface, seamless integration with Google services, and efficient web rendering capabilities. From a forensic standpoint, Chrome's artifacts are well-organized and primarily stored within the user’s profile directory, making them a valuable resource for digital investigators. ------------------------------------------------------------------------------------------------------------- Chrome User Data Storage Locations Windows XP: %UserProfile%\Local Settings\Application Data\Google\Chrome\User Data Windows 7 and Later: %UserProfile%\AppData\Local\Google\Chrome\User Data Most artifacts are stored in SQLite databases or JSON files . While these formats are widely documented, the stored data often requires additional processing for analysis. For example, timestamps and page transition data may not be human-readable at first glance. ------------------------------------------------------------------------------------------------------------- Analyzing Chrome Browser History Browser history is an essential artifact in forensic investigations, providing insight into a user’s online activity. Chrome maintains an extensive history of visited websites, with a default retention period of up to 90 days. Key information extracted from browser history includes: URLs of visited websites Page titles and referring sites Frequency of visits Timestamps for each visit User profile associated with the visits ------------------------------------------------------------------------------------------------------------- Chrome History Database The History  database, stored within User Data\, is the primary source for browsing activity. Chrome stores its history in SQLite format, and forensic analysts can extract valuable insights by querying specific tables. Key SQLite Tables in the History Database: Table Name Data Stored downloads, downloads_url_chains Download history, including URLs and file names keyword_search_terms Typed search queries (used for autocomplete) segments, segment_usage Frequently visited sites (for the Most Visited page) visit_source Source of URL information (local, synced, imported) urls, visits Comprehensive browser history, including timestamps and referrer data Additional History Artifacts Top Sites Database:  Stores thumbnails and metadata for frequently visited pages. Archived History:  Previously stored older browsing history beyond 90 days (removed in Chrome v37). History Index YYYY-MM:  Used to index page content for searches (removed in Chrome v30). ------------------------------------------------------------------------------------------------------------- Key Tables in Chrome’s History Database The primary tables of interest are: urls  – Stores the URL, page title, and the last visit time. visits  – Keeps a detailed log of each visit to a website. To get a complete picture of a user’s browsing activity, you need to cross-reference both tables. What Can We Learn from Chrome’s History? Total Visits:  Each time a site is visited, a new entry is made in the visits table. Last Visit Time:  Stored in the urls table, showing the most recent visit. Visit Count:  Tracks how often a particular site was visited. Typed URLs:  URLs that were physically typed or pasted into the address bar get a special typed_count value, indicating intentional user activity. Visit Duration:  Unlike most browsers, Chrome records how long a site was open in the visit_duration field. This data is stored in microseconds , and the tab doesn’t even have to be in focus for the duration to increase. Hidden URLs:  The hidden field in the urls table doesn’t mean the visit was hidden from the user . Instead, it controls whether the URL appears in auto-complete suggestions (0 = visible, 1 = hidden). ------------------------------------------------------------------------------------------------------------- Understanding Page Transition Types Every visit entry in Chrome has a transition field , which indicates how the user accessed a website. These values are stored as 32-bit numbers  and can look cryptic without decoding. The core types include: Transition Type Meaning 0 Link click 1 Typed URL 2 Auto bookmark 3 Auto subframe (embedded content) 4 Manual subframe 5 Omnibox suggestion 6 Start page visit 7 Form submission 8 Page reload 9 Keyword search 10 Generated keyword search These transition types help investigators determine how a website was accessed. For example, a typed  transition (1) suggests direct user interaction, whereas a link  transition (0) indicates the user clicked a hyperlink. https://kb.digital-detective.net/display/BF/Page+Transitions Check out the article related to Transitions and qualifiers ------------------------------------------------------------------------------------------------------------- What is an Internet Cache? The internet cache is a feature designed to speed up web browsing . When you visit a website, your browser downloads and saves parts of the webpage (such as images, scripts, and HTML files) on your device. This way, if you revisit the same site, your browser can load the saved content instead of downloading it again, making things much faster. This is why when you press the back button, the previous page loads instantly—it’s coming from the cache. Why is Cache Important in Forensics? From a forensic standpoint, the cache is a goldmine of information about a user's online activity. It stores actual webpage content, meaning investigators can reconstruct what a user saw and interacted with on a websit e. While browsing history only logs visited URLs, the cache holds more valuable data like images, HTML files, and even downloaded attachments (e.g., in Outlook Web Access). How is Chrome's Cache Structured? Chrome stores cached files inside a user’s profile directory. Before version 97, the cache files were stored in the Cache folder . From v ersion 97 onwards, they were moved deeper into Cache\Cache_Data. The cache consists of at least five key files: Index file  (index): Keeps track of cached entries. Data files  (data_0 to data_3): Store the actual cached content and metadata. Block files : Organize cached data into fixed-size blocks for efficient storage. Separate files  (f_xx format): Used for storing larger files (above 16 KB).\ What Information Can Be Extracted from Chrome Cache? Each cached item comes with metadata that gives useful insights, such as: Metadata Field Description Filename The name of the file downloaded from the website. URL The web address where the cached file came from. Content Type Type of file (e.g., HTML, JPG, JavaScript). File Size Size of the cached file. Last Accessed Time The last time the cached content was used. Server Time The first time the cached content was saved. Response Header Stores HTTP headers, which help Chrome retrieve cached data efficiently. Timestamp Analysis in Chrome Cache Chrome cache files contain four important timestamps stored in UTC: Last Accessed  – The last time the user viewed the cached content. Server Time  – When the content was first saved to disk. Server Last Modified  – When the content was last updated on the website. Expire Time  – When the cached content is expected to be removed (set by the website). Additionally, large files stored separately (f_##### files) have filesystem-specific timestamps , including Created, Modified, Accessed, and MFT Change times  (for NTFS systems). Tools for Analyzing Chrome Cache Manually extracting cache data can be challenging since it’s stored in a structured format. However, tools like NirSoft ChromeCacheView  simplify the process by displaying cache details in an easy-to-read table. I request whitelist the tool to your antivirus because this will get quarantined everytime you will try to run View cached file metadata. Extract and save cached files for analysis. Limitations of Cache Analysis The cache is dynamic — older files get removed as new ones are stored. Websites can prevent caching  for security reasons (e.g., Gmail doesn’t cache sensitive content). Cache files corrupt easily , causing loss of data. Chrome rebuilds  the cach e if essential files are missing. Conclusion Chrome is one of the most data-rich browsers for forensic investigations. Its history database, visit logs, and metadata provide a detailed timeline of a user’s web activity as well Chrome’s cache is also very valuable forensic artifact that helps investigators piece together a user’s browsing activity. B y analyzing cache contents and timestamps, forensic experts can understand what sites were visited, what files were downloaded, and even reconstruct webpages. However, cache data is volatile, so timely acquisition and analysis are crucial! -------------------------------------------------------------------------------------------------------- Stay with me we will continue about Google forensic in next article. ------------------------------------------------Dean------------------------------------------------

  • Browser Forensics: Uncovering Digital Clues

    ---------------------------------------------------------------------------------------------------------- In today’s digital world, tools like Belkasoft and Magnet Axiom are like superheroes of browser forensics. You snap a screenshot, run the tool, and boom —you have all the answers. It's almost like magic! ✨ But, let’s be real—those tools aren’t exactly cheap, and not everyone (especially freelancers or small businesses) can afford to shell out a small fortune for them. So, what do we do when the fancy tools are out of reach? Well, we roll up our sleeves and dive into the exciting world of manual browser forensics! Yes, it’s more time-consuming, but trust me, it’s worth it. Plus, the best part? You’ll get to be the digital detective you’ve always wanted to be. 🕵️‍♂️ Don't worry if you feel overwhelmed by articles and technical jargon. Stick with me, and by the end of this series, you'll be a browser forensics pro—without the hefty price tag! Let’s get started, and have some fun along the way! 😎 ---------------------------------------------------------------------------------------------------------- I nternet access is one of the most frequent user activities, making web browsers a key portal for online interactions. In cases like employee misuse, internet activity alone can serve as crucial evidence. In other investigations, while not the primary focus, browser data can provide valuable corroborating information. For instance, analyzing browsing history can reveal access to local files or network shares during an intrusion investigation. We are going to explore the dominant browsers on Windows: Google Chrome, Microsoft Edge, Internet Explorer, and Mozilla Firefox. If you haven’t kept up with browser artifacts, you may be surprised at the vast amount of data stored by these applications. ---------------------------------------------------------------------------------------------------------- Understanding Browser Artifacts We must determine what a piece of trace evidence represents and how it relates to key investigative questions. Internet browsers store a wealth of user data, commonly referred to as artifacts. While many types of browser artifacts exist, three fundamental categories form the foundation of most browser forensic investigations: History Databases Browser Cache Cookies These artifacts help us build a profile of user activity—identifying visited websites, frequency of access, timestamps, and user interactions. While these primary sources are invaluable, other artifacts can further corroborate findings and provide additional context. These include: Bookmarks  – Indicating user intent and areas of interest. Download History & Default Download Folder  – Revealing past file retrievals. Temporary Directories  – Storing forgotten downloads. Auto-Complete Data  – Providing insight into form submissions, search queries, and usernames. However, history and cache files are often the first to be deleted by users. In such cases, these ancillary artifacts may be the only available sources of evidence. ---------------------------------------------------------------------------------------------------------- The Evolution of Web Browsers The battle for browser dominance continues as organizations compete for market share in an increasingly web-driven world. Google Chrome has held the lead for years, while Internet Explorer and Mozilla Firefox have seen a decline. Microsoft introduced multiple browsers, with the latest iteration of Edge gaining traction. Meanwhile, Apple’s dominance in the mobile space has bolstered Safari’s market share. The leading engines include: Blink (used by Chrome, Edge, Opera, and Brave)  – A fork of the WebKit engine, dominating the market. Gecko (used by Mozilla Firefox)  – The primary alternative to Blink. WebKit (used by Safari)  – Initially developed by Apple. Microsoft Edge previously used a proprietary engine (EdgeHTML), but later adopted Blink due to limited success. ---------------------------------------------------------------------------------------------------------- Investigating Browser Artifacts The similarity among modern browsers simplifies forensic investigations. If you can analyze Chrome artifacts, you will find Opera and Brave to be nearly identical . This similarity, however, presents challenges when carving artifacts from unallocated disk space or memory, as determining the exact source browser can be difficult. A strong set of forensic tools and the ability to manually parse browser databases are essential skills for investigators. ---------------------------------------------------------------------------------------------------------- Next Step: Google Chrome Forensics In the next few sections, we will dive into multiple browser forensic, exploring how to extract and analyze its artifacts effectively. First we are going to start with Google Chrome --------------------------------------------Dean------------------------------------------------------

  • Streamlining Cloud Log Analysis with Free Tools: Microsoft-Extractor-Suite and Microsoft-Analyzer-Suite

    When it comes to investigating cloud environments, having the right tools can save a lot of time and effort. Today, I’ll introduce two free, powerful tools that are absolutely fantastic for log analysis within the Microsoft cloud ecosystem: Microsoft-Extractor-Suite  and Microsoft-Analyzer-Suite . These tools are easy to use, flexible, and can produce output in accessible formats like CSV and Excel, making them excellent resources for investigating business email compromises, cloud environment audits, and more. About Microsoft-Extractor-Suite The Microsoft-Extractor-Suite  is an actively maintained PowerShell tool designed to streamline data collection from Microsoft environments, including Microsoft 365 and Azur e. This toolkit provides a convenient way to gather logs and other key information for forensic analysis and cybersecurity investigations. Supported Microsoft Data Sources Microsoft-Extractor-Suite can pull data from numerous sources, including: Unified Audit Log Admin Audit Log Mailbox Audit Log Azure AD Sign-In Logs Azure Activity Logs Conditional Access Policies MFA Status for Users Registered OAuth Applications This range allows investigators to get a comprehensive picture of what’s happening across an organization’s cloud resources. ---------------------------------------------------------------------------------------------------------- Installation and Setup To get started, you’ll need to install the tool and its dependencies. Here’s a step-by-step guide: Install Microsoft-Extractor-Suite : Install-Module -Name Microsoft-Extractor-Suite Install the PowerShell module Microsoft.Graph  (for Graph API Beta functionalities): Install-Module -Name Microsoft.Graph Install ExchangeOnlineManagement  (for Microsoft 365 functionalities): Install-Module -Name ExchangeOnlineManagement Install the Az module  (for Azure Activity log functionality): Install-Module -Name Az Install the AzureADPreview module  (for Azure Active Directory functionalities): Install-Module -Name AzureADPreview Once the modules are installed, you can import them using: Import-Module .\Microsoft-Extractor-Suite.psd1 ---------------------------------------------------------------------------------------------------------- Note:  You will need to sign in to Microsoft 365 or Azure with appropriate permissions(Admin level access, included P1 or higher access level, or an E3/E5 license) before using Microsoft-Extractor-Suite functions. ---------------------------------------------------------------------------------------------------------- Getting Started First, connect to your Microsoft 365 and Azure environments: Connect-M365 Connect-Azure Connect-AzureAZ From here, you can specify start and end dates, user details, and other parameters to narrow down which logs to collect. The tool captures output in Excel format by default, stored in a designated output folder. Link :- https://microsoft-365-extractor-suite.readthedocs.io/en/latest/ ---------------------------------------------------------------------------------------------------------- Example Log I collected: One drawback to keep in mind is that logs are collected one by one. example first u collect MFA logs second again you written command and collected Users log. Another thing to keep in mind is if u do not provide path output will be capture under default folder where script is present. ---------------------------------------------------------------------------------------------------------- You might have question why two different suite? Answer is because there is script name Microsoft-Analyzer-Suite developed by evild3ad. This suite offers a collection of PowerShell scripts specifically designed for analyzing Microsoft 365 and Microsoft Entra ID data, which can be extracted using the Microsoft-Extractor-Suite. Current Analysis support by Microsoft-Analyzer-Suite is: Link: https://github.com/evild3ad/Microsoft-Analyzer-Suite ---------------------------------------------------------------------------------------------------------- Before I start, I will show you folder structure of both the tools: Microsoft-Extractor-Suite Microsoft-Analyzer-Suite-main Analyzer-Suit allows You can also add specific IP addresses, ASNs, or applications to a whitelist by editing the whitelist folder in the Microsoft-Analyzer-Suite directory. ------------------------------------------------------------------------------------------------------------ Lets start: I will show you two logs capture and analyzed is message trace log other one Unified audit log all collect using the script Microsoft extractor suite and than I will use Microsoft-Analyzer-Suite. Collecting Logs with Microsoft-Extractor-Suite Now, let’s go over collecting logs. Here’s an example command to retrieve the Unified Audit Log  entries for the past 90 days for all users: Get-UALAll After running this, the tool will output data in Excel format to a default folder. However, you may need to combine multiple excel file into one .csv file. Because Anlyzer suite script only run using .csv. ------------------------------------------------------------------------------------------------------------ Combining CSV Files into One Excel File When working with large data sets, it's more efficient to combine multiple log files into a single file. Here’s how to do this in Excel: Place all relevant CSV files in a single folder. Open a new Excel spreadsheet and navigate to Data > Get Data > From File > From Folder . Select the folder containing your CSV files and click “Open”. From the Combine  drop-down, choose Combine & Transform Data . This option loads your files into the Power Query Editor , where you can manipulate and arrange the data. In the Power Query Editor, click OK  to load your combined data. Edit any column formats, apply filters, or sort the data as needed. Once done, go to Home > Close & Load   Once Done Output will be look like below: But to ensure compatibility with Microsoft-Analyzer-Suite save the file as a .csv Using Microsoft-Analyzer-Suite for Log Analysis With your data collected and organized, it’s time to analyze it with Microsoft-Analyzer-Suite . UAL-Analyzer.ps1 Before using UAL-Analyzer.ps1 script there are few dependencies u have to make sure these are installed for running script First is creating is IPinfo account its free. https://ipinfo.io/signup?ref=cli ImportExcel  for Excel file handling (PowerShell Module) Install-Module -Name ImportExcel https://github.com/dfinke/ImportExcel IPinfo CLI (Standalone Binary) https://github.com/ipinfo/cli xsv (Standalone Binary) https://github.com/BurntSushi/xsv To install xsv: Now as I had WSL (I used command git clone https://github.com/BurntSushi/xsv.git ) You can download folder (as you feel comfortable) Once dependencies are set up, configure your IPinfo  token by pasting it into the UAL-Analyzer  script. To locate this in the script: Open UAL-Analyzer.ps1  with a text editor like Notepad++, search for the token variable, and paste your token there. ------------------------------------------------------------------------------------------------------------- As for latest Microsoft Analyzer suite There is another script called config.ps1 add token here If you are using older analyzer suite, Than its same for you but if not there is changes in script.. ------------------------------------------------------------------------------------------------------------- Running the Analysis Script For Unified Audit Logs, use the UAL-Analyzer  script. For example: .\UAL-Analyzer.ps1 "C:\Path\To\Your\CombinedUALLog.csv" -output "C:\Path\To\Output\" Once script ran successfully and output collected you will get pop up ------------------------------------------------------------------------------------------------------------ Lets check the output: As per screenshot , You can see output will be in CSV, XLSX in both format. Now question arise why there is same output in different. This is because the XLSX will contain output in coloured format, if something suspicious found it will be highlighted automatically. Where as csv will be in no highlighted format. Example of xlsx: Example of CSV: Folder Suspicious Operation: Kind note scripts are still getting updated and modified if you open GitHub you might find newer version it might work better for current this will output for me it make thing easy hope it do for you as well. ------------------------------------------------------------------------------------------------------------ Second Log we are going to talk about Message Trace logs Command : (This will collect all logs) Get-MessageTraceLog Screenshot of Output: Next step is Combined all excel into one(.csv format). Once done run MTL-Analyzer script .\MTL-Analyzer.ps1 "C:\Path\To\Your\CombinedMTLLog.csv" -output "C:\Path\To\Output\" (Make sure before running add token details inside the script than run the script) Conclusion By combining Microsoft-Extractor-Suite  and Microsoft-Analyzer-Suite , you can effectively streamline log collection and analysis across Microsoft 365 and Azure environments. While each suite has its own focus, together they provide an invaluable resource for incident response and cybersecurity. Now that you have the steps, you can test and run the process on your own logs. I hope this guide makes things easier for you! See you, and take care! Akash Patel

  • Google Workspace Email Collection: Data Extraction, eDiscovery, and Audit Logging

    Google Workspace is an integral part of many organizations, providing essential tools for communication and collaboration. However, when it comes to forensic investigations, compliance, and eDiscovery, knowing how to extract and analyze data from Google Workspace is crucia l. ---------------------------------------------------------------------------------------------------------- Data Extraction in Google Workspace There are three primary ways to extract data from Google Workspace: Admin Console Data Export Available in all paid Google Workspace tiers. Exports data for all users or specific accounts. Covers a wide range of data, including Gmail, Drive, Calendar, Contacts, Chat, Tasks, Voice data, and even Vault-retained items. Data is first archived in cloud storage, from where it can be selectively downloaded. Similar to Google Takeout but allows administrators to manage multiple user exports efficiently. Google Vault (For eDiscovery and Compliance) Included in Business and Enterprise editions or available as an add-on. A powerful tool for data retention, searching, and exporting  beyond standard exports. The only method   to access Gmail’s “Confidential Mode” messages. Supports retention policies, litigation holds, and compliance-related data archiving. Can search across Gmail, Drive, Shared Drives, Google Groups, Chat messages, Meet recordings, and Google Voice data. Provides search and filtering based on keywords, dates, and user accounts. Gmail API (For Custom Data Collection) Allows programmatic access to Gmail data. Used by third-party email collection tools or for building custom forensic scripts. Grants access to Gmail History Records , which track message additions, deletions, and label changes. Useful for tracking actions like message deletion, marking emails as spam, or email forwarding. ---------------------------------------------------------------------------------------------------------- Google Vault: A Deep Dive into eDiscovery Google Vault is a must-use  tool for organizations needing compliance and legal hold capabilities. It goes beyond basic exports, offering: Advanced Search and Filtering:  Using search operators similar to Gmail. Comprehensive Export Options:  Supports PST and MBOX formats, with additional metadata in XML and CSV formats. Confidential Mode Access:  Unlike the Gmail API, Vault retains the full content  of confidential messages. Draft Message Versioning:   Every version of a draft is saved and available in Vault for 30 days, even if deleted by the user. Retention and Hold Policies:  Enforceable for different data types to ensure compliance with organizational policies. Critical Pro-Tip: If a user account is deleted , all associated data is permanently removed from Vault . Instead, suspend user accounts to retain data while restricting access ---------------------------------------------------------------------------------------------------------- Audit Logging and Investigations One of the most powerful aspects of Google Workspace is its audit logs , which help track user activity and identify security incidents. Google provides different types of logs, including: Log Name Purpose Data Retention Admin Log Actions taken by Google Workspace administrators Account, event description, date, IP address 6 months User Log All login activity, including webmail and admin console Account, log-in type, date, IP address 6 months Email Log Search Search emails sent and received by the organization Email headers (no content searches) 30 Days OAuth Log Authorizations by email clients and mobile devices User, Application Name, Scope, IP address, date 6 months User Reports App Usage Consolidated view of user status and account activity Usage of Gmail, Drive, Storage, and External Apps 6 months Log Retention Periods: Most logs are retained for six month s , except for Email Log Search, which is available for 30 days . Organizations using Google Workspace Enterprise  can store logs indefinitely in Google BigQuery  or export them to a SIEM  for extended retention. ---------------------------------------------------------------------------------------------------------- Leveraging Open-Source Tools for Google Workspace Investigations ALFA on GitHub:   invictus-ir/ALFA Will try to create a article on this tool in coming future(Stay tuned) ---------------------------------------------------------------------------------------------------------- Email Header and Metadata Investigations Google Workspace allows email header searches  for messages from the last 30 days . Investigators can extract metadata such as: Sender & recipient email addresses. Subject lines & timestamps. Message ID and client IP address. Mail delivery tracking (e.g., failures, spam filtering). Matched Rules  that flag emails for objectionable content, PII, or compliance violations. Key Limitation: Email headers do not  contain email message content (only metadata). For full content analysis, investigators must rely on Google Vault or exports . ---------------------------------------------------------------------------------------------------------- Final Thoughts Google Workspace provides robust tools for forensic investigations, data compliance, and eDiscovery. By leveraging Admin Console exports, Google Vault, Gmail API, and audit logs , organizations can effectively extract, search, and preserve critical data. To ensure thorough investigations : Use Google Vault  for advanced eDiscovery. Leverage audit logs  for security analysis. Export logs to BigQuery or a SIEM  for extended analysis. Suspend accounts instead of deleting them  to retain forensic evidence. Understanding these mechanisms ensures that organizations can respond effectively  to incidents while maintaining compliance with legal and regulatory requirements. --------------------------------------------Dean--------------------------------------

  • Uncovering Deleted Items and File Existence in Digital Forensics.

    When investigating digital forensics cases, confirming which files were deleted or previously existed is crucial . Whether tracking user activity or validating forensic evidence, understanding where and how to find artifacts plays a key role in uncovering the truth. Many articles on my website discuss different deleted items and file existence artifacts. However, putting them all together in a structured way helps streamline forensic investigations. This article serves as a reference guide , consolidating various forensic artifacts that indicate deleted items and file existence , along with their advantages, disadvantages, and relevant analysis techniques. ---------------------------------------------------------------------------------------------------------- Thumbnail Cache (Thumbs.db / Thumbcache) Artifact:  Thumbs.db (Windows XP) and Thumbcache (Windows Vista and later) Forensic Importance:  Stores thumbnail previews of images and documents, even after deletion. Article: Understanding and Managing Thumbnail Cache in Windows: Tools thumbcache_viewer_64 ---------------------------------------------------------------------------------------------------------- Recycle Bin Forensic Importance:  Stores deleted files before permanent removal. Article: Windows Recycle Bin Forensics: Recovering Deleted Files Analyzing Recycle Bin Metadata with RBCmd and $I_Parse ---------------------------------------------------------------------------------------------------------- User Typed Paths Registry Path:  HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\TypedPaths Forensic Importance:  Tracks file paths typed in the Windows Explorer address bar. Article: Windows Registry Artifacts: Insights into User Activity (Typed Path) ---------------------------------------------------------------------------------------------------------- Windows Search Database Artifact:  Windows.edb Forensic Importance:  Stores indexed metadata of files searched on the system. Article: Unlocking Windows Search Indexing for Forensics: A Deep Dive A Deep Dive into Windows Search Database Parsing (WinSearchDBAnalyzer / SQLite / SIDR) ---------------------------------------------------------------------------------------------------------- Search WordWheelQuery Registry Hive:  NTUSER.DAT Registry Key:  NTUSER.DAT\Software\Microsoft\Windows\CurrentVersion\Explorer\WordWheelQuery Forensic Importance:   Stores user-searched keywords from the Start menu. Analysis Tool:   Registry Explorer ---------------------------------------------------------------------------------------------------------- Conclusion Analyzing deleted files and file existence artifacts plays a vital role in forensic investigations. By leveraging Windows registry artifacts, cache files, and search history, investigators can reconstruct user activity, track deleted files, and build a strong case with digital evidence. A structured approach to investigating these artifacts ensures efficiency and thoroughness in forensic analysis. When investigating digital forensics cases, confirming which files are deleted or file existed is crucial. Whether tracking user activity or validating forensic evidence, understanding where and how to find artifacts plays a key role in uncovering the truth. -------------------------------------------------Dean------------------------------------------------------

  • A Deep Dive into Windows Search Database Parsing (WinSearchDBAnalyzer / SQLite / SIDR)

    WinSearchDBAnalyzer Introduction One of the most powerful tools for parsing the Windows Search Database  is WinSearchDBAnalyzer . This tool effectively makes the contents of the Windows search index  available for forensic investigation. It currently supports parsing ESE (Extensible Storage Engine) database format  up to Windows 10 (Windows.edb files)  and can analyze already exported files or extract the database from a live system. Key Features Provides control over which metadata fields to extract. L Can parse both live and exported Windows.edb files. Can parse corrupt ESE databases, but repairing them beforehand ensures better results. Organizes indexed items by file extension , allowing easy filtering of specific file types (.docx, .zip, etc.). Investigators can explore indexed content structured as folders . Metadata columns can be sorted for better analysis, and a preview pane  displays file details upon selection. The Find  search bar enables keyword searches across multiple metadata fields, including filenames, folder names, and indexed file content ( Search_AutoSummary  field). Carves deleted records from unallocated database space, which may contain traces of previously deleted files. Practical Usage Example In an investigation, a search for "secret"  filtered across all files and metadata. A file named "akash.pdf"  was identified, and its extracted metadata was visible in the preview pane. The metadata included file system timestamps  and indexed content , revealing that the term "secret" appeared within the indexed content (Search_AutoSummary field). Useful Search Queries Practical search tips To view visited URLs : Search for "http://"  or "https://" To find internet queries : Search for "q="  or "query=" To locate records by date: Search for "2021-11-" To view all records : Select "ALL" To recover deleted records: Check "Unknown" Note:  Parsing a full Windows Search Database can be slow due to the large volume of indexed data. Esedatabaseview ---------------------------------------------------------------------------------------------------------- SQLite-Based Windows Search Index Parsing While WinSearchDBAnalyzer automates parsing, manual review  is also possible using a SQLite database viewer. Key SQLite Databases in Windows 11 1. Windows-gather.db Path:   C:\ProgramData\Microsoft\Search\Data\Applications\Windows\Windows-gather.db This database provides a high-level overview  of indexed files and folders, allowing full path reconstruction . Important Fields in SystemIndex_Gthr Table: ScopeID : Foreign key linked to Scope  in SystemIndex_GthrPth  table (to find parent folder). DocumentID : Unique identifier for a file, linked to WorkID  in Windows.db . FileName : Name of the indexed item. LastModified : Last modified time (Windows FILETIME format). Important Fields in SystemIndex_GthrPth Table: Scope : Foreign key linked to ScopeID . Parent : Identifier for parent folder. Name : Folder name. 2. Windows.db Path:   C:\ProgramData\Microsoft\Search\Data\Applications\Windows\Windows.db Stores a massive collection of metadata  for indexed files. Important Fields in SystemIndex_1_PropertyStore Table: WorkId : Unique identifier for an indexed item. ColumnId : Identifies metadata property type (linked to SystemIndex_1_PropertyStore_Metadata). Value : Metadata information. Important Fields in SystemIndex_1_PropertyStore_Metadata Table: Id : Property ID (linked to ColumnId in PropertyStore table). UniqueKey : Describes metadata property type. Example of Parsing Windows.db ---------------------------------------------------------------------------------------------------------- WinEDB and Windows 11 Search Index Changes Their WinEDB project  provides useful SQL queries  for analyzing the new database schema. These queries can be executed in DB Browser for SQLite  to extract human-readable data. 📌 Reference:   GitHub – kacos2000 WinEDB Search Index Tool --------------------------------------------------------------------------------------------------------- Search Index DB Reporter (SIDR) SIDR is an incredibly effective tool for parsing the Windows Search Index Database. It can parse both the original Extensible Storage Engine (ESE) database found in Windows 10 and the newer SQLite database format introduced in Windows 11. Command: E:\Windows Forensic Tools\window.edb.db analysis>sidr.exe -f csv -o "C:\Users\Akash's\Downloads" "C:\ProgramData\Microsoft\Search\Data\Applications\Windows" SIDR is designed to export a curated view of well-known and important items from the database. The output from SIDR is divided into three reports: File Report Activity History Report Internet History Report While the executable exports pre-defined metadata types, advanced users can modify the report output by editing the YAML file within the source code (compilation of the source code would be required). File Report The File Report provides a curated list of metadata for indexed files. It includes: Full file name and path Created, modified, and accessed timestamps File size and owner Indexed content (System_Search_AutoSummary) System_Search_GatherTime (timestamp when the file information was recorded in the database) The System_Search_GatherTime  is particularly useful because it offers another indicator of when a file was present on the system , independent of file creation time. However, it is important to note that SIDR extracts only a subset of the available metadata. Out of potentially hundreds of metadata types, only about ten are included in this report. If an investigation requires more detailed metadata, tools like WinSearchDBAnalyzer (for Windows 10)  or manual SQLite database parsing (for Windows 11) can supplement the extracted data. Internet History Report The Internet History Report extracts browser-related items stored in the Windows Search Index. Currently, this report includes data from Internet Explorer  and Microsoft Edge . It records: Visited URL Page title Date visited System_Search_GatherTime (when the information was recorded in the database) This report can be particularly useful if browser history is missing from primary browser databases. However, it is important to note that InPrivate browsing sessions are not stored  in the Search Index. Activity History Report The Activity History Report provides insights into user activity recorded in the Windows Search Index. It aggregates multiple System_Activity  metadata types, revealing: Files opened by a user Applications used to open those files Start and end times of the activity (providing duration information) A key forensic insight is the Windows Search Index does not delete Activity History information when the original file is removed . This means that even if a suspect deletes or renames a file, uninstalls an application, or attempts other cleanup actions, relevant forensic data may persist in the Search Index. ------------------------------------------------------------------------------------------------------------- Conclusion The SIDR  tool is designed by investigators, for investigators. It efficiently extracts key forensic data from the Windows Search Index without overwhelming analysts with unnecessary information. Advanced users can further customize the tool by modifying the source code and integrating it with additional analysis tools like Velociraptor  or WinSearchDBAnalyzer . -------------------------------------------------Dean---------------------------------------------

  • Unlocking Windows Search Indexing for Forensics: A Deep Dive

    Windows search indexing has been an integral part of the operating system since Windows 2000, continuously evolving to improve search efficiency. By default, it is enabled from Windows XP onward, silently cataloging an extensive amount of user data, including file names, metadata, and even partial content of certain file types. While this feature enhances the user experience, it also creates a valuable forensic artifact: the Windows Search Database (Windows.edb) . ------------------------------------------------------------------------------------------------------------- Understanding Windows Search Indexing Windows search relies on the Extensible Storage Engine (ESE) database to store indexed content. This database, known as Windows.edb , contains references to thousands of files, emails, and other indexed data, providing a powerful resource for forensic investigations. Where to Find Windows.edb Windows 11 Microsoft continues to surprise us with its choice of database formats. Windows 11 version 22H2 introduced a new database format to SQLite. Two SQLite databases hold the most interesting data: C:\ProgramData\Microsoft\Search\Data\Applications\Windows\Windows db C:\ProgramData\Microsoft\Search\Data\Applications\Windows\Windows-gather.db Windows Vista - Windows 10:   C:\ProgramData\Microsoft\Search\Data\Applications\Windows\Windows.edb Windows XP:   C:\Documents and Settings\All Users\Application Data\Microsoft\Search\Data\Applications\Windows\Windows.edb ---------------------------------------------------------------------------------------------- Why is Windows.edb/.db Important for Forensics? Windows.edb provides a wealth of information that can help reconstruct user activities. It stores: File and folder names with metadata Indexed email data (subject, sender, and message body in Outlook) Indexed OneNote, SharePoint, and OneDrive content Search query history References to deleted files With indexing covering over 900 file types by default , analysts can extract data from an extensive variety of file formats. ---------------------------------------------------------------------------------------------- Tools for Parsing Windows.edb Historically, analyzing Windows.edb was challenging, requiring manual examination with generic ESE database parsers. Fortunately, several open-source tools now provide efficient ways to extract and analyze its contents: Windows.edb Analysis Tools: WinSearchDBAnalyzer   Search Index DB Reporter (SIDR)   NirSoft ESE Database View   Windows.db Analysis Tools: Db Browser for Sqlite ---------------------------------------------------------------------------------------------- Challenges with Windows.edb Analysis Windows.edb/.db operates differently from traditional database formats. Originally designed for high-performance applications like Microsoft Exchange, ESE databases use a write-ahead model , meaning: Data is first written into log files before being committed to the main database. Uncommitted data can persist in memory for hours or even days. Extracting Windows.edb in a forensic investigation may result in a “dirty” database —one that hasn't consolidated its latest log entries. ---------------------------------------------------------------------------------------------- Handling a Dirty Windows.edb Database When dealing with a dirty  (incomplete) Windows.edb database , forensic analysts have two options: Work with the database as-is  – This means some uncommitted data will be missing, and certain tools may not be able to parse it properly. Recover the database using esentutl  – This built-in Windows utility allows database integrity checking, defragmentation, and recovery. Using esentutl for Recovery and Repair Windows includes esentutl.exe , a command-line tool to manage ESE databases. Key commands include: Check database status: If the database is dirty , it needs to be repaired. esentutl /mh Windows.edb Attempt database recovery: This applies uncommitted log data to the database. esentutl /r edb /d Repair a corrupted database (as a last resort): ⚠️ This will discard uncommitted data, so use it only if necessary. esentutl /p Windows.edb Best Practices for ESE Database Analysis Always work on a copy  of Windows.edb/.db and its associated log files. Use the same Windows version  for recovery as the system where the database was acquired. If possible, extract log files  along with Windows.edb to ensure data consistency . Avoid modifying original evidence whenever possible —use forensic tools that support read-only parsing Conclusion Windows Search Indexing is a double-edged sword: while it enhances user experience, it also creates an invaluable forensic artifact  in Windows.edb. By leveraging modern parsing tools and understanding ESE database behavior, forensic investigators can extract critical evidence, reconstruct user activities, and even recover deleted data. By mastering Windows.edb analysis, forensic professionals gain access to one of the most overlooked but powerful artifacts in Windows forensics . Whether you're performing an investigation or simply exploring how Windows manages search indexing, understanding this artifact can provide deep insights into user activity on a system. -------------------------------------------Dean--------------------------------------------------

  • Windows Event Logs for USB Activity

    Windows Event Logs are an excellent resource for investigating USB-related activities. These logs provide insights into when devices are connected or disconnected, driver installations, user actions, and more. Let’s break this down in simple terms. ----------------------------------------------------------------------------------------------------- Key Logs to Monitor for USB Activity System Log (Plug and Play Events) When a new USB or Plug and Play device is connected, Windows installs a driver, logging Event ID 20001  (start of installation) and 20003  (completion of installation). These events include details like: Timestamp  (when the installation occurred) Device Information  (Vendor ID, Product ID, iSerialNumber) Installation Status  (e.g., 0x0 means no errors). Limitation : Modern W indows versions (10/11) often log only Event ID 20003 by default. Example Use : Correlate timestamps with user logins to identify who connected the device. Security Log (Audit Removable Storage) Event ID 4663  is logged when files or folders on a removable device are accessed, created, or modified. Tracks: User Account  performing the action. Action Type  (e.g., file creation, deletion, or read). Object Name  (the specific file or folder). Challenge : The log does not directly tie file operations to a specific device. Investigators must cross-reference with other logs or artifacts. Security Log (Audit Plug and Play Activity) Event ID 6416  records every time a Plug and Play device is added. Provides: Detailed device information (VID, PID, iSerialNumber, volume name). Benefit : Unlike System Logs, these events are recorded each time a device is connected. How to Enable : Configure the “Audit PNP Activity” option in Advanced Audit Policy Configuration. Microsoft-Windows-Partition/Diagnostic Log Tracks detailed removable device activity, including when a device is connected or disconnected. Often used alongside Event ID 6416 and 4663 for a complete timeline. ----------------------------------------------------------------------------------------------------- Additional Logs for Device Activity Microsoft-Windows-DriverFrameworks-UserMode/Operational Log Available by default in Windows 7, but must be enabled in later versions. Logs connection and removal of devices, allowing you to determine how long a device was connected. MBAM/Operational Log (Microsoft BitLocker Administration and Monitoring) Tracks the mounting and dismounting of removable devices. Includes the volume GUID , which can help correlate device activity with registry data ----------------------------------------------------------------------------------------------------- Setting Up Auditing for USB Devices To make the most of these logs, you need to configure Windows to track the necessary events: Enable Removable Storage Auditing : Go to Advanced Audit Policy Configuration > Object Access > Audit Removable Storage . Enable both Success  and Failure  auditing. Enable Plug and Play Activity Auditing : Under Advanced Audit Policy Configuration > Detailed Tracking , enable Audit PNP Activity . ----------------------------------------------------------------------------------------------------- Key Takeaways Use System Logs  for identifying the first-time connection of devices. Rely on Security Logs  for tracking file and folder operations. Combine Event IDs 4663, 6416, and 20003  to get a complete picture of device activity. Cross-reference logs with the Registry  or other artifacts like Prefetch data  to match devices with user actions. Enable auditing policies to ensure detailed logs are captured. By strategically leveraging these logs, investigators can gain valuable insights into USB usage, even in environments with limited historical data retention. --------------------------------------------------Dean--------------------------------------------------

  • USB Device Identifiers and Forensic Insights: iSerialNumber, SCSI Serial Numbers, UASP Devices, and Cleanup in Windows

    USB devices often come with a unique identifier called the iSerialNumber. Why the iSerialNumber Matters? The iSerialNumber is a hardware-based unique identifier. I f you plug the same USB device into multiple computers, each system should log the same iSerialNumber . This makes it incredibly useful for tracking where a device has been used—whether for forensic investigations or enterprise-level monitoring. Exceptions and Windows-Generated Identifiers Unfortunately, not all USB devices report an iSerialNumbe r. If the device lacks this value, Windows generates an identifier for it. You can easily recognize these by looking at the second characte r—Windows-generated IDs will have an ampersand (&). For profiling on a single system, it doesn’t matter whether the identifier is hardware-based or Windows-generated, as both will uniquely identify the device on that system. However, tracking the same device across multiple systems can be problematic if it lacks a unique iSerialNumber since Windows will assign a different identifier on each system. -------------------------------------------------------------------------------------------------------- Challenges with Poorly Designed Devices Low-quality USB devices or adapters can cause confusion. They might report inconsistent identifiers, even on the same system. This can make a single device appear as multiple devices. When this happens, you’ll need to rely on other identifiers like Vendor ID (VID), Product ID (PID), volume names, or the Volume Serial Number to clarify thing s. Extracting the iSerialNumber If you need to retrieve the iSerialNumber from a physical device. Using Hardware Tools A USB write blocker or similar device is the safest way to extract the iSerialNumber. Using Software Tools You can also use tools like Microsoft’s USBView  (part of the Windows Software Development Kit). Physical Inspection Sometimes, USB devices have identifying information engraved on their casing. However, be cautious—this number doesn’t always match the actual iSerialNumber stored in the hardware. ------------------------------------------------------------------------------------------------------------- The SCSI Serial Number: An Alternate Identifier In addition to the iSerialNumber , USB devices often have another serial number called the SCSI Serial Number . Here’s how the two differ: The iSerialNumber is used by the USB subsystem and is typically stored in the device descriptor. The SCSI Serial Number comes from the device’s storage subsystem. These numbers may not match, and forensic tools can sometimes show one but not the other. This can create challenges when trying to correlate data between system logs and the Windows Registry. How to Identify Both Serial Numbers Starting with Windows 10, Microsoft’s Partition/Diagnostic Event Log  provides detailed information about connected devices, including both the iSerialNumber and the SCSI Serial Number. Here’s how you can access them: Run this PowerShell command with a USB device plugged in: Get-WmiObject win32_diskdrive | select-object model, serialnumber, pnpdeviceid, deviceid Open the Microsoft-Windows-Partition/Diagnostic.evtx  log. You’ll find: The iSerialNumber (under the "ParentId" field) The SCSI Serial Number (under "SerialNumber") You can also cross-reference these with other details like the VID, PID, a and device capacity to distinguish devices. ------------------------------------------------------------------------------------------------------------- When dealing with USB devices, it's essential to recognize the difference between standard USB devices and USB Attached SCSI (UASP) devices. UASP devices store information under SYSTEM\\Enum\SCSI key, which requires some unique steps to extract useful forensic data. Profiling UASP Devices: Step-by-Step Identify the Device Look for your device under SYSTEM\\Enum\USB. If the Service value references UASPStor and the DeviceDesc mentions UASP, you’ve found a UASP device. Note the ParentIdPrefix  value; it’s a key link to finding related data in the SCSI registry key. Correlate Data in the SCSI Key Use the ParentIdPrefix  value to find the matching entry under SYSTEM\\Enum\SCSI. This key will reveal manufacturer details, product information, and additional timestamps for the device. Pay special attention to the DiskID  and iSerialNumber . Note : Windows prepends to iSerialNumbers for UASP devices. Use Tools for Simplified Analysis Tools like Registry Explorer offer plugins to simplify analysis of the SCSI key , providing extracted information in a table format for easier documentation. ------------------------------------------------------------------------------------------------------------- Handling Windows USB Cleanup Activities Recent versions of Windows have implemented cleanup mechanisms that can impact USB-related forensic evidence. Here’s what you need to know: Scheduled Cleanup Tasks Early versions of Windows 10 (and Windows 8) used the Plug and Play Cleanup  task to remove USB-related data for devices not detected in the last 30 days. Later versions of Windows 10 removed this specific task but introduced similar cleanup during major updates. This cleanup means USB artifacts may only persist until the next major Windows update , especially in keys like USBSTOR, USB, SCSI, and even the critical Microsoft-Windows-Partition/Diagnostic log. Keys and Logs That Survive Cleanup Some artifacts remain even after cleanup routines, providing critical data for forensic profiling: MountedDevices : Tracks drive letters and volume information. Windows Portable Devices : Identifies devices used on the system. MountPoints2 : Logs drive mount points for user-specific activity. setupapi.dev.log : Records device installation and removal events (though only for a limited time). Volume Shadow Copies : Stores older versions of registry keys and logs, often allowing recovery of deleted artifacts. ------------------------------------------------------------------------------------------------------------- Recovering Data from Cleanup with DeviceMigration Keys Windows archives device data during cleanup or updates in the DeviceMigration keys . These keys allow forensic analysts to go back in time and recover information about devices previously connected to the system. Key locations include: SYSTEM\\Control\DeviceMigration SYSTEM\Setup\Upgrade\PnP\CurrentControlSet\Control\DeviceMigration What Can You Extract? While not all original data is retained , these keys store: Manufacturer and Product Information VID/PID iSerialNumber ParentIdPrefix DiskID LastPresentDate : A 64-bit timestamp showing when the device was last connected. ------------------------------------------------------------------------------------------------------------- Best Practices for Forensic USB Analysis Correlate Data Sources Use DeviceMigration keys to cross-reference older device data with other longer-lasting keys l ike MountedDevices or MountPoints2. This helps identify details like drive letters, volume names, and user-specific usage. Utilize Archived Data Windows.old Folder : Created during major updates, it contains older versions of registry hives and logs that may still hold critical USB-related data. Volume Shadow Copies : If enabled, these snapshots allow you to recover older file system and registry data. Leverage Forensic Logs Logs like setupapi.dev.log and event logs (other than Microsoft-Windows-Partition/Diagnostic) remain useful even after cleanup , though their retention period is often limited. ------------------------------------------------------------------------------------------------------------- Conclusion: By understanding UASP device profiling, cleanup mechanisms, and how to recover deleted artifacts, forensic analysts can still extract valuable information even in challenging scenarios. -------------------------------------------------------Dean--------------------------------

  • Advanced OneDrive Forensics: Investigating Cloud-Only Files & Synchronization

    Cloud storage has evolved beyond simple local folder synchronization . Newer technologies, like Files On-Demand  and Smart Sync , allow users to interact with cloud-stored files without downloading them . This presents new forensic challenges  since not all files exist locally , and standard filesystem artifacts may be missing . We’ll cover: ✅ How OneDrive’s new sync model affects forensic investigations ✅ Tracking cloud-only files & deleted data ✅ Using OneDrive’s forensic artifacts to recover missing evidence ------------------------------------------------------------------------------------------------------------- 1️⃣ Understanding "Hydrated" vs. "Dehydrated" Files in OneDrive Microsoft OneDrive introduced Files On-Demand  in Windows 10 (version 1709) , allowing users to view all cloud-stored files without downloading them . 📌 OneDrive File Status Icons: 🌥 Blue Cloud:  File is only in the cloud  (dehydrated) ✅ Green Check (Hollow):  File was opened recently  and cached locally ✅ Green Check (Filled):  File is fully downloaded and always available locally 💡 Why This Matters: S ome files may have never existed on the local system  (dehydrated). A forensic image may miss cloud-only files  unless OneDrive logs or sync databases are analyzed. ------------------------------------------------------------------------------------------------------------- 2️⃣ Where to Find OneDrive Artifacts Even if files are not stored locally , OneDrive leaves forensic traces  in multiple locations: 📍 OneDrive Sync Folder (Locally Stored Files) %UserProfile%\OneDrive\ 💡 Includes only hydrated (downloaded) files. Cloud-only files are missing. 📍 OneDrive Settings & Metadata %UserProfile%\AppData\Local\Microsoft\OneDrive\settings\Personal\ 💡 Contains sync logs, database files, and user metadata. 📍 OneDrive Logs (File Sync History) %UserProfile%\AppData\Local\Microsoft\OneDrive\logs\ 💡 Records uploads, downloads, and file deletions. Stores up to 30 days of logs. 📍 OneDrive Registry Keys (User Account & Sync Details) NTUSER\Software\Microsoft\OneDrive\Accounts\Personal 💡 Tracks the OneDrive sync folder location and last authentication time. ------------------------------------------------------------------------------------------------------------- 3️⃣ Investigating Cloud-Only Files Using OneDrive Sync Database 📌 SyncEngineDatabase.db (SQLite) – The Most Important OneDrive Artifact Since March 2023 , Microsoft migrated OneDrive’s file-tracking system to SQLite . The SyncEngineDatabase.db  file stores: ✅ Cloud-only file records (even if never downloaded) ✅ File metadata (timestamps, size, folder structure) ✅ Synchronization status (e.g., cloud-only, synced, shared) ✅ quickXorHash values (instead of SHA1) for file integrity %UserProfile%\AppData\Local\Microsoft\OneDrive\settings\Personal\SyncEngineDatabase.db Key Tables in SyncEngineDatabase.db 🔹 od_ClientFile_Records (Tracks OneDrive Files) Column Description fileName Name of the file resourceID Unique identifier for each file lastChange Last modification time (Unix Epoch format) size File size fileStatus Synchronization status sharedItem Indicates if the file was shared localHashDigest quickXorHash value for file integrity 📌 File Status Codes: 2 = Available Locally  (Downloaded) 5 = Excluded  (Ignored by sync) 6 = Not Synced 8 = Available Online Only  (Cloud-only) 💡 Forensic Use: Identifies files that only exist in the cloud  (fileStatus = 8). Tracks deleted or moved files  by correlating with OneDrive logs. 🔹 od_ClientFolder_Records (Tracks OneDrive Folders) Column Description folderName Name of the folder resourceID Unique folder identifier folderStatus Sync status (Synced, Not Synced, etc.) sharedItem Indicates if the folder was shared 📌 Folder Status Codes: 9 = Synced 10 = Not Synced 11 = Not Linked ------------------------------------------------------------------------------------------------------------- 4️⃣ Investigating Deleted OneDrive Files When a user deletes a file, it disappears from all synced devices and the cloud . However, OneDrive and Windows keep hidden traces . 💾 Recovering Deleted OneDrive Files ✅ Option 1: Windows Recycle Bin Locally deleted OneDrive files may still be in: C:\$Recycle.Bin\ ✅ Option 2: OneDrive Recycle Bin (Cloud-Based Recovery) OneDrive Personal:  Deleted files stored for 30 days OneDrive for Business:  Deleted files stored for 93 days URL to check deleted OneDrive files: https://onedrive.live.com/ ✅ Option 3: OneDrive Sync Logs & SafeDelete.db SafeDelete.db (SQLite)  stores deleted file records before syncing. Deleted file traces may persist in logs & databases  before being purged. 📍 Location: %UserProfile%\AppData\Local\Microsoft\OneDrive\settings\Personal\SafeDelete.db 💡 Forensic Use: Tracks who deleted a file and when . Identifies files deleted long ago using SQLite carving techniques . ------------------------------------------------------------------------------------------------------------- 5️⃣ Tracking Shared Files & External Data Sources OneDrive allows users to sync shared folders  from other users, Microsoft Teams, or SharePoint. 📌 Registry Key for Shared Folders (Tenants) NTUSER\Software\Microsoft\OneDrive\Accounts\Personal\Tenants 💡 Tracks external data sources, including: ✅ Files shared from other OneDrive accounts ✅ SharePoint & Teams folder synchronization 📌 Forensic Use: Investigators must check this key to avoid missing shared folders stored outside the default OneDrive folder . ------------------------------------------------------------------------------------------------------------- 6️⃣ Locating OneDrive Log Files & Understanding Their Purpose 📍 Log File Location: %UserProfile%\AppData\Local\Microsoft\OneDrive\logs\ OneDrive logs track interactions  between the local system and the cloud, recording: ✅ File synchronization events  (uploads, downloads, deletions) ✅ File modifications (renames, moves, metadata changes) ✅ Cloud-only file interactions (Files On-Demand downloads, file access timestamps) 📌 Common OneDrive Log File Extensions File Extension Purpose .odl Main log file tracking file activities .odlsent Logs of files successfully synced .odlgz Compressed logs (older entries) .aodl Advanced logging (for internal Microsoft use) 📌 Important Notes: Log filenames are anonymized  (filenames replaced with obfuscated values). Older OneDrive versions used ObfuscationStringMap.txt to decode filenames , but newer versions encrypt logs with Bcrypt  (key stored in general.keystore). 🔍 Forensic Tools to Parse OneDrive Logs: OneDriveExplorer  (by Brian Maloney) Python scripts by Yogesh Khatri A big thank you to Brian Maloney for reaching out to me regarding issue i said that tool is not working for me. I must admit, I had forgotten to recheck it. Today, I downloaded the latest version of OneDrive Explorer from the github, and it appears to be working perfectly. The tool is now parsing the .odl logs as expected, and OneDrive Explorer is successfully displaying the data. Parsing ODL logs getting output in csv ------------------------------------------------------------------------------------------------------------ 7️⃣ Investigating OneDrive File Activity Using .ODL Logs 📌 OneDrive logs are essential for tracking: ✅ File uploads & downloads  (date, time, file size) ✅ File deletions & renames ✅ Cloud-only file access (even if the file never existed locally) 🔹 Recovering Deleted File Activity from .ODL Logs Even after a file is deleted from OneDrive, remnants remain in .ODL logs . Look for file delete events  (Deleted column in OneDriveExplorer output). Correlate timestamps  with Windows Recycle Bin logs ($Recycle.Bin). Check cloud-based OneDrive Recycle Bin  (retains files for 30–93 days). 🔍 Cross-reference OneDrive logs with: Windows Event Logs  (tracks OneDrive file modifications) Volume Shadow Copies  (may store previous versions of OneDrive files) ------------------------------------------------------------------------------------------------------------- OneDrive’s Setting Important File: ------------------------------------------------------------------------------------------------------------- OneDrive’s Evolving Forensic Challenges Microsoft OneDrive has transformed digital forensics , requiring investigators to look beyond standard filesystem artifacts . We will explore more about OneDrive  in the next article (I nvestigating OneDrive for Business: Advanced Forensics & Audit Logs ) , so stay tuned! See you in the next one.

  • Making Sense of SRUM Data with SRUM_DUMP Tool

    If you're digging into Windows forensic artifacts, SRUM (System Resource Usage Monitor) data is a goldmine. But manually decoding the SRUM database? That’s a nightmare. Thankfully, Mark Baggett’s free tool, SRUM_DUMP , does all the heavy lifting for us. ------------------------------------------------------------------------------------------------------------- What is SRUM_DUMP? SRUM_DUMP processes the SRUDB.dat database and generates an Excel spreadsheet with separate tabs for each table in the database. It also correlates some fields from the Windows Registry, making it easier to identify network connections, system usage, and even user activities. This tool is a game-changer for forensic analysts. It provides structured Excel templates that can be customized for better data visualization, such as calculating network connection times or applying conditional formatting. ----------------------------------------------------------------------------------------------------------- How to Use SRUM_DUMP Let’s get straight to the process. After extracting your forensic image or pulling out the SRUDB.dat  file and the SOFTWARE  registry hive, follow these steps: Launch SRUM_DUMP  and click the Browse  button to select the SRUDB.dat  file. If you're analyzing a mounted image, you’ll likely find it in: E:\Windows\System32\SRU\SRUDB.dat Choose an output folder  where the processed Excel sheet will be saved. Stick with the default Excel template  (unless you have a specific need to change it, which is rare). Provide the SOFTWARE registry hive   to allow SRUM_DUMP to cross-reference network and user names. Since incident response often deals with systems that weren’t properly shut down, the registry hive might be in a dirty state. Ideally, use a cleaned-up version for accuracy. Click OK , and within seconds, you'll have a neatly structured Excel file ready for analysis. ----------------------------------------------------------------------------------------------------------- Output ----------------------------------------------------------------------------------------------------------- Understanding SRUM Data Now, let’s break down what kind of forensic insights we can extract from SRUM data. 1. Network Connectivity Usage Table This table logs when and where a system connected to a network. Here’s what you’ll see: Column B  – Timestamp of when the connection was recorded. Column E  – Network interface used (e.g., Wi-Fi, Ethernet). Column F  – Network name (SSID of Wi-Fi connections). Column G  – Duration of the connection. Column H  – Start time of the connection. In some cases, overlapping connections suggest a system went into sleep or hibernate mode between sessions. Investigators can use this data to establish movement patterns or even detect suspicious activities. ------------------------------------------------------------------------------------------------------------- 2. Windows Network Data Usage Table This table shows: The application name  using the network. The total bytes sent and received . The user SID  associated with the activity. ------------------------------------------------------------------------------------------------------------- 3. Application Resource Usage Table Unlike the Network Data Usage table, this one logs all running applications, whether they used the network or not. It records file paths , execution times , and CPU/memory usage . It can indicate whether a user was running resource-heavy software  or simply had it open in the background. Foreground/Background bytes read/written  can help determine if large amounts of data were copied (e.g., to an external USB device). ------------------------------------------------------------------------------------------------------------- Final Thoughts SRUM data is an incredibly powerful forensic resource, but making sense of it manually is next to impossible. With SRUM_DUMP , analysts can quickly extract and analyze network activity, application usage, and potential signs of data exfiltration. Whether you’re investigating insider threats, tracking a hacker’s movements, or simply auditing system usage, SRUM_DUMP makes life a lot easier. So, if you haven't tried it yet, give it a shot—it might just become one of your go-to forensic tools! ---------------------------------------------------Dean--------------------------------------------------

  • Windows Registry Artifacts: Insights into User Activity

    Updated in 24 Feb, 2025 ------------------------------------------------------------------------------------------------------ 1. Search History: The "WordWheelQuery" registry key is a valuable artifact found in the Windows registry of Windows 7 to Windows 10 systems. It stores information about keywords searched for from the START menu bar, providing insights into user search behavior and interests. NTUSER.DAT Hive. N TUSER.DAT\Software\Microsoft\ Windows\CurrentVersion\Explorer\WorkWheelQuery ------------------------------------------------------------------------------------------------------ 2. Typed Path: This key will show when you have manually typed a path into the Start menu or into the Explorer bar. This key would be useful in a situation where you are trying to show that the user had specific knowledge of a location. NTUSER.DAT hive. NTUSER.DAT\Software\Microsoft\ Windows\CurrentVersion\Explorer\TypedPaths ------------------------------------------------------------------------------------------------------ 3. Recent Docs: To understand this artifact in-depth check out the below article: RecentDocs: Uncovering User Activity Through Recently Opened Files https://www.cyberengage.org/post/recentdocs-uncovering-user-activity-through-recently-opened-files ------------------------------------------------------------------------------------------------------ 4. Microsoft Office Recent Docs: To understand this artifact in-depth check out the below article: Tracking Recently Opened Files in Microsoft Office: A Forensic Guide: https://www.cyberengage.org/post/registry-user-activity-tracking-recently-opened-files-in-microsoft-office-a-forensic-guide ------------------------------------------------------------------------------------------------------ 5. Last Visited MRU/ Open Save MRU When you "save or open a file," Have you ever noticed that it might remember the location you previously saved or opened a file? Have you noticed that when you save or open a file, there is a drop-down dialog box that remembers your previous save or open locations or files that have been opened?      (i) Open Save MRU It acts as a repository for a history of files accessed or saved by users , offering a panoramic view of their digital footprint. NTUSER.Dat Hive: NTUSER.DAT\Software\Microsoft\Windows\CurrentVersion\Explorer\ComDlg32\ Through CMD: reg query HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\ComDlg32\     (ii) Last Visited MRU The Last Visited MRU (Most Recently Used) artifact tracks the specific executable files used by an application to open files documented in the OpenSaveMRU key . Additionally, each value within this artifact also records the directory location for the last file accessed by that application. NTUSER.Dat Hive: NTUSER.DAT\Software\Microsoft\Windows\CurrentVersion\Explorer\ComDlg32\ Through CMD: reg query HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\ComDlg32\LastVisitedPidlMRU Last Visited Pid MRU :-  Track application executable used to open files in Open save MRU and the last file path used (Program execution) Open save pid MRU” - Values under this show items input in open save dialog without an extension (File knowledge) * :-(track the most recent files of any extension input in open save dialog). ------------------------------------------------------------------------------------------------------ 6. Last Commands executed: NTUSER.DAT Hive: NTUSER.DAT\SOFTWARE\Microsoft\Windows\CurrentVersion\Explorer\RunMRU    Command: reg query HKCU\SOFTWARE\Microsoft\Windows\CurrentVersion\Explorer\RunMRU   ------------------------------------------------------------------------------------------------------- 7. Trusted Office Documents To understand this artifact in-depth check out the below article: Tracking Trusted Office Documents: A Key to Investigating Macro-Based Malware https://www.cyberengage.org/post/registry-user-activity-tracking-trusted-office-documents-a-key-to-investigating-macro-based-malwar                                                                                                           ------------------------------------------------------------------------------------------------------ 8.  Installed Applications To understand this artifact in-depth check out the below article: Windows Registry: A Forensic Goldmine for Installed Applications https://www.cyberengage.org/post/windows-registry-a-forensic-goldmine-for-installed-applications -----------------------------------------------Dean--------------------------------------------------------   To Learn In deep check out below blog

bottom of page