Cyber Threat Intelligence at Scale
Tell Me More

Overview

Holmes Processing was born out of the need to rapidly process and analyze large volumes data in the computer security community. At its core, Holmes Processing is a catalyst for extracting useful information and generate meaningful intelligence. Furthermore, the robust distributed architecture allows the system to scale while also providing the flexibility needed to evolve.

Extract

Execute static and dynamic analysis techniques and also query 3rd parties for information.


Store

Centrally manage raw data, extracted information, and generated intelligence.


Analyze

Provides the infrastructure required to perform statistical, machine learning, and other analytic methods.

Pillars

Computer security threats are every growing, constantly evolving, and actors are deliberately attempting to thwart security practices. To counter this, Holmes Processing Architecture is based upon 3 core pillars.

Resilient

Failures should be gracefully handled and not affect other parts of the system.

Scalable

The system should be easily able to scale vertically and horizontally.

Flexible

Components should be interchangeable and new features should be easy to add.

Projects

Holmes Processing is comprised of multiple projects. Each project can be used individually or combined together to create a powerful system that spans the analysts life-cycle.

Gateway

Tasking Validation and Routing

Totem

Static Analysis and Queries

Totem-Dynamic

Dynamic Analysis

Storage

Database Interaction

Investigation

Results Analysis

Website

Multi-User Website

Toolbox

Helpful Tools