Elysium Analytics employs a multi-tier, distributed data-ingestion streaming pipeline to process log data efficiently and securely. The Elysium Analytics ingestion pipeline includes network devices that are able to process data from multiple inputs on both TCP and UDP as well as via active collection from third-party repositories such as S3 buckets.
Once data is collected, it is parsed and enriched with additional metadata. All log data is highly available across multiple data centers and is backed up to ensure data availability.
The analytics layer runs on top of the indexed data and allows for search, aggregation, and customized analyses of the log data through the popular Kibana interface and Elysium Analytics’ extensions. Users can create searches, visualizations, dashboards, and alerts. The system administrator can also invite multiple users to collaborate as they create the relevant analytics tools that they need to run monitoring and forensics processes.
Data Collection Flow
Beats and Minifi compatible for simple integration and leverage of existing enterprise collection frameworks as well as integration with any 3rd party source.
Connect your sources leveraging existing connectors from Logstash and Apache Nifi with our direct output plugin from Logstash to Snowflake.
Parse legacy device data sources in Logstash and modern data sources in JSON and Java.
Enrich data in real-time with Identity, Asset, Geolocation, and Threat Intelligence, as well as data from lookup tables built into the storage platform data pipeline.
Collect, aggregate and analyze logs from any cloud application source. Simple setup. Get the whole picture from all your cloud applications, infrastructure, and devices.
Collect all log data from all your your security devices, on-premises and cloud implementations for a consolidated view of all activity across all your security solutions and retain the data for as long as you need to.
Collect all your enterprise network and endpoint device logs for full visibility to all activity across all layers of your network.
Collect, Parse, Enrich, Load, Connect
With integration of all your security and log sources, Elysium Analytics automatically collects all the data you need from any source. Easily parse, map, and group your data in Elysium Analytics Open Data Model for full context and fast analytics. Context and Threat Intel enrichment add event and non-event contextual information to security event data in order to transform raw data into meaningful insights.
Collect your data
With integration to all your security and log sources, Elysium Analytics automatically collects all the data you need from any source; cloud, on-prem, or SIEM solution. Leveraging Kafka, Logstash, Beats, and Nifi.
Parse your data
Parse, map, and group your data in Elysium Analytics Open Data Model for full context and fast analytics.
Enrich your data
Context enrichment adds event and non-event contextual information to security event data in order to transform raw data into meaningful insights. Users typically enrich with geo data, asset lookup data, and more.
Add Threat Intel to your data
Enrich your data with Threat Intel and get a broad view of the threat landscape external to your organization allowing your security team to more effectively detect threats, measure overall relevant risk exposure, and become more effective at mitigation. We have implemented a RESTful API as well as STIX & TAXII support for simple ingestion into our data lake.
Load your data
Loading your data into the data lake is billed by the second and can be configured to continuous loading or batch loading. Since you are billed by the compute resources you consume, you can configure frequency and capacity based on what your needs are.
Connect to your data
Combine all your on-prem IT logs, enterprise network logs, cloud logs and network traffic data into one scalable data lake and combine your in-cloud and on-prem data silos into one scalable Snowflake data lake.
Data Ingestion Monitoring
Data ingestion allows you to ingest your data to the Elysium Analytics platform. Ingestion can be batch-based (which often reduces your ingestion compute cost), or streaming ingestion (which allows you to ingest your data to the platform in real time. Although there is no limit to the volume of data you can ingest and there are no license limitations to how much data you can ingest, making sure all your data is collected, shipped and processed is critical.