Back when Oracle was considered cool and cutting edge, most enterprises weren’t thinking about data pipelines, big data, and data lakes. Fortunately, we had the foresight in the early 2000s to build a next-gen columnar database that was designed to scale on ingestion and analytics across hundreds of Linux nodes. With this new massive multi-parallel system, we tackled the big data issue at the largest enterprises (Intel, AT&T, Yahoo, Goldman Sachs, etc.) who were struggling with the torrent of data from the proxy, firewalls, Windows, Netflows, and other gigantic sources. Through our experience of running multiple clusters at large enterprises, we realized that building all proprietary software was limiting our ability to scale and to leverage new capabilities from the open-source communities.
We learned experientially that having too many components in the backend database, as is the case with ElasticSearch and Hadoop, will induce complexity issues that would be incompatible with our vision of an easy-to-use, cloud scale solution. Then the light bulb went off and we realized that we must have a hybrid solution of a proprietary cloud database to reduce complexity and then leverage open source components and proprietary software for running analytics. This provided us with the best of both worlds where we could leverage the power of a scalable backend like Snowflake for ingestion and then leverage proprietary and open source components for detection. One thing was becoming clear: Security challenges today are to a large extent a “big data” challenge.
A true cloud-scale security solution has to be capable of collecting all the data in an enterprise, enable data-driven analytics, and facilitate democratization of the data and Intelligence. Fortunately, big data technology has improved vastly over the past few years and with Snowflake’s data warehouse-as-a-service, we are now able to deliver on the promise of true cloud-scale security analytics.