Before we can address the optimum method for responding to the continuously evolving cyber threat landscape, we need to understand the sources and types of data available from which we can understand and make decisions about the health and security of our systems.
Data-at-Rest: The Traditional Basis for Monitoring Infrastructure
The most prevalent, and traditional, method utilized to gain an understanding of what is happening (or has happened) within the compute infrastructure is data-at-rest; that is, inactive data that is stored physically in any digital form. In the context of an organization's network data-at-rest often comes in the form of log files and/or recorded network traffic. These consist of records of events, changes, traffic, or messages exchanged between different systems. Typically generated by the software running on the various components within a system, they are a static view of events that have already occurred, been captured and stored. The vast majority of existing monitoring and cyber defense solutions rely on these types of static data as their primary data source. These solutions aggregate, process and make decisions about the health and security of the compute infrastructure based on the analysis of this historical data that describes what has already occurred over a given period of time.
The Cons of Data-at-Rest
The issue with relying on data-at-rest as the primary data source is that decisions are being made on historical data, not live data. This reliance on historical data leaves organizations exposed to significant risks; as a result, zero-day cyber attacks, real-time fraud or operational issues cannot be identified quickly enough to make a difference. In the latest Verizon Data Breach Investigation Report (DBIR), Verizon points out that it only takes a few seconds for an attacker to compromise a system, and just a few hours to exfiltrate data from a system. However, it often takes organizations months to detect the compromise and then proceed to contain it. Verizon reported that 68% of data breaches took months to detect (yes, you read that right: months!)
As we concluded in our most recent post, "Strategies That Make Data-at-Rest Obsolete," there has to be a better way than simply relying on data-at-rest as the only source of situational awareness. Could organizations better identify and stop attacks on their systems more quickly if they had access to intelligence in real-time? Is there a way to have access to live network intelligence, as opposed to analyzing events after the fact? The answer is a resounding, "Yes!" So, what is the alternative?
The Opportunity to Use Data-in-Motion to Enhance Cyber Security
The best and most innovative way to understand what is happening within the compute infrastructure is to use data-in-motion: live (real-time) information which is an active representation of the systems and information moving between devices. Data-in-motion is not made up of static files and network logs; rather, it is made up of dynamically processed samples of live traffic traversing the compute infrastructure.
The Cons of Data-in-Motion
Historically, this type of information has not been easy to generate, capture or analyze - it is highly complex, largely unstructured and moves at ever-increasing rates of speed. These challenges were the reason that cyber security and operational tools traditionally defaulted to more manageable sources - network logs, packet capture, etc., as the primary source for data. While these sources do not provide a real-time, actionable view, they are easily analyzed and provide a historical, point-in-time view that does provide value when attempting to understand what has already happened.
Technology Advances have Unlocked the Power of Data-in-Motion
Recent technological developments have largely eliminated the challenges associated with generating and analyzing live traffic (AKA, data-in-motion). More specifically, there have been three key developments that allow organizations to access and utilize data-in-motion effectively:
- New programming languages allow for the programming of networks, and data forwarding planes provide foundational capabilities necessary to program (physical and virtual) networks and process the most complex types of live network traffic.
- Advances in in-memory computing technologies support the development of high-speed, real-time (physical and virtual) sensors that can ingest unstructured data-in-motion and convert the information into standardized, open-format streaming data records for follow-on analytic processing.
- Advances in data science tools, specifically streaming analytics combined with machine learning and artificial intelligence technologies have evolved to a point where organizations can now quickly develop and deploy applications that can natively ingest data-in-motion and combine that information with (historical) data-at-rest to make sub-second, informed decisions and take actions in real-time.
When it comes to making continuous, real-time decisions, information technology is no longer a limiting factor. Solutions are now available that can provide a live view of networks, continuously monitor data flowing across those networks, and allow decisions to be made in real-time, as events are unfolding.
The Best Cyber Security Strategy
Combining data-at-rest with data-in-motion constitutes a new, more complete, more useful and powerful form of situational awareness to enable complete visibility and provide a broader range of more effective cyber security, fraud detection and infrastructure management solutions.
The Tools Required for Execution
MantisNet provides scalable and dynamic sensor technologies to capture and deliver high-resolution data-in-motion to give you the intelligence you need: when and where you need it. Combining data-in-motion with data-at-rest provides the most complete, comprehensive and accurate view of network activity. More importantly - the addition of data science tools which allow you to combine data-at-rest AND data-in-motion gives you the unique ability to create solutions that can continuously learn from, and adapt to, the evolving threat landscape in order to ensure the best possible defense and control of your network infrastructure.