Scaling data security solutions: What you need to know

In this Help Net Security interview, Bruno Kurtic, President and CEO at Bedrock Security, discusses the role of data visibility in enhancing cybersecurity.

He explains that effective data visibility involves discovering, classifying, and contextualizing data, which helps organizations understand and manage data flow and potential threats. Kurtic also addresses common implementation pitfalls and how real-time solutions integrate with existing cybersecurity frameworks.

data visibility

How does data visibility contribute to an organization’s overall cybersecurity posture?

Data visibility is the foundation of data security, governance, and management. It involves discovering, classifying, and contextualizing data, including its business context, access entitlements, and location. This comprehensive approach allows enterprises to clearly understand the data’s location, users, and importance.

The ability to understand and track data movement across an organization’s system and networks is critical to overall security posture. For example, by understanding data flow, organizations can identify anomalies and detect potential threats more easily. Visibility into data movement helps incident response teams to contain attacks and minimize the impact of a data breach.

Data visibility also helps organizations gain insight into what data is sensitive, where that data resides, and protect it appropriately, which enables security teams to prioritize security controls and allocate resources appropriately. In addition, data visibility enables organizations to more easily comply with regulations and demonstrate that compliance when required.

What common mistakes should organizations avoid when implementing data visibility and cybersecurity measures?

Given that most organizations have data that will grow, get shared, and change rapidly, it is critical to be able minimize the average time to detect potential exposures of sensitive data, the mean-time-to-detection (MTTD), as much as possible. A lower MTTD is a KPI that measures how effectively data security posture management reduces the likelihood of an incident. Minimizing the MTTD ensures that even though data and its use constantly changes, organizations have the ability to detect when their security posture on sensitive data is impacted. There are three common mistakes that organizations need to avoid:

OPEX costs. Organizations often fail to consider the operational overhead and costs involved when using a data security solution, such as a data security posture management (DSPM) solution. Often, using a DSPM forces an organization to make a tradeoff between costs and MTTD. OPEX costs can come in the form of compute costs; resources needed to run, operational overhead to deploy, manage and monitor infra, or train the tool; and costs involved in tuning the accuracy of the tool, typically in forms of rules.

Performance. Since a low MTTD requires rapid identification of potential exposure, organizations must evaluate performance and accuracy of any data visibility solution. If a scan takes a long time, MTTD will be longer. If a scan is costly, it will likely be performed fewer times which will also make MTTD longer.

Scale. Closely tied to performance is scalability. Even a solution that performs well with a high degree of accuracy is useless if it can’t quickly analyze data as it grows and changes. Before choosing a data security solution, organizations must evaluate the scale of how much data, the type of data, and the locations of data the product can assess.

How do real-time data visibility solutions integrate with existing cybersecurity frameworks?

Well-designed data visibility solutions can provide a real-time assessment of risk, a framework for prioritizing action on that risk, and then either perform dynamic enforcement of a policy or provide remediation recommendations. For example, if a real-time data visibility solution is integrated with the MITRE Attack framework, it becomes easier to map tactics, techniques, and procedures (TTPs) that are related to data, helping security teams understand the nature of the threat and its possible progression.

Similarly, the NIST CSF Protect component focuses on data security, such as detecting unusual access patterns or unauthorized data transfers.

Finally, zero trust frameworks typically include data security as a core element, helping to enforce least privilege access control by ensuring real-time detection of unauthorized or unusual access.

For each of these frameworks, knowledge of the data (location, type) and access to it are critical and connect to other aspects of the security lifecycle.

How should companies prioritize their data visibility efforts to align with cybersecurity goals?

The first step for any data visibility effort will always be to identify and classify structured and unstructured data at scale and speed. It’s vitally important to understand data sensitivity, such as personally identifiable information (PII), intellectual property, and financial data. Organizations will also need to prioritize the data that is subject to industry-specific regulations, because lack of compliance can result in significant consequences to the business.

And while identifying sensitive data and protecting it is always important, organizations must also ensure that they understand what data and assets are critical to business operations and evaluate how vulnerable those assets are to threats. Organizations need to understand the threat landscape, particularly if there are potential threats or vulnerabilities that might target specific data types, and then prioritize the data that is at an increased risk of compromise. Data visibility efforts must also align with internal strategic cybersecurity objectives, which typically also means aligning with common cybersecurity frameworks.

How do organizations measure the ROI of their data visibility investments?

Best practices for measuring the ROI on data visibility investments typically include:

  • Time to value. How fast can a tool be deployed and used effectively?
  • Accuracy. How many false positives or false negatives will the chosen tool produce? How much time will be spent investigating false positives — and what are the consequences of false negatives? How much time is spent in tuning rules to ensure accuracy amid constant change in data?
  • MTTD/MTTR. Mean-time-to-detection/response. Ultimately, a decrease in the MTTD is an excellent indication that the tools are identifying problems more quickly and responding faster, thereby reducing risk and potential damage.
  • OPEX/CAPEX ratio. Apart from the CAPEX (capital expenditure, which is typically limited to software or service subscription costs), what are the operating costs (OPEX) for the data visibility solution, including compute, personnel, data storage, professional service support, and so on.

Don't miss