Hello, Guest!

cBEYONData’s Dawn Sedgley On the Importance of Data Aggregation

In today’s digital world, data is king. With the power to unlock important insights and inform decisions, data is increasingly critical, and federal agencies are prioritizing it as such.

GovCon Wire sat down with Dawn Sedgley, chief operating officer at cBEYONData, during a recent Executive Spotlight interview to learn more about data aggregation and how it could impact federal missions. Read below for the full interview.

GovCon Wire: What is data aggregation and why is it so important right now?

Dawn Sedgley: Data aggregation is the process of gathering, compiling and summarizing data from multiple sources to create a unified, comprehensive dataset. For the federal government, aggregated data enables agencies to make informed, data-driven decisions. By viewing data holistically, agencies can understand broader financial and operational trends, identify gaps and allocate resources more effectively.

Data aggregation also provides the foundation for agencies to leverage advanced analytics, AI and machine learning, all of which are important and hot topics right now because they enable so many different types of objectives. For example, enabling accountability through connecting the dots between various systems and processes provides the insights and transparency needed to support audits, which is always a hot topic in the government. Data aggregation, along with AI/ML, enables efficiency objectives by performing remedial activities that can free up human bandwidth. Or, it can give organizations the ability to develop predictive models that forecast various scenarios, helping agencies to be proactive in their approach, prepare for contingencies before they arise, mitigate risks and be more effective with trade-off decisions.

As the federal government modernizes their business systems, data aggregation becomes easier through APIs, cloud services and advanced data management tools, which simplify the processes of data aggregation and distribution. These systems can aggregate data more efficiently than legacy systems, reducing errors and increasing the speed of data consolidation. Modernized business systems, particularly those built on cloud or scalable architectures, can handle larger volumes of data from more diverse sources. This scalability allows for more extensive data aggregation, supporting advanced analytics and insights across the organization and driving further value and innovation across the organization.

GCW: What are some of the key opportunities and threats associated with data aggregation?

Sedgley: The examples provided show the kinds of opportunities which are vast and across the spectrum of efficiency, effectiveness, risk reduction and so on. Access to larger aggregated volumes of data is also a double-edged sword for the U.S. national security apparatus, as well as her allies and partners. On one hand, data aggregation, alongside emerging technologies, allows for improved decision making and more advanced analytics, it can aid in auditability and it can even help in improving operations.

However, when data is aggregated and integrated into operations, the classification level of data may change. Association of data describing one set of information to data for a different set of information — for example, ammunition and location or research budgets and vendors — can potentially raise the classification level of the aggregation relative to the individual sources, causing a need for the government to further protect its dissemination and use through its lifecycle.

The reality is our nation’s enemies and competitors reap the same benefits we do. The same technologies and data strategies increase their own analytic and decision-making capability, while allowing them to gain insight into ours. Competitors are taking full advantage of the profound proliferation of our military and intelligence data to understand our intentions and limitations. They achieve this in part because data aggregation is both purposeful and accidental. Modernized systems enrich our data environment, allowing for data linkages and data federation at near-real time.

GCW: Can you share some use cases for data aggregation?

Sedgley: Taking financials as a use case, following the money is true for both adversaries and for audits! And data aggregation often intentionally provides trends and connections which then can also identify anomalies. Analytics highlighting anomalies can be utilized to support audit processes for where to drill into identifying system control issues or fraud, waste and abuse. Yet, the same application on other entity’s aggregated data can elicit indicators of bad actors.

GCW: How do you see data aggregation impacting the global intelligence landscape, particularly as modernization efforts push for more integrated systems?

Sedgley: Data aggregation, driven by modernized, integrated business systems, is important in many aspects, but without taking a closer look at what the data reveals once combined, we are risking exposing our operations to our adversaries. It is critical for organizations to have visibility into how data aggregation is occurring and the complexity impacting it.

To do this, agencies need to start evaluating their mission and threat landscape, as well as the business processes, master data and systems that generate and sustain the organization and its capabilities.

GCW: What are the main challenges in ensuring that relevant insights are accessible while maintaining operational security, and how can a solid understanding of data and data architecture help address these?

Sedgley: Inertia, over-consumption, denial, over-simplification — there are so many challenges. For some, it’s getting past the willingness to change at all and just having the sheer inertia to do things differently. For others, it’s over consumption and grabbing all data from everywhere and anywhere whether they need it or not — but data is cool, and they’ve got to have it, so they won’t have to go ask someone else to provide it for them.

Many want to deny the reality. The fact that the volume of data and the ability to integrate, aggregate and visualize that data at speed is even possible is just incomprehensible to them. Technology birthed, matured and scaled so fast compared to prior technological evolutions; to them, it’s not plausible. Or worse, there are those who are in denial because they think all the data is out there anyway, so why even try to protect it?

Then there’s oversimplification, where many react to this issue by suggesting increased classification for the data. But that over correction does not provide the information leaders need to make decisions, and it’s counterproductive to the modernization of business systems. Because data moves through processes and across systems, this is a system-of-system issue which then seems complex for many to unravel and resolve.

One approach our company is taking with our clients is creating minimal viable information, or MVI, for the business processes across the organization. MVI defines the necessary information translated into the data for a process to be executed in a system. Data across processes and systems can then be aggregated and appropriately classified through analytical products. This provides the right data to the right person at the right time for the right reason. The goal is to safeguard sensitive insights that can be garnered via data shared across platforms. Once we identify the MVI required to meet the intent, we clarify the risk and identify potential risk mitigation mechanisms for decision makers.

After the MVI is defined, we begin re-crafting business processes, enabling systems and the data landscape to fit that MVI standard. To support this initiative, organizations need a solid understanding of their data landscape and system architectures.

Video of the Day