Solutions for GovIT’s Data Challenges

Phil Vincenzes is IntelliDyne’s Senior Vice President and Chief Analytics Officer, spearheading the delivery of advanced analytics solutions such as DashBlox performance management products, delivery of intelligent, automated solutions using Robotic Process Automation (RPA), and providing real-time visibility into enterprise IT operations and programmatic execution of contract requirements. Here are Phil’s thoughts on the data and analytics challenges facing GovIT teams.

What are the most common issues GovIT teams have with data?

Ensuring that information systems are secure, and sensitive data is protected is the top challenge. Outdated, obsolete legacy systems aren’t just costly to maintain — they pose a big security risk to any agency’s mission. Add to that the challenge of attracting and retaining highly skilled professionals for a modern GovIT workforce, comfortable using data. Attracting and retaining skilled IT professionals with data expertise who can drive modernization with up-to-date technology is critical.

Another issue is leveraging and managing enterprise data in authoritative systems with appropriate levels of access, accessibility, and quality. Quality Enterprise Data Management solutions can help to deliver a ‘single version of truth’ and minimize confusion over what is and what isn’t and provides the foundation of clean, accurate data to ensure that everyone is operating and supporting decisions with correct and current information.1

How do data science and subject matter expertise best work together?

Our first stopping place when tackling a business problem is always the user, or SME.  The CRISP-DM (Cross Industry Standard Process for Data Mining) begins with an understanding of the business or, the users’ environment and how solving a specific problem creates value. Up front collaboration with subject matter expertise helps the data scientist/solution developer reveal the meaning, value, and implications of data in support of a solution or discovery of findings. By working together throughout the solution lifecycle, better and more meaningful results are almost always obtained.

How is data analytics interfacing with the cloud?

Cloud technologies embrace and encourage a fast-moving, innovative environment where teams can utilize the cloud to store more data and discover new use cases for that data. Analytics and business intelligence tools can integrate easily with cloud technologies. Ultimately, cloud technologies increase the ease of sharing information and data across an organization, enabling teams to compile data, conduct analysis and act on findings more quickly.

How do data considerations impact the decision for an on-premise, private or public the cloud?

Security, control and cost all factor into this major decision. What works today may not meet your needs tomorrow, and each solution comes with a different cost structure. The cloud allows you to quickly and easily scale storage, memory and computing power as needed. Also, storing data and running applications from the cloud makes it easy for employees to connect and work from anywhere without the complexities of VPNs.

Data sharing needs can also be an important consideration when determining whether to host data on-premise or in the cloud.  For example, the COVID-19 pandemic has illuminated the need to share information across agencies, academia, public and private epidemiological research, and even media. Hosting this information in the cloud can lower traditional barriers to access, which is critical right now, considering that we are globally trying to accelerate the dissemination of information in support of gaining knowledge and finding a solution to the pandemic.

How can government programs simplify ETL processes to support data engineering and science initiatives?

It’s a fact that the data preparation and curation phase of most analytics initiatives represents a significant effort. Many attempts to weave analytics into the day-to-day fabric of an organization’s business processes are crippled when teams fail to recognize the importance of getting this first step right. We follow an agile analytics process that is founded in the CRISP-DM methodology, which begins with understanding the business problem and then understanding the data.  Only then can we begin to approach the data preparation.  Let’s face it, ‘data munging’ isn’t the sexy part of a data science or analytics project but, all too often, under-thought, or fragile data prep undermines the success of analytics initiatives.

In the rush to show results, it’s tempting to overlook the value of simplification and repeatability of ETL processes, which is ultimately a pre-requisite for ‘operationalizing’ analytics. This task is now much easier and can be fun, thanks to the introduction of visual, codeless/low code, data-prep engineering tools. These modern tools ease the heavy lift and provide a path for older ‘hand-rolled’ ETL code to become history, allowing citizen data scientists to visually map and create elegant, repeatable processes that can be modularized, shared, and scheduled.

How can GovIT enrich a modern analytics platform with machine learning and deep learning?

Most government agencies probably jumpstarted their analytics journey by standing up a business intelligence platform to roll out their first dashboards. Those with the foresight to centralize their development effort to create purpose-built analytic applications, serving discrete business unit needs have realized the greatest business value.

Our team is now building the next generation of business intelligence apps, which we refer to as Augmented BI. By developing BI applications that are tightly integrated with Robotic Process Automation, machine learning, or AI, our applications can now take the final leap forward to significantly increase productivity gains and support better, and more informed decision making…or, even take autonomous action.

How can agency leaders get actionable insights from their program data near real-time?

The adage, ‘You can’t change what you don’t measure,’ certainly applies when there is a desire to obtain actionable insights from programmatic data. Most government contracts’ program data is locked up in silos — spreadsheets, financial systems, project scheduling apps, homegrown risk registers and more. That makes it nearly impossible for agency leaders to have transparency or obtain situational awareness to support intelligent decision making.

However, by overlaying an Integrated Program Assurance Dashboard to aggregate these disparate data sources, leaders could obtain valuable insight immediately as data is refreshed in each source system. This is exactly what IntelliDyne does for each of its government contracts. In a single Integrated Program Assurance dashboard, IntelliDyne program managers and executive leadership can view all programmatic data across any of our contracts — significantly improving transparency, accountability and effective decision making.

Related Posts:

What is Augmented Intelligence and how can RPA advance DoD IT Modernization

______________________________

1https://www.oversight.gov/sites/default/files/oig-reports/CIGIE_Top_Challenges_Report_April_2018.pdf

Scroll to Top