December 22, 2020

5 real-world examples of fraud and anomaly detection for government

Dave Kelly

Topic:   Industry Focus

McKinsey reports that close to 20 percent of all U.S. federal funds or about $5 trillion annually comes up missing due to fraud, waste, and abuse. With the current U.S. deficit at more than $3 trillion, it’s clear that mitigation strategies to proactively detect and prevent this fraud could change the course of the country and enhance the services to our citizens.

The good news is that modern data platforms can help government agencies spot the data anomalies that are indicative of waste, fraud, and abuse, enabling new insights and proactive mitigation. Whether it’s tax fraud schemes bilking millions or the abuse of federal programs, today’s analytics tools, when applied properly, can make a huge difference in the U.S. budget and the losses the country is experiencing. The same holds true for governments globally.

Here are just a few examples of how governments can leverage big data analytics in the future and how they’re being used today.

Use Case #1: Fraud Detection Saves Money

The Brookings Institute reports that every $1 in $6 owed in taxes is simply not paid. That alone makes up three-quarters of the entire annual budget deficit. Tax evasion can be deliberate or inadvertent, committed by individuals or corporations, but in all of these cases, it is an issue of staggering proportions. If local, state, and federal governments invested in AI-driven big data analytics platforms, these tools could apply machine logic and predictive analytics to spot fraudulent tax returns and increase revenues.

These same tools could be applied to fraudulent unemployment claims, which, in 2019, amounted to about 10 percent of all the money paid out, a number totaling in the billions. In 2020, the FBI reported the problem as worsening, particularly in the area of identity theft, where bad actors use someone else’s personal data to apply for unemployment.

In all of these cases, next-generation data platforms could harness the data, analyze it, and spot anomalies that could cut down on fraud. The State of Texas illustrated the power of these tools to spot fraud and save cash; last year, the state announced saving $90 million over four years by using big data to combat unemployment fraud.

Use Case #2: The Energy Sector Uses Anomaly Detection to Save Lives

NASA and the U.S. Forest Service teamed up to improve platform interoperability between these disparate departments to detect anomalies in standard weather patterns. The goal was to integrate their data in a way that could better predict catastrophic weather events to help with the human and financial costs of these crises in advance.

With 2020 turning out to be one of the most costly and difficult years for environmental mishaps due to hurricanes, wildfires, and the pandemic, efforts to leverage data to help prepare Americans will help cut down on the fiscal and human costs of these events. Along the same lines, the Business of Government reports that the National Center for Atmospheric Research is using predictive analytics to determine future energy supply needs as the climate shifts.

Use Case #3: Homeland Security Uses Big Data to Speed Communication

Homeland Security is using big data analytics for government in ways that improve communication across jurisdictions. Like many bureaucracies, the agency struggles with interoperability and the effect on their crime-fighting efforts. By integrating data and removing silos, the organization can work more effectively with state and local law enforcement agencies. There are signs these efforts are paying off. Less than one day after the 2013 Boston Marathon bombing, the FBI, local law enforcement, and Homeland Security came together to analyze more than 480,000 images found in 10 terabytes of unstructured data, which ultimately led to the capture of the terrorists who placed the bomb.

Use Case #4: FEMA Spots Fraud with AI

As the COVID-19 pandemic was really sinking its teeth into the U.S., the Office of the Inspector General (OIG) released a report showing that the Federal Emergency Management Agency (FEMA) inadvertently paid out more than $3 billion in improper and potentially fraudulent claims since 2003. The agency said, “It relies on applicant self-certifications because no comprehensive repository of homeowner’s insurance data exists.”

The answer to this level of waste lies at the heart of how big data analytics for the government could increase the efficiency of the nation’s bureaucratic systems. Many layers of government data are siloed or simply non-existent, and they are governed by manual processes that lead to error and waste. Instead, AI platforms can use machine learning algorithms to collate and improve structured data and spot anomalies.

Predictive algorithms could increase the use of targeted inspections and spot-checking to lower fraud and abuse while also saving labor. These tools could also increase the transparency of government agencies such as FEMA to improve their effectiveness with the Americans they’ve been tasked to serve.

Use Case #5: Local Governments Invest in AI

Not to be outdone by their federal counterparts, it is the local and state governments that are increasingly investing in big data to cut costs, improve efficiencies, and reduce waste. This includes:

  • The State of Louisiana, using software and anomaly detection to combat Electronic Benefit Transfer (EBT) fraud in the Supplemental Assistance Food Stamp Program (SNAP)
  • The City of Louisville, Kentucky, which says it has a five to one return for every dollar they spend on data analytics. With big data analytics, the City identified $3.6 million in cost savings in one year and $500,000 in additional revenue opportunities. The City also cut more than 200 days from key administrative functions such as hiring. The City attributed its success to “intentional efforts to use data and evidence in innovative ways”
  • The City of Cincinnati, Ohio, which saved $3.3 million in one fiscal year through “using data analytics in a very methodical and process-oriented fashion.” The City’s goal was to gather the data, analyze it, and then identify and implement strategies to save money
  • Kansas City, Missouri, a city that used big data analytics to improve  customer satisfaction ratings related to snow removal efforts during the winter months. The City leveraged big data to improve government relations with its citizens while also using GPS tracking data that allowed residents to see real-time updates in its efforts to keep their streets clean and free of snow. Customer satisfaction increased from 50 percent to 62 percent as a result, at a time when the public’s trust in government is anything but robust

While these examples demonstrate how modern data platforms are helping government agencies proactively respond to the waste, fraud, and abuse problem, the reality is that these data solutions are yet to gain wide adoption across the government. The good news for government is that efficiencies in technology and the adoption of cloud computing have significantly reduced the cost of these solutions, making them a reality for agencies of all sizes and offering a significant return on investment by mitigating a substantial loss of tax dollars.

Dave Kelly CPT (Ret.) spent 26 years in the Michigan State Police where he led statewide intelligence, Cyber Command & Technology. Dave Kelly is a veteran of the United States Marine Corps and currently serves as the vice president for Public Sector Solutions and Strategy at ibi. In this role Dave is enabling our federal, state, local government and public education partners achieve their analytics, data integration, quality, management, and visualization goals. Dave provides our partners with thought leadership on use-case driven solutions to public sector technology problems and strategic vision for what is possible when leveraging the ibi stack of technology.