Toward More Useful Federal Oversight

This post first appeared on IBM Business of Government. Read the original article.

Thursday, January 25, 2024

This is an excerpt of Chapter 11 from our new book Transforming the Business of Government: Insights on Resiliency, Innovation, and Performance.

Well-executed oversight is an invaluable aspect of government operations. It complements program and cross-program implementation by increasing the likelihood that government spending and actions realize their intended benefit.

In the future, federal oversight can be made more useful to more users for more purposes. Continually evolving technologies make it easier and more affordable than ever to collect, analyze, and use oversight data and analyses to anticipate, detect, prepare for, prevent, and respond more quickly, fully, and successfully to problems. Evolving technologies and analytic approaches can deliver more insights not just to federal programs but also to federal delivery partners to help them anticipate, prevent, and address problems and pursue improvement opportunities more pro-actively1 and more strategically, addressing the most serious problems and opportunities before proceeding to other key actions. In addition, lessons from experience and well-designed trials can reveal better ways to communicate oversight findings and other evidence in order to realize better outcomes and operational quality while building understanding of and trust in government.

Those doing oversight are beginning to tap evolving technologies. What lessons are they learning, technology-linked and otherwise? What barriers are they encountering? What new oversight approaches are worth testing and assessing? Finally, who does, can, and should search for and share lessons learned and build new knowledge and capacity for more useful oversight?

This chapter explores these questions. It seeks to engage others in asking, answering, and acting to adopt more useful approaches to oversight that improve government performance on multiple dimensions.

What is oversight?  Oversight, as defined in this chapter, is work done by those not charged with daily and longer-term program operations, but rather work done to look for and report on problems and opportunities and make recommendations regarding them. Oversight supplements but does not supplant internal agency evaluation and goal and program management.2 Oversight takes many forms, such as investigation, field observation, hotline calls, and data collection and analyses. It includes the search for promising as well as problematic practices.

Why is oversight needed? An important objective of government oversight is to prevent and penalize fraud and corruption. Fraud and corruption can originate outside government,3 within government,4 and by government contractors.5 oversight aids the search to find, prevent, and penalize these problems.

Oversight also brings attention to poor work quality and operational problems.6 It can reveal organizational culture problems,7 and spotlight government duplication, fragmentation, and overlap.8 oversight illuminates issues needing but not getting attention. It reduces the risk that programs run on autopilot instead of continually searching for ways to do better. By finding relevant peer performance, oversight can play an innovation-encouraging role, similar to how private sector competitors play. 

Who conducts oversight? Many conduct oversight in the U.S. federal government: Congressional committees, Government Accountability Office (GAO), agency inspectors general (IGs), IG networks, and program and regional offices. Plus, federal grant recipients receiving $750,000 or more a year must hire private sector auditors to conduct annual “single audits.”9 

Congress has authorized GAO and IGs to conduct oversight. GAO originally focused on savings and efficiency,10 and now looks for other government improvement opportunities. Inspectors General function “as independent government watchdogs who seek out fraud, waste, and abuse and who promote effective management in federal programs.”11 Agency program offices often conduct oversight of those they fund, complementing other program activities such as getting dollars out and helping grant recipients learn from each other and collaborate. The ways program offices conduct oversight often varies. Cross-agency oversight is also done, much of it through the congressionally created Council of the Inspectors General on Integrity and Efficiency (CIGIE).

How are oversight findings used? Oversight findings encourage specific actions to correct specific problems. Gao and IGs both track their recommendations to encourage corrective action. GAO also periodically updates a high-risk list to encourage action in high-need areas. Oversight findings also suggest ways to improve, as when GAO identifies agencies reducing improper payments to help other agencies.12

Different situations call for different types of oversight. Intentionally fraudulent use of government funds obviously warrants severe punishment. Oversight findings of poor implementation practices, however, often warrant assistance, not punishment, except when recalcitrance to making needed change is evident.

Who can and should use oversight findings? Congress, agency, and program leaders are target audiences for oversight information. Oversight can also help those working on and supporting the front line and others. 

More attention needs to be given to “uses” and “users” of oversight findings. Questions then need to be asked about whether those users are aware of and can find oversight findings, and whether they view such findings as useful for anticipating and preventing problems, addressing those that do occur, and improving outcomes. The Office of Head Start (OHS) at the Department of Health and Human Services talks about using monitoring findings to inform OHS but not to help children and families in Head Start programs.13 Presumably, Head Start programs are also an important audience for monitoring findings, including findings from the “Promising Practices Pilot” announced in OHS’s FY2023 monitoring protocols.14

Adopting an agile and user-centered design approach like that used to upgrade USASpending.gov15 can contribute to more useful oversight. Indeed, the Pandemic Response Accountability Committee of CIGIE (PRAC) released a tool kit to support agile oversight.16 New technologies make agile, user-centered design more feasible than ever. Websites can invite interested par- ties to opt for updates and note their areas of interest. Also, online platforms facilitate fast, iterative feedback that can support continuous improvement.

The following cases describe efforts to make oversight more useful, suggesting how new technologies and sharing lessons learned might contribute to more useful future oversight.

Case 1: Recovery Act. Implementation of the Recovery Act suggests better ways to use and communicate oversight information. A small White House-based Recovery Act implementation office supporting then Vice President Biden managed Recovery Act implementation. GAO commended this effort for: (1) strong support of top leaders, (2) centrally situated collaborative governance, (3) use of networks and agreements to share information and work toward common goals, and (4) adjustments to, and innovations in, usual approaches to conducting oversight (e.g., increased use of upfront risk assessments, real time information, earlier communication of audit findings, and use of advanced data analytics).17

Congress also legislated a Recovery Act Accountability and Transparency Board (RAT Board) for oversight. The RAT Board launched Recovery.gov, building and improving on the existing mapping platform of another agency. Mapping this facilitated seeing where Recovery Act funds were initially allocated,18 and mapping increased public interest in communities. The public also became more aware of how federal spending might affect them. GAO praised this website for: clear purpose, using social networking tools to garner interest, tailoring website to audience needs, and obtaining stakeholder input during design.19

Future agency and oversight spending maps might go beyond mapping spending to show spending options under consideration, progress made, and post-spending impact in each location. Spending on physical space projects, for example, might link to photos showing interim and final progress and to data and descriptions about spending purposes and impact. 

GAO and others20 have captured some valuable lessons learned from Recovery Act implementation to inform future congressional action and agency implementation, but these studies have missed some important lessons. For example, Congress appropriated $84 million for the RAT Board but did not fund a program implementation office. Most employees of the small RAT Board implementation office that GAO praised were “detailed” or borrowed from other agencies. It remains unclear whether this is a good precedent.

Also, Recovery.gov disappeared when the RAT Board ceased operations. Government neither sustained nor archived the site, eliminating not only an online platform for other agencies to use and improve but also lessons about the website’s functionality. The White House created the Government Account- ability and Transparency Board, which reflected on and issued a set of recommendations.21 However, there appears to be no entity that tracks action on the recommendations made by this board. nor does any entity routinely learn from cross-agency experiences to recommend authorities and resources needed to manage cross-agency implementation well.

Case 2: Federal Emergency Management Administration. The Federal Emergency Management Administration (FEMA) has done pioneering work to make oversight information more useful. A newly created, FEMA audit office converted IG, GAO, program, and grant single audit findings stored as PDFs (much in the Federal Audit Clearinghouse) into consolidated data, housed in a single, searchable database. FEMA staff used natural language processing complemented by human intelligence to note which regulatory authority auditors cited for problems, allowing FEMA to sort audit findings by keywords in chapters, sections, and paragraphs of regulatory text.

FEMA staff also noted date, location, dollar amounts, closure status, and audit teams for each data item. It used these data to create a Compliance Dashboard succinctly visualizing current and historic compliance patterns.22 The dashboard includes a bubble chart showing the most common (but not necessarily the most serious) compliance problems, trend graphs for different subsets, and color-varying maps suggesting persistent and unresolved problems. This enables comparisons across time, location, and audit teams. The figures trigger focused follow-up discussions to decide appropriate follow-up actions. FEMA leaders and those running FEMA training efforts receive the dashboards; a publicly shared version of the dashboard shows results including problems trending downward, suggesting that this information is being used to prevent and reduce noncompliance.23

FEMA has not yet tried to access other data sources to consider if and how its compliance requirements align with real-world risks. Nor has the agency made the dashboard public. Nonetheless, this approach to oversight information suggests a future approach all federal agencies can take to manage oversight, consolidating historic findings into a searchable database and developing better ways to collect future information to facilitate mapping, trend analyses, noncompliance analyses, and other action informing visualizations.

Case 3: Pandemic response Accountability Committee. Congress created the PRAC when it passed the Coronavirus Aid, Relief, and Economic Security (CARES) Act. PRAC’s Pandemic Analytics Center of Excellence (PACE) recently embraced a big data approach. Its projects suggest the enormous potential of tapping data from outside the implementing agency. PACE analyzed over 33 million applications for Small Business Administration (SBA) CARES Act funding together with publicly available Social Security Administration (SSA) information to identify suspected invalid or unassigned SSNs. PACE then asked SSA to verify those SSNs. SSA informed the PRAC that over 221,000 of those SSNs were not issued by SSA nor did they match applicant-provided birth information—suggesting potential identity fraud.24 This PACE analysis suggests one kind of analysis that oversight bodies can do to help federal agencies and their state, local, territorial, and tribal delivery partners anticipate and prevent problems.

PRAC had difficulty accessing SSA data in a timely way to do this early-warning analyses.25 This suggests a need for an “after action” review to under- stand the kinds of data-sharing arrangements needed to enable more timely, useful future oversight. 

Related questions warranting attention include:

  • Who does these sorts of “after action” reviews and recommends action to Congress and Executive Branch leaders?
  • Who tracks and encourages follow-up on recommendations?
  • What funding is needed to do this well?

Looking Forward.  Given oversight purposes and what they imply about uses and users of over- sight findings, these three mini cases suggest the following future action:

  • Treat oversight findings as data for generating greater insights to anticipate and more strategically prevent problems, address problems, and pursue opportunities. Careful thought is needed on how to store and share oversight findings to make analyses easier. Lessons can come not just from those doing oversight but also from the many government programs that collect and analyze data to aid planning, prepared- ness, prevention, response, and recovery. Thought also needs to be given to whether and how to convert past oversight findings to a more analyzable form, data standards, data dictionaries, and data-sharing agreements. More thought is also needed about how to tag collected information to facilitate searching and sorting across oversight findings by categories such as outcome objectives, process types, populations served, time, location, and incentive structure.
  • Collect and share oversight data in as close to real time as possible. In addition to PRAC’s agile oversight tool, a white house memo on the Infrastructure Investment and Jobs Act strongly urges more proactive approaches.26 Timelier and more spatially and geographically frequent data tend to support more proactive approaches.
  • Analyze oversight findings and other data within and across agencies. Data analyses to assess oversight findings across programs and agencies for patterns, similarities, variations, relationships, clusters, trends, positive and negative outliers, and anomalies across time and subsets can help to prevent and more strategically respond to problems that occur; such data can also reveal opportunities for improvement, including opportunities for cross-program scale economies. These kinds of data can also aid identification of causal factors to influence precursor events useful as warning signs.
  • Look for better and promising practices along with problems and risks. Oversight should include the search for practices associated with progress within and across agencies. These should include finding better products and services likely to help those served, regulated, and protected as well as better internal practices, such as useful metrics, report generators (old-fashioned and AI-supported), and incentive structures. The search for better practices also requires looking beyond aver- ages to variation to understand how different situations affect efficacy. And cross-program analyses of risks and contextual changes affecting multiple programs promise efficiencies. Oversight bodies can do this cross-program analyses, or if not they can suggest how to get it done.
  • Continually learn from experience and well-designed trials and build capacity to learn within and across programs. Oversight can support continuous learning about effective practices within and across programs. Some of these practices could include programs sharing outcome objectives and programs using similar implementation processes such as benefits processing. Oversight can also support the creation of shared evidence libraries to house information on shared outcomes and similar processes. It could encourage iterative trials to improve the functionality and cut the costs of evidence libraries (as NASA uses NIH’s PubMed platform) and other knowledge-sharing tools.27 Where appropriate, over- sight can encourage shared learning agendas.
  • Successfully communicate oversight findings, lessons learned, and needs with important users. CIGIE’s Oversight.gov is a promising step forward in communicating oversight findings. More useful oversight also requires attention to target users, and whether they know about and are easily and affordably able to access, understand, and apply relevant oversight and other needed information. This, of course, requires careful thought as to key users, their relative priority, and their information needs. oversight can share relevant information about users of information and other evidence and effective ways to communicate with those users. It can encourage shared trials to find better ways to communicate to different users, and to strengthen understanding of skills needed to communicate findings and lessons learned successfully.
  • Sufficiently resource data design, collection, analyses, communication, and active management within and across agencies. The six activities recommended above need to be encouraged and actively resourced by Congress and leaders, within and across agencies, which oversight can inform.

Moving forward on actions suggested above will require tackling specific challenges, including:

  1. Data standards confusion. Data standards strengthen improvement- informing insights so Congress has frequently mandated data standards—in the DATA Act of 2014, GREAT Act of 2019, Financial Data Transparency Act of 2022, and other laws. How these efforts fit together, planned next steps, and how to engage is confusing. The Federal Data Strategy’s Action 2 indicated its intent to align multiple data standard efforts.28 The status of this strategy is, however, unclear. Alignment, public information about planned next steps and why they were chosen, and external engagement would be helpful.
  2. Entities responsible for learning across agencies and finding and sharing lessons learned about shared risks, priority users, are unclear. The three cases discussed above suggest the value of more systematically searching for and successfully sharing oversight findings and other evidence, across agencies and time. Congress ought to consider and decide who should lead cross-agency learning and the resources needed to do continuous learning and improvement well. CIGIE and GAO members, sufficiently resourced, would obviously need to be involved as should other potential oversight and evidence users, including frontline workers.
  3. Barriers to user-centered design and data sharing. More useful oversight and evidence requires better understanding of user needs and experience. Unfortunately, barriers exist to gathering useful feedback about government’s knowledge-sharing efforts such as webinars and evidence libraries. Real and perceived barriers, such as Paperwork Reduction Act, may also impede data sharing.29 Government needs frequent feedback for agile and effective action about ongoing oversight, to identify barriers to user-centered design and recommend ways to remove them is needed.

There is reason for optimism despite these challenges. This chapter seeks to foster ideas about how to make government oversight more useful for more users. This requires finding new ways to do and use oversight and other information—in more effective, efficient, and fair ways, to improve outcomes, operational quality, and public understanding of and trust in government.

Endnotes

  1. White House,” Memorandum for the Heads of Executive Departments and Agencies re Advancing Effective Stewardship of Taxpayer Resources and Outcomes in the Implementation of the Infrastructure Investment and Jobs Act, M-22-12, https://www.whitehouse.gov/ wp-content/uploads/2022/04/M-22-12.pdf
  2. Honorable Greg Friedman, U.S. Department of Energy Inspector General, 1998-2015.
  3. U.S. Pandemic Responsibility Accountability Committee, Fraud Alert: PRAC Identifies $5.4 Billion in Potentially Fraudulent Pandemic Loans Obtained Using Over 69,000 Questionable Social Security Numbers, https://www.pandemicoversight.gov/sites/default/ files/2023-01/PRAC%20fraud%20alert%20on%20potential%20SSN%20fraud _ 1.pdf.
  4. McWhirter, Cameron, “Former Mississippi Human Services Director Pleads Guilty to Charges Related to Welfare Scandal,” Wall Street Journal, September 2022, https://www. wsj.com/articles/former-mississippi-human-services-director-pleads-guilty-to-charges- related-to-welfare-scandal-11663887132.
  5. Associated Press, “General manager of electrical company pleads guilty to defrauding MBTA operator, Keolis,” WBUR. June 14, 2023, https://www.wbur.org/news/2023/06/14/ john-rafferty-fraud-boston-mbta-rail.
  6. Johnson, Charles A. and Kathryn E. Newcomer, U.S. Inspectors General. Brookings, 2020. pp. 184-193.
  7. Sparrow, Malcolm, Handcuffed. pp.12-17 (Kindle). Brookings 2016.
  8. U.S. Government Accountability Office, 2023 Annual Report: Additional Opportunities to Reduce Fragmentation, Overlap, and Duplication and Achieve Billions of Dollars in Finan- cial Benefits, June 14, 2023. GAO-23-10608, https://www.gao.gov/duplication-cost- savings.
  9. Code of Federal Regulations, Subpart F, Audit Requirements, https://www.ecfr.gov/current/ title-2/subtitle-A/chapter-II/part-200/subpart-F.
  10. https://www.gao.gov/about.
  11. Johnson and Newcomer, p. xv.
  12. U.S. Government Accountability Office. Improper Payments: Programs Reporting Reductions Had Taken Corrective Actions That Shared Common Features, June 30, 2023. GAO- 23-106585, https://www.gao.gov/products/gao-23-106585.
  13. U.S. Office of Head Start, Fiscal Year (FY) 2023 Head Start Monitoring Protocols, https:// eclkc.ohs.acf.hhs.gov/federal-monitoring/article/fiscal-year-fy-2023-head-start-monitoring- protocols.
  14. U.S. Office of Head Start. FY23 Monitoring Kickoff, September 29, 2022, https://eclkc. ohs.acf.hhs.gov/sites/default/files/video/attachments/fy2023-monitoring-kickoff-slides.pdf.
  15. National Academy of Public Administration, DATA Act Implementation: The First Government-Wide Agile Project, April 30, 2020, https://napawash.org/grand-challenges-blog/ data-act-implementation-the-first-government-wide-agile-project.
  16. Pandemic Response Accountability Committee Agile Products Toolkit, https://www.pan- demicoversight.gov/media/file/agile-products-toolkit2022pdf.
  17. U.S. Government Accountability Office. Recovery Act: Grant Implementation Experiences Offer Lessons for Accountability and Transparency, GAO-14-219, https://www.gao.gov/ assets/gao-14-219.pdf.
  18. Although not where sub-grantees and sub-contractors eventually spent those funds, a functionality that needs to be developed.
  19. U.S. Government Accountability Office. Recovery Act: Grant Implementation Experiences Offer Lessons for Accountability and Transparency, GAO-14-219, https://www.gao.gov/ assets/gao-14-219.pdf.
  20. Fine, Glenn. Fighting fraud, waste, and abuse—the 2009 Recovery Act, Brookings, February 11, 2022, https://www.brookings.edu/articles/fighting-fraud-waste-and-abuse- the-2009-recovery-act/#:~:text=The%20Recovery%20Board%20was%20led%20 by%20a%20full-time,in%20funding%20and%20employed%20approximately%2035%20 staff%20members.
  21. U.S. Government Accountability and Transparency Board, Report and Recommendations to the President, December 2011, https://obamawhitehouse.archives.gov/sites/default/files/ gat _ board _ december _ 2011 _ report _ and _ recommendations.pdf.
  22. Federal Emergency Management Administration, Compliance Dashboard presented to National Academy of Public Administration Grants Management Symposium, https:// s3.us-west-2.amazonaws.com/napa-2021/Grants-Management-Symposium/FEMA _ COD _ Handout.pdf.
  23. For more background on this example, see Metzenbaum, Shelley, Federal Grants Management: Improving Operational Quality, IBM Center for The Business of Government, 2021, pp. 11-15, https://www.businessofgovernment.org/sites/default/files/Improving%20Opera- tional%20Quality.pdf#page=11.
  24. U.S. Pandemic Responsibility Accountability Committee, Fraud Alert: PRAC Identifies$5.4 Billion in Potentially Fraudulent Pandemic Loans Obtained Using Over 69,000 Questionable Social Security Numbers, https://www.pandemicoversight.gov/sites/default/ files/2023-01/PRAC%20fraud%20alert%20on%20potential%20SSN%20fraud _ 1.pdf.
  25. Buble, Courtney, ‘These were not normal times’: A former watchdog reflects on COVID- 19 oversight, Government Executive, June 30, 2023, https://www.govexec.com/ oversight/2023/06/these-were-not-normal-times-former-watchdog-reflects-covid-19-over- sight/388102/.
  26. White House, Memorandum for the Heads of Executive Departments and Agencies re Advancing Effective Stewardship of Taxpayer Resources and Outcomes in the Implementation of the Infrastructure Investment and Jobs Act, M-22-12, https://www.whitehouse.gov/ wp-content/uploads/2022/04/M-22-12.pdf.
  27. Chief Financial Officers Council, Managing for Results: The Performance Management Playbook for Federal Awarding Agencies, April 2020, p. 35, https://www.cfo.gov/wp-con- tent/uploads/2021/Managing-for-Results-Performance-Management-Playbook-for-Federal- Awarding-Agencies.pdf.
  28. Federal Data Strategy, Leveraging Data as a Strategic Asset, 2020 Action Plan, https:// strategy.data.gov/action-plan/#action-20-develop-a-data-standards-repository. 
  29. Womer, Jonathan and Kathy Stack. Blending and Braiding Funds: Opportunities to Evaluation Capacity in Human Service, SSRN, April 2023, https://papers.ssrn.com/sol3/papers.cfm?abstract _ id=4403532. 

Previous Blog Posts on Transforming the Business of Government: Insights on Resiliency, Innovation, and Performance 

Introduction

Chapter One – Emergency Preparedness and Response 

Chapter Two – Cybersecurity 

Chapter Three – Supply Chain

Chapter Four – Sustainability 

Chapter Five – Workforce 

Chapter Six – Eight Areas for Government Action – Insights on Resiliency 

Chapter Seven – AI Literacy: A Prerequisite for the Future of AI and Automation in Government

Chapter Eight – Design Principles for Responsible Use of AI to Enhance Customer Experience Using Public Procurement

Chapter Nine – Quantum Technology Challenge: What Role for the Government?

Chapter Ten – Using Linked Administrative Data to Advance Evidence-Based Policymaking

Leave a Reply

Your email address will not be published. Required fields are marked *