NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

National Academy of Sciences (US), National Academy of Engineering (US), and Institute of Medicine (US) Committee on Science, Engineering, and Public Policy. Implementing the Government Performance and Results Act for Research: A Status Report. Washington (DC): National Academies Press (US); 2001.

Cover of Implementing the Government Performance and Results Act for Research

Implementing the Government Performance and Results Act for Research: A Status Report.

Show details

4Conclusions and Recommendations

Over the last 4 years, federal agencies that support research in science and engineering have moved by stages toward full implementation of GPRA. The central objective of the act is to elicit from the agencies a regular accounting of the planning, performance, and results of their research activities.

Agencies have spent substantial time and effort in devising ways to implement the act. However, both the agencies and oversight bodies must still develop better refinements to improve interpreting, implementing, and communicating with each other about GPRA.

To assist in the complex processes of implementing GPRA, this report has attempted to summarize and interpret the experiences of agencies and oversight bodies. In particular, its major sections examine the current process and recommend the most appropriate methods of evaluating basic–and applied–research programs, the criteria that agencies can and should use to perform their evaluations, and the experiments and difficulties experienced by agencies in communicating their evaluation results internally and externally.

After its study of GPRA with agency and oversight personnel, the present panel has concluded that the manner of planning and evaluating research programs carries great importance. It is apparent that inappropriate methods and inadequate communication can harm the programs that the law seeks to strengthen. We hope that the general observations, conclusions, and recommendations in this report help agencies and oversight groups as they continue to take the incremental steps necessary to implement GPRA for the country's federal research programs.

Chapter 2 and Chapter 3 each contain specific recommendations for agencies and oversight bodies that are designed to expedite the implementation of GPRA. This chapter offers a brief set of more-general conclusions and recommendations that consolidate the major themes of the preceding text.

4.1. General Conclusions

The panel offers the following 10 conclusions:

Conclusion 1: All five agencies have made a good-faith effort to develop reporting procedures that comply with the requirements of GPRA. Some agencies stated that GPRA compliance has added substantially to the cost of their planning and evaluation activities in the form of staff time and resources. Others report that they have been able to integrate GPRA with their traditional budget and planning processes although at some cost of time and effort.

Conclusion 2: Some agencies are using the GPRA process to improve their operations. These agencies report benefits in strengthening program management and enhancing communication about their programs to the users of research and the general public. The need to do so depends on the goal of that agency and the degree to which there is concern about a given field of research or about new and emerging programs.

In promoting greater accountability, the act calls for firmer alignment of research programs with overall strategic planning and for a higher degree of accountability. These agencies report progress on both counts—in strengthening the management of their programs and in enhancing their ability to communicate the value of their programs to the users of research and the public.

However, while some agencies report that they have been able to derive their GPRA requirements from the same management processes that they traditionally use for internal control and budgeting, others see GPRA requirements as extra burdens that add to the planning and reporting workload, with lost opportunities in terms of costs of staff time and resources devoted to this requirement.

Conclusion 3: The most effective technique for evaluating research programs is review by panels of experts using the criteria of quality, relevance, and, when appropriate, leadership. Agency approaches to GPRA research programs demonstrate the utility of expert review using the same criteria of quality and relevance as outlined in COSEPUP's original report. The international leadership criteria is generally not evaluated by most federal agencies at this time, although several are interested in such a measure. However, given the diversity in mission, complexity, culture, and structure of federal agencies that support research, it is not surprising that their approaches to GPRA have varied. One size definitely does not fit all.

Conclusion 4: Oversight bodies and some agencies need clearer procedures to validate and verify agency evaluations. In particular, oversight bodies expressed a desire for better understanding of the methodology and results of expert review evaluations.

Conclusion 5: Agencies choose to aggregate their research programs at different levels. Some agencies provide evaluations on a field-specific or program-specific basis; others do so for the research program in its entirety. Aggregating at a high level can make it difficult for oversight bodies to clearly see and understand the methods and programs that are the focus of the analyses.

Conclusion 6: The development of human resources as an agency objective sometimes does not receive explicit emphasis or visibility in GPRA plans and reports. When this objective is explicit, it not only affirms the value of the US tradition that includes graduate students in the research programs of their advisers—but also shows how reductions in research funding can jeopardize the preparation of the scientists and engineers the nation will need in the future.

Conclusion 7: Agencies often receive conflicting messages from oversight bodies about the desired format, content, and procedures to be used in GPRA compliance. For example, one agency made an effort to tie its GPRA reports more closely to its annual budget, as required in the act, only to be told by a congressional committee to return to a previously used format—another was told the reverse.

Conclusion 8: Due to timing requirements built into the legal guidelines of GPRA, agencies find that they must begin work on performance plans before the relevant performance reports are complete. As a result, the potential benefit of GPRA in providing a mechanism for incorporating performance results of previous years into performance plans for later years is limited.

Conclusion 9: Communication between agencies and oversight groups is not sufficiently regular, extensive, or collaborative. During focus groups, the workshop, and interviews, it was consistently clear that improved communication between these two sectors could reduce the difficulties and misunderstandings experienced by some agencies.

Conclusion 10: The degree to which the results of GPRA results of research programs are being used by oversight groups for programmatic decision-making are uncertain. Are the results of the “results act” being used? In particular, agencies have not yet seen the use of their reports in the congressional decision-making that determines the size and priorities of their budgets.

4.2. General Recommendations

On the basis of these observations, the panel offers the following general recommendations:

Recommendation 1: Federally supported programs of basic and applied research should be evaluated regularly through expert review, using the performance indicators of quality, relevance, and, where appropriate, leadership.

The language of the act strongly urges agencies to evaluate their programs annually through the use of quantitative measures so that progress can be followed with clear numerical indicators. The panel reaffirms COSEPUP's earlier assertion that research programs, especially those supporting basic research, cannot be meaningfully evaluated this way annually. Instead, these programs can be evaluated over a somewhat longer term through expert review, which has a long tradition of effectiveness and objectivity.

Recommendation 2: Agencies should continue to improve their methods of GPRA compliance and to work toward the goals of greater transparency, more-realistic reporting schedules, clear validation and verification of methods, and the explicit use of the development of human resources as an indicator in performance plans and reports.

Transparency refers to the ability to readily see how and why an agency decides to emphasize or de-emphasize a particular program or area of research. When an agency describes its performance plans and reports from an agencywide point of view (for example, an agency might describe its efforts to reduce global warming as though it were a single program), it is difficult for oversight bodies or the public to understand the process of priority-setting. Although oversight bodies or agents of the public would not be expected to review the thousands of subentities that perform their own planning and reviewing within agencies, they can reasonably expect access to documents that help them to answer specific questions.

Although GPRA requires annual reporting on all programs, basic research often does not produce useful results in a single year and must be monitored over several years before outcomes become apparent. Agencies should experiment with alternative reporting forms, as permitted by GPRA, that provide realistic evaluations of long-term research.

Although expert review has long been the accepted method for evaluating research in the science and engineering communities, some aspects of its performance are unclear to outside observers. Agencies should make clear how they validate their research-evaluation methods, such as the means by which they select expert reviewers and choose to aggregate research programs for review.

Agencies have a large stake in the education and training of scientists and engineers, but this objective might not receive explicit emphasis or visibility in GPRA plans and reports. The objective must be explicit not only because it affirms the value of educating young scientists and engineers in the context of research, but also because it demonstrates how reductions in research funding could weaken the corps of human resources that are essential for the nation's future.

Recommendation 3: Agencies and oversight bodies should work together as needed to facilitate agencies integrating their GPRA requirements with their internal planning, budgeting, and reporting processes. In addition, they should work together to adjust the timing of GPRA reporting to capitalize on the value of the planning process.

Whenever possible, agencies should use procedures already in place without adding steps. GPRA should not add unnecessarily to the workload of agencies, and oversight bodies should help agencies to ensure that this does not happen. At the same time, effective linkage of GPRA reporting with budgets may help agencies explain their needs to Congress and justify funding levels during periods of restrained budgets.

Recommendation 4: Agencies should strive for effective communication with oversight groups on the implementation of GPRA. For their part, oversight bodies should clarify their expectations and meet more often among themselves to coordinate their messages to agencies.

A principal purpose of GPRA is to improve how agencies communicate their results to oversight groups, the “users” of research, and the general public. More-effective communication will enhance the value of the act to all constituents.

As indicated in COSEPUP's first report, GPRA is potentially useful because it “provides an opportunity for the research community to ensure the effective use of the nation's research resources in meeting national needs and to articulate to policymakers and the public the rationale for and results of research.” However, the act will not fulfill its intended objectives unless the Senate and House Operations committees, working with OMB, identify and respond to agency concerns through open discussion. Unless the agency responses to GPRA are useful to Congress in the urgent task of setting priorities and budgeting, the value of the act might not warrant the time and effort it requires of the federal government.

4.3. Specific Recommendations

Provided below are the specific recommendations that are scattered throughout this report:

4.3.1. Agency Methods

Recommendation M-1: Agencies should continue to take advantage of their existing expert review panels, but should review the balance in their membership, particularly the need to include user groups, and the time panel members devote to GPRA versus other topics so that it is not excessive. In addition, they should review the degree to which internal vs. external reviewers are used.

Recommendation M-2: Agencies should continue to use peer review to evaluate the quality of their research programs.

Recommendation M-3: Agencies should clarify their use of relevance as a criterion in evaluating their research programs. User groups should be a part of the relevance evaluation process, and their role should be described clearly in performance plans and reports.

Recommendation M-4: Agencies should use international benchmarking to evaluate the leadership level of research programs, as described in COSEPUP's earlier Goals and International Benchmarking reports, especially for emerging fields of research and those of national importance.

Recommendation M-5: The development of human resources should be emphasized as an explicit objective of GPRA performance plans and reviews.

Recommendation M-6: Agencies that choose to aggregate their research-program activities at a high level should endeavor to make clear the decision-making processes that lie below this level.

Recommendation M-7: Agencies should devise ways to describe how they validate their research-evaluation methods, describing, for example, how they select expert reviewers and choose to aggregate research programs for review.

4.3.2. Communication

Recommendation C-1: Agencies and oversight groups should strive to communicate more effectively with each other so as to improve agencies' progress in implementing GPRA.

Recommendation C-2: Agencies should seek to demonstrate more clearly to users and the public how they prioritize and evaluate research programs.

Recommendation C-3: Oversight groups should provide more clarity and consistency in their expectations of agencies that are striving to comply with the requirements of GPRA.

Recommendation C-4: Agencies, especially large mission agencies, should seek to improve internal communication about GPRA so that the evaluation of research activities is not hidden within the agency's overall GPRA reporting.

Recommendation C-5: Agencies should work with oversight bodies to create more-realistic GPRA reporting schedules. Such schedules should recognize the important difference between research programs of differing goals and time frames. While yearly reporting may be appropriate for applied research, a 3-year (or longer) performance schedule for basic research would usually be more suitable and valuable.

4.4. Summary

Much has been learned about the procedures of planning, evaluation, and management in the last several years, and some value will have been gained by the agencies from their own discussion of accountability. However, one key remaining question is the degree to which oversight groups are using the results of the “results act” for programmatic decision-making. Unless the agency responses to GPRA are useful to Congress in the urgent task of setting priorities and budgeting, the value of the act might not warrant the time and effort it requires of the federal government. But by working more closely together than they have in the past, the federal agencies and the oversight bodies can implement the letter and spirit of GPRA in ways that lead to greater efficiency, lower cost, and more effective research programs that are demonstrably conducted in the national interest.

Copyright © 2001, National Academy of Sciences.
Bookshelf ID: NBK44121


  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (1.3M)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...