NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

National Academy of Sciences (US), National Academy of Engineering (US), and Institute of Medicine (US) Committee on Science, Engineering, and Public Policy. Implementing the Government Performance and Results Act for Research: A Status Report. Washington (DC): National Academies Press (US); 2001.

Cover of Implementing the Government Performance and Results Act for Research

Implementing the Government Performance and Results Act for Research: A Status Report.

Show details

3Communication Issues

One of the challenges to agencies and oversight bodies is to interpret the law itself in ways that achieve the desired results of accurate reporting and accountability. The language of the law is general; it addresses the agencies as a single population without distinguishing among them. As mentioned above, however, the variations in structure and function of agencies are considerable. For the most part, individual agencies have been left with the task of working out for themselves the best way to interpret such terms as “program activity,” “performance indicator,” and “program result” within their particular structures.

In the absence of detailed, continuing discussions among the creators of the law, oversight bodies, and agencies, the agencies have little guidance on the best ways to apply such terms to existing agency procedures and research programs. In addition to its comments on these communication difficulties, the panel offers several specific observations about the issue of oversight.

The issue of communication is fundamental to the implementation of GPRA. That is, the essential objective of the law, in the context of research, is to improve the management of government services. And a key element of that management is communication of federal agencies with oversight bodies about the scientific and energy research needs of the nation as a whole. To achieve that objective, long-standing terms and customs of the agencies must be translated and communicated so that they are clear outside the agencies. If this can be achieved, internal “language” and GPRA “language” can be reconciled toward the dual goals of facilitating congressional budgeting activities and enhancing the effectiveness and efficiency of agency management.

This chapter examines the communication between the agencies and the primary audiences for its GPRA reports: oversight groups, the users of research, and the public. As was the case with the previous chapter, the observations here are based on the panel's focus groups and workshop where agency and oversight group representatives discussed agency responses to the following questions:

  • How is the result communicated to different audiences (e.g., S&T community, advisory committees, agency leadership, administration, Congress)?
  • How is the result used in internal and external decision-making?

The agency responses are summarized in Appendix C and the workshop discussion is provided in Appendix D.

3.1. Communication Between Agencies and Oversight Groups

The viewpoints of Congress, GAO, OMB, and other entities interested in the implementation of GPRA vary with their specific charges. In general, however, all of them have expressed a desire to know more about:

  • What procedures the federal agencies use to comply with GPRA;
  • How successful those procedures are; and
  • How the GPRA planning and reporting processes can serve agency missions and the public interest better than is available in the existing documentation.

The panel's discussions with agency and oversight representatives made it clear that communication between agencies and oversight groups must be more regular, extensive, and collaborative to facilitate agency responses to GPRA. Participants in the focus groups and workshop emphasized that improved communication about methods could hasten the implementation of GPRA, increase its value as a planning and accountability tool, and reduce the cost of compliance.

One common complaint from agencies is that oversight bodies are quicker to criticize shortcomings than to suggest improvements or specify desired outcomes. The most constructive course would be for oversight bodies to suggest how agencies can use procedures already in place without adding additional steps. GPRA should not “make extra work,” and oversight bodies should be willing to work with agencies to ensure that this does not happen. The public benefit of such a course is to eliminate unnecessary cost and duplication of effort.

As mentioned above, one objective of GPRA and of the oversight bodies is to clarify the mechanisms used by agencies to validate and verify their evaluation procedures. For example, oversight representatives would like to be assured that the reviewers of research programs are objective, experienced, and expert—again, a communication issue. Oversight bodies have expressed an inability to see or understand how those qualities are validated by agencies, and they have asked for improved communication about the procedures.

In addition, agencies sometimes receive conflicting signals from oversight groups. Even from a single oversight entity, they might receive different guidance from different staff members. One agency revised its accounting to link GPRA reports more tightly with its budgets, for example, only to be told by a congressional oversight committee to return to the previous format to which the committee was accustomed.

Further, different congressional committees prefer different levels of aggregation. Some prefer a high level, in which many or all research programs are considered together; others prefer a disaggregated approach. This presents a confusing picture to agencies.

Most importantly, agencies have not yet seen the use of their reports in the congressional decision-making that determines the size and form of their budgets. That could reduce the incentive of agencies to integrate their own planning and budgeting functions with the requirements of GPRA. Without such integration, agencies duplicate their reporting efforts to serve internal budgeting functions and GPRA requirements.

Recommendation C-1

Agencies and oversight groups should strive to communicate more effectively with each other so as to improve agencies' progress in implementing GPRA. During focus groups, the workshop, and interviews, it was consistently clear that improved communication between these two sectors could reduce the difficulties and misunderstandings experienced by some agencies. Agencies should provide brief, clear summaries of the procedures by which they perform expert review, aggregate programs, validate evaluation methods, set research priorities, and include user groups and other members of the public in planning and reporting. That simple step would allow a clearer view of the links between GPRA documents and agencies' internal procedures.

3.2. Communication by Agencies with User Groups and the Public

User groups, as described above, represent important segments of the public that are served by publicly funded research. They have important roles to play in planning and evaluating research programs. Some agencies make good use of such groups on panels and review committees, but their activities are seldom made clear in GPRA documents or to oversight groups. Agencies can demonstrate the value and operation of their review processes better by publicly describing them to oversight groups, the potential users of research results, and the general public.

Recommendation C-2

Agencies should seek to demonstrate more clearly to users and the public how they set priorities for evaluating research programs.

3.3. Communication by Oversight Groups

Good oversight need not impede agencies that are making a sincere effort at compliance. Oversight activity is likely to be lowest for agencies that have advanced in their efforts to comply with GPRA and greatest for those still struggling. Agencies, for their part, should strive to make the research clear to nonscientists among oversight bodies.

Recommendation C-3

Oversight groups should provide more clarity and consistency in their expectations of agencies that are striving to comply with the requirements of GPRA. They should consult, as requested, on practical ways to integrate agencies' internal planning and reviewing practices with GPRA requirements. They should also meet more often among themselves to coordinate their expectations of agency practices.

3.4. Communication Within Agencies

One objective of the law is to encourage the integration of program activities and strategic planning. With or without GPRA, in fact, each agency can benefit from reviewing its research programs in the light of how they and their individual projects serve the broader strategic plan. The strategic plan, in turn, should evolve year by year in view of the changes that are made in individual programs and projects. More effective use of strategic planning can allow oversight groups to understand the contribution of individual programs to an agency's mission and hence improve agency-oversight communication.

Recommendation C-4

Agencies, especially large mission agencies, should seek to improve internal communication about GPRA so that the evaluation of research activities is not hidden within the agency's overall GPRA reporting.

3.5. The Issue of Timing

Although agency representatives expressed enthusiasm for using the criterion of quality to evaluate research on a regular basis, they voiced concern over the requirement to provide annual reports for their basic research programs. As explained earlier, basic research often does not produce useful results in a single year and must be monitored over several years before outcomes become apparent. A particular concern on the part of agency representatives is that programs and individual researchers might feel pressured to produce evidence of annual achievement in the form of “extra” publications or other meaningless metrics. Such activities would waste valuable resources and distract researchers from productive work.

Because the value of investments in basic research can be evaluated only over long periods, retrospective methods might be more effective than annual reports. For example, NSF is experimenting with “rolling” assessments whereby one-third of the portfolio is evaluated each year. Every research project is thus evaluated every 3 years, a reasonable period in which to expect results.

NSF suggests that that method could be useful for other agencies and that the 3-year focus be applied to performance plans, as well as performance reports. Thus, an agency would set 3-year performance targets, rather than annual performance targets, on research goals. Performance reports would still be annual, but they would cover one-third of the portfolio each year. They would also describe trends in the direction of basic research, the rate of progress of research, and the productivity of special initiatives. Management goals and short-term objectives in applied-research programs, where targets are more easily calibrated and predicted, would still be described annually in performance plans and reports.

A potential benefit of GPRA is the ability to strengthen agencies' planning procedures by making available the research results of previous years. Because of timing requirements built into the legal guidelines of GPRA, agencies find that they must begin work on future performance plans before the most recent performance reports are available. For example, in November 2000, one agency was beginning its performance plan for 2002 before it had finished its performance report for 2000.

One reason for the difficult timing is that the act was designed to enable oversight groups to connect each performance plan and performance report directly with its corresponding annual budget. The timing is unfortunate for several reasons. Agencies and researchers need the flexibility to change the course of a research project if change is warranted by previous and current results. And neither agencies nor the public receive a benefit when agencies create detailed performance plans before they have sufficient recent information on the performance of current programs.

Recommendation C-5

Agencies should work with oversight bodies to create more-realistic GPRA reporting schedules. Such schedules should recognize the important difference between research programs of differing goals and time frames. Although yearly reporting may be appropriate for applied research, a 3-year (or longer) performance schedule for basic research would usually be more suitable and valuable. The schedules should allow agencies to use previous results when preparing performance plans. Agencies should also continue their efforts to integrate GPRA planning and evaluation procedures into current agency processes.

Specifically, the panel suggests that agencies engaged in basic research make 3-year performance plans and set 3-year performance targets for research goals in their performance plans, rather than targets that refer to particular fiscal years. Management goals and short-term objectives in applied-research programs should still refer to a 1-year period. Performance reports should be annual and stress trends and indicators of the direction of basic research and the level of progress and productivity of special initiatives.

3.6. Summary

Communication between agencies and oversight bodies is essential to making the GPRA process work. So far, the communication process has been flawed from the viewpoint of both sides. If agencies are clearer regarding their methodology and oversight groups are clearer and more consistent regarding their expectations, a better and more useful product will result.

Copyright © 2001, National Academy of Sciences.
Bookshelf ID: NBK44124


  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (1.3M)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...