ExpectMore.gov: Endocrine Disruptors (2023)

  • View this program’s assessment summary.
  • Visit ExpectMore.gov to learn more about how Federal Government programs are assessed and their plans for improvement.
  • Learn more about detailed assessments.
Program Code10002280
Program TitleEndocrine Disruptors
Department NameEnvironmental Protection Agy
Agency/Bureau NameEnvironmental Protection Agency
Program Type(s) Competitive Grant Program
Research and Development Program
Assessment Year2004
Assessment RatingAdequate
Assessment Section Scores
Program Purpose & Design80%
Strategic Planning70%
Program Management91%
Program Results/Accountability26%
Program Funding Level
(in millions)
FY2007 $19
FY2008 $19
FY2009 $16
  • Ongoing Program Improvement Plans
  • Completed Program Improvement Plans
  • Program Performance Measures
  • Questions/Answers (Detailed Assessment)

Ongoing Program Improvement Plans

YearBeganImprovement PlanStatusComments

By end of CY, collect data for second year of contracts and compare to baseline of the efficiency measure.

No action taken

Completed Program Improvement Plans

YearBeganImprovement PlanStatusComments

Maintain funding at approximately the FY 2005 President's Budget level.


Articulate clearly R&D priorities to ensure compelling, merit-based justifications for funding allocations.

CompletedThe program's endocrine disruptors research priorities are clearly articulated in the Endocrine Disruptors Research Plan (EDRP) and an even more detailed Multi-Year Plan (MYP), in which priorities are specifically detailed from 2000 to 2012. In its independent, expert review of the program in December 2004, the BOSC Subcommittee on Endocrine Disrupting Chemicals noted that "goals... to address the underlying science needs for risk assessment and management of EDCs continue to be appropriate."

By the end of CY 2006, develop baseline data for an efficiency measure that compares dollars/labor hours in validating chemical assays.

CompletedBy October 30, 2006 collect data based on existing contracts.

By the end of CY 2007, collect data for first year of new contracts and compare to baseline efficiency measures.


Program Performance Measures


Measure: Determination of the extent of the impact of endocrine disruptors on humans, wildlife, and the environment to better inform the federal and scientific communities. (Targets and baseline under development).

Explanation:This is an Office of Research and Development (ORD) and Office of Prevention, Pesticides, and Toxic Substances (OPPTS) shared goal. The measure explicitly links research program to screening program's decisions and to environmental outcomes. Scientific progress of research will be determined through external independent expert panels that will assesss the appropriateness of the measure and extent to which it has been met.


Measure: Reduction in uncertainty regarding the effects, exposure, assessment, and management of endocrine disruptors so that EPA has a sound scientific foundation for environmental decision-making.

Explanation:ORD measure. This long-term measure is a short-term outcome that explicitly links endocrine disrupting chemical (EDC) research to OPPTS decisions and environmental outcomes. Progress in reducing scientific uncertainty will be determined qualitatively through the use of external independent expert panels that will assess the appropriateness of the measures and the extent to which they have been met.


Measure: Improved protocols for screening and testing.

Explanation:ORD measure. Provides annual picture of research progress to develop screening and testing protocols for OPPTS to use. Additional annual milestones for 2007 and 2008 are described in the EDC Multi-Year Plan (MYP).

2001 9 9
2002 10 10
2003 8 8
2004 3 3
2005 2 2
2006 1 1
2007 6 3
2008 2

Measure: Assessment Milestones Met

Explanation:ORD measure. The targets include products such as guidance for assessing endocrine disruptors. Additional milestones for 2007 through 2012 are described in the MYP.

2001 1 1
2002 0 0
2003 1 0
2004 1 1
2005 0 0
2006 1 0
2007 0 0
2008 0

Measure: Risk Management Milestones Met

Explanation:ORD measure. Targets include products such as a Risk Management Evaluation of EDCs and a report on optimizing wastewater treatment plan operations to remove certain EDCs to be used by the Office of Water. Additional milestones for 2007 through 2012 are described in the MYP.

2001 2 2
2002 0 0
2003 3 3
2004 5 5
2005 5 5
2006 3 3
2007 3 2
2008 1

Measure: Effects and Exposure Milestones Met

Explanation:ORD measure. Targets below include products that will help determine the extent of ED impact, such as reports identifying androgenic compounds in paper mill effluent; assessing children's exposure to pesticides, EDCs, and other persistent organic pollutants; and potential effects of flame retardants on human thyroid function. Additional milestones for years 2007 and 2008 are described in the MYP.

2001 22 22
2002 22 22
2003 18 18
2004 5 5
2005 5 5
2006 9 9
2007 4 5
2008 5

Measure: Cumulative number of screening assays that have been validated.

Explanation:The proposed targets are lower than the original targets as a result of a number of factors that influenced the rate at which assay validation could be completed and that were outside the control of EDSP. The word "screening" is deleted from the original text because this measure includes more than just the screening assays.

2005 11/20 0
2006 11/20 2/21
2007 8/20 3/20
2008 13/20
2009 14/19
2010 19/19

Measure: Contract cost reduction per study for assay validation efforts in the Endocrine Disruptor Screening Program.

Explanation:The average cost per study was calculated based on contract costs. A laboratory study was defined as conduct of an assay with a single chemical in a single lab, and represents standardized study costs based on a mix of in vitro and in vivo studies, as well as detail review papers. The baseline average cost per study is $62,175 in 2006. The measure of efficiency will be judged based on the target of a 1% cost reduction per year for three (3) years.

2007 1% 63%
2008 1%
2009 1%
2010 1%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design

Is the program purpose clear?

Explanation: The Endocrine Disruptors Program provides EPA with the scientific information necessary for the Agency to reduce or prevent potential unreasonable risks to human health and wildlife from exposures to chemicals that adversely affect the endocrine system, called endocrine disrupting chemicals (EDCs). The program develops and uses validated test systems and other scientifically relevant information to determine whether certain substances may have an effect in humans and wildlife similar to naturally-occurring estrogen. The program is comprised of two components: the screening program, which is mandated under the Food Quality Protection Act and the Safe Drinking Water Act, and the research program, which provides the scientific information and tools for the screening program to fulfill its purpose. These programs are carried out, respectively, by EPA's Office of Pesticides, Prevention, and Toxic Substances (OPPTS) and EPA's Office of Research and Development (ORD). Under statute, the program is to provide testing for all pesticide chemicals, at a minimum.

Evidence: Office of Research and Development Strategic Plan, pp. 32-33, (www.epa.gov/osp/strtplan/documents/final.pdf). Research Plan for Endrocrine Disruptors, p. 6-8, www.epa.gov/ORD/WebPubs/final/ORD-EDR-Feb1998.pdf. Multi-Year Plan for Endocrine Disruptors, pp. 4-6, (www.epa.gov/osp/myp/edc.pdf). Federal Food Drug and Cosmetic Act [21 U.S.C 346a(p)], Endocrine Disruptor Screening and Testing Advisory Committee (EDSTAC) Final Report (EPA/743/R-98/003). www.epa.gov/scipoly/oscpendo/edspoverview/edstac.htm. Authorizing legislation to conduct research: Section 346a(p) of the Federal Food Drug and Cosmetic Act, as amended by the Food Quality Protection Act of 1996 (P.L. 104-170); Safe Drinking Water Act, Section 1442, as amended, Public Law 93- 523, Toxic Substances Control Act, Section 10, as amended 15 U.S.C. 2609; Federal Insecticide, Fungicide, and Rodenticide Act, Section 20, as amended 7 U.S.C. 136r.


Does the program address a specific and existing problem, interest or need?

Explanation: Congress required creation of the program due to the growing awareness over the last two decades of the possible adverse effects to humans and wildlife from exposure to chemicals that can interfere with the endocrine system. These effects can include developmental malformations, interference with reproduction, increased cancer risk, and disturbances in the immune and nervous system function. Some chemicals cause these effects in wildlife, but more research is needed on the potential of chemicals to cause these effects in humans at environmental exposure levels. EDCs are suspected to cause the reported declines over the last four decades in male reproductive health (e.g., decreases in the quality and quantity of sperm production, increases in certain malformations of male reproductive organs) and increases in certain endocrine-related cancers, such as breast, prostate, and testicular. Very few chemicals have been tested as to their potential to interfere with the endocrine system, and current standard test methods do not provide adequate data to identify potential EDCsor to assess the potential risks to humans and wildlife. FQPA required EPA to set up the EDSP using validated test methods and in addition to developing assays and determining what chemicals may affect the endocrine system, EPA is responsible for then setting in place protections frompotential harm from these chemicals.

Evidence: Special Report on Environmental Endocrine Disruption: An Effects Assessment and Analysis (EPA/630/R-96/012). www.epa.gov/ORD/WebPubs/endocrine/endocrine.pdf. National Research Council, Hormonally Active Agents in the Environment, NAS Press, Washington, DC, (www.nap.edu/books/0309064198/html/). World Health Organization/International Programme on Chemical Safety, Global Assessment of the State of the Science of Endocrine Disruptors, 2002, pp. 1-3; 131-132, www.who.int/ipcs/emerg_site/edc/global_edc_TOC.htm. Committee on Environment and Natural Resources (CENR), The Health and Ecological Effects of Endocrine Disrupting Chemicals: A framework for planning, National Science and Technology Council, Office of Science and Technology Policy, November 22, 1996, Washington, DC., pp. 14-15 (www.epa.gov/endocrine/frametext.html)


Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: EPA made special efforts when creating the program to consider other ongoing efforts, but it is not clear that some elements of this program currently are not duplicative or redundant of other federal efforts. An interagency working group (IWG) on ED research has been in place under the Administration's Committee on the Environment and Natural Resources (CENR). EPA chairs the IWG, which includes fourteen agencies. Over the years, the CENR working group: 1) developed a framework for federal research related to the human health and ecological effects of EDCs; 2) developed an inventory of on-going federally funded research on EDCs; and, 3) overlaid the framework with the inventory to identify high priority research gaps in the federal portfolio. CENR, the IWG, and EPA, however, have not been diligent about reviewing what type of research is conducted by its members to make sure no duplication is occurring unless deemed necessary. CENR has recently solicited from agencies a detailed accounting of ED research being pursued. While some degree of duplicationis acceptable, the explanation must justify that the program provides value beyond that of similar efforts, including those at other federal agencies. The screening program is the only US program statutorily required to develop assays and require testing of pesticides for their potential to disrupt the endocrine system,and therefore, it is not in duplication with any entity.

Evidence: In the absence of this most recent CENR inventory, which the agencies have not yet filled out, it is not clear that duplication of some ED research is not occurring, particularly health effects research, and whether this duplication is appropriate. Committee on Environment and Natural Resources (CENR), The Health and Ecological Effects of Endocrine Disrupting Chemicals: A framework for planning, National Science and Technology Council, Office of Science and Technology Policy, November 22, 1996, Washington, DC., pp. 1-2 (www.epa.gov/endocrine/frametext.html) .


Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The research and screening programs were developed with input from external scientists and stakeholders to ensure the most appropriate design. EPA solicited input from scientists nationally and internationally in identifying research questions prior to developing its ED Research Plan. The Reseach Plan was then reviewed by an external peer review panel (1997) to ensure that the research program addressed the most critical scientific questions. The research program drafted a multi-year plan (MYP) in 1999 and updated it three times to take into consideration scientific progress. The screening program was design by EDSTAC, a Federal advisory committee with broad stakeholder and scientific input on the basis of the best available science. The Science Advisory Board (SAB) and the Scientific Advisory Panel (SAP) reviewed the proposed design of the program in 1999. The SAB/SAP cautioned EPA about taking on too much too soon regarding this complex area and recommended that EPA implement the program in stages with 50-100 chemicals for screening in the initial stage followed by a thorough review of the program before proceeding to a second batch of chemicals.

Evidence: Endocrine Disruptor Screening and Testing Advisory Committee (EDSTAC) Final Report;chapter 7 epa.gov/scipoly/oscpendo/history/finalrpt.htm. Review of the EPA's Proposed Environmental Endocrine Disruptor Screening Program, Joint Committee of the Science Advisory Board and Scientific Advisory Panel EPA-SAB-EC-99-013, July 1999, pp. 1-3 (epa.gov/sab/pdf/ec13.pdf). Endocrine Disruptor Screening Program, Proposed Chemical Selection Approach for Initial Round of Screening; Request for Comment, December 30, 2002, www.epa.gov/scipoly/oscpendo/docs/12-02-frnotice.pdf.


Is the program effectively targeted, so that resources will reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: The research program is relevant to the specific national need of learning more about the potential exposures and effects of EDCs as well as assisting OPPTS, its customer, to create and refine assays for validation for OPPTS's screening of chemicals for ED potential. In at least one example, the research program works effectively with EPA's program offices, its other customers. For instance, the research program is conducting research on wastewater treatment operations to remove certain EDCs as well as research on identifying androgenic compounds in paper mill effluent, both to aid EPA's Office of Water. This question does not apply to the screening program at this time because the program is in the process of determining assays to validate for future screening; it does not have beneficiaries in the traditional sense of equity or disbursement of services.

Evidence: Endocrine Disruptor Screening Program, Proposed Chemical Selection Approach for Initial Round of Screening; Request for Comment, December 30, 2002 (67FR 79611) (www.epa.gov/scipoly/oscpendo/docs/12-02-frnotice.pdf). List of assays currently undergoing validation: www.epa.gov/scipoly/oscpendo/assayvalidation/status.htm. Research Plan for Endocrine Disruptors, pp. 7-8, www.epa.gov/ORD/WebPubs/final/ORD-EDR-Feb1998.pdf. Multi-Year Plan for Endocrine Disruptors, Fig. 4 (www.epa.gov/osp/myp/edc.pdf).

Section 1 - Program Purpose & DesignScore80%
Section 2 - Strategic Planning

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The research and screening programs share a long-term outcome measure, to determine the extent of the impact of EDCs on humans, wildlife, and the environment to better inform federal and scientific communities. The research program has another long-term outcome measure, which is to reduce the uncertainty of the effects, exposure, assessment, and management of EDCs so that EPA has sound scientific foundation for decisionmaking across the Agency. The research program also has a long-term output measure of conducting science that supports OPPTS's screening and program. The screening program has one long-term output measure, which is the number of chemicals screened for potential ED effects. An outcome measure, such as concentrations of EDCs in the environment or risk reduction in humans and/or wildlife, is not appropriate at this time, because the program has not yet determined which chemicals may cause adverse effects. Once the program has screened chemicals, it can then develop regulations to protect the public, by which it will be able to collect data for more appropriate outcome measures at that time.

Evidence: See Measures tab.


Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Not all the measures have targets and timeframes. The targets for the long-term measures for the research program are not quantitative, and targets and progress will be verified by ratings from an expert scientific panel. For the research measures to be successful, the panel will look at the progress EPA has made on its "ten questions of uncertainty" from its research strategy and then determine appropriate qualitative targets (i.e., defining "success"). Each of the three measures, including the shared measure, addresses at least one of the ten uncertainty questions. The research program, however, is in the process of a charge for external review and has not yet defined its LTG targets. The screening program's long-term measure is new and targets and timeframes are under development. The targets and timeframes are highly dependent on the availability in the future of screening assays. The targets and timeframes for both the research and screening programs are tied to each others measures, because the programs depend on each other to accomplish the purpose ofthe program as mandated by Congress.

Evidence: See Measures tab.


Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The research program's EDC MYP lists annual measures, or milestones, that tie into the long-term mesures, including the long-term measure shared with the screening program. Due to the variety of annual measures proposed, research conducted, and workproducts developed, it is not appropriate at this time for the research program to have an efficiency measure. The development of an efficiency measure for the research program should be evaluated in the future when both ORD and OPPTS have made more progress on the long-term performance measures. The screening program has annual measures to support its long-term measures, including the long-term measure shared with the research program. In addition, the screening program is developing an efficiency measure of cost per labor hour of contracted validation studies. The program is moving away from a large mission support contract to a new multiple award contract, which will encourage competition of work assignements among several vendors. The efficiency measure will enable the program to determine costs to accomplish a specified amount of effort in assay validation and whether increased efficiencies were realized via a contract mechanism of vendor competition under each work assignment.

Evidence: See Measures tab. Multi-Year Plan for Endocrine Disruptors, pp. 18-33, Figures 1-3. (www.epa.gov/osp/myp/edc.pdf). Research Plan for Endrocrine Disruptors, pp. 14-32. www.epa.gov/ORD/WebPubs/final/ORD-EDR-Feb1998.pdf.


Does the program have baselines and ambitious targets for its annual measures?

Explanation: The screening program has baselines, but not adequate targets. The baselines for the screening program is zero in 1996, because no ED screening and testing existed when FQPA was enacted and no prevalidation or validation studies had been completed. The baslines for the research program are represented by the state of science at the initiation of the integrated research program. For example, in 1998, EDSTAC identified a few assays to use as starting points but stated that no assays were considered to be 'validated' at the time. A description of the state of the science served as the basis of discussion and input at two international workshops EPA convened to identify key critical research needs before developing a Research Plan. The state of the science related to ED has also been evaluated in other documents subsequent to finalization of the Research Plan. The research program has annual milestones for its long-term goals, which is included in its MYP.

Evidence: See Measures tab. Multi-Year Plan for Endocrine Disruptors, pp. 18-33, Figures 1-3. (www.epa.gov/osp/myp/edc.pdf). Research Plan for Endrocrine Disruptors, pp. 14-32. www.epa.gov/ORD/WebPubs/final/ORD-EDR-Feb1998.pdf.


Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Screening program: contractors and subcontractors commit to the program's long-term and annual goals, which are defined in the scope of work. Then new Endocrine Disruptor Screening Validation Support Services contract will require periodic reviews to ensure that the newly developed annual performance measures and long term measure are adhered to. Other federal agencies cooperate with the EPA in standardization and validation of assays through Interagency Agreements (IAGs), which include performance goals. The program also has commitment with the Organization for Economic Cooperation and Development (OECD), with which it works, via a statement in its objectives. Partners for the research program include EPA's program and regional offices, federal agencies, and extramural grantees. All program's commit to the research program's long-term and annual goals. Grant proposals are developed consistent with the LTGs and APGs of the research program and supportive of EPA's long-term goals. Annual reports from extramurally-funded proposals are reviewed for consistency with stated goals to ensure funded research supports long-term Agency goals. Periodic program reviews, that bring together intramural and extramural scientists supported through the program, are held to monitor the progress toward the long-term goals.

Evidence: For the screening program: Annual reports, monthly invoices, and monthly progress reports from the contractor are reviewed for consistency with the stated objectives found in the scope of work. Regular meetings and communication with various workgroups (e.g., Priority Setting Workgroup and Regulatory Assessment Group) and partners such as OECD, to monitor progress toward the LTGs of the program. For the research program, EPA's program and regional offices commit to the annual and long-term goals of the research program through ORD's Research Planning Process (RPP), in which they participate via an ED Research Coordination Team. Scope of Work for contracts and IAGs (representative example - main support contract given). Standard Instructions for Submitting a STAR Application, pp. 10-11, (www.epa.gov/ncer/rfa/forms/standinstr_03-04a.pdf).


Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The screening program does not have independent reviews of sufficient scope that examine how well the program is accomplishing its purpose and meeting its long-term goals. A review was completed of the proposed program in 1999 but it focused on design and plans for implementation rather than performance. EPA cites reviews by it advisory committee (EDSTAC), which has met regularly in the past three years, and its stakeholder group (EDMVS). Because both committees are made up of stakeholders they are not independent. The screening program must conduct another independent high-quality review of its program. For the research program, the 1997 peer review of the research plan was an evaluation of design and implementation, not results. Components of the research program as well as the labs have been peer or expert reviewed, but a review of the performance of the research program is more appropriate. ORD, is initiating a formalized mechanism for conducting independent external expert reviews of the quality, relevance, and performance of EPA research programs, including the ED research program, which will take place every four to five years and are intended to provide a qualitative assessment of the progress of EPA research in advancing the science and reducing uncertainty. In addition, ORD's Board of Scientific Counselors (BOSC) will assess progress of long-term goals.

Evidence: Review of the EPA's Proposed Environmental Endocrine Disruptor Screening Program. Joint Subcommittee of the Science Advisory Board and Scientific Advisory Panel. EPA-SAB-EC-99-013, July 1999. (www.epa.gov/sab/pdf/ec13.pdf). National Risk Management Research Laboratory, Scientific Written Peer Review of the National Risk Management Research Laboratory's Risk Management Evaluation (RME) of Endocrine Disrupting chemicals (EDCs), June 30, 2003. National Center for Environmental Assessment, Scientific Peer Review of An integrated human health and ecological species effects assessment: A case study of bisphenol A, 2003.


Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: EPA estimates and budgets for the full annual costs of operating its programs, taking into consideration any changes in funding, policy, and legislation. EPA managers use up-to-date financial, policy, and regulatory information to make decisions on program management and performance. EPA's financial information is integrated with performance and other program data to support day-to-day decisionmaking of managers and executives. The screening program's budget is transparent inasmuch as it has since inception been a "key program" accounted for separatley in EPA's Congressional Justification. In FY 2004, the program is assigned a unique project ID tied to a single GPRA goal and objective in EPA's new accounting structure, which will enable the program to evaluate funding changes to performance and vice versa. The research program requires that all short- and long-term impacts to critical research paths described in the multi-year plans are identified in order to inform all investment or disinvestment decisions. Specifically, disinvestment proposals must be accompanied by information on impacts to existing annual and long-term goals, and investment proposals must be accompanied by meaningful performance information that is consistent with the research directions outlined in the relevant multi-year plan(s).

Evidence: Annual Congressional Justification, Budget Automation System (BAS) reports. EPA was selected as a government-wide finalist for the 2002 President's Quality Award in the area of budget and performance integration. Program/Projects for 2004-2005 sorted by PRC code.


Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The screening program is working to adopt a limited number of long-term and annual performance measures with ambitious targets and timeframes. The research program is implementing plans to conduct regular independent reviews of its research programs.

Evidence: Plans to review programs and ongoing effort to adopt adeqate measures.


If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: As noted in 1.3, using the federal and global endocrine disruptors inventories of research, coordinated by EPA, it is possible to compare potential benefits of programs in other federal agencies, other countries (including EU and Japan), academia, and industry that address related topics associated with ED effects, exposure, assessment and management to ensure that work is not duplicated and to fill research gaps. CENR has initiated a data call to its ED workgroup members to report on activities and research being undertaken in the federal agencies so that a comparison can be made. Comparisons like these should occur more frequently in the future.

Evidence: CENR.


Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: ORD's Strategic Plan identified EDC research as one of six high-priority topics. ORD's Research Plan identifies and prioritizes key research areas within the broad topic of endocrine disruption, and the MYP identifies how and when EPA will address specific areas over the next ten years. The priorities are refined with each update of the MYP and consider input from the Agency's program and regional offices, progress to date, and awareness of research ongoing in other organizations. The priorities inform ORD's risk-based contingency planning process, in which: the program and regional-office members of the multi-media Research Coordination Teams (RCTs) identify their problem-driven research needs; RCTs prioritize the needs; and, ORD and EPA management base resource decisions on the prioritized information. In developing the priorities for the Research Plan and MYP, the degree of emphasis for research areas and long-term goals was based on the following: 1) the importance of the research to EPA program/regional offices; 2) magnitude of uncertainties in the knowledge base; 3) sequence of research needed for a final answer; 4) possibility that research would result in significant product(s) for hazard identification, risk characterization, or risk management; 5) technical feasibility of conducting a successful project; and 6) statutory timeframes.

Evidence: Strategic Plan of the Office of Research and Development, pp. 13-18, 30-31. (www.epa.gov/osp/strtplan/documents/ord96strplan.pdf). Office of Research and Development, 1997 Update to ORD's Strategic Plan, pp. 11-15. (epa.gov/osp/strtplan/documents/1997rev1.pdf). Research Plan for Endocrine Disruptors, p. 10, Appendix VI. www.epa.gov/ORD/WebPubs/final/ORD-EDR-Feb1998.pdf. Multi-Year Plan for Endocrine Disruptors, pp. 13-14. (www.epa.gov/osp/myp/edc.pdf). ORD Planning Guidance, Office of Science Policy Intranet Site (hhtp://intranet.epa.gov/ospintra/Planning/fy05guid.pdf).

Section 2 - Strategic PlanningScore70%
Section 3 - Program Management

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The screening program receives monthly progress reports from contractors on the progress of assay validation. Meetings with the research program's scientists are conducted as needed weekly. The Endocrine Disruptor Methods Validation Subcommittee (EDMVS), established and administered by the screening program, provides technical and scientific advice on each of the assays as they progress through the validation process. The endocrine disruptor priority setting workgroup composed of members from mutliple Agency program offices, meet weekly to develop Agency policy. The EDSP meets regularly, occasionally 'retreats', and briefs management several times per year to keep all abreast of progress and to chart future directions of the program. The research program reports progress towards achieving the performance measures is quarterly and summarized yearly within ORD. This information is used to inform the annual planning process as well as to update the MYP. Contractors and holders of cooperative agreements are monitored on a regular basis to ensure their progress is compatible with the overall aims of the MYP. STAR grantees are required to report annual progress and final results, including significant accomplishments which are posted on a public website. The ORD researchprogram has been adjusted based on the data generated by ED research.

Evidence: Monthly Progress reports from contractors. STAR Web Site on Terms and Conditions (http://es.epa.gov/ncer/guidance/tscs99.html). Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM); evaluation of in vitro test methods for detecting potential endocrine disruptors: Estrogen receptor and androgen receptor binding and transcriptional activation assays, page xxv-xxvi, preface, National Toxicology Program, Research Triangle Park, NC, 2003 (http://iccvam.niehs.nih.gov/methods/endodocs/edfinrpt/edfinrpt.pdf).


Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: For the screening program, annual performance of managers and staff is evaluated considering their success in achieving planned accomplishments as well as the impact of unforeseen events. As a technology development and validation program, the program encounters setbacks that cannot be planned for. What is critical is how the program and staff respond to those setbacks and their success in moving forward despite them. The research program incorporates program performance into personnel performance evaluation criteria. Through mid-year and end-of-year performance reviews conducted by ORD's Deputy Assistant Administrator for Management, senior managers are accountable for specific performance standards relating to program goals, including progress toward the targets and timeframes described in the multi-year plans. The research program also monitors progress against GPRA targets, including mid-year reviews with the Deputy Administrator. Some research is conducted extramurally. For both programs, contracts and grants, statements of work, deliverables, costs, and schedules are written into award terms. OPPTS and ORD project officers and Work Assignment Managers are responsible for seeing that agreements are awarded and managed according to government regulations to give value to the government and public.

Evidence: EPA encourages that contracts and grant management be reflected in individual performance criteria as applicable. Manager's personnel performance evaluation criteria.


Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: The screening program has consistently obligated all of its funds in a timely manner since its inception. Specifically, it has always obligated its funds within the two-year period they are available for obligation, following an Operating Plan developed each year with the proposed budget and modified according to the Congressional appropriations. The funds are spent for the intended purpose. Obligations and expenditures are tracked in EPA's Integrated Financial Management System (IFMS) against the Operating Plan. The research program monitors obligations and spending via monthly IFMS status of funds reports. As of March 31, 2004, the research program had obligated 90% of its total two-year FY2003/2004 resources. EPA works with grantees to ensure that recipient spending is consistent with the approved workplan. Each program office and grants management office conducts post-award monitoring of assistance agreements, including monitoring the draw-down of funds against grantee progress on workplan tasks and deliverables. This monitoring ensures that recipients are spending the fundsdesignated to each program area for the intended purpose. All grantees are required to submit annual or more frequent financial status reports.

Evidence: Budget Automation System (BAS) data. EPA Records Schedule 299-Budget Automation System (BAS) (www.epa.gov/records/policy/schedule/sched/299.htm). Environmental Protection Agency, Annual Reports and Financial Statements (www.epa.gov/ocfo/finstatement/finstatement.htm).


Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The screening program uses existing infrastructure and procedures in other programs to improve efficiency and reduce operating costs overall. The program uses competitive sourcing procedures for obtaining contracts. The program is drafting a Request for Proposal (RFP) for the future contract on assay validation as a performance-based contract rather than the current level-of-effort contract to improve performance per program dollar. The program utilizes international partnerships, where possible, to share the costs associated with assay validation and to bolster the assay(s) scientifically. For the future, to assist the screening program in identifying efficiency improvement opportunities for the three major program components more effectively, the program has begun implementing Microsoft project, a tracking software tool. For the research program, ORD has a Total Cost of Ownership (TCO) initiative that builds upon guidance from the federal Chief Information Officer's (CIO) Council and best industry practices to develop more effective, efficient ways to provide quality desktop and server management. TCO's objectives are to reduce administrative expenses, increase efficiency, and maintain the same level of service and services. Through TCO, ORD has realized significant cost savings while enhancing the efficacy of conducting its science and research.

Evidence: The three components of ORD's TCO initiative that helped it realize cost savings are Desktop Replacement, Network Operations Center, and Consolidated Call Center. Infrastructure [Contract support (OPPT), IT supplies (OPPT), Grant support(OPPT SEES), FR Notice office (OPPTS), docket (OPPT)].


Does the program collaborate and coordinate effectively with related programs?

Explanation: The screening and research programs coordinate and collaborate extensively on the direction of the overall research program and the development and validation of the assays: The ED program collaborates with the Office of Pesticide Programs (OPP), Office of Prevention and Toxics (OPPT), Office of Water (OW), and Office of General Council (OGC) on the development of the endocrine program's priority setting. The ED program collaborates with federal agencies through the EDC IWG of CENR, which has resulted in two joint solicitations and funding of 24 research projects (14 of which by EPA). Through the Federal Advisory Committee Act process, EPA scientists participate with stakeholders (e.g., environmental advocacy groups, chemical industry, animal welfare organizations, academia, other federal and state agencies) in a public forum to solicit advice as we progress toward the goals of the EDSP as mandated by Congress. These collaborations led EPA to jointly sponsor a series of workshops with the chemical industry and an environmental advocacy group; 4. Internationally, the ED program collaborates and coordinates through OECD on developing validated assays, particularly in the ecotoxicity testing areas. EPA also collaborates with the European Union, Japan, and through WHO to jointly sponsor ED workshops.

Evidence: CENR, The Health and Ecological Effects of Endocrine Disrupting Chemicals: A framework for planning, National Science and Technology Council, Office of Science and Technology Policy, November 22, 1996, Washington, DC., pp. 14-15 (www.epa.gov/endocrine/frametext.html) Proceedings for joint CMA and WWF workshops. Japan National Institute for Environmental Studies-US EPA International Workshop on Endocrine Disrupters, February 28 - March 3, 2001. Joint Research Centre European Commission, National Institute of Environmental Health Sciences, Environmental Protection Agency 1999. Expert Panel Meeting on Opportunities for Collaborative EU/US Research Programmes on Endocrine Disrupting Chemicals. World Health Organization/International Program on Chemical Safety Global Assessment of the State of the Science of Endocrine Disruptors (who.int/pcs/emerg_site/edc/global?edc_TOC.htm)


Does the program use strong financial management practices?

Explanation: The program follows EPA's financial management guidelines for committing, obligating, reprogramming, and reconciling appropriated funds. Agency officials have a system of controls and accountability, based on GAO and other principles, to ensure that improper payments are not made. At each step in the process, the propriety of the payment is reviewed. EPA trains individuals to ensure that they understand their roles and responsibilities for invoice review and for carrying out the financial aspects of program objectives. EPA received an unqualified audit opinion on its FY 2003 financial statements and had no material weaknesses associated with the audit. EPA is taking steps to meet the new accelerated due dates for financial statements. The ED program has no material weaknesses as reported by the Office of the Inspector General (OIG) and has procedures in place to minimize erroneous payments. Since October 2003, the screening program has instituted new procedures for identifying and distributing screening program funds in IFMS more in alignment with the program's goals and objectives.

Evidence: Budget Automation System (BAS) reports. Unqualified audit opinion on EPA FY02 financial statements. Fiscal Year 2002 Advice of Allowance Letter. 2002 Integrity Act Report, resource policies at: http://intrasearch.epa.gov/ocfo/policies. EDSP Annual Budget (Tracking Sheets). GAO 'scrub' of the EDSP FY04 budget completed in spring 2003.


Has the program taken meaningful steps to address its management deficiencies?

Explanation: In FY 2003, no Agency- or program-level material weakness was identified for the EDSP or ED research program by the Federal Managers Financial Integrity Act (FMFIA) annual review process. In addition, any recommendations to the research program that may result from the PART review will be addressed through ORD's MYP team and its National Program director, who oversees the management of the program.

Evidence: FMFIA Annual Review Process. MYP development/review process. Memo Announcing National Program Director, December 17, 1999.


Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: The research program's extramural research grants are awarded through EPA's Science to Achieve Results (STAR) grants program, which uses external scientific peer review to rate applications based on scientific merit. Only applicants rated as excellent or very good, typically 10 to 20 percent of proposals, are considered for funding based on relevance to EPA's programmatic priorities.

Evidence: EPA National Center for Environmental Research website: RFA announcements (http://es.epa.gov/ncer/).


Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The research program designates grant project officers to monitor grantee performance, including submission of annual progress reports and compliance with federal requirements. Grantees provide a list of publications, presentations, and other activities on an annual basis and at the end of their grant period. Three workshops have been held that brought together all the grantees to discuss their work, one of which integrated grants' results with results from the intramural program. The research program also has one cooperative agreement with CIIT Centers for Health Research in RTP. ORD's oversight practices for cooperative agreements are similar to those for grants in that project officers monitor performance through progress reports and are provided lists of publications, presentations and other activities annually and at the end of the agreement. Because EPA researchers work much more collaboratively with cooperative agreement partners than with grantees, there is a significantly greater amount of involvement in, and oversight of, their activities.

Evidence: Grant Project Officers. Cooperative agreement with CIIT Centers for Health Research


Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: An annual progress report is submitted by each grantee and posted on the EPA National Center for Environmental Research website. Reports are distributed to EPA staff to disseminate to interested parties. These reports include summaries of progress in relation to project objectives as well as publications of research results. Grantees also present results at the multitude of ED-related national and international scientific conferences held annually. Project officers monitor cooperative agreement performance through annual progress reports. Results of cooperative agreements are made available through publication in scientific journals. Progress reports and publication information are posted on the NCER web site.

Evidence: Grantee Annual Progress Reports. EPA National Center for Environmental Research website: RFA announcements (http://es.epa.gov/ncer/). Results of cooperative agreements published in scientific journals.


For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?

Explanation: The research program does not provide sufficient evidence that allocation of funding, including intramural, is based on a competitive process of merit. The program cites that internal funding is allocated to high-priority project areas as determined by ORD's planning process and internal programmatic reviews. The program also notes that it has a comprehensive quality assurance program and that division-level peer reviews take place every three to four years. Guidance states that allocation of funds through the selection of research proposals based solely on peer review is not equivalent to a merit-based competitive process, so alone would not satisy this question.

Evidence: Evidence as required in the guidance is not provided.

Section 3 - Program ManagementScore91%
Section 4 - Program Results/Accountability

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The research program has demonstrated progress in achieving its long-term performance goals, including the shared measure with OPPTS. In 1998, few assays were identified as starting points and none were considered "validated" at the time. The research program refined these assays and developed new ones when the starting point assays were found to be unreliable or inadequate. The research program is developing not only assays for near-term use through the screening program but also the subsequent generation of assays using molecular and computational approaches (such as EPA's Computational Toxicology research program). The screening program's long-term goals are new or under development, but historical information has shown that the program has been slow to make progress towards its LTGs. Mandated by FQPA in 1996, the program to date does not have any validated assays and therefore cannot begin to screen chemicals. Public sentiment may also be of the same opinion. In 1999, the Natural Resources Defense Council (NRDC) sued EPA contending in part that EPA had not moved expeditiously to implement the screening program.

Evidence: Historically, ORD's EDC research program has developed performance indicators that monitor research activities and outputs, which is appropriate for a research program in the early stages of its development. ORD has since changed its performance management framework to implement recommendations from EPA's IG, OMB, OSTP, and the NRC as is appropriate for a more mature research program. Because some of the starting point assays failed during the process of development and demonstration, the program had to set even more ambitious targets to develop assays that were reliable, relevant, and reproducible - targets that it has met.


Does the program (including program partners) achieve its annual performance goals?

Explanation: The screening program has not made as much progress as expected for the assays targeted for inclusion in the program, which are its APMs. To be fair to the program, the original performance goals were overly ambitious and assumed that all scientific studies would be performed perfectly, would not require significant coordination with groups outside EPA's control, and would give unambiguous, ideal results. A number of unexpected hurdles, listed in the evidence section, hindered expected progress for all the original measures. FQPA mandated the screening program in 1996, and to date, no assays have been validated; however, all the Tier 1 battery of assays are in the latter phases of validation, and Tier 2 assays are progressing through pre-validation. For the research program, all annual performance targets supporting the LTGs, except one, have been completed on schedule. Several protocols developed through the ED research program are being validated either for use by the screening program or OECD. To date ORD has produced at least eight protocols, prepared detailed review documents for three protocols, and was a major contributor to another four detailed review documents. In addition, ORD is working on the next generation of assays. Another example of progress in ED research, which should not be taken alone as a measure of success, is that almost 400 articles have been published in peer reviewed journals under the ED research program.

Evidence: Screening: The original annual goals did not take into account many complexities in a validation activity of this scale. Additional time costs associated with contract performance (time developing acceptable study plans, QAPPs, etc.), experimental success (some experiments fail requiring modification of test method or otherwise provide unexpected results), lab capacity (limited number of experiments that can be performed at any given time), biological limitations (time required for animals to be available at proper life stage, acclimation time, in-life phase of the experiment), FACA consultation (time to consult with advisory group, assimilate feedback, make modifications, and adjust experimental time frames), and international coordination (time to coordinate with international partners) were not duly accounted for in the original goals. EPA is in the process of revising its official annual targets to consider the recognized complexities and make the new targets both ambitious and realistic. In view of the constraints encountered, program accomplishments have progressed as rapidly as technically feasible and have been remarkable.


Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The screening program has used available resources efficiently, establishing worksharing relationships with other national and international partners in a number of areas. These include international validation efforts of several key assays. A proposed new multi-award contract will be implemented that will encourage competition and improve efficiencies. The contract will be performance based and adaptive to further promote efficiency. In conjunction with these changes, the program is developing a 'dollars/labor hour' efficiency measure for its validation contract efforts. The screening program has begun or will begin in the near future implementing some improvements, such as a new contracting mechanism and Microsoft project as referenced in Question 3.4, which will help it realize improvements in this area. The research program ha also achieved cost savings from its Total Cost of Ownership intiative as well as international collaboration on science and research.

Evidence: Multi-award contract package. OECD Endocrine Disrupter Testing and Assessment homepage www.oecd.org/document/62/0,2340,en_2649_34377_2348606_1_1_1_1,00.html. TCO initiative actions to realize cost savings: Desktop Replacement, Network Operations Center, and Consolidated Call Center.


Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: There are federal programs conducting similar research into the exposure and effects of EDCs, however, no comparison has been made. CENR is undertaking some type of information collection, but the results of which will not be available in the near future.

Evidence: CENR.


Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: Components of the research program have undergone different levels of review. Periodic peer reviews have indicated the research is consistent with the program's goals. The reviews have been supportive of the program's direction. The National Academy of Sciences reviewed the EDC extramural grants as part of its review of EPA's Science to Achieve Results (STAR) grants program as a whole, and noted that "[STAR] research in endocrine disruptors . . . has resulted in groups of peer-reviewed publications of immediate use in understanding causes, exposures, and effects of environmental pollution." Four of EPA's labs that conduct ED research convened external panels to review those programs and adjusted the research program in response to recommendations from each of these panels. The research supporting the screening program has resulted in protocols that were proposed for validation. It is expected that the research program's implementation of future regular and independent reviews of the R&D Investment Criteria as well as the BOSC review of the MYP will inform the program and the public of the research program's performance in meeting its long-term goals, including the one shared with the screening program. Because the screening program has not had an independent review of the program's results since 1999, the ED program as a whole can receive only partial credit to this question.

Evidence: Review of the Endocrine Disruptor Screening Program by joint subcommittee of the Science Advisory Board and Scientific Advisory Panelwww.epa.gov/science1/pdf/ec13.pdf (1999-did not focus on performance). National Academy of Sciences review 'The Measure of STAR', pp. 20-63 (www.nap.edu/books/0309089387/html/). National Risk Management Research Laboratory, Scientific Written Peer Review of the National Risk Management Research Laboratory's Risk Management Evaluation (RME) of Endocrine Disrupting chemicals (EDCs), June 30, 2003. National Center for Environmental Assessment, Scientific Peer Review of An integrated human health and ecological species effects assessment: A case study of bisphenol A, May 28, 2003.

Section 4 - Program Results/AccountabilityScore26%

  • View this program’s assessment summary.
  • Visit ExpectMore.gov to learn more about program assessment and improvement by the Federal Government.
  • Learn more about detailed assessments.
Top Articles
Latest Posts
Article information

Author: Jeremiah Abshire

Last Updated: 12/25/2022

Views: 6425

Rating: 4.3 / 5 (54 voted)

Reviews: 85% of readers found this page helpful

Author information

Name: Jeremiah Abshire

Birthday: 1993-09-14

Address: Apt. 425 92748 Jannie Centers, Port Nikitaville, VT 82110

Phone: +8096210939894

Job: Lead Healthcare Manager

Hobby: Watching movies, Watching movies, Knapping, LARPing, Coffee roasting, Lacemaking, Gaming

Introduction: My name is Jeremiah Abshire, I am a outstanding, kind, clever, hilarious, curious, hilarious, outstanding person who loves writing and wants to share my knowledge and understanding with you.