We are thrilled to share the winners of the Annual Evaluation Community Awards Program, on behalf of the Evaluation Officer Council! These awards celebrate Federal civil service leaders committed to uplifting evaluation and evidence-based policymaking across the Federal Government whose work far too often goes unrecognized. Individuals from across the Federal evaluation community, as well as those who work alongside Federal evaluators, nominated 26 individuals and team, representing 15 different agencies. A cross-functional panel reviewed the submissions to provide recommendations for recipients in each category. OMB presented the awards at the October 2023 Evaluation Officer Council (EOC) meeting.
The winners of the 2023 Evaluation Community Awards profiled below reflect the commitment and ingenuity of the Federal evaluation community. The below excerpts from their nominations share compelling stories of how Federal evaluators develop high-quality evidence that program and policy leaders rely on, contributing to positive change across Government. The profiles of each recipient serve to showcase their excellent work, which we hope inspires others to connect with and celebrate the Federal evaluation community in all they do to improve Federal programs and enable evidence-based policymaking and decision-making.
EOC Distinguished Contribution Award
Megan Kays, OPM
Evaluation and Evidence Team of the Year Award
COVID-19 Public Education Campaign Team, HHS/DOC/EOP
Evaluation in Action Award
Enoh T. Ebong, USTDA
Evaluation Officer Council Distinguished Contribution Award
For Evaluation Officer Council members who make exceptional contributions to the Federal evaluation community.
Megan Kays, Evaluation Officer
US Office of Personnel Management (OPM)
Meg leads the Office of Personnel Management’s (OPM) evaluation and evidence building initiatives under the Foundations for Evidence-Based Policymaking Act, including implementation of OPM’s Learning Agenda and Annual Evaluation Plans. Meg also coordinates initiatives related to strengthening the agency’s use of evidence in strategic initiatives and capacity-building for research and evaluation.
"Meg Kays has played a leading role in building a stronger culture of learning and continuous improvement at OPM. She led the development of OPM’s first Learning Agenda and first Research, Analysis, Statistics, and Evaluation Capacity Assessment. The Learning Agenda includes questions related to top OPM and Administration priorities, and Meg is actively leading OPM-specific and Government-wide studies on these and other topics.
Meg has played a critical role in filling knowledge gaps related to OPM’s strategic plan and equity action plans. She established OPM’s first Research Community of Practice to foster cross-organizational connections and knowledge transfer among OPM’s research, evaluation, and statistics experts. She also established and led a cross-organizational Data Sharing Working Group to determine how to share data externally with research partners. In a short period of time, Meg has become recognized within OPM as a trusted advisor to leaders, managers, researchers, evaluators, and analysts throughout the agency on matters related to research and evaluation."
Evaluation and Evidence Team of the Year Award
For Federal teams who collaborate across disciplines and silos to conduct evaluation(s) that generate critical evidence for decision-making in or across agencies.
COVID-19 Public Education Campaign Team
Department of Health and Human Services, Department of Commerce, and Executive Office of the President
- Sarah Trigger, HHS/FDA
- Katherine Margolis, HHS/FDA
- Kathleen Yu, HHS/FDA
- Daphney Dupervil, DOC/Census
- Maysoon Malik, HHS
- Monica Vines, DOC/Census
- Elizabeth Petrun Sayers, HHS/FDA
- Morgane Bennett, HHS/FDA
- Lynn Sokler, HHS/CDC
- Allison Kurti, HHS/NIH
- Jessica Weinberg, HHS/FDA
- Joshua Peck, EOP
- Trinidad Beleche, HHS
- Nicholas Holtkamp, HHS
- Lok Wong Samson, HHS
- Aaron Kearsley, HHS
"The HHS COVID-19 Public Education Campaign is a national initiative to increase public confidence in and uptake of COVID-19 vaccines and educate the public about the availability of COVID treatments while reinforcing basic prevention measures. The Campaign is led by a team of government employees on detail to HHS, including those who specialize in research and evaluation as well as those in other disciplines who led efforts on partnerships, media placement, and strategic planning.
As one of the largest public education campaigns in U.S. history, evaluation was critical to demonstrating Campaign success and securing funding. The campaign relied on a multi-method approach to evaluate campaign effectiveness, including using social media analysis, in-depth interviews, focus groups, and longitudinal and cross-sectional nationally representative surveys. This methodology reflects a novel way to simultaneously evaluate short- and long-term campaign effectiveness.
As COVID-19 was an evolving pandemic, the team demonstrated agility and efficiency to quickly analyze data and tailor messaging. The evaluation findings show that the Campaign helped the public make informed decisions about their health and COVID-19. Specifically, exposure to the campaign increased vaccine confidence and likelihood of first-dose vaccination."
Evaluation in Action Award
For Federal executives or program leaders who use evaluation results to drive program improvement.
Enoh T. Ebong, Director
US Trade and Development Agency (USTDA)
Enoh T. Ebong is the Director of the U.S. Trade and Development Agency. Nominated by President Biden to serve as USTDA’s Director, she was confirmed by unanimous consent of the U.S. Senate. As Director, Ms. Ebong leads USTDA in its efforts to develop sustainable, clean infrastructure and foster economic growth in emerging economies, while also supporting U.S. jobs through the exports of U.S. goods and services.
"Director Ebong has a long history of using and elevating evaluation and evidence to help make evidence-informed decisions. The examples are too numerous to count, but some recent examples include:
- Insistence on using evidence from past climate obligations to improve a strategic objective goal in the 4-Year Strategic Plan.
- Use of findings [from] pilot projects to inform how resources can best be prioritized during consideration of new pilot project activities. This led to changes to a program’s design to support effective and responsible use of U.S. government resources.
- Use of multiple evaluation findings relating to stakeholder responsiveness and resulting impacts to USTDA’s ability to understand and document outcomes. Director Ebong uses those findings to discuss with stakeholders’ senior leadership the importance of providing information during the evaluation process. Director Ebong’s approach to always include Monitoring & Evaluation representatives in the room during these discussions helps to underscore the importance to USTDA of monitoring, evaluation, and evidence-based learning."
Excellence in Program Evaluation Award
For evaluation leaders or staff who keep initiatives on track and deliver high quality evaluations that uphold the standards of relevance and utility; independence and objectivity; rigor, transparency, and ethics.
Marsha Silverberg, Associate Commissioner
Institute of Education Sciences, National Center for Educational Evaluation and Regional Assistance
US Department of Education
Marsha leads the evaluation division at the U.S. Department of Education, working with a talented team of researchers and analysts to design and conduct rigorous studies of federal programs and federally supported strategies. Drawing on nearly three decades of experience, she has helped the Department develop its evidence agenda, supporting program offices and grantees in both using and generating evaluations to improve program outcomes.
"Marsha truly exemplifies each of values found in the Department’s evaluation policy, modeling rigor, utility, objectivity, transparency, and ethics. Her tireless work across the Department garnered senior leader (Assistant Secretary-level) buy-in to design and conduct the Department’s first-ever randomized controlled trial of a federal student aid program, rigorously evaluating the impact of access to Pell Grant funds for short-term programs. The resulting study yielded evidence crucial to Executive and Legislative branch policymakers.
Marsha’s fair, open, and forthright style has allowed her to build productive working relationships with stakeholders both within and beyond ED, including staunch advocates who were wary of rigorous program evaluation. Several evaluations, both recently completed and underway, would not have been possible without her collegial spirit and commitment to learning for the sake of program improvement, including impact studies of the DC Opportunity Scholarship Program and the 21st Century Community Learning Centers Program. As she nears her 25th year with the Department, Marsha has been a role model for a generation of program evaluators known government-wide for their commitment to excellence.”
Federal Evaluation Innovator Award
For evaluation mavericks who think up creative and outside-the-box evaluation ideas, help design them, and see them through to execution.
Paul O’Leary, Senior Economist
Office of Retirement and Disability Policy, Office of Research, Demonstration, and Employment Support
US Social Security Administration
Paul leads SSA’s development of data products to support disability and return-to-work analyses. These data products transform complex administrative data into easy-to-use research files that allow SSA to answer complex research questions quickly and efficiently. Paul supports a cadre of analysts in using this data and conducts rapid cycle analyses to support policy development and program evaluation.
“Paul O’Leary, a Senior Economist at the SSA, has led the development of an innovative evaluation infrastructure that includes the creation of the Disability Analysis Files (DAF) and National Beneficiary Surveys. As part of his work leading the agency’s Ticket to Work (TTW) evaluation, Paul had the foresight to identify the need for these data for broader evidence-building and evaluation. … These files serve as a key source of information on program experiences and return-to-work outcomes for SSA disability beneficiaries. The data files have been used to conduct evaluations to support major regulatory changes to the Ticket to Work program, as well as by the Department of Education, GAO, and numerous external researchers.
In addition to developing the data and mentoring junior members of SSA’s evaluation team, Paul’s own evaluations using these data have set the standard by which the TTW program has been assessed. His rigorous analytical approach to the TTW evaluation has often corrected the naive analyses from external stakeholders that created misperceptions about the program. Paul’s work is foundational to the agency’s efforts to fulfill the promise of the Evidence Act.”
Outstanding Evaluation Mentor Award
For evaluation leaders or staff who go above and beyond to provide mentorship, support, and guidance to others in the Federal evaluation community.
Maureen Wilce, Evaluation Team Lead
Centers for Disease Control
U.S. Department of Health and Human Services
Maureen leads a multidisciplinary team in building evaluation capacity within the National Asthma Control Program and its partners. Key to this is developing and providing technical support based on a suite of evaluation tools based on the CDC Framework for Program Evaluation and the principles and standards established by evaluation professional organizations.
“Maureen Wilce has inculcated in countless evaluators - novice and seasoned alike - an appreciation for an evaluation approach rooted in the CDC Evaluation Framework. She mentors the multitudes: her team members, who never really leave Maureen’s team even when they’ve moved on; solo evaluators in her center and beyond who need a thought partner (or therapist); evaluation fellows whose supervisors aren’t grounded in the Framework; and staff in funded state programs, whether they’re the epidemiologist-turned-evaluator (yesterday) or the evaluator with decades of experience.
Maureen truly practices what she preaches; she brings a practical approach to evaluation, emphasizing utility and promoting efficient and effective use of evaluation resources (and by extension, programmatic resources). Maureen’s research on evaluative thinking illustrates her awareness that we make great progress by helping colleagues create space to reflect on their work, project its impact, bring multiple perspectives to their work (even dissenting ones), and draw on evidence in decision making. Many, many of us have learned this from Maureen. Her influence is deep, wide, and will persist long past her 2024 retirement.”
Linda Vo, Knowledge Translation Program Coordinator, Rehabilitation Program Specialist
Administration for Community Living
U.S. Department of Health and Human Services
Linda was the Evaluation Lead in the Performance and Evaluation Office at the Centers for Disease Control and Prevention, where she led agency-wide evaluation and capacity-building activities and oversaw the CDC Evaluation Fellowship Program. In June 2023, she joined National Institute on Disability, Independent Living, and Rehabilitation Research (NIDILRR) within HHS’s Agency for Community Living. She currently serves as the Knowledge Translation Program Coordinator and is responsible for developing and managing NIDILRR’s knowledge translation program and monitoring NIDILRR-funded knowledge translation grants and contracts.
“Linda Vo has been a mentor and a mentor’s mentor. In an agency filled with interns and fellows, Linda has shepherded many new public health professionals as they navigate the challenges a federal bureaucracy presents. She has been patient and persistent, provided encouragement, given thorough and direct feedback on work products and on personal presentation, and tailored her approach to the individual professionals. More importantly, she has been a sounding board and fierce advocate for newer evaluators in the agency, especially people of color. She has listened, she has believed, she has acted.
In her role with CDC’s Evaluation Fellowship, Linda was a confidante to many fellows. ... For more seasoned evaluators in the agency, Linda has provided guidance not only on how to be a good evaluator (though she does that, too), but on how to be a good mentor.
For all of us, Linda has modeled the courage, honesty, and vulnerability that are perhaps the hardest part of the interpersonal domain in the evaluator competencies. She has been a cheerleader, an ally, a teacher to many.”