Research and Evaluation

The CES Research and Evaluation Department offers consulting services and expertise for schools, coalitions, municipalities and community-based organizations. We can provide brief consultations or more extensive planning and evaluation work.

Home E Consulting E Research & Evaluation

Our Approach

CES Research and Evaluation staff have the skills and experience to customize our work to meet individual client needs, foster capacity building so that clients develop the structures and skills they need to internalize evaluative functions, and ensure that evaluations are useful for program improvement.

Integrating evaluation throughout the program development and implementation process promotes outcome-oriented planning and implementation — a backward planning process that ensures that program design is strategically connected to the intended outcomes.

Existing programs can also find value in evaluation assistance, including consultation on strategic planning, data-informed program improvement, evaluation design and implementation, outcome data analysis, and reporting.

Experience

We have provided evaluation support across the state and nationally to programs, organizations, and school districts for their work in early childhood, special education, vocational-technical education, positive behavioral intervention and supports, social-emotional learning, district-determined measures, professional development, and educator licensure programs. We have also worked with schools and community organizations promoting healthy youth development and healthy communities using the Strategic Prevention Framework, including South Hadley, Northampton, and Easthampton school districts and their Prevention Coalitions; Cooley Dickinson Hospital; Easthampton Health Department; and Casa Latina. Support for this work has included technical assistance, survey development and administration, data analysis, development of social norms campaigns, and outcome evaluations.

Notebooks and a pen

Services

Program Planning, Design, and Improvement

We work with clients in the early stages of program design or planning, to help clarify the desired program outcomes, define the program’s strategy, and determine which processes will best achieve the program’s goals. We can also guide clients in using data and research to develop or improve their programs, and help develop a plan for program evaluation. We help clients think through both process and outcome evaluations, and incorporate the use of both qualitative and quantitative data.

 

Evaluation Design and Implementation

Evaluation design involves deciding which evaluation or research questions you want to answer, and determining how to answer these questions. A careful evaluation design helps program managers assess progress, identifies barriers and facilitators, determines whether the evidence of progress is valid and reliable, and allows staff to make adjustments in order to enhance future effectiveness. We offer support in both the design and the implementation of evaluation plans, as well as documenting outcomes and creating reports that may be required for stakeholders, partners, and funders. An effective grant application often requires a well-designed evaluation plan. We are happy to consult with you during the grant writing process to develop a plan for evaluating the proposed program.

Survey Design, Data Collection, and Analysis

Data collection throughout the life of the program is crucial to determining the program’s success. We offer support with designing and implementing methods of data collection, and with analyzing the information they yield, including:

  • Paper or online surveys
  • Focus groups
  • Telephone or in-person interviews
  • Impact data such as test scores, graduation rates, program completions, etc.

Building Your Internal Evaluation Capacity

If your goal is to build internal evaluation capacity, we are ready to assist with training and support for:

  • Funders on results-oriented funding
  • Grantees on outcome-oriented program design, evaluation planning and data collection and use
  • Educators and district administrators on data use, student growth measures, educator evaluation

Networking and Resources

Through contact with other evaluators locally and nationwide, the CES staff have an understanding of what other organizations may be able to offer. We create specialty networks and can link clients to appropriate research and evaluation resources. We maintain memberships in the American Evaluation Association and the American Educational Research Association, and participate in forums, webinars, and other venues for sharing information.

Equity Teams

Addressing inequities in schools requires that adults and young people work together to explore, assess, and transform school culture. CES provides a highly skilled and racially diverse co-facilitation team to walk your equity team through this important, complex, and intersectional work. 

Working with CES evaluators as thinking partners, we have identified strengths to build upon (such as the value of mentors to support student centered work) and challenges to address (e.g., building structured time for student-centered work). Often missing from public school initiatives, this evaluation has been critical to the understanding of the work we have done thus far, developing clarity on our remaining challenges, and how to approach the work beyond this grant period.

— Manchester CT Public Schools

Kate Lytton, M.S.

Lytton brings over 20 years of experience in social research, including needs assessment, strategic planning, evaluation design, survey research, and qualitative methods, to her program evaluation work at CES.  She has designed and led studies of educator professional development, teacher preparation, child abuse prevention, interagency and community collaborations, and truancy prevention initiatives, among many other education, social service, and community health projects.  

Kate brings a passion for participatory approaches that engage stakeholders in identifying and addressing questions that are critical for program improvement and that keep family and child needs at the center.  She facilitates collaborative efforts that focus on collecting and using data to understand an educational challenge and to assess program effectiveness and outcomes.   Kate has a BS in mathematics from Williams College and an MS in Science and Technology Studies from Rensselaer Polytechnic Institute.

Position: Director of Research and Evaluation

Email: klytton@collaborative.org

Phone: 413.586.4900 x5977

Catherine Brooks

Catherine Brooks provides research and evaluation support to internal CES programs as well as to consulting clients. She has extensive experience with designing and managing evaluations; developing and conducting surveys, interviews, and focus groups; analyzing data in Excel and SPSS; and developing analytical reports that summarize and illustrate the results of an analysis. Projects she has managed at CES have included internal evaluation of CES systems and programs as well as external contracts to evaluate state, federal, and privately funded programs. Catherine has a bachelor’s degree in political science from Swarthmore College and a master of public policy degree from Duke University.

Position: Program Evaluation Manager and Survey Research Specialist

Email: cbrooks@collaborative.org

Phone: 413.586.4900 x5902

Karen Auerbach

Karen Auerbach provides support to CES research and evaluation projects. She has almost 20 years of experience in education, social science, and public health research and evaluation. She has expertise in research methodology design, development of data collection tools, data management, statistical analysis, and reporting. Karen has a bachelor’s degree in psychology and women’s studies from Oberlin College, a master’s degree in developmental and educational psychology from Boston College, and master’s and doctoral degrees in human development and family studies from Penn State University.

Position: Research Associate

Email: kauerbach@collaborative.org

Phone: 413.586.4900 x5955

Rebecca Mazur

Rebecca Mazur, Ph.D., has extensive experience in educational research and evaluation and uses qualitative, quantitative, and social network approaches. She has designed and implemented studies investigating a variety of educational phenomena including system factors that support or constrain student learning outcomes, educator professional development, teacher support networks, instructional interventions for adolescent learners, and organizational collaboration. Much of her research involves working directly with public schools that primarily serve a diverse and/or economically disadvantaged student population. Her scholarship has been published in a number of respected peer-reviewed journals including Educational Administration Quarterly, Evaluation and Program Planning, and Educational Management Administration and Leadership. Rebecca has also led or served in an advisory capacity on numerous high school reform/redesign efforts aimed at increasing student learning outcomes, improving school climate, and enhancing curricular rigor and relevance. She is a former high school teacher-librarian and a certified principal. Her international work includes serving as the Academic Director for the Instructional Leadership Institute for Pakistani Educators, funded by the U.S. State Department. Rebecca has a B.A. in history from Mount Holyoke College, a master’s degree in library and information science from Simmons College, and a Ph.D. in educational research, policy and administration from the University of Massachusetts.

Position: Research and Evaluation Specialist

Email: rmazur@collaborative.org

Phone: 413.586.4900 x5938

Contact Us

Translate »