Comprehensive Evaluation Planning C r e a t i n g A Fo u n d a t i o n Marissa Berg Quality Control Manager Resource Associates Capacity Builders Inc.
Marissa Berg Quality Control Manager, Resource Associates/ CBI • 10 Years with RA/CBI • Reviews over 200 applications a year • Federal Reviewer, HHS (CED,DFC), AmeriCorps • Contributor to some of the highest scoring federal proposals in the Country • Background in Technical Writing and Journalism • Bachelor of Science Degree in Journalism from California Polytechnic State University San Luis Obispo
• Created in 1995 by an accomplished grant writer who needed the assistance of other grant writers to keep up with demand • Provides grant writing services across the country and in US territories • Develops Federal, State, and Foundation proposals • Provides strategic planning sessions, grant writing trainings and partnership development services
• 501(c)(3) non-profit organization • Provides capacity building services to organizations across the country • Provides valuable evidence based programming to Native American populations within and near the Navajo Nation Reservation • Both headquarters are located in Farmington, NM • Offices located in California, Arizona, Colorado, Washington, and Texas
• Third Party Evaluation Services Provider • Evaluation Experts Nationwide • Provides valuable support to examine strengths and weaknesses of varying grant elements, including interventions, policies and protocols • Helps prove specific outcomes are achieved for funder required reporting • Long term sustainability planning
What is Program Evaluation? Evaluation: A systematic investigation of the worth of an object. Grant Evaluation: A systematic investigation of the worth of grant activity.
Why is it important to the funder? • Grant dollars most often have strings attached. • Funders give grants because they want to see the money go towards achieving a particular initiative or policy. • A good evaluation will produce objective information on how well and to what degree you accomplished the funder’s initiatives • This allows the funder to report to its constituents or stakeholders on its overarching accomplishment.
Why is it important? • Evaluation is important because it reduces wasteful spending by both grantees and funders. • Evaluation data – if collected, analyzed, and reported correctly – can support such stakeholders in making resource distribution decisions. • When that grantee moves forward in applying for additional funding, these data will help to convince this or other funders that the proposed initiative has a good chance of replicating success.
Formative and Summative Evaluation • Two different types of evaluation • One is progressive in nature (formative), while the other is conclusive in nature (summative) • Both will track and assess your grant’s progress from start to finish as well as measure the effectiveness of your grant work in meeting goals and outcomes.
Formative Evaluation Formative evaluation assesses ongoing grant activities.
Formative Evaluation • Concentrates on examining and changing processes as they occur. • Provides timely feedback about grant services. • Allows you to make adjustments to your grants “on the fly” to help achieve grant goals.
Summative Evaluation Summative evaluation assesses a mature grant’s success in reaching its stated goals and outcomes.
Summative Evaluation • If you met your grant objectives • If you will need to improve and modify the overall structure of the grant or various interventions • The overall impact of the grant • What resources you will need to address the grant’s weaknesses
Frequency of Data Collection • Baseline established at the beginning of the grant • Quarterly • Annually • Post Project completion as needed
Frequency of Data Collection • Established benchmarks as defined in the grant application will help you with your formative evaluation.
Quantitative/Qualitative Data • Different types of data • Quantitative Data is most popular • Qualitative Data is more open ended
Quantitative Data • Can be measured using a number • X number of students agree that drinking and driving is dangerous • 82% of students are proficient in math • Just think of quantity (How many? How much?)
Qualitative Data • • • •
Descriptive in nature, More open ended Described using words Quality of something Students say that the reason they smoke is because it’s easy to get tobacco.
Data Collection Tools • You can create your own or use those previously created • Surveys and standardized testing are they most common • Best to not reinvent the wheel • Some data collection already takes place (meeting minutes, attendance sheets, etc.)
Threshold of Evidence • Depending on the grant program, some projects require a higher threshold of evidence • While some funders are just interested in quantitative data for reports (head counts) • Others are looking for evidence that can be justified using statistical methods. • Most evaluations fall under the category of a quasi‐experimental study when comparing two groups
Threshold of Evidence • Quasi‐Experimental‐This means that you apply the intervention to one group (experimental) and do not apply to the other group (control), but the groups are not random (class a vs. class b) • Randomized Control Trials are most intense because the individuals picked for each of the groups (experimental/control) must be truly random.
Evidence Based Programming • A program intervention that has shown a positive change on a target population after implementation • Evidence Based Programming allows you to set quarterly and annual benchmarks with confidence
Evidence Based Programming • When an evaluation is measuring an intervention that has not been previously tested your goals/objectives become a hypothesis, which is more research based in nature.
Setting Benchmarks • At the start of the project you should have either identified or scheduled baseline measures to be collected to be used for your evaluation. • Based on the baseline you should have, at minimum, annual benchmarks of achievement towards your final goal.
Setting Benchmarks • Allows you to utilize data collected from your formative evaluation the most effectively towards a continuous cycle of improvement.
Evaluation Planning • Identify who will be doing your third party evaluation. • Provide their qualifications for conducting the evaluation. • Identify what data collection tools will be used to measure each of the goals/objectives • Determine your baseline or when the baseline will be collection.
Evaluation Planning • Establish quarterly/annual benchmarks • Identify when data will be collected • Identify who and how the data will be collected and analyzed. • Describe what reports will be presented and how changes to project activities will be implemented based on findings from the summative evaluation.
Evaluation Planning • Identify how project outcomes will be used to support program sustainability • Describe how an advisory committee will provide feedback on summative reports • Describe how project outcomes will be distributed to a broad audience.