A Practical Guide to Evaluating Your Own Programs

A Practical Guide To Evaluating Your Own Programs-Free PDF

  • Date:01 Aug 2020
  • Views:6
  • Downloads:0
  • Pages:97
  • Size:241.82 KB

Share Pdf : A Practical Guide To Evaluating Your Own Programs

Download and Preview : A Practical Guide To Evaluating Your Own Programs


Report CopyRight/DMCA Form For : A Practical Guide To Evaluating Your Own Programs


Transcription:

Development of this manual Taking Stock A Practical Guide to. Evaluating Your Own Programs was made possible with the sup. port of the DeWitt Wallace Reader s Digest Fund which sponsors. Science Linkages in the Community SLIC a national program. designed and coordinated by the American Association for the Ad. vancement of Science AAAS,Table of Contents,Acknowledgments. Introduction,Chapter One Why Use This Manual 1,Program Evaluation Basics. Chapter Two Why Evaluate 5, Chapter Three Getting Started Framing the Evaluation 11. Documenting Context and Needs,Chapter Four What Are You Trying To Do 15. Defining Goals and Objectives,Chapter Five Finding the Right Mix 21.
Using Quantitative and Qualitative Data,Chapter Six Finding the Evidence 27. Strategies for Data Collection,Chapter Seven Making Sense of the Evidence 39. Interpreting and Reporting Your Data,Examples Featuring Fictional CBOs. Chapter Eight Applying This Manual 45,How One CBO Did It. Chapter Nine Applying This Manual in a Bigger Way 55. Expanding the Evaluation Design,Appendices Sample Reports Using Evaluation Data.
Appendix A Final Evaluation Report 71,Appendix B Proposal for Expanding a Program 75. Appendix C Annual Progress Report 81,Glossary of Terms 89. Index of Key Concepts 93,Acknowledgments, Taking Stock was prepared at the request of the American Association for the Advancement of. Science AAAS for its Science Linkages in the Community SLIC initiative The work was. carried out with the financial support of the DeWitt Wallace Reader s Digest Fund We gratefully. acknowledge the vision and support of these two organizations in the development of this manual. and its companion workshop, For many years AAAS has worked with national and local community based organizations to. design and deliver enrichment activities for use in out of school science programs Beginning in. 1993 the SLIC initiative focused these efforts in three diverse U S cities Chicago IL Rapid. City SD and Rochester NY Recognizing that evaluation is essential for quality programming. and continued funding AAAS commissioned Horizon Research Inc to develop materials to. guide community based organizations through the evaluation of their informal science activities. However in the course of writing the manual it became clear that the need for a fundamental. grounding in program evaluation is not limited to organizations that provide out of school science. activities, And so this manual evolved into what we hope is a more broadly useful guide to program.
evaluation To the extent that we have succeeded we owe much to our reviewers who include. members of the SLIC National Planning Council and their colleagues as well as grantmakers from. national and local foundations DeAnna Beane Association of Science Technology Centers. Janet Carter Bruner Foundation Stacey Daniels Kauffman Foundation Catherine Didion. Association for Women in Science Hyman Field National Science Foundation Sandra Garcia. National Council of La Raza Maritza Guzman DeWitt Wallace Reader s Digest Fund Barbara. Kehrer Marin Community Foundation Roger Mitchell National Science Foundation Nancy. Peter Academy of Natural Sciences Annie Storer American Association of Museums Ellen. Wahl Education Development Corporation and Faedra Weiss Girls Inc. Other important reality checks were provided by AAAS and SLIC staff most notably Betsy. Brauer Rochester SLIC Yolanda George AAAS Gerri Graves AAAS Michael Hyatt Chicago. SLIC and Margie Rosario Rapid City SLIC, Finally we would like to thank Shirley Malcom head of the AAAS Directorate for Education and. Human Resources Programs who recognized the need for this resource and provided us with the. time and space to do it right,Chapter One,WHY USE THIS MANUAL. Do you want information that will help improve your organization s programs. Are your sponsors asking about the quality and impact of the programs they fund. Are you applying for a grant that requires an evaluation plan. If you answered Yes to any of these questions then this manual can help It is a practical guide. to program evaluation written for community based organizations CBOs It provides. information that you can put to use now to help improve your programs. This manual focuses on internal evaluation that is program evaluation conducted in house by. CBO staff We have taken this approach for one simple reason many CBOs cannot afford to. hire someone outside the organization to evaluate their programs but they still need the kinds of. information that evaluation can provide, The information in this manual should better prepare you to design and carry out a program. evaluation And because the field of evaluation is now putting greater emphasis on participatory. evaluation a middle ground between internal and external evaluation you will be able to apply. your knowledge either within your own organization or in working with an external evaluator. This manual will also help you recognize when you might need an external evaluator and the ad. vantages of using these services should your CBO be able to afford them at some point. Here are some assumptions that we made about you as we wrote this manual. You care about kids and communities, Your organization is committed to providing the best services possible. You have some experience running or participating in a CBO program so you have an. idea of how to get things done, You want to evaluate a program not the people who run it or participate in it.
These shared qualities aside we realize that CBOs come in all shapes and sizes Some have full. time staff and annual program budgets exceeding 100 000 others spend less than 5 000 per. program and rely almost entirely on volunteers Community based organizations also range. widely in their goals from teaching new information or skills to strengthening families to. enhancing students educational and career options,Taking Stock 1. This manual is designed to help a wide variety of,organizations whatever your goals or resources. What s In This Manual, Chapters 2 7 include basic information on evaluation concepts and techniques Ideally everyone. who picks up this manual will read these chapters for some background in program evaluation. Chapter 2 talks about what evaluation can do for your programs and describes. two types of evaluation formative and summative, Chapter 3 discusses the importance of documenting needs and context and. identifies some important first steps in planning your evaluation. In Chapter 4 we distinguish between program goals objectives indicators. and outcomes and their role in evaluation, Chapter 5 talks about using quantitative and qualitative data to evaluate.
progress and impact, Chapter 6 describes how to collect information to evaluate your programs. through document review observations interviews and surveys. Chapter 7 provides tips for organizing interpreting and reporting the. evaluation data that you collect,Overview of the Evaluation Process. Framing the,Framing the,Evaluation,Evaluation Defining. Objectives,Objectives,Finding the,Finding the,Evidence Making Sense. Making Sense, Identifying needs Choosing goals that are Expanding the evaluation Looking for themes.
Documenting context consistent with needs plan Interpreting data. Taking stock of Defining objectives Looking at records and Putting it together. available resources Generating evaluation documents Reporting your results. Designing program questions Observing program,strategies Selecting indicators and activities. outcomes Interviewing people,Conducting surveys, The remaining chapters of this manual show how to apply this information with examples of how. evaluations might differ for programs with varying levels of resources Chapter 8 takes you. through a simple evaluation of a small program in a fictional CBO Chapter 9 describes how the. same CBO enlarged the evaluation when the program was expanded We have also included. sample evaluation plans and instruments that can be adapted for use in your own programs. Taking Stock 2, The appendices include examples of three types of reports that present evaluation information. Appendix A is an example of a final evaluation report that describes the im. pact of the small scale program described in Chapter 8. Appendix B illustrates a proposal for expanding the scope of this program as. described in Chapter 9, Appendix C models an annual progress report that describes the formative. evaluation of the multi year program described in Chapter 9. A Glossary of Terms is included at the end of the manual Throughout the manual words and. terms that are shown in bold italics are defined in this glossary. Finally we have tried to make this manual accessible to a wide range of audiences As an over. view it takes a relatively traditional approach to evaluation providing information on funda. mental concepts and activities and how these can be applied However in practice the field of. evaluation is far more complex than we have described it here Using this guide as a basic intro. duction we recommend the following resources to help you expand your knowledge and under. standing of program evaluation, Assess for Success Needs Assessment and Evaluation Guide 1991.
Girls Incorporated,30 East 33rd Street,New York NY 10016. Leadership Is Evaluation with Power 1995,by Sandra Trice Gray. Independent Sector,1828 L Street NW,Washington DC 20036. Measuring Program Outcomes A Practical Approach 1996. United Way of America,701 North Fairfax Street,Alexandria VA 22314. User Friendly Handbook for Project Evaluation Science Mathematics Engineering and. Technology Education,by Floraline Stevens Frances Lawrenz Laure Sharp.
National Science Foundation,4201 Wilson Blvd,Arlington VA 22230. Taking Stock 3,Taking Stock 4,Chapter Two,WHY EVALUATE. To evaluate something means literally to look at and judge its quality or value A CBO might. evaluate individual employees its programs or the organization as a whole When you evaluate a. person s performance you try to find out how well she carries out her responsibilities When you. evaluate a program you want to know how far the program went in achieving its goals and. objectives And when you evaluate an organization you ask how well it operates to achieve its. larger organizational mission Evaluation involves the collection of information that helps you to. make these judgments fairly,This manual focuses exclusively on program. evaluation Why is program evaluation so,First it generates information that can help. you to improve your programs,Second it can demonstrate to funders and.
others the impact of your programs, In the past evaluation was often used only to measure performance Based on information gath. ered in a final summative evaluation further funding decisions were made Programs were. continued or discontinued depending on the results of the evaluation. Luckily program staff and funders have begun to expand their view of evaluation and appreciate. its potential for program improvement Through ongoing formative evaluation you and your. sponsors can gain insight into how well your program is performing and what adjustments may be. necessary to keep it on track,More about Formative Evaluation. Formative evaluation can help you determine how your program is doing while it is in progress or. taking form The information you collect can help you make changes in your program and correct. problems before it s too late Formative evaluation can also help you identify issues of interest. that you might not have thought about when planning your program And it can help shape and. refine your data collection activities,Taking Stock 5. Formative Evaluation,Provides information as a program takes form. Monitors progress toward objectives,Provides information to improve.
Helps identify issues of interest,Helps refine data collection activities. Helps clarify program strengths and,limitations, Information from a variety of sources such as participants instructors and parents can tell you. how a program is progressing For example Do students like the program Are staff and par. ticipants satisfied with the activities What changes are needed to improve the program. The people involved with your,programs should be consulted. during the evaluation planning Pinpointing Problem Areas. stage and as often as your re Getting Formative Feedback. sources permit during program, Youth Action Today was in the third year of providing. implementation Let partici three day summer camps for middle school students and. pants know that their opinions their high school mentors Interest in the camp had. are important and provide steadily increased among sixth and seventh graders with. them with opportunities to enrollment rising each year But pre registration this spring. share their views With their showed fewer eighth graders were signing up Thinking. input you can improve your fast program staff met with several small groups of eighth. graders who had attended the camp when they were,programs and increase the.
younger to see if they knew what the problem was, likelihood that you will achieve Students told the staff that word was out that camp. positive results Even activities were babyish and that the camp wasn t. programs that have been suc cool enough for older kids With this feedback pro. Development of this manual Taking Stock A Practical Guide to Evaluating Your Own Programs was made possible with the sup port of the DeWitt Wallace Reader s Digest Fund which sponsors Science Linkages in the Community SLIC a na tional program designed and coordinated by the American Asso ciation for the Ad vancement of Science AAAS

Related Books