Source: Sarena D. Seifer and Stacy Holmes, Community-Campus Partnerships for Health, May 2002
Updated: Julie L. Plaut, Campus Compact, May 2009
Julie Elkins, Campus Compact, September 30, 2009
Introduction
As with any pedagogy and/or program, a variety of tools and methods can be used to evaluate service learning. What follows are a discussion of issues to consider, a summary of websites, and a list of publications that provides background, tools, and resources. Remember, new tools and resources are being developed all of the time—use the resources below to get started and check your favorites for updates on a regular basis.
Evaluation is essential not only for documenting a program’s effectiveness and demonstrating accountability, but also for informing continued development or improvement of programs. Evaluation is not the end of initiative; it is an essential part of the on-going process that provides an opportunity to be deliberative, reflective, and creative. An inclusive and ongoing evaluation process can also contribute to stronger relationships and a shared vision among partners both on and off-campus.
The Joint Committee on Standards for Educational Evaluation has recommended four basic attributes for any program evaluation:
- Utility, including identification of stakeholders, credibility of evaluators, pertinence of information, and clarity and timeliness of reporting.
- Feasibility, including practicality of procedures, political viability, and cost effectiveness.
- Propriety, including service to participants, community, and society, respect for the rights of those involved, and provisions for complete and fair assessment.
- Accuracy, including program documentation, use of valid and reliable procedures, appropriate analysis, impartial reporting, and justified conclusions.
Palomba, Catherine A., and Banta,Trudy W. 1999. Assessment Essentials: Planning, Implementing, and Improving Assessment in Higher Education. San Francisco: Jossey-Bass Publishers.
These guidelines can be helpful as you plan your evaluation efforts. Keep in mind that you need not meet scholarly expectations unless your goal is to conduct and publish rigorous research. It may not be feasible, for instance, to undertake a longitudinal study measuring whether students’ participation in a service-learning course increases their volunteering in the community or their donations to the campus after graduating. In an evaluation project, however, you can ask students predictive questions such as “How likely are you to volunteer in the community/give to the college after you graduate?” This type of inquiry would capture expectations of their future behavior and inform practice. Efforts to document results both for students and for communities in such ways are an integral part of any high-quality service-learning program.
Issues to Consider
The following is a sample of questions that you might consider as you search for the most appropriate tools and resources to evaluate your service-learning program. Clarifying these issues from the beginning of your program will allow you to implement a process that accurately reflects your evaluation objectives and makes the best use of everyone's time and resources.
Who is this information for and what do they want to know?
Are you reporting program evaluation information to a private foundation that provided funds for your service-learning program and is interested in the number of service hours and the impact on the community? Or are you trying to justify ongoing institutional support for service learning to college administrators or trustees primarily concerned about student retention and development? The best evaluations fall short when they do not deliver the type of information desired by the intended audiences. Implementing a participatory evaluation process can help to alleviate this concern by involving relevant stakeholders early on. Funders, community partner staff, faculty, campus administrators, and students can each bring a different set of strengths to the evaluation process and will help to clarify the issues that you should focus on for your evaluation.
Helpful Tools and Resources:
Participatory Evaluation, Annzukoski, DRPH, MPH, and Mialuluquisen, DRPH, MPH, RN
What are you trying to measure?
Are you interested in student learning objectives or student attitudes towards their surroundings? Do you want to measure the impact of service learning on community agency partners, neighborhood residents, or the academic institution? You might have several evaluation goals and each of those goals might have several measurable indicators. For example, an evaluation goal might be to enhance your students' cultural competency. Measurable indicators for this goal might include an increase in knowledge about the demographic composition of the city and state, an increased appreciation for the beliefs and values of other cultures, and an ability to describe strategies for increasing the accessibility of city services to recent immigrants from Mexico who do not speak English. Another measurable indicator may be performance on a reliable and valid tool designed to measure one's cultural competency.
Clarifying both the goals and the indicators from the beginning of your program will help you determine if you are measuring what you think you are measuring. For example, a change in student attitude does not necessarily indicate increased competency in the course topic. However, attitudinal change might be the primary interest of both you and your service-learning stakeholders (funders, deans, community groups), and as a result the most appropriate focus of your evaluation efforts.
The following is a list of outcomes that could be impacted by service learning and therefore evaluated. This list is not comprehensive, but is intended to illustrate a range of outcomes that service learning has been shown to influence.
- Student attitudes, satisfaction, experiences, learning, competence, civic engagement, career plans
- Alumni professional development, civic engagement
- Faculty attitudes, satisfaction, experiences, learning
- Faculty professional development and teaching competence
- Community site attitudes, satisfaction, experiences, learning
- Client attitudes, satisfaction, experience, behavior
- The value of partnerships, service availability, quality of life
- Administrators' attitudes, satisfaction, experiences, learning
- Academic mission, priorities, curriculum, civic engagement
Who will be involved in the evaluation process?
In addition to the professionals on campus and in partner organizations, whom we talked about earlier, students in research methods courses might be able to work on your project as a service-learning assignment. Student teams might, in successive semesters, develop an evaluation plan and test instruments, revise and implement the instruments, and analyze the results. It might also be possible to involve work-study students; the University of Minnesota, for instance, has trained student staff to conduct “pulse” surveys of undergraduates 20 times a year to get their thoughts.
Community partners or community members might also quite happily work on evaluation projects in exchange for training and wages, tuition waivers, or other benefits; in some cases local residents have welcomed the opportunity to participate in important evaluation activities and build their own skills.
Faculty may also find participating in evaluation activities helpful to their own scholarship, tenure, and promotion. If their research interests overlap with the issues addressed by your evaluation, collaborations may serve your needs and result in new publications for faculty.
Some programs hire evaluation consultants. Some grants may require external consultants be hired to design, administer, or assess data from an evaluation process. Benefits include:
- an outside perspective;
- a greater appearance of objectivity;
- relevant expertise;
- less work for your office or organization; and
- perhaps more respect for the findings from the community or funders.
On the “con” side, consultants:
- are more expensive;
- may not know as much about your office/organization; and
- give you less control over the process; plus
- may prevent you from developing internal capacity to conduct evaluation due to your reliance upon them.
If you decide you do want to hire a consultant, before moving ahead with hiring someone, clearly define what will be evaluated and what the roles of the evaluator and organizational staff will be, and then make sure the consultant’s background or expertise fits your needs well.
What sources of data (both statistics and stories) already exist?
There is already data out there. Not all of it will be directly relevant to the questions you want answered, but some of it may be raw material for rich evaluation, or at least a benchmark against which you can measure future progress. It may also help you structure evaluation methods or specific quantitative or qualitative tools. The following are some existing data resources to consider.
Higher Education Institutions:
- Academic Plan
- Course catalog, other key documents
- Mission
- Policies
- Publicity—website, magazine, media coverage, etc.
- Strategic plan
Organizations:
- Campus Compact annual survey
- Carnegie community engagement classification application
- NCA accreditation self-study
Students:
- Alcohol EDU
- Alumni surveys
- Applications
- CORE Alcohol Study
- Course evaluations
- First College Year Survey (Higher Education Research Institute)
- NSSE- National Survey of Student Engagement
- Retention statistics
- Student newspaper
Multiple campuses can choose to add questions on civic engagement as Campus Compact members and participants in the American Democracy Project. While the CORE study and Alcohol EDU have a primary focus of alcohol, there are questions on volunteer activities and each school can add questions to each of these surveys.
Faculty:
- Annual activity reports
- Faculty Senate records
- Papers and publications
- Promotion and tenure files
Community:
- Community-Campus Partnership missions, goals, meeting minutes, websites
- Community parent organization activities
- K-12 reports, data, or historical trends
- Partner organizations’ administrative records and/or assessment activities
- Publications by partners or local media
How will you collect additional data?
As you think about your goals for an evaluation and start to research what other people have done in their service-learning programs, you will undoubtedly note that there are several data collection methods available. It is a good idea to plan this process prior to implementing your service-learning program so that data collection processes can be built into the overall program. For example, information collected on administrative forms filled-out by your students and community partners can become part of the data you use in your evaluation. One useful approach is to connect evaluation to already plan for events and activities - for example, adding questions to an existing course evaluation form or asking community partners to fill out a brief pre- service-learning survey when they attend an orientation. Community site visits can be built into the process as a way to not only keep the program running smoothly but also act as a source of qualitative data collection; for example, through structured interviews or focus groups. If you are working on a grant-based program, grant applications often ask you to identify an evaluation plan and submit it in writing as part of your application process. The extensive array of evaluation literature and service-learning research studies offers a whole spectrum of methods for collective evaluative information, such as:
- Achievement and attitude testing
- Content analysis of student reflection journals, course email discussion list discussions, community site evaluations, course syllabi, faculty curriculum vitas
- Curriculum evaluation
- Focus groups
- Impact evaluations
- Interviews (in person, Skype, phone etc.)
- Learner assessments
- Needs assessments
- Objective structured examinations (i.e., trained individuals role-play being a client and then rate the student's communications skills)
- Observation (passive, participant) (in-person or via technology)
- Personnel evaluation
- Pre/post tests on specific competencies and or attitudinal measures
- Written and on-line surveys (including students, faculty, community partners, administrators, service recipients)
There are often creative ways to get the data you want. For example, you may want to be able to convey change over time even though there isn’t baseline data available through preexisting records or evaluations. You can simply include retrospective questions on a survey or in interviews, like “How committed to service were you before taking this service-learning class?” along with questions about their current state, like “How committed to service are you now, after taking the service-learning class?” Some evaluators actually prefer that kind of “post-post test” where individuals compare their own commitment to service, for example, before and after taking a service-learning course, because the answers then reflect the respondents’ thinking at one point in time and are thus more valid. (Here’s an illustration. With the standard “pre-test, post-test method,” students might answer the question “How committed are you to service?” by rating themselves on a scale of 1 to 10. They may have considered themselves very deeply committed to service at the start of a service-learning class, but by the end of the class, they realized that they had seen service as a marginal though important part of their lives but now felt drawn to a service-oriented career as well. In that case, they may have answered a survey question the same way at the start and end of the class, but their understanding of what a deep commitment to service means had changed significantly, and that change is completely missed.)
How will you analyze your data?
After you have collected your data using one or several of the methods mentioned above, your next step will be to analyze your data. Several issues to consider include:
- Database and computer software needs - there are several statistical and qualitative applications available. If your organization or institution does not already use a certain software (e.g., SPSS, SAS, and/or Excel) there are several resources online. For example, the American Evaluation Association maintains a list of applications for analyzing qualitative data (http://eval.org/EvaluationLinks/QDA.htm) and products for developing and analyzing surveys (or http://eval.org/EvaluationLinks/QDA.htm), some of which are free and/or offer a free trial download.
- Data entry and analysis assistance - Consider tapping into the graduate students at your academic institution, especially those students in fields related to your service-learning course and/or students involved in a program evaluation/research course who can provide a service to you that will tie back into their own course objectives.
Will quantitative or qualitative data best capture the picture?
There are many factors that may drive your decision to use a quantitative, qualitative or mixed method analysis. While this is an age old debate, some of the premises have shifted with the advancement of qualitative software that can quantify “soft data’. Some of the questions you might consider to guide this decision may be:
- Are you interested in numbers (quantitative) or rich words, descriptions, details and depth of the subject (qualitative)?
- What are your primary goals in evaluation? Research, program assessment, program planning, funding, increase institutional support?
- Are you collecting the data for an intended audience? What methods do they require or prefer?
- If you have more than one audience to share the data with, would varied methods meet this goal?
- Do you have access to the tools needed for the method such as software packages, or skilled interviewers?
How will you use and disseminate your data?
It’s possible just to go through the motions of evaluation because somebody else requires you to do so, but as you know that’s not what we’re advocating—that means missing out on the real value of evaluation. Perhaps the most important step is simply having an inclusive planning process, with multiple stakeholders invested in benefiting from the evaluation results in various ways (guidance for improving programs, data to use in reports or proposals to funders, fodder for stories in alumni magazine or local newspaper, etc.). All of you may get caught up in your day-to-day responsibilities, but you can remind each other of the value of reflecting on, sharing, and applying the lessons of your evaluation. That kind of gentle, mutual accountability—grounded in a shared commitment to ongoing improvement of your work together—is really powerful.
Once you have the results in hand, remember to share them with your intended audience and the stakeholders that were instrumental in your evaluation (community partners, students, etc). Also, it is important to let the deans and administrators of the academic institution know the evaluation outcomes. Evaluation findings can be shared in a number of formats, depending on the audience and the goal. For example, you might consider:
- Preparing a short article for the campus newspaper, accompanied by photographs of students in the community.
- Writing a memo to your dean that is accompanied by a brief executive summary of the evaluation findings.
- Delivering a power point presentation on the evaluation findings for discussion at an advisory board meeting.
- Compiling the positive and critical student comments about a particular community partner and meeting with them in person to discuss how the service-learning experience at that site can be improved.
How will you protect confidentiality and rights of human subjects?
Generally speaking, routine course and program evaluations do not require institutional review board (IRB) review if they are to be used internally and not for publication. However, we advise checking with your institution's IRB to be sure. Whether IRB review is required or not, it is important to carefully consider how the evaluation will protect the confidentiality and rights of participants. For example, focus groups should begin with a set of "ground rules" that include a commitment by all participants to not divulge any information discussed with others outside the room. Any form for participants to indicate their contact information for receiving the final evaluation report should be separated from the evaluation survey itself.
It’s possible that people will respond more openly if you use methods that allow anonymity (where even the person administering the evaluation doesn’t know respondent’s identity) or confidentiality (where the evaluation team knows but doesn’t make it public, appropriate when there is in-person contact such as interviews or focus groups, or when you need to track information over time; records still try to be as discreet as possible, assigning people numbers/codes instead of listing everything by name).
Getting Started: General Evaluation Resources
The following list of resources focuses on the basics of evaluation. Most of these websites include links to many other useful websites and resources. Let us know if you have found a useful resource that should be listed!
American Evaluation Association
The American Evaluation Association's mission is to improve evaluation practices and methods; increase evaluation use; promote evaluation as a profession; and support the contribution of evaluation to the generation of theory and knowledge about effective human action. The Association's website includes a number of useful links, publication lists, and online handbooks. The association also has Topical Interest Groups (TIGs) that include: Assessment in Higher Education; Collaborative, Participatory and Empowerment Evaluation; and International and Cross-Cultural Evaluation. Visit the TIG website to view a list of useful resources and links.
The Evaluation Center at Western Michigan University
The Evaluation Center's mission is to advance the theory, practice, and utilization of evaluation. The Center's website includes links to reports, tools, checklists, a directory of evaluators, and other resources that are very helpful for planning evaluations.
Free Resources for Program Evaluation and Social Research Methods
Evaluation research and the methods used: surveys, focus groups, sampling, interviews, etc. It provides extensive lists of links to other useful resources. This site was created by applied sociologist Gene Shackman and is hosted by the International Consortium for the Advancement of Academic Publication.
The Online Evaluation Resource Library
The Online Evaluation Resource Library (OERL) was developed for professionals seeking to design, conduct, document, or review project evaluations. OERL is funded by the National Science Foundation (NSF) and includes sample evaluation plans, instruments, and reports covering such topics as curriculum development, student attitudinal assessment, access for under-represented populations, and faculty development.
Practical Assessment, Research and Evaluation
Practical Assessment, Research and Evaluation (PARE) is a peer-reviewed electronic journal supported entirely by volunteer efforts. Its purpose is to provide access to refereed articles that can have a positive impact on assessment, research, evaluation, and teaching practice, Manuscripts published in the journal are scholarly syntheses of research and ideas about methodological issues and practices.
Statistics Every Writer Should Know
This website, created by journalist Robert Niles, provides a simple guide to basic statistics. It includes definitions of concepts such as mean and median, information on “not getting duped,” and answers to frequently asked questions about sample sizes and other topics.
Getting Started: Evaluation Resources Specific to Service-Learning
The evaluation resources listed here are specific to service-learning programs and/or service-oriented programs that can act as models. Most of these websites include links to many other useful websites and resources. Let us know if you have found a useful resource that should be listed!
America’s Civic Health Index
The Civic Health Index is a cooperative effort of the National Conference on Citizenship, the Center for Information and Research on Civic Learning and Engagement (CIRCLE) at the Jonathan M. Tisch College of Citizenship and Public Service at Tufts University, and Harvard University’s Saguaro Seminar: Civic Engagement in America, as well as members of a Civic Health Index Working Group. An annual report measures a wide variety of civic indicators in an effort to educate Americans about our civic life and to motivate citizens, leaders and policymakers to strengthen it.
Campus Compact has a variety of resources, including a rubric to assess service-learning reflection papers, useful links, and publications that cover the evaluation of service-learning and other campus-based programs, such as:
Gelmon, Sherril B., Holland, Barbara A., Driscoll, Amy, Spring, Amy, and Kerrigan, Seanna. 2001. Assessing Service-Learning and Civic Engagement: Principles and Techniques. Rev. 3rd ed.
Online resources for civic engagement and service-learning programs, including a paper that summarizes the findings of S-L research over the past several years, with an annotated bibliography, may be found at http://www.compact.org/initiatives/service-learning/
Community-Campus Partnerships for Health
The Community-Campus Partnerships for Health (CCPH) website includes tools to assess service-learning and students, faculty, community partners and institutions, many of them excerpted from the publication.
- Methods and strategies for student assessment
http://www.ccph.info/depts.washington.edu/ccph/pdf_files/tools-students.pdf - Methods and strategies for community partner assessment
http://www.ccph.info/depts.washington.edu/ccph/pdf_files/tools-partner.pdf - Methods and strategies for faculty assessment and reflection
http://www.ccph.info/depts.washington.edu/ccph/servicelearningres.html - Methods and strategies for institutional assessment
http://www.ccph.info/depts.washington.edu/ccph/pdf_files/tools-institution.pdf - Self-assessment tool for service-learning sustainability
http://www.ccph.info/depts.washington.edu/ccph/servicelearning.html - The Community-Campus Partnership for Health (CCPH) website includes a service-learning bibliography with a section on Evaluation Methods.
http://ccph.info/depts.washington.edu/ccph/servicelearningres.html
The Compendium of Assessment and Research Tools
The Compendium of Assessment and Research Tools (CART) is a database that provides information on instruments that measure attributes associated with youth development programs. CART includes descriptions of research instruments, tools, rubrics, and guides and is intended to assist those who have an interest in studying the effectiveness of service-learning, safe and drug-free schools and communities, and other school-based youth development activities.
Corporation for the National and Community Service (CNCS) and Project STAR, the Corporation’s contractor for assessment and evaluation provides resources for data collection, as well as information about assessment instrument development.
The Corporation for National and Community Service Resource Center
The Resource Center is a clearinghouse of more than 5,000 materials designed to strengthen national service and volunteer programs – including free downloadable forms, documents, and tools; a specialized collection of books, videos, and other publications on loan; the Effective Practices Collection, now 800 strong; a growing catalog of self-paced online courses; and more. It’s a one-stop website with content generated by the CNCS network of training and technical assistance providers and experts to serve your program needs.
International Association for Research on Service-Learning and Community Engagement (IARSLCE) holds an annual research conference from which the Advances in Service-Learning Research series is produced. A list of titles in the series may be found at http://www.researchslce.org/publications/.
The National Service-Learning Clearinghouse
The National Service-Learning Center links section includes additional links on Assessment and Evaluation (some overlap with this Fact Sheet), as well as a Research Primer to assist many types of scholars at all stages of the research process.
NSLC also has a fact sheet entitled "The Evidence Base for Service-Learning in Higher Education," which provides information relevant to service-learning evaluation as well.
The National Society for Experiential Education
The National Society for Experiential Education website includes a list of publications for sale, including the Program Evaluation Handbook by Robert Serow (written specifically to introduce experiential educators to the basic elements of program evaluation) and the Combining Service and Learning: A Resource Book for Community and Public Service series, which includes a section on evaluation. NSEE members have access to additional online resources and can join a Special Interest Group on Assessment, Evaluation and Research.
RAND
RAND is a non-profit institution that helps improve policy and decision-making through research and analysis. The RAND website includes several evaluation reports that explain the methods and tools used in their research, such as their 1999 report “Combining Service and Learning in Higher Education: Evaluation of the Learn and Serve America, Higher Education Program, which presents results of a three-year evaluation of LSA-funded service-learning.
The Shumer Self-Assessment for Service-Learning (PDF, 122K)
The Shumer Self-Assessment for Service-Learning (SSASL) is designed as a self-reflective system for professionals in the service-learning and experiential learning fields. Although most applicable to K-12 service-learning programs, it offers a series of instruments and analysis worksheets arranged to help individuals evaluate and improve their current service-learning programs. (Copyright December 2000 by the Center for Experiential and Service-Learning, Department of Work, Community, and Family Education, College of Education and Human Development, University of Minnesota, St. Paul, Minnesota).
VALUE (Valid Assessment of Learning in Undergraduate Education) Project
The VALUE project, sponsored by the Association of American Colleges and Universities, is developing rubrics for evaluation of e-portfolios and other student work related to the essential learning outcomes defined in the LEAP (Liberal Education and America’s Promise) initiative. These outcomes include civic knowledge and engagement, critical thinking, ethical reasoning and action, intercultural knowledge and competence, problem solving, and teamwork. Campuses can sign up to test and provide feedback on the draft rubrics; the final versions will be publicly available in September 2009.
The W. K. Kellogg Foundation
The W. K. Kellogg Foundation (WKKF) supports children, families, and communities as they strengthen and create conditions that propel vulnerable children to achieve success as individuals and as contributors to the larger community and society. Its website offers the WKKF Evaluation Handbook and a variety of other useful publications, including:
Methods of Assessing the Quality of Public Service and Outreach in Institutions of Higher Education How Service Works - A Kellogg evaluation of higher education service programs funded between 1985 and 1995. © 2009 Learn and Serve America’s National Service-Learning Clearinghouse. Photocopying for nonprofit educational purposes is permitted.
Bringle, Robert, et al. 2004. The Measure of Service Learning: Research Scales to Assess Student Experiences. Washington, DC: American Psychological Association.
Gottlieb, Karla and Robinson, Gail. 2006. A Practical Guide for Integrating Civic Responsibility into the Curriculum. Washington, D.C.: Community College Press.
This curriculum guide from the American Association of Community Colleges (AACC) includes useful tools and examples for assessing civic outcomes according to a range of stated objectives, including a student pre/post test and a community partner evaluation. Note that this is a large (94 page pdf) document.
Palomba, Catherine A., and Banta, Trudy W. 1999. Assessment Essentials: Planning, Implementing, and Improving Assessment in Higher Education. San Francisco, CA: Jossey-Bass Publishers.
More Assessment Resources
The Center for Information and Research on Civic Learning and Engagement (CIRCLE) conducts and shares research on the civic and political engagement of young people (ages 15-25) and offers assessment tools.
Assessment tools from the American Democracy Project
Higher Education Research Institute (HERI) at UCLA has conducted an annual survey of college freshman since 1966 including research specifically on the effects of service-learning.
National Survey of Student Engagement obtains, on an annual basis, information from hundreds of four-year colleges and universities nationwide about student participation in programs and activities that institutions provide for their learning and personal development. The results provide an estimate of how undergraduates spend their time and what they gain from attending college. Survey instruments may be found at http://nsse.iub.edu/html/survey_instruments_2010.cfm

An easy-to-search database of hundreds of high-quality service-learning lesson plans, syllabi, and project ideas, submitted by educators and service-learning practitioners
The world's largest service-learning library, with full-text and print resources











