Vanderbilt University Assessment Website

Best Practices for Assessment

Process

Goal-directed

  • Good assessment always begins with the specification of clear, specific, and measurable objectives based on a program’s goals. Assessment involves the translation of those goals into a series of objectives that can be specified clearly and precisely stated.

Be minimally invasive.

  • Any assessment process should carefully consider both the needs of the program as well as how the process will affect the participants. Ideally, any assessment procedure will be as minimally intrusive and invasive to the participants as possible.
    • Try not to ask participants to provide information that is readily obtainable from other sources like student or personnel records.
    • When acquiring such data for assessment purposes, it is important to be respectful of individuals’ privacy and to be professional and responsible with all collected data. Be certain to review all measures and data collected to ensure that it is necessary for the assessment process (i.e., only ask personal questions that are critical to the assessment objectives).

Evaluate the assessment process.

  • It is important to evaluate the assessment process. The reason to do assessment is so that the results can be used to help identify ways to improve a program or to answer a specific question.
    • Evaluate an assessment process by asking how were the results of assessments used, or how will they be used?
    • Be certain to consider how often assessment activities need to be repeated in order to accurately reflect the changes made in the program, as well as changes in the population to which the assessments are directed.
    • Finally, always be on the lookout for better and more effective ways to implement an assessment process.

Click here for more information about designing an assessment plan

Methods

Direct and indirect measures.

  • Assessment measures are usually divided into two broad classes: direct and indirect measures.
    • Direct measures are those in which actual behavior is observed or recorded and the measure is derived from that observation.
    • Indirect measures include those in which participants report their attitudes, perceptions, or feelings about something, usually in the form of a survey.
    • Direct measures are generally preferable for the assessment of specific objectives, but some objectives may only be measurable with more indirect methods.

When developing assessment items, try not to “reinvent the wheel.”

  • Before conducting the assessment, research the topic to determine what has already been conducted.
    • Research into the area might reveal one or more published instruments already available for use.
    • Make sure that others are not (or have not been) already collecting the data you want.

When constructing an assessment measure be aware of semantic issues.

  • Use inclusive and culturally sensitive language.
  • Keep assessment questions simple and to the point.
  • Avoid leading or “double-barreled” questions that result in participant responses which are not focused on the relevant issue.

Pilot test when possible.

  • It is generally best to try to pilot test survey and interview questions so that any confusion regarding interpretation of the question will be discovered.

Keep the length of the assessment measure as short as possible.

  • Be cognizant of survey, focus group, and interview length. Surveys or focus group type interviews which are too long, lose their effectiveness and result in higher non response or incomplete response rates.

Minimize over-survey.

  • Try to avoid “over-sampling” or over-surveying the population.
  • Be aware of other assessment activities occurring at the same time.
  • Sample a proportion of the population instead of the entire population.

Check with IRB to ensure your assessment activities are in compliance.

  • IRB (Institutional Review Board) protects the rights of human subjects who participate in research to ensure that they are not harmed in any way by their participation.
  • Anyone who is doing research with human subjects that will be made public, must have their research approved by the IRB prior to conducting the research.

Click here for more information about choosing assessment methods

Results

Interpretation of the data

  • The results should be considered in the greater context of the program and the participants. Be mindful of external factors that may have influenced the results. Where possible, benchmark your results against other institutions or programs.

Reporting of the data

  • Report the data in summarized form to as wide an audience as possible. When reporting the results, make sure to describe the methodologies employed. Include information about response rates where relevant, and describe the data analyses used as well as any limitations that should be noted. Be certain in the reporting that individual responses are held confidential and that such security and privacy is assured by any external vendors that might be employed in the data collection process.

Use of data

  • Once assessment results are collected, the most important step involves using the data to evaluate how well the program is meeting its stated goals.
    • Evaluate the results of the assessment to ensure that the results answer questions regarding how well the objectives of the program are being met.
    • Based on the data, decide what improvements might be made to the program to help it better meet its goals and find ways to implement those changes.

Click here for more information about analyzing and reporting assessment results