Stage 2 | APA Institutional Effectiveness | Florida International University | FIU
Skip to Main Content

Stage 2

Develop an Assessment Strategy

  1. Identify Assessment Needs
    • What are you trying to measure or understand? Every thing from artifacts for student learning to program efficiency to administrative objectives.
    • Is this skill or proficiency a cornerstone of what every graduate in my field should be able know or do?
  1. Match Purpose with Tools
    • What type of tool would best measure the outcome (e.g., assignment, exam, project, or survey)?
    • Do you already have access to such a tool? If so, where and when is it collected?
  2. Define Use of Assessment Tool
    • When and where do you distribute the tool (e.g., in a capstone course right before graduation)?
    • Who uses the tool (e.g., students, alumni)?
    • Where will the participants complete the assessment?
    • How often do you use or will use the tool (e.g., every semester or annually)?

Measures

Direct & Indirect Measures of Student Learning

Direct measures:
  • Standardized exams
  • Exit examinations
  • Portfolios
  • Pre-tests and post-tests
  • Locally developed exams
  • Papers
  • Oral presentations
  • Behavioral observations
  • Thesis/dissertation
  • Simulations/case studies
  • Video taped/audio taped assignments
Indirect Measures:
  • Surveys or questionnaires (How to develop surveys)
  • Student perception
  • Alumni perception
  • Employer perception
  • Focus groups
  • Interviews
  • Student records

Program and Administrative Efficiency Tools

  • Database information
  • Surveys or questionaires
  • Focus groups
  • Interviews
  • Student records
  • Tracking logs
  • Quantities of the item/service measured
  • Timelines

When, Where, and How Often?

When measuring outcomes, it is fundamental to have a clear plan of where, when, and how often the assessment tools will be used to collect data. You can collect data at three different levels: the course, the program, and the institution. Each of these will produce different types of data and will affect the quality and specificity of the data you can collect and analyze.

Example:

Course Level:
  • Essays
  • Presentations
  • Minute papers
  • Embedded questions
  • Pre-post tests
Program Level:
  • Portfolios
  • Exit exams
  • Graduation surveys
  • Discipline specific national
  • exams
Institution Level:
  • Institutional research data
  • (e.g., in-house surveys,
  • focus groups, databases)
  • National surveys
  • Standardized national
  • exams

Articulating an Assessment Methodology

Assessment Instrument:
  • When assessing learning outcomes: what student work will be used?
  • When assessing program outcomes/admin: what tools will you use to measure the outcome?
  • Describe the assessment activity/artifact/student work and state when and how it will be completed/collected (e.g., during senior year in the capstone course).
Assessment Procedures:
  • How will you collect and evaluate the artifact you collect?
  • Explain how the assessment activities/artifacts/student work will be evaluated.
  • Who will participate in the assessment process if using a rubric or checklist?
  • If you are using a rubric, describe the rubric and state how many members will be on the review/evaluation panel.
Sampling:
  • How many artifacts will be assessed?
  • Clearly state your sample size and sampling technique or explain that you are assessing all of the students.
Minimum Criteria for Success:
  • What score or rating do you expect students to achieve in order to demonstrate they have learned the outcome for this activity/artifact?
  • If you are using a rubric, then you could state: "Students will achieve a 3 or better on a 4-point rubric."
  • If you are using a text, then you could state: "Students will answer 75% of the test items (15 out of 20) correctly."

Continue to Stage 3