Skip to content

Performance testing versus knowledge testing

30 Jan 2013
Share:

Art Stark is an instructor at the United States Coast Guard National Search and Rescue School – and a longtime Questionmark user.
He will team up with James Parry, Test Development/E-Testing Manager at the Coast Guard’s Performance Systems Branch, to share a case study at the Questionmark Users Conference in Baltimore March 3 – 6.
I’m looking forward to hearing about the Coast Guard’s progress in moving from knowledge-based tests to performance-based tests. Here’s how Art explains the basic ideas behind this.
Tell me about your experience with performance-based training at the Coast Guard.

All Coast Guard training is performance-based. At the National Search and Rescue School we’ve recently completed a course re-write and shifted more from knowledge-based assessments to performance-based assessments. Before coming to the National SAR School, I was an instructor and boat operator trainer on Coast Guard small boats. Everything we did was 100% performance-based. The boat was the classroom and we had standards and objectives we had to meet.
How does performance testing differ from knowledge testing?
To me, knowledge-based testing is testing to the lowest denominator. All through elementary and high school we have been tested at the knowledge level and very infrequently at a performance level. Think of a test you may have crammed for, as soon as the test was over you promptly forgot the information. Most times this was just testing knowledge.
Performance testing is actually being able to observe and evaluate the performance while it is occurring. Knowledge testing is relatively easy to develop. Performance testing is much harder and much more expensive, to create. With reductions to budgets, it is becoming harder and harder to develop the type of facilities we need to use for performance testing, so we need to find new, less expensive ways to test performance.
It takes a much more concerted effort to develop knowledge application test items than to develop simple knowledge test items. When a test is geared to knowledge only, it does not give the evaluator a good assessment of the student’s real ability. An example of this would be applying for a job as a customer service representative. Often there are questions for the job that actually test the application of knowledge, such as “You are approached by an irate customer, what actions do you take…?”
How will you address this during your session?
We’ll look at using written assessments to test performance objectives, which requires creating test items that apply knowledge instead of just recalling it. Taking from Blooms Taxonomy, I look at the third step, application. I’ll be showing how to bridge the gap from knowledge-based testing to performance-based testing.
What would you and Jim like your audience to take away from your presentation?
A heightened awareness of using written tests to evaluate performance.
You’ve attended many of these conferences. What makes you return each year?
The ability to connect with other professionals and increase my knowledge and awareness of advances in training. Meeting and being with good friends in the industry.
Check out the conference program and register soon.

Related resources

Get in touch

Talk to the team to start making assessments a seamless part of your learning experience.