Posted by John Kleeman, Executive Director and Founder
There is a famous quote from a few years back: “The half-life a learned skill is 5-years”. This suggests that half of what you learned 5 years ago is obsolete. I suspect that in fact skills are currently changing faster with technology advances and the pandemic. Much of what we learned 2 or 3 years ago has changed or is obsolete, and the world continues to change rapidly. So, what skills will we need in future and what does this mean for assessment?
One of the best reports I’ve seen on skills is the recent Future of Jobs report from the World Economic Forum. This suggests that the top six skills for work in 2025 will be:
- Analytical thinking and innovation
- Active learning and learning strategies
- Complex problem solving
- Critical thinking and analysis
- Creativity, originality and initiative
- Leadership and social influence
This makes some sense – certainly when I interview people for job roles, I’d love to it if they have all these. The report also includes the graph below on the relative important of different skill groups. As you can see this shows critical thinking / analysis along with problem solving as the two most increasing skill group areas. Essentially future workers will need to be good thinkers and problem solvers.
The question I’ve been asking myself is, as the skills needed in the workplace adjust, how do we use assessments to best measure these skills? I don’t have all the answers, but here are some ideas.
1. Test above knowledge.
For most job roles, facts are only a small part of job knowledge, and people who know things but cannot understand, apply and synthesize them into work relevant job actions are much less competent and useful than those who can. So work-related tests should mostly be testing “above knowledge”. There are a lot of ways to test beyond recall in systems like Questionmark, you can see a recording of a past webinar, “Beyond Recall: Taking Competency Assessments to the Next Level”, on how to do this.
2. Test critical thinking and problem solving directly.
There is longstanding research to suggest that test results of general mental ability (often in the form of “verbal reasoning” or “numerical reasoning” tests) correlate well with job performance. But there clearly is a difference between simple reasoning and more analytical thinking. We’ve recently partnered with Cambridge Assessment (a division of Cambridge University in the UK) to adapt their Thinking Skills Assessment, originally designed for university admissions to be usable in corporate pre-hire and development use as “Questionmark Thinking Skills”. Our test is able to genuinely measure critical thinking and problem solving and this could be useful. See https://www.questionmark.com/platform-services/qm-thinking-skills/ for more information.
3. Consider testing for data literacy.
With the huge volumes of data available now, the ability to analyze data and interpret meaning from it is becoming more important than simple numeric ability. I’m seeing increasing interest in testing for “data literacy”, which is the ability to read, work with, analyze and argue with data. Questionmark is looking at providing assessments in this area – watch this space.
4. Observational assessments.
In an observational assessment, an instructor or supervisor watches someone do a practical task and rates it (see our blog on Scheduling Observational Assessments). Although these are often used to assess practical tasks like operating a machine or performing a medical procedure, they can also be used to measure interpersonal skills (e.g. a salesperson working with a prospect) or other real-world activities. Observational assessments provide a flexible way of fairly measuring almost any skill.
5. Situational judgment.
Finally, consider situational judgment assessments – these present a dilemma in a question that participants must answer by exercising judgment as to what the right choice is. Situational judgement questions can be used to help measure people’s choices in real life situations which are often ambiguous and where you can’t just follow the rule book but need to use judgment. They could be a good way to measure leadership, initiative and other interpersonal skills. I think they have a lot of mileage for wider use especially in certification and post-hire. Questionmark has a white paper on “Assessing for Situational Judgment” that explains a lot about situational judgment questions and how to create them.
I hope these ideas may spark some inspiration or ideas as we adapt to new circumstances. For information on how Questionmark can help your organization assess the skills of the future, request a demo today.
John is the Founder of Questionmark. He wrote the first version of the Questionmark assessment software system and then founded Questionmark in 1988 to market, develop and support it. John has been heavily involved in assessment software development for over 30 years and has also participated in several standards initiatives.