• Home
  • Proctoring
  • Six things I learned at the ATP Global Virtual Conference last week
  • Six thi...

Posted by John Kleeman, Founder and Executive Director

The Association of Test Publishers (ATP) has several conferences and face-to-face meetings each year, including in the US, Europe and India. This year, due to Covid-19, these were replaced with a single virtual, global conference, which happened last week. The conference was huge, there were over 200 sessions in 5 days, many in parallel, over 1,100 registrants. Sessions started at 5am Eastern and ended around 11pm Eastern.

Questionmark is a longstanding member of the ATP and I’m a director and will be its chair in 2021. I and Questionmark colleagues presented five sessions at the conference:

  •  “How SAP Has Digitally Transformed Their Certification Program” (jointly with our customer SAP);
  • “Delivering Occupational Assessments under ISO 10667: Gaining an International Advantage (jointly with experts from Sweden and the US);
  • “If a Computer Algorithm Detects a Security Anomaly, What Reactions Are Appropriate? (jointly with Caveon);
  • ATP Guidance on Video Surveillance. What is Safe? What is Sensible? What is Unwise? (jointly with an expert from Prometric and ATP general counsel Alan Thiemann);
  • Best Practices for Translating and Adapting Computerized Tests and Exams in the 21st Century (jointly with cApStAn).

I attended as many other sessions as I could, and here are some things I learned.

1. A virtual conference can be valuable.

I look forward to the return of in-person events, but virtual conferences do work. The two most successful formats of sessions were:

  • Recorded video presentations with live Q&A. The presenters pre-record a 30-minute video, which is played and followed with a live 10 minute Q&A session. This gives a good mix of prepared content and live immediacy.
  • Coffee conversations where a group of people get together on a single subject (e.g. accessibility issues in remote proctoring, situational judgement questions) in a managed Zoom call. The nice thing about these is that you get to meet and talk to peers.

2. Digital transformation in assessment is advancing very rapidly.

In many countries and sectors, tests that used to be delivered on paper have moved online this year because of Covid-19. And tests that used to make people travel to physical test centers are now offering remote options – where a test taker is observed remotely by a proctor over video.

3. As we digitally transform, equity and access for all are critical.

I chaired a panel with CEOs or senior leaders of major assessment organizations in the US, UK and Japan, and this point came out clearly from this and other sessions. As we make assessments more digital, it’s essential to think about people who don’t have access to computers and/or bandwidth or have accessibility issues. Assessment software needs to be robust and able to deal with connection failures, and we need to think through accessibility accomodations in remote assessments. For example, one issue that we need to address is how to deal with people who for accessibility reasons need “scribes” to type for them when taking a remotely proctored test in their home.

4. Artificial intelligence (AI) in assessment is much talked about but currently more artificial than intelligent.

Many vendors claim to use AI, for example to detect cheating by analyzing video data, but most systems seem more algorithmic/mechanistic than genuinely using AI techniques like machine learning or natural language processing. I’m sure this will improve.

5. Biometrics are promising for authentication but not viable for privacy reasons in many markets.

Facial recognition and other biometrics offer a good technical approach to identifying test takers but have practical challenges. In Europe, facial recognition is only legal if the test taker gives genuine consent; and increasingly US states and localities are starting to restrict use of facial recognition.

6. There is a balance between convenience and security in remote proctoring.

The best executed session at the conference I saw was a clever demonstration by Chris Foster and Andrew Marder of Caveon on how well one could hide cameras to take pictures of the screen when taking tests. Clearly this does give a risk for security of test content.

But although security is vital, so also is dependability and convenience for test takers. Another speaker shared that they planned for a 10% failure rate in online proctoring, due to connectivity or technical issues. It’s critical that remote tests work dependably, and that the candidate finds the process convenient. One important area for all of us to improve is that assessment software needs to work with all systems that test takers have at home. Software must also be reliable and robust in the event of connection issues.

I hope this article shares a little of the buzz of the conference for those of you who couldn’t attend.

Please contact us with your questions or comments.