Skip to content

Ep. 38 – Assessment Luminary, Mark Lynch, Learnosity

14 Jul 2023
Share:

Host John Kleeman is joined by Mark Lynch, co-founder of Learnosity, a leading assessment technology company. Mark shares his accidental entry into the field of education and assessment, starting with his work on an assessment engine for the New South Wales Board of Education. He discusses the growth of Learnosity, highlighting their focus on solving hard problems and their early success in delivering highly accessible assessments.

Mark emphasizes the importance of tackling the hardest problems in assessment, as solving them paves the way for addressing other challenges. He also talks about Learnosity’s evolution and their work with governments and schools worldwide to deliver equitable assessments.

The conversation delves into the future of assessment, particularly the role of AI in personalization and content creation. Mark acknowledges the potential of AI to improve accessibility and productivity in assessment, but also highlights the challenges associated with evolving standards and conflicting accessibility needs.

He offers advice for those starting out in ed tech and much more!

Full Transcript

John Keeman:

Hello everyone and welcome to Unlocking the Potential of Assessments, the show that delves into creating, delivering, and reporting on fair and reliable assessments. In each episode, we chat with assessment luminaries, influencers, subject matter experts and customers to discover and examine the latest in best practice, guidance and all things assessment. I’m your host, John Kleeman, founder of Questionmark and EVP of Industry Relations and Business Development at Learnosity, the assessment technology company.

Today, we’re really pleased to welcome Mark Lynch. Mark is a co-founder of Learnosity. He’s got a degree in electronic engineering from University College Dublin. Mark’s originally from Ireland, but has worked in Sydney, Australia for over 20 years. Prior to founding Learnosity, Mark was a software developer at a major accounting firm and at an Australian e-commerce platform.

Hi, Mark. I should say good evening, though. I think it’s morning for you. It’s close to midnight in England, but breakfast time in Australia.

Mark Lynch:

Absolutely. Good to see you, John, and nice to be here.

John Keeman:

The question I ask everybody is how did you get into assessment?

Mark Lynch:

Okay. Completely by accident.

I remember back in 2000s, early 2000 so maybe 2003, I met Gavin Cooney through a web standards group and he was doing some work with the New South Wales Board of Education and I ended up helping him out with some coding for some assessment.

I had no background in education, just a background in technology and software development so helped to build basically an assessment engine back in the day that we scaled from that was basically practice for the school certificate and HSC and would do multiple choice questions back in the day. So that was the kind of accidental entry into education and assessment.

John Keeman:

How did Learnosity come about and grow up?

Mark Lynch:

Yeah. My kind of perspective on that is, yeah, so having met Gavin, literally started doing 10 hours a week contracting, which was right after I had my first, no, my son was born, so the first of my children and it was literally consulting work to make ends meet. So started doing assessment, as I mentioned, started doing multiple choice assessment.

As we were doing that, at the same time, they were looking to do in New South Wales the School Certificate Computing Skills Test. Of the kind of 70,000 students, they built something, someone, other team had built a platform that handled it for 65,000 students, but for any students who had special, they’re called special provisions so needed accessibility, large fonts, different color backgrounds, things like that, there’s about 5,000 of those. This platform couldn’t handle it, so they were going to have to go back and ironically do the computing skills test on paper. So we were kind of tasked with that problem and built a platform that would allow us to do that, allow us to deliver highly accessible assessment.

Ironically, at the time that was on Flash, which later become the least accessible piece in the world, but that was highly accessible because we had the ability to control the graphics, scale the fonts, scale the images, recolor stuff, so stuff that just was not available to do back in the web of early 2004.

John Keeman:

I know I’ve talked to you before and I think you think that one of the reasons for Learnosity’s success is that you solved hard problems. Can you tell me about that?

Mark Lynch:

Yeah, I think that’s probably a key point and, again, that one is a good example of it where it was easy to solve it for the 65,000 people who have, fully able to use a computer. To solve it for the 5,000 who have challenges, that’s the hard problem.

And from us solving that hard problem the first time, the next year or after we successfully delivered that, the next year they kind of turned around to us and said, “So that one that you’ve built, when you turn off all the accessibility provisions in it, it looks a lot like the,” let’s call it the real one, “the main assessments.” We’re like, “Yeah, and it could do the whole thing.” So the next year we ran the entire system for all 70,000 students.

But yeah, I think that was very much a key learning lesson of tackle the hardest problem because all of the other problems then become easy after that. And it’s a pretty standard piece now in development, tackle the hard problem and the easy ones follow after that. I think there’s probably more, I could keep going on. There’s more later in that as we evolved.

John Keeman:

I mean, what other kind of hard problems are there in assessment or that we’ve solved or you’ve solved?

Mark Lynch:

Yeah. Well, actually let me talk through our story and how it evolved from there.

Having built this assessment platform, we then went back and approached other governments. We’d have governments come in, the New South Wells government and say, “How did you do this?” And we’re like, “Well, we just kind of built it.” So we went and approached other governments to see if they wanted to do it.

Back in Ireland, the Irish government, they were like, “Yeah, we really like what you’ve done there, but we’re not ready to go and do that yet. But what we’d really like to do is help keep the Irish language alive and improve the teaching of that.”

So what we looked at then and, again, you got to remember this is back in, say, 2005, 2006, where the state of computers in Irish schools was not fantastic. So we ended up building a phone system using asterisks that hooked into the existing assessment engine. So what that would do would dial up on a mobile phone, a Nokia mobile phone. You’d put in a code, you’d get asked questions and you’d respond to it as you would on a phone. Again, that was one of those hard problems to deliver and use the right media and the right tools to deal with that because we knew at the time we wouldn’t be able to do that on computers.

Probably a key piece around that as well, this is back to the RPs, we were given an option of two schools to work at the time. One was, let’s call it a normal school called [inaudible 00:06:43] and the other was down the road, a couple of miles, the Microsoft School of the Future. So we had this choice, which school do we want to go with? So we picked [inaudible 00:06:56], I’m delighted we did because it was a much harder problem because proving it in the Microsoft School of the Future with all the resources, with all of the technology wouldn’t have actually proved anything. Whereas proving it in a regular school with regular resources really showed us how we could move it forward.

Really proud of that project as well because it brought students up two grade levels in kind of six to 12 weeks, which was amazing.

John Keeman:

I think what you’re suggesting is if you’re working in ed tech, don’t go for the easiest solution, go for the hard solution. Because if you do the hard solution, you’re more likely to be able to scale it up and make it more widely used?

Mark Lynch:

I think so, yeah. Again, that’s not necessarily go for the technically hard, just choosing the hard road. That’s not it. But it is choose the… It’s called the higher mountain. And if you can scale that, you know all the other ones are very achievable. So yeah, focus on the appropriate hard challenges.

John Keeman:

Well, I mean, that’s an interesting angle because I think a lot of people nowadays sort of go for an MVP, which is sort of an easy thing to do, but I think what you’re suggesting is that that isn’t always the best way to go.

Mark Lynch:

Well, I think part of it is it’s understanding what is the MVP. It’s more a subtlety around that. After you’ve delivered your MVP, what are the next challenges after that? And yeah, proving the minimum piece of it is valuable, but yeah, not just… we’re not proving that… Look at the hard problem and then choose what piece you’re trying to actually prove.

John Keeman:

Talking about hard problems, one of the big challenges with assessments online is the sort of scalability challenge. You’ve got thousands of people taking a test at the same time, maybe even starting a test at the same time. Do you want to tell us a little bit about that kind of scalability challenge and how you’ve perhaps in the past solved that?

Mark Lynch:

Yeah, yeah. Let me loop back on that. Probably from doing that audio assessment stuff I mentioned earlier on the phones, we actually end up moving back to computers as sound cards were not standard on every machine back then. They’ve obviously become standard and it allowed us to make it a lot more scalable.

We actually were also looking at the history we’d learned from New South Wales of doing literally once-a-year exams was the most stressful two days of the year. It’s a bit like the Olympics. Every four years you’ve got a 10-second sprint and if you mess that up, it’s another four years. It wasn’t quite that bad, but it was the same 364 days for practice for one day. And what we found about that was it was challenging, it was hard so we actually decided to try to focus on, if it’s hard, do it a lot.

We ended up focusing Learnosity on the education use case, assessment in education use case so that we’d get practice on it every day. So every day of the year when students are using our system, we’re getting [inaudible 00:09:58]. When I think back to we would be excited when we’d do a million assessments in 30 days and now we do a million assessments in 30 minutes, it’s just changed and evolved. And that practice of doing it, getting a lot of practice and continuing to learn and continuing to evolve and any system will require, for every order of magnitude increase in scale will require re-engineering. So we continue to re-engineer our systems and learn what are the bottlenecks and work around them. So big piece of being able to scale is have lots of small pieces that can scale out.

John Keeman:

What are some of the challenges of that sort of everybody starting a test at the same time?

Mark Lynch:

Some of the big challenges, and you see this very much in the migration from paper exams where the logistics of doing a paper exam is get everybody in a hall and do it at the same time and as long as you have capacity for that, that’s fine.

Doing that online, it actually works against the general scalability of a system. We’ve seen so many projects around the world that have had this challenge where having everybody bottleneck. I say the current issue where Taylor Swift is touring has taken down websites all over the world at the moment when people go to try and buy tickets.

But, yeah. What we’ve seen on that is continually iteratively work through to find where the bottlenecks are. So tips on that, caching everything as much as you can. Clearly you can’t cache everything because you got to manually balance that with security and it’s not just the capacity of the system. So think the Learnosity system, it’s that one part of the capacity. Where we’ve got involved in stuff is understanding the whole way through what is the weakest link in the chain. A lot of that sits outside of us, as in internet providers.

We actually did a project in Egypt where it wasn’t just the… it was literally the internet provider, the Egyptian internet backbone was getting clogged by doing some of the assessments. So we were seeing the entire country’s network slow down because we had 600,000 students starting an exam at 9:00 AM. So we kind of iteratively continued to work, a lot of our stuff was in there and just fine, but splitting that out, separating it out to different areas, having multiple paths, getting stuff cached as early as you can. And yeah, using CDNs is a big piece where if you can cache part of it a lot closer, even within country to the end user, you have it on much higher bandwidth networks.

John Keeman:

Oh, I love that. Thank you. Thank you. So, I think as well as leading the e-Assessment Association, in fact probably more of your time is spent on the consulting. What kind of consulting do you do?

Mark Lynch:

Yeah. So, no, it is more and more now. So, I left the world of being employed in 2017. It was quite a hard and nerve-wracking thing to do, having been employed by the same organization for 20 years. But I branched out into providing independent consultancy. Fundamentally, I suppose you can break the consultancy down into two groups. The first group is focused on supporting EdTech and e-assessment vendors, suppliers of technology, around building their strategies, supporting their business development processes for breaking into new markets and new sectors within the markets. And that’s an area that I know really well, because that was the role that I was doing in my former employed life in business development.

And I’ve worked with a wide range of organizations in that capacity, from large assessment organizations like RM, through to small, innovative EdTech providers like Sparx, who provide an AI-based mathematics learning tool. And that’s been really interesting. And that’s one of the things I really relish about the consultancy piece is there’s always something different to look at and be involved in.

And then the second group of people that I support are assessment providers, so exam awarding bodies, test providers. And that’s been really interesting because … And I fell into that a bit by accident when I was approached by one of the UK’s largest awarding bodies to help them with a procurement project, where they were trying to select a new assessment platform provider using my experience and my externality, I suppose, on the process.

And that was really interesting because I’ve spent my whole career selling to those organizations and suddenly I find myself on the other side of the fence and supporting the procurement process. At the moment, I think I’ve got six or seven awarding organization clients where I’m supporting them not just on procurement, but also on how to maximize the use of the technology they’ve already got, in deploying effective assessments. It’s a fascinating part of the work that I do and one that I really enjoy.

John Keeman:

So, look, we always like to share some good practice on this podcast. Could you perhaps, first of all, maybe share some good practice for tech, e-assessment vendors, looking to do new things and then perhaps exam organizations or test publishers or others, what advice would you give from the different angles? Let’s start with the vendors. What advice would you give vendors?

Mark Lynch:

Yeah, and I think there is some commonality across the two groups, by the way. And I always think none of this is rocket science, none of this isn’t obvious when you step back and think about it, but that’s the problem for many people, is you get so focused on the day-to-day that it’s very difficult to step back. And so, particularly from a supplier perspective, I think it’s important to be focused on what you are good at doing and not trying to be all things to all people. And that goes to the technology itself, if you are really, really good at delivering large scale, high scale multiple choice questions, then focus on that and grow that and make that the best thing that you can do and the best in the market at doing that. Because the e-assessment community now is more and more crowded, particularly in the UK, with vendors trying to sell their solutions. So, identify what you are good at doing and do more of that. Don’t try to be all things to all people.

And the other temptation, which is quite … Mistake is probably too hard a word, but quite an easy trap to fall into. When you’ve got something particularly innovative, it’s very easy to try and sell that to everybody and to find new use cases in different markets, because often new technologies take a while to embed in a market. So, don’t give up on the core markets that you know that there’s synergy with, with your innovative technology. Keep focused on that. Don’t get caught being pulled off at a tangent to a completely different sector because someone’s shown some interest in your technology. Keep true to your course. I think that’s really my overarching advice. Decide what you’re good at and really focus that and pool all of your resources onto that thing.

John Keeman:

For those of our non-technical listeners, so caching you’re talking about coding it down and then reusing it and a CDN would be a content delivery network that does some sort of similar caching.

Mark Lynch:

Good point. Yes. Yeah. Basically how can you get the results closer or the code closer to people and the assets related to that closer to people so that you don’t all have to get it separately.

John Keeman:

I mean, what sort of advice would you give to other people setting up big projects, either an assessment or related areas?

Mark Lynch:

I think yeah, basically probably that piece I made and point earlier, practice it every day, figure out how you can get it every day. It is hard. There’s different pieces and the piece at scale, like we often talk about, one in a hundred errors, one in a thousand errors, one in a million errors. When you’re doing stuff at scale and you’re doing it for millions of students, you see one in a million errors every day. So it’s all about continuing to track, improve, manage, and tighten that up and there’s a whole load of, let’s say, real world edge cases that will end up coming to bite again.

Another example we had of projects we’ve done where, as part of the syncing content down to, we’re getting it closer to the students and syncing that down, it was designed to sync content at night, until what we realized in the real world in some of the really remote places we were doing this, the systems they were syncing down to were literally turned off at night because the rest of the village needed the electricity. So that’s not something that ever came up in design criteria and was only when you’re on the ground that you see, “Oh, these machines disappear every night. What’s going on there?”

We also see network constraints and things like that that you don’t see in daily testing. I suppose probably the best analogy for that is don’t know if you’ve ever been at a rock concert or at a stadium and your phone, your mobile phone has a signal, but it can’t actually download anything. It’s basically because the congestion of 80,000 people having their phones on it can just about keep up with being connected, but it doesn’t have the bandwidth to handle that. Obviously 5G will hopefully solve all that. Those are the kind of challenges I see.

John Keeman:

Very interesting. And I know one of the thing as well as scalability that sort of powered Learnosity’s success was accessibility, which it sounds like that was one of your very first projects in any case.

But a lot of people are interested in accessible assessments in the community. What’s the key to delivering accessible assessments or with technology?

Mark Lynch:

Yeah. Look, I think the key to accessible assessments is it’s not a bolt-on. It’s a key piece. I also think about like it’s very similar to security in that security is not a tick box, yes or no. It is a journey, it’s a constant evolution.

The accessibility on a computer, some people see it as a hassle. It’s actually a core part of it and it is core. Computers can actually give people even way more access. People who would not be able to do certain things can do a lot more on computers. So it actually opens up these assessments to the world, which is great. It’s key to learn, understand, be curious about it, and then build a team around that who are able to be passionate about it, who can understand and be empathetic to what all these needs are.

And they’re sometimes conflicting so it’s not a simple problem. But by following the best practices, by working with the vendors who build accessibility tools, by continually testing with real world scenarios, you can continue to iterate and improve. But yeah, it’s not just a hard problem, it’s a really hard problem, which takes time, effort, and energy.

John Keeman:

Can you give a sort of example of one of the conflicts or how you might solve a conflict?

Mark Lynch:

Yeah. In terms of the accessibility challenges, it’s that things are continuing to move. We’ve been through scenarios where we kind of a really great example of it, but where the standard says one thing, you implement that and then the standard evolves so you end up having to change, iterate and evolve on that. Things like alt text through to aria labels and which you’re using, which is right. Different screen readers or different software might be reading old ones like, for example, old alt text versus reading new aria labels or the advice on how visible to make certain things, particularly when it’s decorative stuff, may not be worth annotating for accessibility needs because it ends up just being stuff that gets in the way. So it’s kind of an ongoing evolution.

John Keeman:

Okay, that’s great. I know that one of the things about Learnosity and one of the things that has driven you and Gav is the sort of desire to make assessment available to lots of people, that education’s a human. Where did that come from and what drives you to be in this space?

Mark Lynch:

Yeah, I think part of that is just almost even the called Irish upbringing, where in Ireland there was very much a mentality of we didn’t have the natural resources that many other countries had, but when I was growing up, the country was investing in education as a way to propel the country forward. And I think that’s stuck at me where I was, yeah, it’s just so amazing. If you can educate people, it gets rid of conflict, it gets rid of… We can solve all of the world’s problems with education. So it’s, yeah, get on and tackle that problem, rather than waiting for somebody else to do it is kind of where we end up.

John Keeman:

What big challenges do you see coming now or in the future?

Mark Lynch:

Yeah. I think there’s obviously a lot. AI is clearly going to be one of the biggest disruptors over the next while with the generative text pieces so that’s going to be a positive for generation of content. It’s going to be a challenge for generation of student essays and managing that. And there’s definitely going to be a whole evolution in how education and work evolves.

The simplest analogy on that is the calculator and people will forget to use math or forget how to do basic arithmetic. To some level, that may be true. To some level, it may not be as relevant as we fast-forward five, 10 years. I don’t have the answers on what it’s going to be, but I’m very interested to watch and observe and be part of that as we can see how we can sort of help humanity go forward.

I think the reality is it’s not going to replace everything. It’s going to just be an evolution of skillsets that people need to understand and evolve, much like the Industrial Revolution.

John Keeman:

What sort of things do you think AI can help in assessment?

Mark Lynch:

I think it will end up getting a lot more into the personalization over the long term, which I don’t think we’re going to be there just yet, but been able to have more kind of interactive tutoring pieces like that where something that is not cost-effective for a lot of people right now of having a tutor, having somebody who can give you direct feedback on it, but that could become a lot more distributed and available to a lot more people globally, which is really exciting.

John Keeman:

What about using AI to help authors of questions be more productive? How do you think?

Mark Lynch:

Oh, I think definitely that’s going to evolve and accelerate that, which will allow creation of more content and better content and deeper research into that. Obviously comes with many caveats at the moment, but we’re seeing that continue to get better and better where allowing managed content, rather than it being the entire internet, allowing particular knowledge bases of content to be used for the creation of content and more trusted ones or more specialized ones could be incredibly valuable, will allow really good content creation.

John Keeman:

Any other challenges outside AI?

Mark Lynch:

I think one of the biggest challenges having gone through the last 20 years of globalization is I would say called the balkanization of the internet. As a company set up in Ireland and Australia, we were able to build a global company where huge amount of our customers in the US and over the last 10 or 15 years been looking at, oh, China will be a great market. We’re kind of watching that and we’ve done some stuff there, but it seems to have closed down more. We’re going to see more and more places like fear, there’s more and more places that will end up controlling and managing access to that. So you end up not with one solution that you can scale globally, but needing to be managed in each region.

There’s some good things that, well, say, there’s some good things around managing the data correctly and that’s super important. But I think what I’m talking about is the more, it’s called the political reasons for splitting the internet apart for control and management of that, which is not, counter this entire mission of continuing to educate, it’s continuing to being used to dis-educate, that’s not the right word for it, but manage the content with disinformation and controlling access to that.

John Keeman:

I think we know what you mean, yes.

Mark Lynch:

So I think that’s definitely going to be an ongoing risk and challenge.

John Keeman:

Maybe another hard problem that people need to work out how to solve?

Mark Lynch:

Yeah, I think it is. I think there’s definitely some interesting pieces on that with Starlink and access to internet. But yeah, it will always be possible to get around it. To me, the challenge is whether it will be possible for the majority of the population to get around, which ends up leading to a kind of two-tier systems where of the 8 billion plus people in the world, there’s a small amount who have access to it and there’s a lot that don’t have access to quality education and quality information.

John Keeman:

I mean, I think you are suggesting maybe there’s two forces here, the one force that’s trying to encourage education to be for the masses and AI may be helping that be realistic, and another force may be balkanizing the internet or cutting it down, who’s going to win?

Mark Lynch:

Yeah, that’s a good question, though. I don’t know. I think the reality is depends on how, yeah, it’s very much a political landscape piece. I’m optimistic that, shall we say, the forces of good will win and that knowledge and truth are the right way to go. It’s got more political than I was thinking, the dictators and those pieces are not, in the end, won’t win.

John Keeman:

Oh, I think most of us will hope that. So perhaps we could just close. What sort of advice would you give other people who may be starting out in this technology-based assessment or ed tech? Any advice that you’d give them how should they set up a company or grow their company or grow their career?

Mark Lynch:

Yeah, yeah. Well, I’ll kind of talk about how. How I think about it is what you need to be, you need to be in it to figure out what the problems are. So step one is get involved in some way. And when you go into that, keep your eyes open for problems or inefficiencies or things that are not working and then start looking at that.

And there’s the tricky balance where you have to have this confidence the problem you’re solving is the right problem, without being overconfident and solving a problem that’s not really a problem. So it’s a really careful art as in it’s, is the problem big enough? Is it painful enough? Are people, what’s the word? It’s that vitamin or pill piece. Are people willing to pay for it or are they just going, “Oh, it’d be nice if it could do that, but I wouldn’t pay for it. I wouldn’t care to solve it that much.”

Look, in our history, we solved lots of problems that people were actually even were willing to pay for, but that they weren’t scalable, that they weren’t repeatable. Then the phone project I mentioned earlier, that was an amazing project that delivered results, did everything right, but it was hard to roll out because of the stigma of mobile phones in schools and things like that so it was never going to roll out at a large scale. So yeah, being in there looking for problems and being ready to have commitment, but also enough humility to be able to turn around and go, “Okay, it was wrong. Let’s look at the next problem or let’s entirely change the direction we’re going.”

I can recall back and reminded me of, yeah, one of our biggest decisions in Learnosity was to effectively discard our product to build an API product, which was at the time probably 5% of what we had built was what we built the API product around. But the big difference was that it would fit into other people’s systems and allow it to scale. So yeah, knowing when you’re right, knowing you’re wrong is a key challenge.

John Keeman:

Okay. Well, I think that’s intriguing and let’s leave it there and I think a lot for people to think about. So thank you, Mark, really appreciate you giving us this time.

And all our listeners, thank you for listening to us today. We really appreciate your support. Don’t forget if you’ve enjoyed this podcast, want to follow us through your favorite listening platform. Also, please feel free to reach out to me directly at john.kleeman@learnosity.com with any questions, comments, or if you’d like to keep the conversation going. You can also visit the Questionmark website at www.questionmark.com or the Learnosity website at learnosity.com. And we have our best practice webinars we host monthly. Thanks, again, and please tune in for another exciting podcast discussion we’re releasing shortly.

Related resources

Get in touch

Talk to the team to start making assessments a seamless part of your learning experience.