Skip to content

Proving skills in a world reshaped by AI

09 Feb 2026
Share:

As AI reshapes the workplace, organizations need better ways to prove people have the skills required to do the job, not just that they’ve completed training. In this episode of Beyond the Score, Mike Bollinger of Cornerstone OnDemand joins us to explore how assessment, skills, and AI are converging to redefine workforce readiness. 

Full Transcript

John Kleeman:

Welcome to the Learnosity, Podcast Beyond the Score. I’m your host, John Kleeman, an executive at Learnosity, an assessment industry pioneer. The premise of the podcast is that for a century or so, assessment has been slow to evolve, perhaps even stagnant. Now, with the availability of highly advanced technologies like AI, it’s finally ready to make perhaps a quantum leap into a space age, giving us new opportunities to reshape how we think about assessment and how we use it. And in this podcast series, I talk to experts about emerging technologies, ideas, and enduring responsibilities that will recast assessment now and into the future. This episode, I’m really pleased to welcome Mike Bollinger, who’s Global VP Strategic Initiatives at Cornerstone OnDemand, the huge learning and talent company. Mike has been with Cornerstone as a senior executive for 10 years. And prior to that, he was a chief information officer at a group of U.S. schools and had senior human capital management roles at both Oracle and SAP. He’s a well-respected thought leader in human capital management. Welcome, Mike.

Mike Bollinger

Thank you. And I love the title, Beyond the Score. That’s so catchy.

John Kleeman

Yeah. Well, I guess that is really what it’s about. So, tell us a little bit about your background in particular. It’s really interesting that you started from a school’s IT background and then made the jump to the commercial world.

Mike Bollinger

Yeah. I’m the embodiment of those people that have eight careers in their lifetime. I’ve had like 12. I started out in tech, and then I actually started out in finance and then worked in technology in a variety of roles, including an old place that you may or may not remember called EDS in U.S. Customs. And then I got into, just kind of by accident, I got into schools, K-12 in particular, and worked in K-12 in both Wyoming and then ended up being a CIO in a large school district in Wisconsin, and actually served on school board for a while. So, from both directions, both as an administrator and then as a governance individual. And so, in 2001, I made this leap from being on the customer side, if you will, to SAP. And then went to work for SAP in an HR capacity. SAP said, “You’d make a great basis guy,” which is their technology stack.

And I said, “Well, no, look, I really like learning. I really like HR and people.” I’d done several initiatives and said, “Ah, you can do that too.” So I made myself into an HR individual, if you will, in 2001, and then did two tours of duty each at both SAP and Oracle, making a little more money, went back and forth in a variety of roles, including value and including thought leadership and doing a lot of speaking and those kinds of things. And then a little over 10 years ago, came to Cornerstone and I’ve had three jobs at Cornerstone. I founded our thought leadership group, moved that into the field, started the Cornerstone People Research Lab. We still do that in a large variety of ways. And I served directly for our chief marketing officer, and I work in corp dev as well as in partnerships and was aligned with our chief product officer for many years running a team there. So, I’m that guy that’s had 12 careers. I’m proud of it. And I’m a proud grandma on top of it.

John Kleeman

Sounds like you’re doing about 12 things differently at Cornerstone at the moment as well. Yeah. So, look, I think most of our listeners will know who Cornerstone OnDemand are, but there might be some who don’t. Do you want to give a sort of potted description of what it is that Cornerstone does?

Mike Bollinger

Sure. Cornerstone’s a software company that specializes in people, full stop. It’s used by over 7,000 organizations to manage learning, skills, workplace capability. We think of it as workforce agility because agility matters, and that’s actually at the heart of what we do. It’s a large organization, one of those things you may have never heard, 150 million users. We like to say somebody takes a course in Cornerstone more often than Starbucks sells a cup of coffee, but we can’t prove it because they don’t know how many cups of coffee they sell. Cornerstone’s flagship platform is Galaxy. It’s unique, a unified enterprise platform brings learning, skills, talent intelligence, workforce intelligence into a single system. With AI embedded at the core, Galaxy connects data insights and workflows across the people development process. We’re a specialist tool. We focus very much on that. We integrate specifically with specialist tools like Questionmark.

The point of Galaxy is practical. Instead of stitching together a bunch of point tools, you get a platform that helps you understand workforce capability, align people to their roles and close gaps in learning and development and plan for what’s coming next on a common AI data foundation, which everybody says, but in this case, it’s true.

John Kleeman

So, if I understand right, you can either use Cornerstone standalone as your sort of learning platform or you hook it into one of the big enterprise systems and use it with that.

Mike Bollinger

Absolutely.

John Kleeman

Yeah.

Mike Bollinger

So, think of it as a platform that focuses, we’re never going to do payroll, we’re never going to do time, we’re never going to do benefits. We’re very much focused on the talent development life cycles, the people development life cycles, and keeping people agile as the world changes.

John Kleeman

And I think that you are by far the biggest LMS company, if I understand right, and that far more people use Cornerstone than LMS, far more companies than-

Mike Bollinger

Adam Miller founded the company on this notion of teach the world, and this is where we are. Here we are 25 years later. Yep.

John Kleeman

And what kind of parallel is that between schools IT, which I guess is also teaching the world and Cornerstone or commercial teaching area. Do you think they’re very similar, very different or?

Mike Bollinger

No, I think there are strong parallels. And to me, they’re becoming even more obvious. So, in schools and higher education, assessments in particular have traditionally marked important turning points, right? That moment that existed to prove mastery and prove readiness, not just participation, none of those participation ribbons. So, what began a few years ago is now clearly showing up in the workplace. So, as I think about it, employers increasingly need evidence that people can actually apply what they’ve learned on the job. In both environments, the focus is shifting from time spent to demonstrated competence. And that’s important that assessments don’t just certify progress. They’re starting now to guide things like remediation, showing what needs to happen next instead of just retraining and testing. So, across education, thinking about this is really this notion that we have to prove competency.

And I think that’s as important as it gets in terms of visibility into readiness across populations and confidence and standards are being met. The convergence between learning, assessment, and workforce readiness is a keen area of focus for us. At the end of the day, that’s the heart of what we do and it’s where we live. That’s where real capability gets built.

John Kleeman

So, I mean, what do you see are the key… So, I mean, we covered some of them just this then, but what are the key reasons that people use assessments post-time in the workplace? I mean, I guess to check competence is a key one.

Mike Bollinger

So, when we’re thinking about Beyond the Score, right?

John Kleeman

Yeah.

Mike Bollinger

And we think about it from a perspective around proving competence that stands up to scrutiny. And in many cases, regulatory assertion, I hear all the time the need for traceable records that show people understand the rules, the procedures, the risks, the proof to hold up and audits. And that audibility matters because it reduces risk to the organization. I think I told you at one point that I worked in a value capacity. I created what we call value engineering around working with software and the ROI and so on. I’ve built entire business cases specifically just on mitigating risk. And at any point in a business, what matters is lower cost, increased revenue, or lower risk. Those are the big three. And assessments help with not just the costs and so on, but it also is directly tied to risks.

So, when assessments are tied to the regulations and the standard operating procedures, you can catch gaps early before they turn into violations or safety instance, and candidly, could reduce your risk budget if you’re doing it well. There’s also an important legal and operational dimension here that assessments create defensible evidence. So, as you think about making decisions around hiring and promotions, you can show you’re basing that on real job relevant criteria. And in the last few years, that’s been important. So, if someone isn’t meeting a standard, you’re not just flagging failure, you’re enabling remediation and making decisions based on that instead of blanket guesswork. So, this becomes about staying current, and I think that’s the key takeaway here.

John Kleeman

I think there’s a few angles there. So, one angle is probably that if you are assessing people, you’re less likely to make mistakes, and that’s less likely to lead to both business problems and regulatory problems. And also, you’ve got evidence that you’re promoting the right person, which is both going to help you promote the right person.

Mike Bollinger

Dispensable evidence in an era when things can be challenged. And so that’s an important point. You’re making it based on job performance and competency and assessments are a big part of that.

John Kleeman

Yeah, no, no. I think that’s great. And also if something does go wrong, you’ve got the proof that at least you didn’t just train your people, but you checked that they understood things and that the reason wasn’t lack of knowledge or lack of skill, it was something else.

Mike Bollinger

The other thing is, if you think about it, let’s just follow that thread for just a minute, there’s data. So advanced assessment platforms don’t just score. They support multiple methods. How do you analyze question quality? How do you generate insights across various cohorts and subgroups and so on? These are all important insights for the company as they move forward, whether it be an initiative in a new area, a new region, a new product, or any of the things that they want to do to prove to know that people know what they’re doing. And so, the win here is they close gaps fast when they appear and they maintain a workforce that’s demonstrably ready, not just trained. And I think that’s the important key here is we’re not just training them; we’re making them ready from a demonstrable, I’m going to read.

John Kleeman

That sounds good. And we see a lot of interest in topic, the different objectives or subscores, as well as the pass/fail. And also, obviously the different demographics, you can break it down by job role and vocation and department of the organization and things like that and identify. So, say for example, you’ve got a digital literacy or an AI literacy challenge in your organization. If you assess people, you can identify who are your candidates that you need to remediate or improve. So why did Cornerstone partner with Questionmark? Can you talk a little bit about that?

Mike Bollinger

Sure. So, look, one of the things that we know about Cornerstone or that we’re very clear about Cornerstone is that it’s not a walled garden. We used to say we’re opinionated out of the box and configurable under the hood, but if we think about it from the perspective of why we would choose Questionmark, the specialist part of it. Remember, we’re talking about specialty. So, when we think about an assessment and certification capability across the platform, we’re thinking about the rigor and the defensibility that we just talked about. So as this intentionally not a walled garden, we want to bring the best that we possibly can in a variety of different areas. 150 million users, we’re not arrogant enough to believe that we’re going to serve every one of those exactly correctly. So, when we had this conversation around Questionmark and creating this partnership, we wanted something that fit cleanly into the Cornerstone ecosystem.

I think that value shows up in four areas, and we can drill on these if you want, but the first one being advanced and authentic assessment. Cornerstone or Questionmark supports high stake exams, certifications, scenario-based, performance- based, adaptive assessments. What I really like is they’re this wide range of question types, including multimedia, simulations and video or observational responses. And so that’s how organizations can assess that real-world skill application, not just theoretical knowledge. Take that in concert with strong authoring tools, secure item banks. And remember, I’m a little bit of a learning geek, so this is where I’m going, but version control as well. And let me give you a real prime example of version control. In many regulated industries, you have to certify that the items are the latest version and you have to track audibility from that version control perspective. So, from the first real key takeaway for Questionmark, it’s those things that matter to me.

John Kleeman

Yeah. And I think some other things that people like are the strong robustness or somebody’s computer crashes or they have to start again or whatever, they can resume the assessment and there’s also some security capabilities to make it harder for people to cheat and otherwise bypass the test results and also very strong reporting.

Mike Bollinger

Well, and there’s some of that proctoring that you do as well, but if you think about it, yes, agree, crash and start over. But when you’re doing assessment, it has to have the defensibility we’ve talked about, but you also have to have some psychometric rigor. So, you all deliver something around deep item level analytics and reliability and benchmarking. And so, there’s an assurance in there that the assessments are valid and defensible, but also fair. So, this notion of security and compliance for high-stake testing is an important part of that as well. Not only do you support pick up where you left off and so on, but you have some significant proctoring options and audit trails. And those things matter in industries like fin services, healthcare, and life sciences.

John Kleeman

It’s absolutely the case. And really, if you sort of cut Questionmark in two in our heart would be validity, reliability, and fairness. I mean, those are really what we’re about and why people do use them. And I think the partnership with Cornerstone should be really good because obviously people have been able to buy them separate in the past and there are quite a lot of organizations that do, but the fact that Cornerstone are selling it will make it a lot easier for a lot of organizations to get Questionmark and hopefully over time also align capabilities and make the integration stronger and things. So, we’re really excited to have you as a-

Mike Bollinger

I’d add one thing that I think for the listeners, which is, look, Cornerstone is… We use the term enterprise grade, enterprise scale and so on. And those are words that get bandied about a lot. But from a cornerstone perspective, we think about it from a global scale. We support it darn near every language in the world. We support the global nature of the cultural differences. Language is just a part of it. And so, when we think about scale, we think about volume, we think about globality, and we think about the ability to work across industries. And when we went to look, because Cornerstone has assessments, but when we went to look for enterprise scale assessment partner, Questionmark became that partner because of its ability not just to do those global things, but to integrate seamlessly with Cornerstone using standards and APIs and the things that you would expect.

So, in short, we’re providing an open enterprise workforce platform and Questionmark compliments it as a specialist. Now remember, Cornerstone’s a specialist, so is Questionmark. They compliment us as a specialist delivering the rigor and flexibility and globality that we’re looking for.

John Kleeman

So, no, no, I share that and that sounds good. Let’s do a little bit of a pivot and go on to the thing that everybody’s talking about, AI and learning. I think the questions, there’s a lot of questions out there, jobs change with AI, how will learning change with AI? How will compliance change with AI? I mean, maybe let’s start with how jobs will change with AI because everything else really follows from that. What are your thoughts on that?

Mike Bollinger

So, I get asked this question a lot because when you brand yourself as a thought leader, that’s the first thing people ask is some forward-looking thing, in this case, AI. The shorter answer is we really don’t know. I do know though that jobs won’t disappear overnight. It’s a shift. If I think about, if we go back 100 years, the farmer who mastered the site didn’t become obsolete when the combine arrived, but the one who refused to climb into the cab certainly did. So, the skill wasn’t in swinging the blade, it was an understanding the field, managing the machine, knowing when human judgement mattered. So, we wrote a paper on this from a planning perspective. We call it Build, Buy, Borrow, or Bot. And the notion of that is pretty straightforward in the sense that where do you automate, where do you develop, and then where do you fill in the gaps? And those become business decisions. So, if we think of, I think it was Ethan Mollick that said, AI isn’t coming to take your job, but people with AI are going to take the jobs of people without AI, right?

John Kleeman

No, that’s my favorite quote on AI too.

Mike Bollinger

So, AI will take on routine and analytical work, and we want to focus on the human roles.

John Kleeman

Yeah, no, no, I totally agree with that. And I think that other… So critical thinking is going to come really, really important. And obviously, so teaching that and assessing that is going to be important. And also, practical skills, AI, at least at the moment, can’t… It’s got a lot of weaknesses in real life, and evaluation and judgement just becomes super important.

Mike Bollinger

We just released a skills report. We do it annually because we do a lot of labor market data analysis and so on. It just was released. And what we found was that this notion that skills, human skills, they’re not soft skills, they’re human skills, which I also call durable skills like accountability, creativity, human communication and those things. We’re finding that the roles that are changing, we’re finding this fifty-fifty hybrid in this report. Report is absolutely grotesque, to the community. Feel free to go out and get it. It’s got a great number of things in it, but it talks about which roles are changing, which roles are developing and moving forward. But also, this notion that coexistence with AI is candidly how the roles are evolving, if that makes sense.

John Kleeman

One of the things that I think some people are scared about and does worry me slightly is entry-level jobs, the risk that AI can do some of these entry-level jobs in white collar work, and what’s that going to mean for the future of graduates and I guess companies, any thoughts on that?

Mike Bollinger

Yeah, I wrote a piece on that not too long ago. It’s real. Okay? What’s happened is it doesn’t mean that they’re not available, but there’s three things that are going on. And look, I start off by saying I’m a grandpa. Everything is a cycle. Those cycles move in one direction, then the other. So, with that is at the backdrop, there are three things happening at the current point in the cycle. One of those is because there have been some layoffs, you’re finding senior people taking some of those other roles. Secondly, what we do know is that AI is undercutting some of what you would normally call entry roles, but the real caveat, and the biggest driver is the employers are expecting people to come prepared. So, this notion that, and it’s just the nature of the shortness of the cycle. So, this expectation is you should come hitting the ground running.

So, the idea of an internship or something along those lines, there are fewer and fewer of those. The last driver is demographic in that we have a lot of college graduates coming through the system right now competing for what would normally be considered entry-level jobs. So the takeaway here is that there’s a selective nature on the employer side for those particular jobs, including not as many of them, but the real opportunity for those people who are working in those areas is to start now with getting skills that are demonstrable and understandable and finding ways to talk about that rather than expecting you to be developed. Go develop yourself and bring that to the table. And if you bring the table that you’re ready, those are the kinds of things that the employers are looking for.

John Kleeman

And of course, AI is an obvious area there because employers will be looking for people to know about AI and younger people would tend to be more digitally literate than less young people.

Mike Bollinger

100%. But if you look at the way they represent themselves, they, I’m a Gen Z as well, not. If you look at the representation from that cohort of their resumes and so on, they’re representing the digital skills, but they’re not necessarily representing those human skills that we talked about. And what we’re finding in the job postings is that those human skills appear more often in aggregate, more often, two to three times more often than the technical skills themselves. So, think about durable, adjacent, transferable kinds of skills development, and you’ll find yourself, you’ll always have an opportunity somewhere.

John Kleeman

No, that’s so true because if you think about, I’m sure anybody who’s listening as well, if you think about the good young people who’ve joined your organization being effective, their technical skills are part of it, but their ability to communicate, their ability to work with other people is so important. And that’s not something which schools or colleges always teach as well as they could do.

Mike Bollinger

Hey, John, can we assess for that too? Just checking.

John Kleeman

Yeah. So, I think, well, that’s a good question, but I think team working assessment is actually a weak area of the whole industry and practice. There are so many tests for individuals, it’s much harder to do team testing. And I think that’s definitely an area for the future. But of course, there are individual assessments for all sorts of things like critical thinking or communication skills or whatever. And also, AI could be… Sorry, go ahead.

Mike Bollinger

I’m thinking about the simulations, right? You can create simulation environments and then start to assess against that. And one of the things is, the old joke, you only get one chance to make a first impression. If you get a few chances to practice before that first impression, you’re ahead of the game, right?

John Kleeman

Yeah. And we’re going off slightly on a tangent here, but I think AI for scoring these things can be really interesting. So, say for example, you want to have somebody learn to present, well, then an AI can look at them presenting, give feedback, commentary, and also scoring. And we’re certainly experimenting with that for sales, for sales training that you can have somebody give a sales presentation and the AI will come along and score that. And maybe that can be used for sales certification or sales assessment reasonably soon.

Mike Bollinger

The other big one is customer conflict. So it’s very hard to be able to be good at that in the moment, whether it be in a restaurant, front of the house, or if you’re on the phone as a CSR or any kinds of things related to the customer itself. And so, practice matters there as well because then you can remain calm, which is the most important criteria in a situation like that. Look, going to get technical, or not technical, but geeky on you. Mastery is an asymptote curve, which means they never quite touch. The requirements in the mastery are you get close, but they never quite touch. So, you’re always learning more. And that’s the whole point. And that’s what assessments can help you do, not just assess, but help you grow into that particular area.

John Kleeman

So, I mean, that was where we were supposed to be going on this, before the tangent, what is it that AI can do differently in assessment and perhaps learning? And I think we’ve talked about some of those things there. Any other thoughts on how AI is going to change the world of assessment?

Mike Bollinger

And I think this is the beauty of the relationship that we’re talking about here. Assessments can identify gap and also suggest remediation, but then it’s up to that platform to insert the remediation and then create that mastery curve that we’ve talked about. So, there’s a personalization aspect to it that didn’t really exist before. For 10 years, we’ve been talking about personalized learning. Now we can. So, learning shifts from a content delivery to demonstrated competence, and that’s a win for employers and employees. I love that promise. I do. I love the promise that it brings.

John Kleeman

Yeah, no, I agree. The two things we’re seeing at Questionmark that are… And there’s a lot of different ways that AI can be used in assessment, but creating questions, it’s really hard to write a good question. AI is pretty good at coming out with things that needs human review, but that’s a good area. But the thing that I’m really, really most excited about is AI scoring because it opens up the ballpark to all sorts of different kinds of question types that aren’t just multiple choice or right, wrong answer, but allow you to do sort of unstructured tasks, presentations like I told before, or doing practical tasks or managing a client or whatever, things that you just couldn’t… I mean, they’re very expensive to do manual scoring.

Mike Bollinger

So, let’s take that to human in the loop kind of a conclusion too. So, AI is writing the questions, we’re delivering them, it’s doing some scoring, all good, right? But the other thing that Questionmark brings that I think is often underrated is item analysis and validity. So not only are those questions being written, but are they serving the purpose that they were written for? And to have somebody who has some expertise in a particular area, not necessarily as a test maker, but as an expert in the area that’s being assessed for, review some of that analysis and that data and come back to the overall platform with those results, that’s an advantage that Questionmark brings.

John Kleeman

No, I, million percent agree, probably isn’t a very fair thing to say, and then for those or listeners who are not familiar with item analysis, it allows you to look at the whole range of answers to a question and you can identify partly which questions are hard and which questions are easy and what that means, but also which questions don’t correlate well with the assessment results. So say for example, you have a question which the better performing people and other questions don’t do well on, then that is an amber flag that maybe the question has some issues and it could be that the question isn’t relevant to that skill or other things. So, there’s a lot in item analysis that really, really can add a lot of value and make your assessments more valid and reliable.

Mike Bollinger

Words matter; simulations matter. And the fact that you not only can do that item analysis, but do version control on the questions, which gives you a gradient level of item analysis, we’re geeking out as researchers now, but that’s important when it comes to being able to prove that the assessment is actually looking at the competency that you’re trying to get after.

John Kleeman

So, look, we’ve covered quite a lot of angles here. In terms of takeaways for people, what sort of things do organizations need to do in this rapidly changing world? What skills do they need to have and what sort of things do people in HR, IT, learning or compliance teams, what should they be doing that they’re maybe not doing now or maybe should be doing more of than they have been doing?

Mike Bollinger

So good question. So, if we step back, the question isn’t whether AI will change work. We’re trying to create an environment where we can adapt at pace. So, as the work changes, which it is at pace, anybody who says the pace of change is accelerating is wrong, it already has. Item development and assessment development needs to match that pace. But from an organizational perspective, analysis and assessment aside, organizations need three core capabilities. So first off, digital and AI literacy is not just for specialists. We just got done doing another piece of research and I did a video on that. So, feel free to go out on LinkedIn. But people are using AI, but they’re not necessarily talking about it. And they’re not talking about it because they’re ashamed of it, but rather because the organization has said, “Go use it,” but they’re not necessarily giving them direction on how.

So, the people need to understand and the organization how AI supports decision-making processes and where the limits are, no blind trust. Secondly, critical thinking. And we talked about that. As AI takes on more routine work, human value shifts to interpretation and decision making and applying those skills in real situations. So I talked a little bit about that fifty-fifty thing and every role becoming hybrid, but that’s an important point here is as we focus on higher order tasks, because every time there’s been a productivity boost, it’s always created the ability for us to move up the chain, if you will, strategically. You want to be able to create an environment where people feel confidence in understanding not just what’s coming back to them from that AI, but finally, take a big swing, take some chances and do that in a way that supports those individuals. And then you have to be able to validate competence.

And at some levels, I have the skill, I can assert that. That’s fine. But we have to prove people are ready for the way work works and it’s changing. So, whether it’s safety and compliance or risk or otherwise, that’s important. And in many ways, that’s associated even with white and blue collar skills kinds of jobs. I have a son who works with roofing, doing analysis and estimations and so on. And he can climb on the roof. He can do all the measures and he’s using laser to measure, but he’s also using drones. So how do you take those skills and make you even more effective? And so, for each organization it’s, “how does it make me more effective?” So, you have to assess what people can do today and you have to be able to map critical roles to the skills and look at the four Bs, build, buy borrower, bot. Where is it changing work? Where am I automating? What do I want to do next? And don’t do that on a silo.

John Kleeman

And it sounds like it’s almost going back to the quote that you and I both like about, it’s not an AI, going to take your job. It’s somebody working with AI. It’s going to take your job. So similarly, it’s the learning and the assessment needs to focus on how people can work with AI or other digital things. So the key things to do are to encourage people to get digital and AI literacy and then encourage them to be working out and working out as an organization how you use AI within your organization and how you oversee that AI because we all know that AI makes mistakes as biassed and other things, but humans, AI together can be stronger than humans on their own.

Mike Bollinger

Situationally, 100%. Go back to that report where people are using AI, but they’re not talking about it. It’s because they haven’t necessarily been given direction on, “Here, use it this way. Give us feedback. Tell us how it’s working.” Those things matter and we’re back to culture, human culture.

John Kleeman

Sounds great. And if people want these reports or the videos you’ve been mentioning, is it best if they just look at them on your LinkedIn profile at Mike Bollinger or the Cornerstone website or either?

Mike Bollinger

Well, you know what? Post-production, why don’t I get you some links and you can put those in the post-production. Certainly, look me up. You may or may not know that Bollinger’s a very famous French champagne, champagne of choice of James Bond. No relation, so I work for a living, but it’s @Bollinger on X and Mike Bollinger on LinkedIn. Feel free to reach out and we’ll get you the links so that they’re readily available to the audience.

John Kleeman

Thank you very much, Mike. And it’s really good to speak with you. And thank you also to audience for listening to this episode of Learnosity’s podcast Beyond the Score with me, John Kleeman, and my guest, Mike Bollinger of Cornerstone OnDemand. Hopefully the conversation was as good as some champagne, or at least almost. We appreciate your support. And don’t forget if you’ve enjoyed this podcast, why not follow the podcast through your favorite listening platform and check out our back catalogue. Please reach out to me directly at john@learnosity.com with any questions, comments, or if you’d like to keep the conversation going. Thanks again, and please tune in for our next podcast.

Referenced links:

Related resources

Get in touch

Talk to the team to start making assessments a seamless part of your learning experience.