The world of learning and development is changing at an unbelievably rapid pace. From artificial intelligence to augmented reality, some of the concepts and mechanisms that, on the surface, might seem “out there” are actually already gaining traction in today’s learning environments. And like that rearview mirror warning, innovations of the future are closer than they appear.
It’s an exciting time to be working on innovations around learning, to say the least. One area that we’ve been focused on in particular is the future of thinking-based employee assessments and how they fit into the process of building insights that can be applied to everyday business issues. Our London-based innovation lab is looking at everything from the way people communicate to how they use social media to the way they play games and use technology.
We connected up recently with Ann Herrmann-Nehdi (CEO of Herrmann International / Herrmann Global), Karim Nehdi (Global Head of Innovation), and Danny Stanhope (Applied Data Scientist and Psychometrician) for a wide-ranging discussion about the evolution and future of assessments in light of new directions in learning, new technological advances and new learner expectations. Here are some highlights from that conversation.
What are some of the things you’re hearing from learners and practitioners about assessments in general?
Ann: There’s assessment fatigue. Surveys are everywhere you look, and many of the social media surveys are more interactive and interesting than what employees are getting in the workplace. So there’s a weariness and a skepticism about taking yet another assessment. The classic paper-and-pencil, ask-a-question, get-a-profile survey just isn’t going to cut it any more.
Danny: People do feel like they’re getting surveyed to death, and then when they don’t see any impact or have any insights from the experience, they feel like it was a waste of their time.
Karim: Novelty and engagement are becoming more and more important. At the same time, the amount of data that’s being captured is growing significantly, which means we have a perfect opportunity to evolve assessments from a point-in-time, “out of the ordinary” exercise to on-going diagnostic tools that are always collecting data and use the power of Machine Learning and Artificial Intelligence to unlock a lot more value from that “big” data.
Big Data gets a lot of attention in business today. Tell us a little more about how you see it applying in the world of assessments as a whole and to the HBDI® assessment specifically.
Karim: We all leave “breadcrumbs” of data behind in everything we do. You’ve probably seen how that data is being used to individualize the experience on the web. That data can be used in other ways, too.
Your thinking underlies so many of the things that you do, and we know your thinking can change—maybe not overnight, but it does change, that’s what neuroplasticity is all about —so as you leave those breadcrumbs of data, the vision is that the HBDI® can give you that information about your thinking in real time. Instead of a static experience, it’s dynamic and integrated.
Ann: Like much of what we’re seeing in the learning world in general, this is a move towards a real-time, self-managed approach, where the person taking the assessment has the control. With real-time feedback you can get the information as you go, take it into account and then apply it in the moment. So the learner drives this, not the person administering the assessment.
Karim: Think of it like a Fitbit. You can tweak and make adjustments based on the data and then immediately see improvements in what you’re doing.
Ann: For the business, that real-time element is critical, because it means you can have immediate time to application rather than having to put people through a cumbersome learning process.
How are all these changes impacting L&D?
Ann: L&D needs to be functioning more like a coach on the sidelines, providing experiences that drive insights. I’ve talked about this on the learning side, and it applies to the assessment side as well. They have to be driving the value chain. But most don’t think of assessments in that way. They think of an assessment like a tool that you put out and then that’s it. On its own, delivered as a standalone, an assessment is a waste of time. It must be driving insight and application, or you shouldn’t be doing it.
Karim: For example, if you look at what we’re doing with our Text Profiler project, which gives you real-time feedback on your written communication in the context of thinking preferences, it’s not just about “here’s your information”; it’s “here’s what you can now do with this.” This is what we talk about in our value creation model. You have diagnostic tools like the HBDI® based on the Whole Brain® Model, and they’re meant to support the creation of insights through information and learning experiences. Those experiences then have to be translated into application-focused tools and programs. It’s based on how the brain works and how people learn—and it’s how you get value out of the process.
Danny: The “so what” piece—why am I doing this assessment—is also more important than ever. People need to know how it relates to what they do or you’ll have trouble getting buy-in. Especially with all the surveys people are being asked to respond to, they’re already overwhelmed.
Ann: There’s a learner expectation now that the process should be interesting, challenging, rewarding and fun. L&D can’t get away with putting learners through a tortuous process. Assessments have to be relevant, applicable and driving insight along the path to growth and results. And to Danny’s point, the “why” is essential, particularly when you consider the importance of context to the brain. If people don’t know why they’re taking it or what the goal is, the veracity of the data will go down. And if the data’s bad, why bother?
You’ve hinted at a few ongoing developments with the HBDI® and new application tools. What can you tell us about some of the new products that are on the horizon?
Karim: We’re working on HBDI® v2 as we speak. We have a focus group today looking at the changes we’re implementing. What’s great is that our new technology platform, Axon, allows us to make this a dynamic, living, evolving tool. We’re able to measure, evaluate and tweak without disrupting the validity or experience for the learner and the facilitator.
Danny: Some of the other things we’re working on include building up the richness of the assessment experience, particularly in the “insight” phase of learning, where we’ll be incorporating gaming and other interactive methods.
What about issues of validity as the HBDI® continues to evolve?
Karim: We deliberately built Axon in such a way that that we can continue to evolve the assessment while maintaining its validity. In fact, the psychometric measures of validity get stronger with each evolution. It’s kind of like Google’s search algorithms—every time they make a change, the results get more and more precise.
Danny: People don’t always realize this, but validity is an ongoing effort, and the updates we’re making continue to support and strengthen our validity arguments.
Any closing thoughts for practitioners who are looking at how to use assessments and evolve their strategy going forward?
Ann: First, the assessment and learning processes will continue to mesh together, which means there has to be more integration and “building in.” As real-time feedback loops become part of the process, the steps won’t be linear any more, so integration is key.
Second, I can’t emphasize enough the importance of having a model and context around which to understand your assessment data, instead of just a laundry list of the facts. We get tons of data, but it has no value without a framework around which to use it. And people have less tolerance than ever for tools that have no clear context and no clear application to the work.
Danny: Assessments can no longer exist in a vacuum. You have to answer questions from all perspectives. The learner wants to know, Why should I care? L&D should be asking, Why are we spending money on it if we can’t drive application from it? And from a design perspective, an assessment is irrelevant if you can’t link it to meaningful and personal—not just generic—insights and application.
Ann: Ultimately, I think the focus needs to shift from just being on the assessment and the self-awareness it brings, and become more about creating insights, and applying those insights to create new value and solve problems. This isn’t (or shouldn’t be) new, but we see so many companies who stop short of getting the true value out of assessments. An assessment should be a catalyst and “golden thread” for ongoing learning experiences!
Thank you to Ann, Karim and Danny for sharing! Now it's your turn. Let us know your thoughts on the future of employee assessments in the quick survey below.