top of page
Search

Designing learning experiences in an evidence-informed way

Updated: Jun 18, 2020

Covid-19 has brought learning budgets across organisations under the microscope. Going forward L&D professionals will need to justify their spend more carefully than ever before. Nevertheless, virtual learning will play a crucial role in the development of employees in the weeks and months ahead. Your challenge, as a learning professional, will be to keep employees productive and relevant, without wasting time, money or effort on wasted initiatives. One way of doing this, is to use evidence-informed learning design. This approach ensure you design effective, efficient, and enjoyable virtual learning experiences. International learning expert and author of Evidence-Informed Learning Design, Mirjam Neelen, offers some insights on what evidence-informed learning design is and how you can use this approach to design more effective learning experiences for your organisation…


What you need to know about evidence-informed learning design


In a nutshell evidence-informed learning design is: Using evidence coming from scientific research (not ‘learner data’) to make design decisions. It’s important to note the term evidence-informed and not evidence-based is used for a reason.


Evidence-based practice is an interdisciplinary approach to clinical practice and is grounded in medicine. It’s “traditionally defined in terms of a “three legged stool” integrating three basic principles: (1) the best available research evidence bearing on whether and why a treatment works, (2) clinical expertise (clinical judgment and experience) to rapidly identify each patient’s unique health state and diagnosis, their individual risks and benefits of potential interventions, and (3) client preferences and values”. Globally speaking, if a decision is made on the intake and working of a medicine, then it means that it was tested and approved for a certain specifically defined population (e.g., an adult with a certain BMI and with specific symptoms and pathologies) and the instruction to take a pill in the morning on an empty stomach but followed by food intake allows for a wide range of specific circumstances (at home, in the car, on the beach, when and wherever as long as it’s on an empty stomach in the morning).


Evidence-informed still means ‘based on scientific research’ but we’re in the field of learning sciences, which is an interdisciplinary science (see image). And here, lots of muddy and mucky real-life things influence an intervention’s effects.



Learning and development doesn’t usually deliver the quality of evidence that clinical practice does. This is simply because there are so many variables that are extremely hard to (all) control. Literally, what worked with a class today at 9 AM won’t necessarily have the same effects on a different class at 3 PM and ‘disruptive’ Johnny’s absence will lead to a completely different situation than when Johnny is present. Hence, when we use evidence, we need to acknowledge that what works in one context doesn’t necessarily work in another. We usually use more qualitative data and so, the evidence is weaker.


It’s useful to know, what the levels of ‘scientific evidence’ are. Gorard’s table below gives an excellent overview of the levels of ‘design quality’ in research:

The reason why the lowest rating determines the overall trustworthiness of a study, is because even when is a study is honest and large-scale with hardly any dropout and with standardised results, if the intervention is described in a wishy-washy manner (i.e., you really don’t know or understand what the intervention exactly was) or if the intervention is not equivalent (e.g., the intervention group spent twice as much time working on the learning experience than the control group), that study, overall, has a low trustworthiness and still only gets 2 stars.


So keep an eye out in which category the provided resources fall!


How to start working in an evidence-informed manner

These steps will enable you to make more informed decisions about what/who to believe when unpacking the latest L&D research:


Step 1: Strip it and flip it


The first part, ‘strip it’, means that you take a critical look at the language used in, for example, a statement like the one below: “Living online is changing our brains. There is an increase in people with autistic spectrum disorders.” - Professor Susan Greenfield


Is the language vague?

Yes. What does ‘changing our brains’ refer to? What kind of change? And what does ‘living online’ mean? We don’t know anybody who ‘lives’ online!


Is the language emotional?

Well, not necessarily the language, but people might have strong emotions when it comes to ‘autistic spectrum disorders’.


Is it ‘hyped-up’?

The topic is ‘popular’ because the prevalence of autistic spectrum disorders is rising although we must also take into account that the expansion of diagnostic criteria plays a role here as well. There’s a lot of ‘buzz’ around what ‘digital’ does for/to us as humans so that way it ‘responds’ to a hype.


The second part, ‘flip it’, means that you try to turn the argument upside down. For example, would it be possible as well that, IF it’s true that ‘living online’ changes our brain in a concerning manner (no one likes the idea of an increase of autistic spectrum disorders) then it would be as likely that these changes could be utterly positive, such as that there’s an increase of brilliantly innovative people.


The idea of learning styles intuitively and emotionally makes sense to people but if you flip it and ask, “how do you feel about pigeonholing people” (which is what you do when you think that people fall into a certain learning style category) then suddenly the idea sounds way less appealing.


Willingham recommends writing down the following statement: “If I do X, there is a Y percent chance that Z will happen.”


So, in the above example: “If I ‘live online’ there is Y percent (???) chance that I will get an autistic spectrum disorder.” Hm… how does the statement sound now?


Step 2: Trace it


This comes down to: Don’t just trust what people say because they’re an authority or an (self-claimed?) expert. This doesn’t mean you have to extensively research everything but you need to dig a bit deeper and ask yourself what kind of evidence there actually is for the claim. What kind of resources has someone used? Just take a critical look.


A British Psychologist, Dorothy Bishop asked a simple questions in response to Greenfield’s claim. She asked “Where’s the evidence?” So far, the silence is deafening.


Step 3: Analyse it


This step requires some basic statistical knowledge but a critical eye can bring you quite a long way as well. If a claim sounds very strong, too generic, too dramatic, then it probably needs more nuance!


Find people who do high quality research-to-practice work (in our field, people like Pedro de Bruyckere, Learning Scientists, Daniel Willingham, Carl Hendrick, Tom Bennett, Blake Harvard, Dylan William, Will Thalheimer, and Patti Shank do a really good job. This doesn’t mean to believe these people blindly BUT it will make it easier for you to trace and get a feel for the research that’s out there.


Step 4. Should I do it?


Most of the time this would be about, should I apply this method, implement this strategy, buy this tool? And so forth…


Use the evidence to increase your knowledge and expertise so you can have conversations with clients or partners, parents and colleague teachers, directors or school principals/headmasters, and so further on WHY you recommend certain design decisions. It will improve your expertise, our value in organisations, and, the most important your designs so that your learners can learn more effectively, efficiently, and enjoyably! Use the evidence to increase your knowledge and expertise so that you can have conversations with clients or team, colleagues, directors and so forth on WHY you recommend certain design decisions. It will improve your expertise, our value in organisations, and, most importantly your designs so your learners can learn more effectively, efficiently, and enjoyably! Bottom line Adopting an evidence-informed approach to learning design will help you to make decisions based on evidence, instead of intuition, beliefs, and trends to increase the impact of your work. It will also improve the design of your virtual learning interventions and increase the overall value of your work.


Note: Mirjam Neelen is the co-author of ‘Evidence-Informed Learning Design’ and a Learning Advisory Manager with over 10 years of industry experience, working at companies such as Houghton Mifflin Harcourt, Google, Learnovate Centre, and now Accenture.  Based in Dublin, she’ll hosting an online two-hour seminar on 26 June on Evidence-Informed Design for Virtual Learning Experiences. She’ll take you through some of the strong evidence from the learning sciences and show you how to apply this evidence to support effective, efficient, and enjoyable virtual learning experiences. You can learn more – here.



Don't forget as a member of our L&D Community,  you qualify for a 20% discount on all KR online conferences, seminars and workshops!

68 views0 comments

Recent Posts

See All
bottom of page