[Webinar recording] – The future of instructional design

 

Transcript  - The future of instructional design

Welcome to the on demand version of Sprout Labs webinar on the future of instructional design. I'm Robin Petterd, I'm the founder of Sprout Labs, and principal consultant with Learned. At Sprout Labs, we work with organisations around tools and platforms for building learning ecosystems. Learned works with organisations at the strategic level around thinking about how to use learning ecosystems to really accelerate performance. In this session I'm gonna be talking about these topics. This is the on demand version of the webinar. Our webinars are actually really interactive experiences. I use the whiteboard a lot, and what I do during these particular sessions is do a little bit of a summary of what actually happened on those whiteboards. There's so much great participation in those that's really nice to be able to capture that knowledge here.

We're gonna touch on 70:20:10 learning model and instructional design, a little bit of exploration around learning designer and instructional designer, they're interactive activities, so in some ways I'm doing a bit of a summary of those as well. A deep dive into learning analytics through some scenarios. Touching on emerging technology and then finally finishing on a bit around cloud-based authoring.

Now, in some ways I feel really odd about talking about the future of instructional design. How am I meant to be an expert on that? It's something I do every day and have a group of people who are doing work around it, something I spend lots of time thinking about. But I'm particularly not a futurist, so what I've done particularly in this webinar is take this quote from William Gibson who is one of my favourite writers ever and this is one of my favourite quotes from him, "The future's already here, it's just not evenly distributed. It's at the edge of things". A couple of edges that I'm looking at in this recording and in this presentation is data. And particularly, machine learning. I'm gonna try to give you a bit of a definition of what machine learning is and an overview of what the possibility is later as well. And new interfaces. And new interfaces to do with virtual reality and augmented reality in particular. IT's partly because these are incredibly tied to communication and rich media that we use in learning. Now instructional design is also tied to the future of learning and the future of work.

I haven't actually done a deep dive into the future of work particularly in this session, might be another interesting way of looking at it is to do a whole "what's the future of work? And then what does that mean for instructional design?". That's probably a bit of a different slant to what I've taken during this session.

Now I opened up with an interactive activity around does technology change the way we learn? Or does the way we learn really stay the same? This is me being a little bit provocative because in some ways my personal opinion is technology changes the way we access information, learning's an active process. It doesn't change really how we wire and rewire our brains. We're getting a richer understanding of how we learn, but what's really changing is how we access information. Information isn't just learning, but it's just one part of it. Sometimes in actual fact, being able to access information really quickly is actually one of those spots which replaces the need for a learning experience because so much around learning experiences and knowledge dumps when really they need to be performance focused instead.

So to pick up on that notion of information and knowledge. And that's probably really what the first stage of the internet and the first stage of technology has really been all about is about digitalizing knowledge and information, storing it, and making it easier to access. Literally we talk about browsing the internet. It's a passive activity that's really around accessing information. Social networking in our personal lives is sort of starting to change this to the spot in some ways. Now a lot of our dominant activity online is actually about social communication. But first of all, in the first generation of the net, which I think we've still got a legacy of in our organisations as it was really around information.

Now what's really exciting is that now there exists technology to make things smarter. To use the data in new ways that we store to be able to start to make decisions. The machines actually start to be smarter. I wouldn't go as far as to say intelligent, but smarter. Start to help us make decisions or guide decisions in new ways. There's some predictions that up to 40% of jobs could be automated in the near future because of this ability of machines now and code to be able to make different types of decisions.

So it's a really interesting time to live in, it's one of those things that the trends are going to be towards humans actually doing less grunt work, sometimes doing less decision making and actually doing more work at that real creative end of things. So I open up to the group, what does the term instructional design mean to you? And these were the sort of answers I got back. And there's a bit of a summary sometimes of some of the words people used.

There was a strong focus on instructional designers delivering content. And sequencing content and organising content, and organising learning materials and thinking about the flow. I also quite like the notion, the learning theory was really important to instructional designers and the application of that was really what they were doing.

Then I switched this to a spot where it was, "what does the term learning designer mean to you?" Someone who's involved in the whole experience was a really nice line. And this notion that learning design moves from the learner, and what they're doing, rather than thinking about the content was a really nice sentiment. And the other thing that really happened through this particular slide was a lot of talk and a lot of activity and people using the term designing learning experiences.

Learning experiences is a word that I'm hearing people use more often. Having discussions with people about rather than building a learning management system or a learning portal, people are wanting to build a learning experience platform. So essentially sitting there going, actually, we don't want people to just be in the spot where they're accessing information, or compliances being pushed back out to people using a learning management system, we really want to build an experience that's interactive, that's really learner centred.

So it's just this really interesting thing, and there was a very complicated here as well, that in some ways this terminology bit in discussion between instructional designer and learning designer doesn't actually mean a lot. So it then went into this actually asking which term should we be using?

The group really went for learning designer right at the top, and that's where most of the people have actually put their arrows chose. Didn't have any people who came back and said, "Oh, look actually I think we should be using a radically different term."

Now the terminologies and the words we use around different activities is really important. At Sprout Labs a couple of times we've worked with organisations to realign thinking around learning and performance by doing things like changing role names from supervisors to coaches, or switching and not just talking about performance management, we're talking about performance coaching. And these sorts of terminologies and the sorts of discussions that are triggered around those can fundamentally shift how people think around things.

In actual fact, one of our instructional designers at one stage, when I first met him he was working in a TAFE and he was actually working in a way where he was trying to realign people towards this notion of being more about designing learning experiences, and one way he was doing that was really pushing that notion that people were learning designers, not instructional designers. It's narrowly interesting because now he works at Sprout Labs as an instructional designer, and it's been a sort of wrestling a little bit with him. But what we decided is people have enough trouble understanding what an instructional designer is, and it's a more universally used term, that it's actually easier to work with a term that more people know rather than being in a spot where you're trying to rework that.

That's maybe also the spot that we're in as a ... and has been working in with organisations in the outside. If you're inside and you're in a spot in an organisation and you want to rework how you are doing your instructional design and replacing it, I actually think moving to these terms around learning designer or learning experience designer would be a really powerful way to help make some of those shifts happen.

Now we're gonna look at a bit of a scenario. Meet Mary. A few of you have been to some of the recent webinars this year and have come across Mary a couple of times. She's an L&D person in a financial technology company. We're working in her organisation and looking in her organisation in the near future. And we're gonna look at what three of her instructional designers do as a way of exploring what the near future of instructional design looks like. We're gonna look at Oliver's day, and he focuses on ecosystem design. Chloe who focuses on performance support, and Ruby who focuses on data driven design.

Now Oliver's team's really focused on using the 70:20:10 learning model, and from out of the model of working with 70:20:10 working with that in a way where they're thinking of it as multiple components that work together and ecosystems are very much part of what he does in designing learning ecosystems that work together.

He's in a spot where he's working across doing work that might not normally be seen really as learning activity. Things like innovation projects. So he works quite often with visual tools. You can see the first thing in his day is mapping and working and building up a picture of an ecosystem. Then he's working in a collaborative way with a group of people using some design thinking processes. And then because he works at a spot that's really beyond courses, he's working on a learning campaign that's going out to some managers as part of a programme. Then last but not least, he's working on an actual simulation, ti's a virtual reality simulation and doing the learning design around the edge of that as well.

I just hinted at a couple of things around his life. So essentially as a learning ecosystem, he's moved beyond courses. He's really designing things that are multilayered and working in a really rich way. And design thinking let's him generate new ideas, not just copy`what other people are doing. So he's working in a way where he's moved beyond the simplistic solutions that just might be one-dimensional learning interventions to really rich, rich possibilities.

At Sprout Labs you'll see us talk quite often about ecosystems. And for me ecosystems are really about sets of components that work together, and by their relationships of working together become more than a sum of their parts. And specifically in the 70:20:10 learning model, it's a really powerful way of being able to think about all three dimensions over time and in different ways as well. Our model that we worked up around the learning dimension of a learning ecosystem is this. At Learned we're doing a little bit of a different thing with the model that we're using, it's actually focusing on the technology and the L&D capability around learning ecosystems a lot more and looking at the linkages between technologies of learning and capabilities of L&D as well.

So this is really probably just in that particular model from Learned is really around the learning layer.

So unfortunately so much of our learning and organisations have started to become focused on compliance, about what the organisation needs. The first shift that happens in learning ecosystems is moving to a spot where it's learner centred. What the learner does and what the learner needs and what the learner needs of development of the future of themselves and then the future of the organisation is put at the centre. They're then given pathways that guide them through the ecosystem about how to get to particular goals. They have ready knowledge support and ready job aids and information that they can access. So essentially, learning experiences are not used for giving information as a one dimensional thing. That the information is readily available at their fingertips.

The learning spaces actually become sort of spots to be able to practise new skills. So there's still learning, there's not an ecosystem where learning has totally disappeared or training has disappeared, but the training experiences become a lot more on focusing on actual practising the skills, and also quite possibly at the same time during those learning experiences really learning from each other as well.

It's interesting to think and look at what does the learning ecosystem designer does. So they design more than courses, that's a bit of a given. And then because there's multiple bits, they're thinking about the relationships between those things. One of the things that happens in blended learning when learners are being moved from one medium to another medium is that sometimes they can get confused. And that's one of the things to really think through.

One of the nice metaphors I like to have with ecosystems is that they actually have these components that they work over a period of time. Especially if you're building performance supports and knowledge information things as well, they're actually really designed for the long term. Things that beyond the actual learning experiences might have a life. So there might be still one of those things that comes in, has a campaign for a period of time, but there's artefacts that actually have a long-term role to be able to help people.

A learning ecosystem designer really takes in the current landscape. They do what I think of as pacing. They think of it as well, what's the existing learning culture? Have we been in a spot where essentially digital learning has only been used for compliance so now all of a sudden moving to a radical thing like a social learning experience like a design jam is a really new thing for people and not quite where people's pace is at.

We also had a situation with one particular blueprint we did at one stage. The organisation said "Look, don't worry about our learning technologies, just dream. Just build what we need for this particular thing. We'll figure out the learning technologies, we're looking at replacing everything in the next six months anyway." Now, it actually took them three years to get to the spot where they actually had the learning technologies that they needed really to deliver that programme. And in the mean time really they were working with different bits and pieces cobbled together to try to make that happen.

So the thing is we've just gotta take into account where people are at in terms of the technologies and the learning culture that exists as well.

In some ways, and this is what was interesting about the definition of what a learning designer does, is that the learning designer designs experiences. Which might be those pathways and designing more than just the courses. It means designing the things beyond the actual experience. So they might be things like workplace tasks. And this leads into what becomes a project deliverables as an instructional designer on 70:20:10 learning project.

In a traditional learning project, ,really it's about the course material. So really what disappears in the 70:20:10 ... actually I shouldn't say disappears, especially if it's a blended 70:20:10 learning model, quite often the learning experience might start with something that's a little more formal thing in a 10 and then expand into the 20 and the 70. But what actually happens is that instructional designers need to pay more attention to designing the supports and the processes in those 70s and 20s. This might be something like the guide for the person who's facilitating the community of practise. Or actually guides for the managers , and these might be in a spot well if it's actually a learning campaign that's happening over a period of time, that the learning experiences are possibly something sent out to the employee, but timed in a way where actually the manager's getting something else at much the same time.

Then a similar things' happening with workplace learning tasks. The things that expand people's current responsibilities and current expertise to do new things because essentially behaviour changes the end outcome of learning. So that gets built in and it gets guided and quite often it's actually really key to making sure that someone's line manager is able to let them be in a spot where they're actually able to take on those new things. Humans are goal driven, so essentially in this 70:20:10 model around ecosystems it's important that the instructional designer is really getting hold of the management tools and the management strategies and building evaluations that go really beyond just getting the learner reactions to things and really thinking about how are we gonna measure this and then what flows from this.

Recently recorded a podcast, it was a really exciting conversation with Megan Torrens around how instructional design changes with xAPI and this was one of the things that she talked about, that if you start to talk about measurement really early in the conversation, that really shifts what the design is and happens after that.

Now Oliver's finishing his days working on a video programme for the sales team, putting them in a spot where they're actually having conversations with potential clients, so they're in a spot where they're seeing and practising those skills. I think VR's really exciting. Could be the best thing that's ever happened to learning in the next few years. Had nothing to do with content, had nothing to do with webpages. It had nothing [inaudible 00:19:14] you can't convert a PowerPoint presentation into a voiceover like this. So it's a really really big shift that can possibly happen around moving to a spot where we're starting to design experiences that people are literally immersed in to then be able to build learning around the edge.

One of the things that I'm trying to do in this particular session when I talk about a couple of emerging technologies to give you a couple of tools to be able to look at as well. Because VR is one of those things that seems incredibly scary when you start to get into it, and complicated.

So A-Frame is a really simple web-based system for VR. Essentially if you can edit HTML, you can probably build an A-Frame experience. Unity's a platform that was originally developed for gaming, likewise the Unreal Engine. Both of them have the anility to be able to bring in bits and pieces and build interactive learning experiences.

I think it's really important to think through VR from a learning design point of view. In the experiential learning cycle, it really sits somewhere between testing new concepts, playing with new ideas, and having a concrete experience. So just because you experience something and you sit there and go, "actually yep, that possibly wasn't the right way to do that", unless you're actually really observing and making reflections and building up your own concepts, and then trying those concepts back out in a different way, you're not in that spot where you're actually going through that full experiential learning cycle.

So there's a really rich set of possibilities for instructional designers around VR. The VR experience is almost there to be in a spot where it's actually to have a debrief. In the armed forces, they've been using simulations for a long time for learning and practising , In some ways, the debriefing culture is it's incredibly rich to then be able to build learning experiences from that.

I then opened up to the group with a quick question around how their team could use VR. Really the conversations and ideas came down to being able to build scenarios and role plays on steroids, and to be able to simulate conversations. This is possibly also because I planted that seed of an idea with thinking about the sales conversation as well.

Work health and safety simulations is a really good example of what VR can be used for. You can put people in high risk dangerous situations in a fairly safe way and then be able to work with that learning experience to practise things and possibly be able to get things wrong. It might be actually life threatening if they were actually in a spot where they were doing it face to face.

Now onto Ruby's day. Ruby's working with data driven learning design, so her day's very different to Oliver's. So she's working with an SME to start with looking at some performance problems and then looking at new ways to impact that. And thinking about it from a performance and data point of view. She's in a spot where she's wants to get more activity streams into the learning record store and so she can feed into her analytic systems. And this is not just things she is looking at, learning things, she's looking at other activities that they can bring in so that they can look at the relationship possibly between performance and then learning as well.

Ruby takes a really test-driven approach to what she's doing, so she's in a spot where the last part of the day is seeing how some tests are going, and just sitting there going, "Ah! Did that work the way they expected it? Let's make a small tweak to that." And she's doing that based on the data that's coming back through constantly.

Ruby uses one of my favourite learning words. She thinks about the impact of learning first. And then she thinks about how she's gonna measure that impact. Then she actually also takes this constant approach to testing. Constantly being in a spot where everything's fluid. In a spot where tests are almost being adjusted automatically. This happens constantly in marketing, there's a process called AB testing in marketing where say different words on buttons are put on a website and then depending on how many click throughs you get on particular words and particular buttons, it automatically adjust to that test.

In our learning experiences we've got some great activity basted things to be able to do similar things. You could have two types of explanations of a particular concept or practise, the same particular activity and then depending on which explanation got the better result you'd then sit there and go, "Actually explanation B, people were actually finding that they were always getting the activity right, those people." So to then be able to sit there and go, "Actually it's activity A is the more effective learning experience."

In some ways our platforms don't allow us to do this and we don't have this sort of strong culture of testing, and we're not in a spot where everyone's working with cloud based platforms that make it really easy to be ale to change and shift things constantly as well.

We lead a measured life. There was an example recently that came up in podcast where Yet Analytics had hooked up a heart monitor to someone during a flight simulation training session. And then was looking at the relationship between their performance and their heart rate. Now every time I mention this to someone, they've turned to me and sat there and gone, "oh, I've got my heart rate monitor here!" They're wearing a heart rate monitor as they're working as part of their measured fitness life. This is not quite as common in workplaces and there's a whole set of ethics and privacy things to be thought through as well.

In our workplaces, the key to this from a learning point of view is experience API or xAPI, AK tin can. And this really allows information to come in from multiple different sorts of spots in a really standard, really easy way in being able to work with it.

The thing with learning analytics is really interesting. IT's in a spot where it's really about collecting the data and getting it to start with, and that's our first step we're all working on in terms of the learning, but the analysis is really around data visualisation. Now data visualization's really interesting because our brains visually can see relationships extremely quickly. And putting things into visual ways, we can actually get insight quicker.

The next state into the future is machine analysis of data. Going to try to give a really crash course on what machine learning really is. We have a little bit of a debate at Sprout Labs about how we actually explain this, so I'm going to actually try to explain it in a classic instructional design way where I'm going to do a conceptual way of doing it then actually give a solid example as well.

So my layman's way of trying to explain machine learning is it's about systems that looks at the data, crunches lots of numbers and crunches lot's of relationships to find patterns and then looks for relationships in those patterns. Essentially it's doing two thins, it's classifying the data and then clustering similar data together. Then this gets really powerful because what can then happen is the systems can then predict if someone fits into a particular cluster or classification. Now the other thing about machine learning that I find fascinating is you can actually use these classification clusters to actually build decision trees. So one particular project we're working on at the moment is a study planning tool for doctors. So essentially we're working with a medical education expert who's in a spot where they developed a huge amount of expertise around being able to analyse where a doctor in training is at, what their weaknesses are, and then what topics they should be working on. We've got her to actually codify that into a set of factors and essentially ran one of these algorithms across it very quickly to be able to build up these very complicated decision trees that she normally uses to make these decisions quite quickly.

Now what's really interesting from a developer point of view, this was a lot easier than developing a whole lot of if-then statements. So essentially the system develops those if-then statements based on the data that you actually give it.

Now, an example. You have a group of learners who haven't gone particularly well at their job. Into essentially a black box, a black circle in the middle here (we're not going to worry too much about what actually is in the middle there), you feed that data. And then you feed some examples of people who don't have those problems who are going okay, and you feed those in as well. Then as someone's going through that programme and someone new's coming through with their lot of data, the system's able to then really quickly predict whether or not someone is going to fall into the group of people who struggle, or the group of people who succeed, and they can do that based on what it's seen in the past. So essentially learning from the data as it's going. And that's where part of the learning term comes from.

It really makes Ruby's life a whole lot easier. She can see the root causes of problems really quickly. She also has a system that helps guide her decision making strategies around learning strategies. It's one of these branching scenarios, decision machines that I just talked about a moment ago, but it's also been calibrated to be in a spot where it's bringing in data from the past, from [inaudible 00:29:59] organisation to be able to see what's working really well as well.

To touch quickly on some tools for doing this. BigML is a platform for doing this that's fairly easy to use that you can send excel style spreadsheets to. All of Amazon's systems for being able to do this sort of machine learning is available as application programming interfaces so you do need to be a coder to do those, and then one of the Sprouts is working on a platform called Bit Squirrel that's really designed for doing this sort of work in marketing but we're looking at trying to use it in learning as well.

Now, I opened up to the team around barriers to becoming more data driven. And this was fascinating because I actually expected it to be totally around technology. Then there was one theme around technology, the out of date systems or lack of support from IT, the appropriate but also the thing around L&D actually not being capable. Essentially, traditionally L&D's been so people focused. L&D normally sits in a HR area, and also the cultural rejection of the idea.

I had these fascinating conversations with my sister who's ended up in finance, and she sees the work that I do around culture and learning cultures and technologies around learning cultures as fluffy compared to her world of numbers. And I think this is really interesting around that sort of thing, there's a bit of me in learning that we need to keep the people side of it, but we just need to pick up some of the bits and pieces that are happening from other bits of the organisation as well.

Now Chloe's day. Chloe's really about performance support. She works beyond courses, a little bit like Oliver was doing as well, and really uses possibly courses as a last resort. So her day is doing some root cause analysis work with a marketing team, looking at some ways they can integrate some personalization support into the CRM, the customer relationship management system, and then working on planning for an augmented reality system, which I'll talk a little bit more about.

'Cause in some ways being able to have a augmented reality system where essentially information layered over people's world is possibly the ultimate set of performance support, for information to be available just in time as you're looking at it. Now, one of the important things for Chloe is she's actually made a real shift. She really sees herself as a performance improvement person first. She doesn't see herself as someone who designs courses. And someone who doesn't really design content or even really experiences. She's at that spot, she's really on about performances.

One of the most powerful things in terms of my move to thinking about more performance focused was actually learning more about lean manufacturing. Why? Because lean manufacturing has a whole set of tools and processes that are really about streamlining activities and processes, and really focusing on performance and getting that happening. These are things that you can then apply outside of manufacturing as well to most processes as well. One of the important things that Chloe does as part of that LEAN thinking is actually trying to make sure the learning is designed into the work. Some people who've been to multiple webinars or recordings have seen me talk about this version of the fishbone chart from Paul Matthews. And this is a sort of modified version. And this is I think one of the most powerful tools for doing root cause analysis. It doesn't have to be complicated, it just has to be a spot of a moment where you sit there and start talking to someone who might have come in with what they think is a training problem to sit there and say, "Look, can I just draw a bit of a diagram on the whiteboard, or a piece of paper? And I just want to look at all the aspects of this particular problem, and then once we've thought it through from these particular things, then we might be able to look back at the cause in a different way."

What I particularly like about this is it still sits there and goes, yes, there's some aspects around skills and knowledge and mindsets, the people side of things, the traditional learning, but it's also looking at the processes and resources. And last but definitely not least in terms of Ruby's life, looking at what the measurements are as well. It's a simple tool, but I think it's one of the most powerful tools that we can use for adopting a different sort of way of working and learning.

Now, Robin's crash course in what augmented reality is. Augmented reality in today's terms is two things. Phone based systems where you might put your phone up in front of a QR code or another object and the phone recognises what that is and then puts information back over the top. Pokemon Go was really the main streaming of this type of mobile phone based augmented reality. Augmented reality that's more sophisticated that we haven't quite decided to see as much yet is actually headset based. So it's more like VR, where you put on a set of glasses, and essentially information shows back up in front of your eyes, literally. Google Glass is a really good example of a low end version, and then Microsoft HoloLens is one of the ones that's actually starting to happen as well.

So Chloe's working on an augmented reality system for her coaching staff. It's based on voice recognition and it can be used with both a headset or a phone. So when we mean something like voice recognition, it's doing something like putting these types of performance supports over the top, so it's sitting there going, "Actually, I haven't heard anyone talk about budget yet. I haven't heard anyone say, 'Do you have a budget.'" or it's then triggering different types of feedback based on what's actually happened or happening in the voice set up. Particularly voice interfaces with some of these technologies really also change what the possibilities are as well.

Now, I actually opened up to the group for the last question for the session, which was what needs to happen for L&D to be more focused on supporting performance. Had some really great ideas here, I think it's one of those things that some people in the audience had wrestled and thought about and come up against in their particular organisation as well. So people talked about the problems about like access to performance metrics, and the move to being a strategic partner. I'm going to probably pick this up in some later webinars and some content. About that sort of sense of what happens if people keep on wanting you to be the training centre or the training organisation or the people who do the training. How do you move to a spot where they can actually trust you to become more of a strategic partner, to really make things happen.

I think this last comment down the bottom around the shift that means actually putting the learner at the centre rather than the content at the centre is a really powerful one, because one of the risks around performance support, it can actually put people in a spot where they're in what I call the content ghetto again because what they're doing is only producing job aids, procedures, things to help people just in time, but not always possibly in a spot where they're doing that from a point of view of actually making the job better. So unless it has an analysis bit tied into it, and the instructional designer's doing that really well, you're not really making that full shift in the full potential.

Now one of the other real shifts that's happening in learning authoring at the moment is a move to cloud based systems, and the move to HTML cloud based systems in particular. This is something that's very sort of happening now, but it is actually meaning that there's some real shifts that are happening in instructional design and learning development processes. So I'm meaning things like Adapt Builder which is a HTML5 system, Sprout Labs own Glasshouse system as well. So just like many things that have moved to the cloud, they make it really easy to collaborate, and especially make it easier to manage change. You just go into the system, you make the change, you publish it. You might need to go through a review process but there's those sort of up-learning and down-learning. The other thing that's really fascinating is because these technologies come from more of a web background, they act more like websites. And one of the things we're starting to see is people sitting down with learning management systems saying, "I want it to be more like a website," or with the blog post that I recently wrote around weaknesses and LMSs from Elliott Masie's observations. And some of those weaknesses were really coming down to the fact that learners were expecting LMS experiences and learning experiences to be more like the rest of the web. And quite often they're not.

Now, the other big thing about moving to the cloud with your e-learning authoring systems, it means that you're actually in a spot where your cloud tools can move beyond just content as well. So one of the exciting things I think is being able to build what's sometimes called campaigns or space learning experiences or subscription things. Things that push out learning experiences to people via email or push notifications on their mobile phones. So that spaced over a period of time to sort of stop the forgetting effect and for people to be able to sort of click into and reconnect to what they're meant to be doing constantly. Because behavioural change is not easy. At this particular spot it's odd because it moves from these platforms just being about content to being learning experience as well.

Now the other thing around these particular types of platforms and authoring tools is because they're actually based on web technologies, quite often, especially in a spot where people are doing both instructional design and learning design work and development work means instructional designers and e-learning developers need to be really flaunty in web-design skills like HTML and CSS.

At Sprout Labs we've been working away at our own digital authoring platform. Essentially in the past we've been using it just with our clients on client projects and now we're just about at the spot where we're actually opening it up to more people. You can go to getglasshouse.com and sign up to get early access to that at the moment.

Now, thank you so much for actually taking the time to listen to this particular recording, I hope you found it really useful.