Wednesday, December 4, 2013

Automatically determining mobile or computer-based delivery


Moving into a world of mobile versus pc accessed content doesn’t always mean dramatic rewrites of web-content.  Responsive web design helps the content scale to meet the device’s abilities.  However, when working with rapid eLearning development tools like Adobe Captivate and Articulate Storyline, a one size fits all approach isn’t always possible. 


The Challenge
Currently, there is no standard for delivering multimedia across platforms.  Flash still dominates the pc environment, but is absent on mobile devices.  HTML5 is progressing, but isn’t persistent (especially if you are delivering to a corporate environment using browsers from a few years ago).  Rapid development tools are beginning to introduce HTML5 output, but they lack many of the features that are present if those same files are published as Flash.

There’s a need to deliver the best possible learning experience based on the device that the learner wants to use to access the material.  If a user is sitting at a pc with Flash and looking for interactive eLearning, it can be delivered via Flash.  If the user is on the road and looking to access the same content from their mobile phone, it can’t.  In those cases, the device sets up a few more constraints, but they shouldn’t keep the learner from getting to the information they need in the manner she wants to access it. 

A Solution
Clearly not the solution as there are untold other options out there, but this is one possible solution that leverages some simple javascript similar to what we used earlier to determine the appropriate Captivate output size.  Essentially, you’ll provide two versions of the material – one for pc-based users and one for mobile.  You’ll then create a launch file that will determine how they are accessing the material and point them to the appropriate version.

Decision File
In the body of an HTML launch file, you can place the following javascript code that will determine how the learner is accessing the page:
<script>
if(navigator.appVersion.indexOf(“Mobile”)!=-1){
  window.location=”mobile.htm”;
} else {
  window.location=”pc.htm”;
}
</script>

Essentially, navigator.appVersion pulls a string of information about the browser.  The .indexOf portion of the line searches for the word Mobile in the string.  If it is present, it will return the place in the string where it begins (not -1), so we will open the mobile version of the content.  If it is not present, we will open the pc version of the content.

PC File
If you are using a rapid development tool, this one is the simplest – you merely identify the page that is created by the tool.  This will allow you to launch the Flash version with no problems (assuming the pc has Flash on it). 

Mobile File
This one is a bit trickier as you likely have to make concessions about the delivery from what you would have created in Captivate.  There are a couple of options that can make this simple for the developer, but the focus should instead be on creating what is best for the learner on their mobile device, which can often mean duplicative development effort.

Potential output formats include rendering Captivate files as mp4 videos (once you’re removed all interactivity), providing content in an HTML5 output version from the rapid development tool, providing alternate content in pdf form, or even leveraging standard HTML development.  The mobile version doesn’t have to be a lessened experience, it just has to be one tailored to the device.  For example, a PDF with buttons to navigate works brilliantly on an iPad because it feels so natural, but on a PC, it seems a bit lackluster. 

Overall
Unfortunately, at this point, there is still a bit of duplicative effort that is required for the developer (and designer).  However, it provides a much more ideal situation for the learner by allowing them to access the content in which they are interested through the device they desire without knowing there is another possibility.

Wednesday, November 20, 2013

Reflections from #Learning2013


Earlier this month, I had the privilege of attending Elliott Masie’s Learning 2013 conference in Orlando.  In addition to learning a great deal, I also facilitated two breakout sessions during the conference – which was both an enjoyable experience during, as well as an enlightening one in the preparation stages.  Here are some of my thoughts related to topics that seemed to continuously emerge during the event.
Steve Hudson facilitating 'Hate to Great...'
70 : 20 : 10
The 70 : 20 : 10 model was repeatedly mentioned to talk about where learning occurs.  You can learn more about the model here, but essentially, 70% of learning takes place on the job and through performance support, 20% is social learning including coaching and mentoring, and 10% is learned from courses and readings. 

It’s tough to digest that statistic as a learning professional.  My gut reaction is that organized learning is much more valuable.  However, as I reflect on my own learning, it certainly begins to pass the gut check test.  Rarely do I attend classes to learn something.  Typically I Google how to do something, ask someone in my network, or just figure it out as I go along. 

The prime takeaways are that there is a huge, untapped market for learning professionals to help leverage in the 90%.  This can come from organizational constructs that are put in place such as social media platforms or groups within them or from leveraging experts as mentors throughout an organization.  Additionally, as you look at workplace learning, where is all the money spent?  Is it the 10%?  Is that the best bang for the buck?  Likely no.

Measure Impact or Don’t Measure Anything at All
From both an interesting session from Nigel Paine as well as a lunchtime conversation with Elliott Masie, there seemed to be a lot of emphasis on metrics.  Instead of stressing the importance of measuring everything or measuring everyone, they focused on measuring what the learning is actually trying to accomplish. 

The goal of learning isn’t to fill classrooms or to measure online enrollments – it’s to change behavior.  So, it isn’t appropriate to measure the success of your learning design by counting the participants who attended or doing smile sheets afterward.  While those things may be important to learning teams, they don’t matter at all to the business.  Rather, you should be looking at metrics of on the job behavior beforehand and comparing them to those after the learning intervention.

If you are creating online learning content to help educate employees about safety while transporting their laptop, the only thing that matters is a reduction in the number of laptop theft/losses.  The number of people who took the course or enjoyed it doesn’t matter.  The end goal is to have an impact on behavior.


Take criticism seriously, but not personally
A quote from Hillary Clinton that was paraphrased by Jane Pauley talked about living a life where you are criticized publically.  In addition to having skin “as thick as a rhinoceros,” there is another important aspect to dealing with criticism.  It should be understood and appreciated through the lens it was created, but never taken personally. 

Political talking heads will often insult those who don’t agree with them.  That shouldn’t be internalized.  However, there is likely some nugget or insight to their perspective that is being shared, if you can get through the nonsense.  Seriously examining the core of what they are saying can often be beneficial.

While most of us are not famous and don’t face this same criticism, it is likely there for everyone in smaller doses.  With rivalries and cliques in a workplace, it’s important to understand that you are all there for the same purpose (helping others, making money, etc).  If there is criticism that you hear, being able to take a step back and examine the core of it without taking it personally can be the key to building more productive relationships that will ultimately elevate all parties.


Micro-reflections
In attending many other sessions, there was a good deal of material, quotes, insights, and tidbits that were picked up.  While they didn’t resonate with me or connect to other presentations like the thoughts above, I thought it important to note them:
  • We’re doing pretty well – As I attended sessions on implementing social media into learning and a bit on mobile, I couldn’t help but appreciate the strides that we have made in my own company in these areas.  It isn’t that everyone is doing all of these things, but there are certainly pockets that are incorporating these technologies or are poised to do so at a very high level. 
  • There’s a lot of interpretation to ‘mobile learning’ – Attending sessions on mobile allowed me to see several perspectives of what people considered mobile learning.  From websites that were mobile friendly to full-fledged apps, there was a wide variety of mobile learning.  Ultimately my takeaway is that when working with mobile, design for mobile.  It’s a different device, it’s used in different places, it has different capabilities.  While some may use it as a second screen while their primary machine is crunching numbers or rendering videos, oftentimes, what makes mobile beneficial is that it can be used anywhere.  Am I going to take a 300 slide click through module at a traffic stop?  No.  However, I certainly check my Twitter feed.  If you learning is tailored to the device and the need, mobile can be successful and more than just a gimmick to appease leaders who want to be able to say they have mobile learning.
  • xAPI (Tin Can) – At a session from individuals in the ADL, it was enlightening and refreshing to hear them say that xAPI works great for non-traditional content.  Things like videos, social, games, and other elements that can benefit from learning tracking work really well with xAPI.  If your learners always consume your content from a network-connected computer that is hooked to an LMS, there is really no need to go to xAPI as opposed to SCORM, if it is meeting your needs.  It was nice to hear someone finally stop talking about xAPI as if it is the only standard to use in the future and a full replacement of SCORM at all times.
  • The grass sometimes looks browner – In my session “From Hate to Great: Making the Best of Your LMS” we discussed what we have implemented in our organizations to bridge the shortcomings of our LMSs.  For example, we use an LMS that you can’t readily Google support questions, so we built an internally-supported wiki for admins to share our instructions and help with using the system.  During the conversation, there was an overwhelming amount of discussion of the pain points people experience from their own systems.  For many in the room, it seemed that they felt better when they left that they didn’t have to deal with all the problems the other people had.  The grass certainly appeared greener (despite the weeds) where they were currently standing.


Wednesday, October 30, 2013

Development Process - Creation


The follow up to the Development Process –Initiation post focuses on the creation of the learning originally investigated during the first four steps of the process.  While not every project follows this process, I find that it includes the primary elements necessary for a project. 
Creation
Following the initiation stages, creation focuses on the development of actual materials to create the final product.  After the work that was done identifying the various aspects of the request in the first stage, creation includes: Content Outline, Brainstorm, Editable Draft, and Final Product.

Content Outline
The first step of the creation phase is to work with the identified SMEs to gather the appropriate materials.  These materials can be existing eLearning content or PowerPoint decks from previously held synchronous sessions, but most likely they are a jumble of presentations that have been given to leadership or decks that have been used to communicate with teams involved.  Of course, this makes the assumption that there are materials that exist and you aren’t working from scratch

Once you’ve gathered the materials from the SMEs (or sat down with them long enough), you can begin taking their rough material and making it into an outline.  The outline isn’t designed to be an outline of the finished product, but rather an outline of the current content.  The outline serves the dual purpose of limiting the scope of what’s to be included (through a confirmation with the identified stakeholders) as well as ensuring that you understand the content well enough to organize it on paper.  Likely, you’ll identify many questions and gaps in your knowledge that can be plugged before you get too far into the process.

Brainstorm
Now that you’ve organized the content into a readable format in order to limit the scope and ensure you understand it, it is time to being brainstorming about the best way to deliver the material.  Based on information gathered from the initiation stage, you work with the stakeholders, designers, valued others, and potentially even members of the potential audience group to think through how the material can best be presented.  Similar to the Delivery Medium step, you can get an idea of what people have been picturing by talking through the content, assessments, tracking, and audience. 

Based on your knowledge of learning delivery and development as well as their knowledge of the material, you can ideate about possible creative solutions used to convey the material.  Having just gone through the Content Outline stage, you’ll be able to identify with learners who knew nothing and need to have a deep understanding of the content. 

While the output of this stage is primarily an agreement on ideas and concepts, it can be helpful to have everyone on the same page as to what the final product will likely look like.  For example, there’s a big difference between a virtual synchronous class, a video, and an infographic.  Not only can you get focused on a single delivery medium, you can also ideate about the large, interactive elements within that output. 

Editable Draft
This is the part where people start arguing about storyboards (ADDIE) versus rapid prototyping (SAM).  However, we’ll focus on the desired outcome and leave the defining of what gets us there to others.  Ultimately, at the end of this step, you want to have something to share with your stakeholders that is in as editable a format as possible.

Typically, I create elements in Work, Excel, and PowerPoint during this stage.  While I may do other work in Photoshop, Illustrator or Flash to showcase some of the final look or functionality, I want to keep it simple to allow them to make changes (or make them easily myself).

Exactly what is created here depends on the output, but the focus is on communicating as clearly as possible.  If there’s a script that needs to approved, it can be written in Word, laid out in Excel, or included in the slide notes of PowerPoint.  To show the visual look and feel of a module, PowerPoint can help walk them through.  If you are using a rapid development tool, you can get your deck pretty close to final as you ultimately will have to work in PowerPoint before moving to Captivate or another tool.  If you are creating an infographic, you can move immediately to Illustrator, or you can use Excel to talk about the data, including identifying what’s missing, in order to get buy-in that this is the whole of what you need.

Once you have your materials in a format that can be shared, they go through the approval and polishing that happens with the stakeholders.  You may have them do the editing or collect their feedback and make the changes yourself – it’s up to you.  I find it greatly depends on the type of stakeholder they have proven themselves to be during the process in order to get feedback or approval from them as quickly as possible.

Final Product
Now that you have your different, approved assets including scripts, look and feel, and organization, you can begin assembling and massaging those into the final product.  Narration can be recorded, final visuals can be created, and everything can be buttoned up into a final deliverable.  Likely there is some alpha and beta testing that occurs here before the final delivery, but hopefully you have conveyed to the stakeholders that you are merely looking for final approval or identification of things that are wrong.  This really isn’t the place to discuss font sizes or other elements that will delay the final delivery.


This is a high level process, that has a bit of inherent flexibility to ensure that it can adapt to the needs of the development.  Is it perfect?  No.  For example, there’s a lot of overlap between the Delivery Medium stage and the Brainstorming stage.  However, rarely is there a development cycle that is perfectly laid out either.  Sometimes timelines dictate faster work, sometimes SMEs aren’t invested in providing feedback, sometimes there are others who work through the process with you (which can really be a blessing if you can get them to knock out things like the content outline and get you started on the right foot), but in general this process captures what I’ve found helpful for successful development along the way.