Virtual Reality for Soft Skills...Maybe

 

 

There you are, eating your morning bowl of cereal. You glance at your social media feed. Up pops these images:

 

Screen captures of infographics and images made from the 2020 PwC VR for Soft Skills Training Report. In the lower right hand corner, a person is wearing a headset and looking amazed.

Wow. Even the infographic people seem amazed.

But if it is too good to be true, it probably is, at least where research and marketing intersect. Not to worry! I'm here to tear apart this and see what's inside. Can virtual reality (VR) teach soft skills training? Verdict: Maybe.

My Executive Summary:

 

Heather's summary of the PwC analysis:  2 results are garbage (confidence and focus) 1 result is just okay (faster), 1 result in good (more effective) and 1 result was buried, than the learning had no significant difference.


Let's start with the nuts and bolts.

Citation & Report

Mower, Andrea. “The effectiveness of virtual reality soft skills training in the enterprise: a study”. [Place of publication not identified]: PricewaterhouseCoopers, 2020. Online. Internet. 21 Aug 2021. . Available: https://www.pwc.com/us/vlearning.

Despite that academic-y looking citation, the writing does profess itself to be a report, not research. That's good because reports are not held to the same standards of rigor as research.

What you find at that web link, however, is NOT the full report. You are looking at the corporate summary. Remember that technically, PwC is not in the VR business (a plus) so they are not selling you something about VR. They are only telling about how some VR training went at their company.

Something quoted the report as "73 pages" but the website is not 73 pages long, so I had to find the actual report. That took a little more digging but I found it here: https://www.5discovery.com/wp-content/uploads/2020/09/pwc-understanding-the-effectiveness-of-soft-skills-training-in-the-enterprise-a-study.pdf

Experimental Design

Disclosure is right up front (first sentence!) that:

"supported by Oculus for Business and Talespin,"

Good disclosure; it’s good practice. While I'm not loving that it is sponsored by a VR headset manufacturer and training creator, knowing this lets me view this with the appropriate amount of critical thinking.

Their study started in 2019 and ended in February 2020, so they indicated when it was run.

Note: Every piece of research that touches 2020 and forward into the near pandemic future, should clearly mention WHEN the study was run, because the COVID pandemic is impacting every part of our lives.  COVID does impact “reports” and we need to know if these are ‘at-home-stressed-but-sent-a-headset’ users.

They had a hypothesis: Our hypothesis was that training using VR is more effective in achieving learning outcomes than traditional training methods (classroom or non-VR digital experiences).

Remember that a hypothesis in experiments is good. Hypotheses guide us to our data and results. Bias in experiments is bad. Bias makes us ignore our data and results.

And PwC defined “more effective”  with:

  • Employee satisfaction
  • Learner flexibility
  • Comfortable learning environment
  • Improved attention
  • Higher information retention
  • Confidence building

That definition of "more effective" is a little murky. Usually time plays a very definitive role in "effective" measurements. For example: widgets produced over time. Here, time is not actually mentioned and yet time is prominent in the infographic stats later. Hmm..

In their experimental design, PwC appears to think that they have made comparable training: 

  • classroom,
  • e-learn,
  • v-learn (VR).

Oo, bust here.  While I hat tip to the innovative thought process expressed here, to take advantage of what VR can offer...

“The classroom and e-learn course experiences were linear: A video was shown, the learners asked some questions, then the next scenario was presented.”

“However, we determined this linear approach would not leverage any advantages of the VR modality. We hypothesized that placing the learner directly in the scenarios covered in the curriculum and giving them the ability to act as they might in real life would be more rewarding for them.” (p. 16)

... but, owch, non-comparable methods!!  If you literally taught the information differently, you cannot compare the methods and thus, you cannot compare the results.  The key to making a good design that includes VR (and I’m giving you the $64,000 answer here for anyone designing "compare" research with VR) is to put it up against something very, very cognitively similar.  Right now, that technology is 360 video where you put in branching decisions. And add haptic bodysuits & controllers. But that convo is for another day.

Now to parse out the difference between 360 video and VR, you’ll need thousands of users. Not many investors right now have the willingness to spend thousands of dollars in equipment and time to collect that much data.  And even when you get there, you probably won’t find much a of difference. Why? Well, take a look at what you designed.  If you make a cognitively similar experience and run humans through it, it actually makes sense that no significant difference in the data will arise.  Remember at this point, I’m talking about learning outcomes ONLY, no other characteristic. Also, go back and look at the hypothesis. They already thought that the 'learning outcomes' would be the same (READ: same scores on tests) but that they could achieve those learning outcomes "more effectively". That's interesting. I wonder how much of this report was written from hindsight and how much was written before the study started. (This is why you write your hypothesis first--before your study.) That hypothesis is now possibly showing a bias...did they *think* it was going to turn out...the way it did?

Number of participants: 1600 possible. Good. But they never disclose how many learners they actually had in each group (their Ns). Therefore, I'm going to call this a strong negative because they could have included that number (I don't see a business reason to conceal it).

Experiences: 5-7 minutes long

Did use Oculus Quest (ahem)

Did use Oculus for Business for remote device management

Honorable mention in the report: Using the phrase “not the most gratifying” when talking about tagging and inventorying what must have been more than 300 pieces of equipment.

One more small note: The report does a nice job explaining how they decided to buy 100 headsets and what the cost and time of developing the v-learning training was. It is outside of my scope to analyze that but I will recommend it as good to read. Remember that if the cost of v-learn is dropping, it becomes a better and better choice over time. They do those calculations and find that it becomes the better choice at 3,000 users/learners.

Let’s see what the data shows:

Screen capture of the 4 key metrics that were made into the infographic, claiming that VR is: 4x faster to train than the classroom, 275% more confident than the classroom learners,  3.75x more emotionally connected than classroom learners, and 4x more focused than their peers.

4x faster to train than in the classroom

I might have to cede this one right off the bat because VR does provide a 1:1 experience that most classrooms cannot beat. How many classrooms can provide-- minute for minute-- the same 1:1 attention of the teacher to the student? Erps. Few.

Fly in ointment? VR can cost a great deal of money for development and for the equipment. READ: the set up.

Counter to that? VR can be done for very little money and because training can be replicated a billion times when a 1:1 teacher cannot be replicated, VR wins the day on this claim.  READ: it CAN save time but after development is done.

I still don’t like comparing “classroom” to VR, such a not-fair comparison.

 “What took two hours to learn in the classroom could possibly be learned in only 30 minutes using VR. When you account for extra time needed for first-time learners to review, be fitted for and be taught to use the VR headset, V-learners still complete training three times faster than classroom learners. And that figure only accounts for the time actually spent in the classroom, not the additional time required to travel to the classroom itself.”  

In the report, they shared numbers in minutes. 

Classroom: 2 hours (Watch that crossfire, boys! They didn't express this as 120 minutes. Heather gives PwC a strong look for that. Don't be mean. Or get a better editor.)

E-learn: 45 minutes

V-learn: 29 minutes

29 * 4 = 116

116 is close to 120. Therefore, saying V-learn is 4 times faster is accurate.

275% more confident to apply skills learned after training

This is a poor item to measure when we are focusing on learning outcomes.  The Dunning-Kruger effect says that those least able to accurately self-measure something are actually the worst at that thing.

They make an argument that particularly with soft skills, confidence AFTER training would help implement the soft skills in the workplace. I liked their design inside the V-learn module. Learners had to say their lines in the simulation. Nice touch!

But alas, this is all future prognostication and not actual data. We can hope for something but that doesn’t mean that our hope leads to actual results.

Screen capture of graph from report showing all numbers higher than 100% for confidence, but we can't tell what the percentages are exactly higher than.

 

166% and 275% of what?  Where is the 100% in this diagram? Said another way, what is the baseline? Zero?

I still can’t find this in the report.

3.75 x more emotionally connected to the content than classroom learners 

I’m not going to pick this one apart much because the errors should be apparent.  Often VR presents training as first person-- meaning the user looking through the headset is often the protagonist of the adventure. Therefore, a story happening personally to the user creates more emotional connection than the same story in a classroom. This comes from the bucket labelled obvious and is frivolous data.  It also blatantly shows what’s wrong with using non-comparable instructional designs.  This is an apples to oranges comparison.

FURTHER, newer data is showing that types of empathy count...not just general empathy or emotional connection.

4x more focused than their e-learning peers

There are some great accessibility studies and autism spectrum studies coming out that are showing us some very interesting research (really, it’s a watch this space stuff) that VR can be more cognitively overwhelming for some learners and hence every measurement that says learners have more brain engagement could actually be learners overwhelmed (yeah, obvious bucket again).  

But let’s look at what was actually said here:

 

Screen capture of table from report: Focus is on the questions:  How many times were you multitasking or distracted during this experience? and How many minutes do you estimate it took to get back on task?

"With VR learning, users are significantly less distracted. In a VR headset, simulations and immersive experiences command the individual’s vision and attention. There are no interruptions and no options to multitask. In our study, VR-trained employees were up to four times more focused during training than their e-learning peers and 1.5 times more focused than their classroom colleagues. When learners are immersed in a VR experience, they tend to get more out of the training and have better outcomes."

It doesn’t say how the “more focused” was measured? I wonder?  Remember the clue to look closer is when the presentation tends to do a ‘hand wave’ approach on something.  It’s equivalent to misdirection by a magician.  If you look over here, you are not looking over there.  The text says “There are no interruptions and no options to multitask”.  Yes, the Oculus Quest headset design doesn’t allow a user to look around or to (reasonably) be interrupted by messages from outside of the experience.  But did that mean interruptions didn’t happen?  What about a “this is pulling my hair” message?  Or “it’s fuzzy” or even “I'm gonna puke.”   Those are interruptions caused from the inside out.  Were those counted?

In the report, it looks like self-disclosure:

Screen capture of Improved attention section of report: 45The Effectiveness of Virtual Reality Soft Skills Training in the EnterpriseAs identified in our key findings, VR-trained learners were up to four times less distracted during training than their e-learning peers and 1.5 times less distracted than their classroom peers. This was self-reported, and the team did not use any passive technology to observe this attribute. Based on experience and months of observation, the team actually felt the self-reported statistic was lower than what we observed. However, the statistic was significant and should result in higher learner comprehension and retention.

At this point in the paper (p. 45), my hackles are going up because you cannot see data (self-reported), acknowledge it (we thought it was higher), then throw it out ("the statistic was significant and should result"??).  What? PwC?  My doubt meter is red-lining at this stage. Insert Mr. Potato angry eyes.

5. VR learning can be more cost-effective at scale

Interestingly, this didn’t make an infographic!! What?? What a shame because right here I AGREE with this paper!!!!!!!!!!!! ARGH!~!

YES, YES, YES. VR is more expensive to make once (but that cost is dropping) and it can be replicated (which is where you win) but it is also showing impressive  results in FLEXIBILITY (meaning, you can change up the conditions quickly). VR can be made cheaply, which also means basically, but that might not be a bad thing.  Get over the hump of the novelty effect and design a basic experience which is accessible to many learners and you are in an effective horse race with other forms of learning.  What I'm trying to say is that basic and/or cheap isn’t necessarily bad in VR.  It should not be thrown out. Because this is where VR is going to eventually win.

They are using their own costs and admittedly, the fact that it’s a large study now hurts them because that means more money was outlaid to get the training started.  

Revisiting how many actual participants??? They said 1600 eligible but they never said how many it was AND then they said that they “offered” v-learn to the classroom and e-learn participants as an option (and those went on to answer a smilie sheet on how much they liked it).

One Result Buried

OMG look at this that they tucked into the back of the paper, I didn’t even know it was there!!!!! Page 44.

Screen capture from Higher Information Retention section of paper: We quickly discovered retention scores were inconclusive, as the delta between pre- and post-assessments in each modality was not significant. Indeed, the assessment team underestimated the previous knowledge experience our test population had on the diversity and inclusion topic. In hindsight, we should have selected a topic that was not already in our curriculum or selected a different test group that had not already been immersed in similar training.


No SIGNIFICANT DIFFERENCE in RETENTION!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

I’m currently on the floor laughing because that is what research data has lead us to predict all along.  No difference.  They’re blaming the content in this case. OK, picking myself up off the floor and dusting myself off. Insert droll look here. It's not the content, yo.

What PwC is missing right here is this: No significant difference results eventually means that the cheaper option to learn the same material will be market dominant. Translation: make VR easier to MAKE and VR will be bought. Cha-ching.

Final verdict: Mixed Bag

•2 metrics (confidence, focus) = garbage

•1 metric (faster) = just okay

•1 metric (cost effective) = good

•1 result (no significant difference) = buried

Conclusions

• Enough doubt to worry about other PwC infographics

• Overall, not bad for a “report”

Not research.

Can VR teach soft skills? It appears so, yes, at least as well as classroom and elearn options. (No comment on the quality of the instruction or the assessment.) Right now, v-learn is an expensive choice. But the price for development IS dropping. I have high hopes.

Title image for article: Analysis of PwC Virtual Reality (VR) Soft Skills Training Study 2020. Verdict: Mixed Bag. Image of shopping cart with various brown boxes.


This was my 3rd planned article on analyzing research.

1st article Study Does NOT Show That Instructional Designers Drive Better Student Outcomes.

2nd article "What Happened When Student Brains -- On VR -- Were Scanned" Is Analyzed

This is my 2nd article of three specifically about VR research.

1st article "What Happened When Student Brains -- On VR -- Were Scanned" Is Analyzed

More to come in the next few days because the next article is already written. That will end, hopefully (!), my series on poor VR & learning research.

#Research #VRResearch #VRReport #PwC #SoftSkills #Faster #Confidence #Emotion #Focus #NoSignificantDifference #OculusQuest #elearn #vlearn #Talespin #NotResearch

 

(This is a copy of the same article that I posted to LinkedIn on August 25, 2021)

https://www.linkedin.com/pulse/virtual-reality-soft-skillsmaybe-heather-dodds/