A History of XR Cross Reality Part 6 of 6
Almost there! Only five years to go and then into the future, she writes and points like Doc Brown.
2014 - Future
2015 Google Cardboard
Google (again? They were just in Part 5) sends out Google cardboard in the New York Times (and other media), expanding the new idea-- a cheap VR viewer where you use your smartphone and some special apps. Pluses: mass market availability, cheap, created buzz. Negatives: absolutely no hands. People want to do more than look, they want to touch.
2016 Emotiv
OK - I will go on record and say I do not like this product. It records the user’s brain waves when a certain action is requested (like open email) and then translates future instances of those brain waves as commands to repeat the learned action. Why don’t I like this? I cannot find many instances of telekinesis that have worked so far for humanity. Maybe in a future I can’t see, but for now, nope. Just remember, there is no spoon.
2016 Mobile World Congress - Facebook gets into the VR action.
It is interesting to note that when very large corporations make steps into a certain technology, all of our heads should be turning.
Facebook’s virtual world Horizon (once called Spaces) just opened in October 2019. Horizon so far doesn't seem to be getting a warm reception. This invention represents what I call an Inflection Point in technology where we double back on ourselves and created a conundrum. When asked if they would utilize a virtual world to 'be' (embodiment) with their friends, I see most regular users of Facebook shying away from the concept. These are the same users that lived totally without Facebook...uh...before Facebook. As such, they were synchronously and physically with people before. Being with people should not, by itself, weird out this population. So why the hesitation near this concept now? Are we 'friends' with people and do things within Facebook that we would not do when in-person with them? #Research needed here.
2019 January - Apple patenting new gesture capture devices?
Maybe? As I’ve said though, no need for gloves or devices attached to people. Ditch that idea. New gestures to mean new things though?? Yes, I’m totally down for that and that’s a partial goal of the Immersive Learning Research Network-- to determine, via research, the standards for immersive environments (standardized buttons, meanings, customs), much the same way websites are standardized for user understanding today (home pages, hamburger menus, buttons that are clickable, links that are underlined, etc.).
2019 October - AWE conference in Munich #AWE2019
This isn’t just a headset image with someone pointing. This headset is detecting his hands. See? Already onto the next step. By the way, thanks terribly to this person who joined my LinkedIn network while at the conference and invited me to follow it on Twitter. So cool.
2019 - Magic Leap
I don’t love this product. Why? See my Future of XR Headsets article. Magic Leap can be redeemed in my opinion though. I will wait this out.
2019 - Microsoft Hololens 2
Excellent product. Why? Because the person looking at the user can see their eyes. Besides what I wrote about in my Future of XR Headsets article, Microsoft has taken a page from their own (MS Office) playbook for this launch. They are going directly to the business market. As such, there will be back-pressure into education to prepare learners for the workplace. This product has many of the characteristics of future success.
2019 - example of room-based VR
In this case, the balls on headset helps the room know where the user is looking. Plus here? Accessibility. Also, you could get more people into the same experience. So this is VR for more than one person.
2019 - Microsoft demo of a hologram gives a partial keynote speech in Japanese, when the speaker does not speak Japanese at all.
Very much #thefutureistoday.
2019 - still in development: Omega Ophthalmics’ in-eye based AR.
Anatomically, seems totally possible. I am not struggling with batteries or display resolution-- these challenges can be overcome. I am struggling with how it would work when the eyes are shut. How do we get information *now* when our eyes are shut? Answer: sound or vibration. Also daylight wakes us up. It will be interesting to see how they incorporate those features.
2019 - Disney’s The Void - example of location-based VR.
Notice how this depiction shows users both wearing and not wearing headsets in the same moment. The user already wearing the headset is happy. The users not wearing the headsets appear to be being invited to join into the battle/be happy. I give this a solid "B" for imagery!
This seems to be a conundrum in VR right now--- how to *show* the VR experience but not make it look like a bunch of tethered people wearing bulky headsets. We’ll overcome this challenge as we get more comfortable with “yes, you have to put the headset on to experience this” in our culture. How? In the future, they won't look like bulky headsets.
Negatives: Must go to a location to experience this. Experience is not cheap. Positives: Increase in quality content (Star Wars!) and you can experience this in group (more than one at a time).
2019 - Large expansion in AR - examples here are in art and language.
2019 Google Pixel 4 phone (Sorry for the low quality image, I captured this from Google's video.)
Short range radar to detect hand & body movements. Example? your phone wakes up *as you reach for it*. Yes.
Future (any images are mock ups or rumors):
Samsung flexible phones, wearable phones.
Cicret bracelet - waterproof! But possibly not real, even as a future device.
Flexible screens or screen keyboard combos or AR displayed screens & keyboards. Yes!
Apple AR glasses - what a Dickens-like adventure 2019 has been for these rumors. First we’ll have Apple AR glasses, then we won’t. Then we will. Then we won’t. Where is my seatbelt?
Again, these images are not from Apple. They are mock ups you can find on the Internet. I just picked these 2 depictions because, to me, they are very Apple-y. Also, take note how these AR headsets look like...well...glasses. Bingo. That's your entry into the success market.
Spectacles - nice foray into design by going retro. Unless you knew there were two cameras in those glasses, would you have noticed them inside of a few seconds? Also, notice that now no one is getting mad at cameras looking at you…
Smell -o -vision!
Ok, I am down for smell-o-vision (Hey #Disney’s Soarin!) but I meet so many people who seem disgusted by this idea. Two things to note:
1.) Smell will be in addition to vision, I am sure. So it will be a combo idea (remember the vacuum robot).
2) Smell-o-vision as I understand it is being used in nutrition research. Said another way, can we get people to eat healthier *if* they believe (see and smell) that they are eating one food (desired, but unhealthy) when they are really eating another food (undesired, but healthy)?
An Inflection Point is: if the body believes what it sees (and smells) as real, how will the body react? Will the body react as if unhealthy food was eaten? 'Send out the cholesterol absorbing body chemicals! Get ready to digest this for hours!' Or will the body react as if healthy food was eaten? 'Insulin levels remain even, body chemistry is fine.' I don’t know! Will someone study that? #Research!
Last example: I really admire this depiction of Microsoft Hololens 2 from February 2019.
I'm not entirely sure that the phrase "instinctual interaction" is a winner, but I'm willing to keep betting on this product. P.S. Dear Microsoft, I could help you come up with a better phrase. Thanks, K. Bye.
So my predictions for the future of XR *based* on studying the past:
- Text will be a continuing necessity. Put it everywhere and in everything you can. Text has over a 2000 year history in human interfaces; it is a winner. We need 3D fonts that can work “floating” and over the top of a variety of light and dark backgrounds and we need them yesterday. #3DFonts
- User-customized ways of interacting with large amounts of information.
- Tech not touching you that still works.
- No need for gloves.
- First forms of what works will likely be two technologies squished together.
- Seeing your user as a human will be more important than the tech itself.
Interesting that phone and sound are almost non-existent in my XR research. I didn’t avoid it, but it also didn’t really show up as a necessary future player. I'll keep cogitating on that.
Keep an eye on science fiction. If you noticed, my sci fi sources dried up about 5 years ago and everything I shared was a technological innovation. What Sci Fi sources are predicting the near and far future? Battlestar Galactica? Handmaid’s Tale? The Marvel Universe?
Amara's Law: We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.
Going forward, I am very interested to trace the justice concept (remember Plato?) through this technology. It is significant that we can criticize much of the current XR environment as being made by young, white males in silicon valley. We need diversity and inclusion inside the workforce making XR.
What comes next? I can’t wait to find out.
Miss any of the prior history of XR and you are curious? Have at it!
Have any time points that you think I should include? Tell us as a comment!
#Reality #CrossReality #MixedReality #VirtualReality #AugmentedReality #VirtualWorlds #Design #Transmedia #XR #VR #AR #ARVRinEdu #EdTech #Innovation #GoogleCardboard #Emotiv #FacebookVR #FacebookVW #FacebookHorizon #AppleGestures #iLRN #AWE #MagicLeap #MicrosoftHolens2 #MicrosoftHologram #OmegaOphthalmics #DisneyTheVoid #Samsung2020 #GooglePixel4 #AppleAR #Spectacles #SmellOVision #FutureTrends #3DText #YourUserIsHuman
This article originally appeared on LinkedIn on December 2, 2019.
https://www.linkedin.com/pulse/history-xr-cross-reality-part-6-heather-dodds-ph-d-