The arrival of a pair of BT-300 Smart Glasses gave me an opportunity to take another daft photo of myself wearing a wearable. My eyes don’t really look like that – that’s just where the (presumably) semi-silvered mirror surface is for each eye. Projectors at the sides generate images that are combined with the real light coming in to form a composite AR image.
Following on from an earlier post on Enhanced Reality, it occurred to me that separating the stereo cameras (and microphones) from the ER headset creates a new way of achieving telepresent remote participation – Telepresent Enhanced Reality or TER. I was actually trying out a simpler version a while back when I had a camera on a pan/tilt platform slaved to an Oculus DK2 VR headset. A real TER setup would require stereo cameras and multiple microphones on a pan/tilt/roll mount. The user would have a VR headset and the pose of the pan/tilt/roll mount would mirror movements of the user’s head.
An interesting use would be for conferences where some of the participants are in a conventional conference room but wearing AR/MR/ER headsets (eg HoloLens). Positions in the room for the remote participants would each have a stereo camera/microphone remote. The local participants would obviously be able to see each other but instead of the camera/microphone hardware, they would see avatars representing the remote users. These avatars could be as sophisticated or as simple as desired. Remote participants would see (via the stereo cameras) the conference room and local participants and would also see the remote participant avatars which replace the physical camera/microphone hardware at those locations. Alternatively, these could be suitably equipped telepresence robots (or even cameras mounted on small drones) which would also allow movement around the room. Really anything that has the essential hardware (stereo cameras, microphones, pan/tilt/roll capability) could be used.
Given that everyone has AR/MR capability in this setup, something like a conventional projected presentation could still be done except that the whole thing would be virtual – a virtual screen would be placed on a suitable wall and everyone could look at it. Interaction could be with simulated laser pointers and the like. Equally, every position could have its own simulated monitor that displays the presentation. Virtual objects visible to everyone could be placed on the table (or somewhere in the room) for discussion, annotation or modification.
Obviously everyone could be remote and use a VR headset and everything could then be virtual with no need for hardware. However, the scheme described preserves some of the advantages of real meetings while at the same time allowing remote participants to feel like they are really there too.
I’ve been working through some of the HoloLens tutorials and thought that the Holograms 230 tutorial was pretty amusing. The screen capture shows a solar system being projected in space. The spatial mapping mesh can be seen conforming to objects in view. The poster just to the left of the sun isn’t real – it’s one of the things that you can place on a wall to demonstrate this capability.
Collaboration using AR is a fascinating area with many potential applications. The HoloToolkit is a very handy resource in general and includes the HoloToolkit.Sharing library to assist with collaboration. The HoloToolkit-Unity actually contains built versions of the Server, SessionManager and Profiler but it seemed like a good idea to build from scratch.
There are a few pre-requisites:
- Windows SDK 10.0.10240
- Windows SDK 10.0.10586
- Common Tools for Visual C++
- Windows 8.1 SDK and Universal CRT SDK
- Java 8 SDK
An easy way to get the Windows SDKs is to run the BuildAll.bat script which will exit with an error if something is missing. Then use the solution file for the element that failed to start VS2015. VS2015 will then install the missing components. The Java SDK needs to be installed manually and requires environment variables JAVA_BIN that points to the JDK bin directory and JAVA_INCLUDE that points to the JDK include directory. The BuildAll.bat script should then complete successfully.
The Server is run using SharingService.exe and the user needs administrator permission to install as a service. This can be done by opening a command window in administrator mode and running the command from that for example. It’s actually useful to run the server using the -local flag (as a command line program) as then it’s easy to see status and error messages. The SessionManager displays current server state including connected clients.
I was intrigued by the Vrvana VR + stereo camera pass-through headset. This is a practical version of something that I have been thinking about for a while. HoloLens does a fantastic job of AR/MR but it does have a limited field of view, something that may be inevitable with the waveguide type design. The other limitation is that it can only overlay on reality, not selectively replace it (at least not in all lighting conditions).
Enhanced Reality (ER), by using stereo cameras whose feeds are displayed as in a conventional VR headset can solve the field of view problem and allow all aspects of the field to be enhanced, replaced or overlaid as required. Take the case of using the headset for driving a car (although people will probably not be doing that much longer). For a start, the car would no longer need a dashboard or any instrumentation – everything would be virtual, including big touchscreen displays. Looking out of the windscreen, the image seen could be augmented by data from radar, IR cameras or anything else that enhances the experience. Objects of interest could be enhanced in brightness perhaps. It could incorporate Google Translate style street sign translation and replacement. Obviously any other useful heads-up data could be displayed, such as navigation, speed, temperature etc.
Complex airplane cockpits could potentially be a thing of the past also. Most of the cockpit consists of devices that give information to the pilots or are simple switches and levers. All of these could be virtual. Maybe you keep a joystick and a couple of rudder pedals but that would be it – it’d just be two chairs in a room :-). Meanwhile, the view out of the windscreen could be enhanced in the same way as described earlier.
I am sure there are many applications where the ability to enhance, modify and replace any part of the field of view would be of value. An important aspect of a true ER headset is that even very bright features in the field can be replaced, something that is difficult to do with AR headsets.
I really don’t know how this guy got into my office. Spent the last couple of hours trying to grab the HoloLens back from people in the house playing Fragments. Of course now everyone wants me to buy a HoloLens. The quality is actually much better than the screen shot suggests – the rats running around on the office floor were especially amusing. And who are these people standing around?I used the HoloLens companion app to get these screen captures – it allows others to see what the wearer is seeing.
The nice thing is that nobody has complained of any motion effects. I am usually the most sensitive and can only last about five minutes with an Oculus DK2 but even I survived just fine. Being untethered with no wires or other boxes to worry about is a major plus. Just put it on and play.