SHAPE AssetTags: a different way to create virtual augmentations for XR spaces

The earlier work with UWB tags generated an idea for something I am calling a SHAPE AssetTag. Essentially, this is a tag that is associated with a virtual SHAPE augmentation. The augmentation follows the position and orientation of the tag, making for a very simple way to implement augmented spaces. If engineered properly, it could be an extremely simple piece of hardware that would be essentially the UWB hardware along with a MEMS IMU and a battery. Instead of WiFi as in this prototype, pose updates could be sent over the UWB infrastructure to make things really simple. Ideally, these would be extremely cheap and could be placed anywhere in a space as a simple way of adding augmentations. These augmentations can be proxy objects (all aspects of a proxy object augmentation can be modified by remote servers) and really can be as simple or complex as desired.

Note that the SHAPE AssetTag doesn’t need to contain the actual asset data (although it could if desired). All it needs to do is to provide a URL of a repository where the asset (either Unity assetbundle or glTF blob) can be found. The asset is then streamed dynamically when it needs to be instantiated. It also provides information about where to find function servers in the case of a proxy object. The SHAPE device app (in this case an iOS app running on an iPad Pro) doesn’t need to know anything about SHAPE AssetTags – they just inject normal looking (but transient) augmentation updates into the SHAPE system so that augmentations magically appear. Obviously, this kind of flexibility could easily be abused and, in real life, a proper security strategy would need to be implemented in most cases. For development, though, it’s nice for things just to work!

One application that I like is a shared space where people can bring along their virtual creations in the form of some SHAPE AssetTags and just place them in the SHAPE-enhanced space so that any user in the space with XR devices could see them.

Another idea is that items in stores could have SHAPE AssetTags attached to them (like security tags today) so that looking at the item with an XR device would perhaps demonstrate some kind of feature. Manufacturers could supply the asset and functions servers, freeing the retail store from having to implement something for every stocked item. This could of course be done with QR codes but then the augmentations would not be physically locked to the item, something that could enable some very interesting augmentations. The item could be picked up and moved but the augmentation would retain the correct physical pose with respect to the item.

For now, the hardware is a complete hack with multiple components but it does prove that the concept is viable. In the photo above, the UWB tag (the white box on the floor under the figure’s right foot) controls the location of the augmentation in the physical space. A Raspberry Pi fitted with an IMU provides orientation information and drives the resulting pose via WiFi to the SHAPE servers. The augmentation is the glTF sample CesiumMan and includes animation data. Here are a couple of videos showing how the augmentation tracks the UWB tag around and that the IMU controls the augmentation’s orientation.

By the way, the software didn’t quite work first time…

So, is there any point to this? I am not sure. There are obviously many ways of doing the same thing without any physical hardware. However, the use of UWB makes it easy to achieve consistent results across multiple platforms with different spatial mapping as it provides absolute physical coordinates. Plus, there’s something fun about throwing a small tag on a surface and watching the augmentation appear!

Indoor position measurement for XR applications using UWB

Obtaining reasonably accurate absolute indoor position measurements for mobile devices has always been tricky, to say the least. Things like ARKit can determine a relative position within a mapped space reasonably well in ideal circumstances, as can structured IR. What would be much more useful for creating complex XR experiences spread over a large physical space in which lots of people and objects might be moving around and occluding things is something that allows the XR application to determine its absolute position in the space. Ultra-wideband (UWB) provides a mechanism for obtaining an absolute position (with respect to a fixed point in the space) over a large area and in challenging environments and could be an ideal partner for XR applications. This is a useful backgrounder on how the technology works. Interestingly, Apple have added some form of UWB support to the iPhone 11. Hopefully future manufacturers of XR headsets, or phones that pair with XR headsets, will include UWB capability so that they can act as tags in an RTLS system.

UWB RTLS (Real Time Location System) technology seems like an ideal fit for the SHAPE project. An important requirement for SHAPE is that the system can locate a user within one of the subspaces that together cover the entire physical space. The use of subspaces allows the system to grow to cover very large physical spaces just by scaling servers. One idea that works to some extent is to use the AR headset or tablet camera to recognize physical features within a subspace, as in this work. However, once again, this only works in ideal situations where the physical environment has not changed since the reference images were taken. And, in a large space that has no features, such as a warehouse, this just won’t work at all.

Using UWB RTLS with a number of anchors spanning the physical space, it should be possible to determine an absolute location in the space and map this to a specific subspace, regardless of other objects moving through the space or how featureless the space is. To try this out, I am using the Decawave MDEK1001 development kit.

This includes 12 devices that can be used as anchors, tags, or gateways. Anchors are placed at fixed positions in the space – I have mounted four high up in the corners of my office for example. Tags represent users moving around in the space. Putting a device in gateway mode allows it to be connected to a Raspberry Pi which provides web and MQTT access to the tag position data for example.

Setting up is pretty easy using an Android app that auto-discovers devices. It does have an automatic measurement system that tries to determine the relative positions of anchors with respect to an origin but that didn’t seem to work too well for me so I resorted to physical measurement. Not a big problem and, in any case, the software cannot determine the system z offset automatically. Another thing that confused me is that tags have, by default, a very slow update rate if they are not moving much which makes it seem that the system isn’t working. Changing this to a much higher rate probably much increases power usage but certainly helps with testing! Speaking of power, the devices can be powered from USB power supplies or internal rechargeable batteries (which were not included incidentally).

Anyway, once everything was set up correctly, it seemed to work very well, using the Android app to display the tag locations with respect to the anchors. The next step is to integrate this system into SHAPE so that subspaces can be defined in terms of RTLS coordinates.

An interesting additional feature of UWB is that it also supports data transfers between devices. This could lead to some interesting new concepts for XR experiences…