Visible Light Communication is an ideal technology to enable Augmented Reality applications for indoor environments.
Today there are many examples of augmented reality (AR), many of these running as smart phone apps. Augmented reality in most practical implementations is simply an augmentation of useful information onto real world data, often visualised via a smart phone camera image with added notes or images.
Typically an augmented reality smart phone application might use the GPS location and the digital compass for positioning and orientation. Unfortunately, these sensors suffer severe errors indoors or may simply not work in this environment. However, VLC can function reliably to provide the necessary inputs such as location and orientation data for indoor AR apps.
Before looking into this too deeply let me present three different examples of VLC implementations that could be used for augmented reality applications.
Casio’s PicapiCamera iPhone app claims to be the world’s first app to use visible light communication technology. I first mention this in a blog post in January and a little more information has been emerging since then.
Casio are using flashing dots (red, green, blue) from a display, or even flashing coloured lights (e.g. on a Christmas tree) to convey small amounts of data (8 bits) which are received via the camera and then translated into codes relating to specific information content. Because the information rate is so low they have a look-up table that can translate the code into a longer pre-stored message, image or URL. In order to reuse the small number of unique codes Casio identify the general location of the smart phone first, enabling them to reuse the codes with different messages in different locations. The claim is that this is a great alternative to QR codes and clearly this is a form of augmented reality where information can be added to the image seen by the camera. While more work is required, I like the fact that Casio have just gone out and released the app before the use case is fully developed – maybe the consumer will figure out what it can be used for!
MIT Media Labs NewsFlash
The second example from MIT Media Labs, NewsFlash uses display technology, for example on an iPad display.
In this example the display is used to convey digital information encoded as a colour sequence within an area of the image which they state is imperceptible to the human eye. This coded sequence is received via the smart phone camera and decoded back to the original digital data which might carry a tiny URL (a compact web page address) which links to rich media content.
This is suggested to be the equivalent of an invisible QR code
PureVLC have claimed the world’s first application capable of sending a text message directly from a light bulb to a standard unmodified smart phone.
The equivalent of a text message or tweet can be transmitted from a light bulb within a second. While this is low compared with other VLC systems (about 50,000x slower than what PureVLC have demonstrated in the lab) it is both useful and considerably higher than any other application using a standart smart phone camera.
The MIT Media Labs application is in some ways similar to the Casio application in that display technology is used. However, the MIT application is more subtle in that the display does not contain a flashing colour blob. In Casio’s favour their application also works with flashing LED light sources in addition to display technology.
Unlike Casio & MIT ML, PureVLC have not used colour information to convey the information which is an additional dimension that could be exploited in future to send more data. Another significant difference is that PureVLC use the illuminated area as the source of the data, i.e. reflected light, rather than the direct line of sight eliminating the need to point the phone directly at the light source or display.
PureVLC used an LED light source, the MIT Media Labs applicationrequires a display whereas the Casio application can use a display or an RGB LED source. The Casio data rates are of the order of 1 byte per second (i.e. extremely low). On the other hand the MIT system is (by my guestimates – and I will happily correct any inaccuracy reliably reported to me) at least 10x higher than this. The PureVLC solution using just a phone app can achieve 3kbps which is almost 400x faster than the Casio app!
Uses in Augmented Reality
All of these apps use a smart phone camera as the VLC receiver. So they all receive an image, and they can all use hidden encoded information within the image to augment data onto the scene. If anything Casio’s app is closest to a traditional AR implementation, despite being the slowest and least subtle. PureVLC’s app has not been implemented in an AR sense but provides the highest data rate. The MIT Media Lab offering sits between these.
Hopefully, what I have illustrated by these examples is that AR can easily be implemented by VLC. More work is definitely required but the value of VLC in AR applications is hopefully proven.