LAYERd

Assignment,Final Project,Hardware,Software — Tags: — John Mars @ 10:45 pm

LAYERd is a multi-layer display made from off-the shelf computer monitors. It allows for glasses-free 3D as well as a novel way to envision User Interfaces.

Every LCD screen on the planet is made of two main parts: a transparent assembly (made of laminated glass, liquid crystal, and polarizing filters) and a backlight. Without the backlight, an LCD monitor is actually completely transparent wherever a white pixel is drawn.

LAYERd uses three of these LCD assemblies illuminated by a single backlight to create a screen with real depth: objects drawn on the front display are physically in front of objects drawn on the rear ones.

My work is mainly focused on the potential UI uses for such a display: what can one do with discrete physical layers in space?

PROCESS

The process begins by disassembling the three monitors. After destroying two cheaper ones with static electricity before the project began in earnest, I was very careful to keep the delicate electronics grounded at all times, and I worked on top of an anti-static mat and used an anti-static wristband when possible.



With some careful prying, the whole thing came undone.



Here, you can see how the glass panel is transparent, and how the backlight illuminates it.

After disassembly came the design, laser cutting, and assembly of the frames and display.

Finally, the finished product.

It uses three networked Raspberry Pis to keep everything in sync, as well as the power supplies/drivers from the disassembled monitors.

LESSONS LEARNED

I learned a lot about polarization; mainly that about half of the films needed to be removed in order for light to pass through the assembly. Plus, this cool little trick with polarized light.

I also learned about safety/how monitors work: Alas! Disaster struck. I accidentally cut one of the ribbons while disassembling a monitor, which resulted in a single vertical stripe of dead pixels. Plus, my front display got smashed a little bit on the way to the gallery show, and made a black splotch.

RELATED WORK

One main body of research highly influenced my design and concept: the work being done at MIT Media Lab’s Camera Culture Group, notably their research in Compressive Light Field Photography, Polarization Fields, and Tensor Displays.

Their work uses a similar assembly of displays to mine, but is focused mainly on producing glasses-free 3D imagery by utilizing Light Fields and directional backlighting.

A few other groups of people have also done work in this field, namely Apple Inc., who has a few related patents — one for a Multilayer Display Device and one for a Multi-Dimensional Desktop environment.

On the industry side of things is PureDepth®, a company that produces MLDs® (Multi Layer Displays™). There isn’t much information in the popular media about them or their products, but it seems like they have a large number of patents and trademarks in the realm (over 90), and mainly produce their two-panel displays for slot and pachinko machines.

Another project from CMU is the Multi-Layered Display with Water Drops, that uses precisely synced water droplets and a projector to illuminate the “screens”.

REFERENCES

Chaudhri, I A, J O Louch, C Hynes, T W Bumgarner, and E S Peyton. 2014. “Multi-Dimensional Desktop.” Google Patents. www.google.com/patents/US8745535.

Lanman, Douglas, Gordon Wetzstein, Matthew Hirsch, Wolfgang Heidrich, and Ramesh Raskar. 2011. “Polarization Fields.” Proceedings of the 2011 SIGGRAPH Asia Conference on – SA ’11 30 (6). New York, New York, USA: ACM Press: 1. doi:10.1145/2024156.2024220.

Prema, Vijay, Gary Roberts, and BC Wuensche. 2006. “3D Visualisation Techniques for Multi-Layer DisplayTM Technology.” IVCNZ, 1–6. www.cs.auckland.ac.nz/~burkhard/Publications/IVCNZ06_PremaRobertsWuensche.pdf.

Wetzstein, Gordon, Douglas Lanman, Wolfgang Heidrich, and Ramesh Raskar. 2011. “Layered 3D.” ACM SIGGRAPH 2011 Papers on – SIGGRAPH ’11 1 (212). New York, New York, USA: ACM Press: 1. doi:10.1145/1964921.1964990.

Wetzstein, Gordon, Douglas Lanman, Matthew Hirsch, and Ramesh Raskar. 2012. “Tensor Displays: Compressive Light Field Synthesis Using Multilayer Displays with Directional Backlighting.” ACM Transactions on …. alumni.media.mit.edu/~dlanman/research/compressivedisplays/papers/Tensor_Displays.pdf.

Barnum, Peter C, and Srinivasa G Narasimhan. 2007. “A Multi-Layered Display with Water Drops.” www.cs.cmu.edu/~ILIM/projects/IL/waterDisplay2/papers/barnum10multi.pdf.

Lanman, Douglas, Matthew Hirsch, Yunhee Kim, and Ramesh Raskar. 2010. “Content-Adaptive Parallax Barriers.” ACM SIGGRAPH Asia 2010 Papers on – SIGGRAPH ASIA ’10 29 (6). New York, New York, USA: ACM Press: 1. doi:10.1145/1866158.1866164.

Mahowald, P H. 2011. “Multilayer Display Device.” Google Patents. www.google.com/patents/US20110175902.

Marwah, Kshitij, Gordon Wetzstein, Yosuke Bando, and Ramesh Raskar. 2013. “Compressive Light Field Photography Using Overcomplete Dictionaries and Optimized Projections.” ACM Transactions on Graphics 32 (4): 1. doi:10.1145/2461912.2461914.

This project was supported in part by funding from the Carnegie Mellon University Frank-Ratchye Fund For Art @ the Frontier.

LAYERd

Assignment,Final Project — Tags: — John Mars @ 6:40 am

Augmented Windows

Precedent Analysis — Tags: — John Mars @ 5:25 pm

INDUSTRY: SAMSUNG TRANSPARENT LCD

In 2012, Samsung debuted a transparent LCD. Where typical LCDs use an artificial backlight to shine through their pixels, the transparent display uses the sun (and artificial edge-lighting at night). The display also includes a touchscreen component.

Samsung previewed their technology as a 1:1 desktop monitor replacement, but I see value in using the technology more inventively: as an augmented reality window, where the display builds upon what’s visible in the outside world.

ART: SONY PLAYSTATION VIDEO STORE

Memo Akten (who I just realized authored the ofxARDrone library I used in the last project) created a series of videos for the launch of the Sony PlayStation Video Store. The videos use a live projection-mapping technique where the content is mapped according to the perspective created by the camera’s position and angle in space.

If you look through a window at a nearby object, draw a circle around it, and then move your head, the circle is no longer in-place. With head- or eye-tracking, the perspective projection could change with your changing viewpoint, so that the circle will always be around the object.

ACADEMIA: MULTI-VIEW AUTOSTEREOSCOPIC 3D DISPLAYS

What happens if two (or more) people are looking out the window? A conventional display can’t show different images for each person. Alternatives to make that happen would be active- (or maybe even passive-, if you’re really good) shuttered glasses, as used with 3D TVs, or a lenticular display. A lenticular display is one that uses lenticular film — you know, these things:

This 1999 paper1, by N.A. Dodgson, et al, in addition to his 2011 workshop, discusses a way to create a glasses-free 3D display using lenticular arrays. In addition to its 3D uses, such a display could also be used for “two-view, head-tracked displays; and multi-view displays”. They’re not talking about displaying unique perspectives per-viewer, but instead about delivering correct stereoscopic images to each eye of each viewer; my use should be much more simple.

OUTLET

I definitely want to try to get my project picked up by a media outlet — a technology/design blog, for example. A little boost of fame would be most appreciated. And, heck, if it turns out to be really cool, unique technology (which it might be — I’m not finding all that much in academia), maybe even a paper submission would be in order.


  1. N. A. Dodgson, J. R. Moore, S. R. Lang. 1999. “Multi-View Autostereoscopic 3D Display.” International Broadcasting Convention 99. www.loreti.it/Download/PDF/3D/IBC99-Dodgson.pdf.

OpenFrameworks Awesomeness: TRON Legacy

Software,Technique — Tags: — John Mars @ 9:47 pm

I remembered this and am not using it for my precedents, but I thought I’d share anyway (read his post, don’t just watch the above video):

Tron Legacy by Josh Nimoy

Streaming live video with ffmpeg/ffplay

Software,Technique — Tags: — John Mars @ 9:33 am