TEST SHOOT: Lytro Illum

I recently had a chance to play with the Lytro Illum, version 2 of Lytro’s light field technology implemented in a DSLR form factor.

I’ve been wanting to play with an Illum since they were first announced and my friend Dallas Swindle graciously offered to let me take his Illum out for a week of testing.

This post covers the results of my tests, an evaluation of the camera itself, in and I’ll also cover my thoughts about it’s usefulness and how it might evolve.

The Illum proved to be a fun and fairly easy camera to shoot with. It’s unique ability to produce images that contain depth data is interesting, but it’s still not clear what the real-world application for this would be. (More on this in my summary at the end)

But first, for all my camera geeks out theres, a little background…

(If you want to skip the blabby text and jump to my gallery of test shots CLICK HERE)



Light field photography departs from our traditional concept of photography in that it breaks away from the idea of a single image, taken by a single lens, resulting in a static photograph.

Without getting too technical, the basic concept of light field technology is that you capture a scene using a collection of lenses to collect additional information about the intensity and the direction of a scene you are shooting. This is done using a high resolution imaging chip overlaid with a 2D array of microlenses, much like a compound eye of a bug

light-field-sensor  bugeye


Properly interpreted, this additional information can be used for a variety of results. the most notable being the ability to produce an image where you can change the focus after you have captured the shot. 

Liza-Near Liza-Mid Liza-Far

Light field imaging also can produce 3D models of a scenes (more on this later)



The concept of light field photography has been around since 1908, but it hasn’t been until the recent developments in digital imaging/processing that the technique has emerged as a practical tool.



In 2007 Adobe demoed a camera that had a 19 lenses array and used Photoshop for the post-processing.



Adobe later showed a system that used 20 “sub-lenses” mounted on the front of standard lens.




In 2008 Stanford University developed a prototype for a light field camera that used 12,616 microlenses primarily aimed at 3D imaging for robotics.



But these cameras were all one-off lab prototypes, not commercial products for the general public.


The first Lytro light field camera was announced in 2011. Researcher, Ren Ng, left Stanford to start the first commercial light field camera company, Lytro (originally “Refocus Imaging”).


It was kind of an odd device both in form and function.


It was small, boxy, and without the conventional design elements of a traditional camera. It had a tiny 1.8” viewing screen, no zoom lens, and almost no user controls. It looked more like a pocket flashlight than a high-tech camera.

At $400-500 this camera seemed a little overpriced for most folks. Sure, it was a cool idea that you could refocus your photos but the image quality was limited and you needed to do a lot of heavy post-processing to get any images out of the thing and the quality of the images was pretty terrible.


This first iteration was really more of a technology demo than a viable commercial product. Great for early adopters and uber geeks, but very few were actually sold.



While limited in functionality and commercial appeal, the Lytro v1 was still a game changer. It represented the beginning of a new age of photography, an evolution perhaps more dramatic than the shift from analog to digital photography.

The Lytro v1 was one of the first steps into the realm of “Computational Photography”, the intersection of high-end imaging hardware with powerful software processing. We can now combine imaging data from a wide range of data collection devices to form and explore new representations of our world.

As a lover of light and storytelling, this represents an exciting new world of possibilities for visualization and experience design.

Computational Photography blows open our concept of a “camera” and allows us to explore capturing light in ways never before possible.

LYTRO v2 – “Illum”

In 2014 Lytro announced their v2 light field camera, The Illum. This new model not only improved on the technology, but also came with gorgeous ID courtesy of Artefact.


The form factor of the Illum was far more recognizable as a camera than the boxy Lytro v1 with the familiar zoom lens cylinder, hand grip, and rear display. While clearly inspired by a DSLR, the clean lines, minimal controls, articulated screen and forward angled body make it feel like it (and you) are leaping into the future.

Using the Illum has a lot of similarities to a typical DSLR, you rotate the lens to zoom in and out… you preview the image on the backside screen… you depress the shutter button on the handle grip to take a photo.

Where the Illum diverges from a normal DSLR is in the ability to capture depth data for each image. The UI has several modes to assist in adjusting the depth capture settings.

The depth assist views are accessed through a button right next to your shutter button. Beautifully designed by Nicole Parente-Lopez, the UI provides a variety of “heat map” style of gauges and overlays that allow you to determine what your complete depth range will be captured. It took only a few minutes to get used to the feedback and settings before I could quickly adjust my depth settings.


“Histrogram” mode even overlays a simple heat map on top of you image to help you determine your “far” and “near” ranges…


Using the depth assist modes was not only helpful in shooting, but was playful and fun and really supported the overall design of the camera as futuristic device.


After getting familiar with the basic operations of the Illum I met up with fellow media artist, Nate Pagel to see what kind of creative results we could get from the camera.

Knowing that we were going to be playing with depth we decided to set up a still life scene with a variety of reflective surfaces to see how the camera would react.


We got some fun shots that played with our reflections and depth of scene…


After playing around with our still scene I wanted to test the camera out in a larger environment so my girlfriend and artistic collaborator, Liza Bender and I took the camera out to Golden Gate Park. We brought a few small mirrors since they seemed to produce some interesting opportunities for exploring focus/depth.

We discovered that taking shots where we froze a moment of movement, like water in a fountain, were some of the most interesting. Being able to rack focus through a moment frozen in time, with water suspended in mid-flow, was something that made the ability to refocus more interesting.

This shot, of water droplets on various layers of plastic and glass was probably the most artistic use of the adjustable focus I shot that day:

Check out the rest of my test shots HERE:




As a camera geek I always have fun taking new and strange cameras out for a spin. The Lytro Illum was a fun toy to play with for sure, but in the end I think it’s really just that, a ($1,599) toy.

It did deliver in its promise to produce re-focuasble images, but beyond the novelty factor I wasn’t able to figure out what practical applications this would be useful for. Any technical application where you would want wide depth of field you’d just shoot with everything in focus. And for artistic projects the camera’s quality and limited viewing options didn’t make it a great choice.

During this project I discovered that one of the major drawbacks of their whole system is that their “live focus” images can only be viewed on their own web player. You can embed them, but it’s not like a more standardized image or video format.  This means the situations in which you can use and share the Lytro photos are very limited.

You can’t even view them on your own computer except through their desktop app.

The other features the Lytro photos offer are also almost useless. The parallax is extremely limited to the point that you barely notice it. The ability to change the f-stop settings is also pretty much useless.

Even the ability to change what is in and out of focus in post wasn’t that useful to me as an artist since the depth mapping wasn’t super smooth so you get strange rough edges between the various areas of focus. Even the out of focus areas, had a terrible blur aesthetic to them.

DepthDetail-Liza DepthDetail-Nate

Working with Lytro I kept thinking, “Why would I want to pick my focus after I’ve shot the photo?”. That’s a decision that I artistically want to make when I take the photo.

Depth features aside, I also found the basic image quality of the photos to be fairly low in quality. It’s not a camera I would use for fine art photography.

So what is this camera good for?



While the Lytro Illum was a major update on the original Lytro, I believe they were really just public beta releases, steps on the road to something much bigger.

It seems pretty clear to me that these first 2 still camera products are merely stepping stones to get to the actual killer application for light field photography, VR cinematography. Capturing depth data is a fun parlor trick for still images, but is actually a critical ability needed for shooting VR movies.

Lytro has been able to develop their technology, prove they can take products to market, and even raise a bunch of $. I imagine they are hard at work on a video camera version of the Lytro as we speak. In fact, they recently raised $50M to do just that.

It’s not going to be easy to develop a realtime VR capture system, but I expect we will see an announcement from Lytro in the coming months of their new light field camera.

UPDATE: Yup, it happened…



Recent Posts
Contact Us

Ask me anything...

Not readable? Change text. captcha txt

Start typing and press Enter to search