Sunday, May 21, 2017

Practical light field rendering tutorial with Cycles

This week, Google announced that they had developed a novel rendering technology which would enable "real-time cinema-quality, photorealistic graphics" on mobile VR devices, named "Google Seurat" in collaboration with ILMxLab:

The technology captures all the light rays in the scene by pre-rendering a surface light field ( resulting in photoreal reflections and lighting in real-time.

Disney recently released a paper called "Real-time rendering with compressed animated light fields", demonstrating the feasibility of rendering a Pixar quality 3D movie in real-time where the viewer can actually be part of the scene and walk in between scene elements or characters (according to a predetermined camera path):

Light field rendering is actually quite an old technique and has been around for more than 20 years. The first paper was released at Siggraph 1996 ("Light field rendering" by Mark Levoy and Pat Hanrahan) and the method has been incrementally improved by others in the last two decades. Stanford compiled an entire archive of light field at the end of the last century:

A more up-to-date archive of photographic light fields with higher quality can be found at

One of the first movies that showed a practical use for light fields is The Matrix from 1999, where an array of cameras that all fired at the same time or in rapid succession made it possible to pan around an actor to create a super slow motion effect ("bullet time"):

Bullet time in The Matrix (1999)

Rendering the light field

In this post, I'll show how to render a synthetic light field using Blender Cycles and some open-source plug-ins, without going deep into the theory. If you're interested, Joan Charmant made a great video tutorial, which explains the basics of implementing a light field renderer:

Rendering light fields is actually surprisingly easy with Blender's Cycles and doesn't require much technical expertise. This video demonstrates a few light fields rendered with Cycles:

For this tutorial, we'll use a couple of open source tools:

1) the light field camera add-on for Blender made by Katrin Honauer and Ole Johanssen from the Heidelberg University in Germany: 

This plug-in sets up a camera grid in Blender and renders the scene from each camera using the Cycles engine. Good results are obtained with a grid of 17 by 17 camera with a distance of 10 cm between neighbouring cameras. For high quality, a 33-by-33 camera grid of with an inter-camera distance of 5 cm is recommended:

3-by-3 camera grid with their overlapping frustrums

2) an open source light field encoder + WebGL based light field viewer, created by Michal Polko: (build instructions are included in the readme file).

The encoder compresses the images produced by Cycles by keeping some keyframes and encoding the delta in the remaining intermediary frames.

Results and Live Demo

A live online demo of the light field with the dragon can be seen here: 

You can slightly change the viewpoint and refocus the image in real-time by clicking on the image.  

I rendered the Stanford dragon using a 17 by 17 camera grid and distance of 5 cm between adjacent cameras. The light field was created by rendering the scene from 289 (17x17) different camera viewpoints, which took about 6 minutes in total (about 1 to 2 seconds rendertime per 512x512 image on a good GPU). The 289 renders are then highly compressed (for this scene, the 107 MB large batch of 289 images was compressed down to only 3 MB!). 

A depth map is also created at the same time an enables on-the-fly refocusing of the image, by interpolating information from several images, 

A later tutorial will add a bit more freedom to the camera, allowing for rotation and zooming.

Friday, April 28, 2017

Cycles, the fastest GPU renderer thanks to new denoising algorithms

Cycles is Blender's native CPU/GPU renderer, originally created in early 2011 by Brecht van Lommel (who left the Blender Institute in 2014 to work on Solid Angle's Arnold, which was acquired last year by the innovation crushing Autodesk Corp.). In the past six years, it has slowly but steadily become a fully featured production ready renderer including motion blur, hair/fur rendering, OpenVDB volume rendering, Disney's OpenSubDiv and Principled PBR shader, GGX microfacet distribution, AOVs (arbitrary output volumes or render passes), filmic tonemapping and support for Alembic scene importing.

A video showing the stunning realism that can be achieved with Cycles:

Even though Cycles has been open source since the beginning, the Blender Institute decided in August 2013 to change the license for the Cycles source code from a restrictive GPL license to a permissive Apache 2.0 license, which allows Cycles to be integrated into commercial projects.

Although Cycles started out as an unbiased renderer, it quickly adopted many biased tricks to drastically cut down rendertimes such as clamping the bounces for different types of rays, blurry filters for glossy surfaces and switching over to shooting ambient occlusion rays after a certain number of bounces is reached.  

In recent months, Lukas Stockner, one of Cycles' developers (who was also responsible for adding light portals and IES light profile support) implemented a few remarkable noise reduction algorithms based on very recent research, which will certainly turn many rendering heads. Two features in particular have been added that reduce rendertimes by 8 times on average: scramble distance (which takes the randomness out of sampling and traces rays in a fully coherent way) and a noise filtering algorithm based on "weigthed local regression". The noise filter has been in development for over a year and has been available in experimental Cycles builds for beta-testing. It's currently under final review and is ready to be released into the Cycles master branch any day. The Blender community is going wild and for good reason. The new denoiser delivers exceptional results, preserving details in textures at very low sample rates and rendertimes:

Full HD render (1920x1080 resolution). Rendertime: 1m 24s
Fully denoised at 50 samples on a single GTX 1070.
Image from the Blender Artists forum
Final denoised and colour corrected render, 1m25s (from BlenderArtists forum)
Some of my own tests using one GPU:

20 samples, no denoising, render time 3m28s

20 samples, denoised, render time 4m09s

200 samples, no denoising, render time 31m58s

The new version of Cycles with built-in denoising will run on both CPU and GPUs from Nvidia and AMD. Experimental builds for CUDA and OpenCL are available here.

Experimental OpenCL/CUDA build Release notes:
  • OpenCL & Cuda GPU Denoise System (this is Lukas' latest denoise code system) 
  • Cuda & OpenCL supported
  • GPU Denoise Multi-GPU Support (even in viewport, definitely works for Cuda but not tested with multiple OpenCL GPUs)
  • Scramble Distance added for Sobol and multi-jitter (works on CPU & GPU) Also added to supported features render tab
  • Blue Noise Dithered Sobol with scramble distance
  • Thread Divergence Sort Reduction patch (gives 30% speedup in classroom and 8% in Barcelona scene)
More information on the denoising algorithm can be found in this thread on the Blender Artists forum and Lukas Stockner's Wiki page:

Experimental Cycles denoising build thread

With this groundbreaking denoiser, Cycles leapfrogs all other GPU renderers, and will soon be making the dream of ultrafast photoreal rendering happen for anyone.  

Sunday, March 5, 2017

Web developer wanted

Our project is making great strides and we're currently looking for a top notch web developer to join our team.

Candidates for this role should have:

- a Bachelor of Computer Science 
- a minimum of 4 years of working experience with front-end and back-end web development (e.g. Node.js/npm, Rails, Go, Django, Ember.js, Angular.js, React.js, Bootstrap, jQuery)
- UI design skills are a plus
- an unbounded passion for and hands-on experience with real-time and offline 3D graphics
- creative and original problem solving skills
- unrelentless hunger to learn more and become an expert in your field
- ability to work independently
- be highly efficient, motivated, perfectionist and driven with heaps of initiative and
- New Zealand residency or be keen on moving to NZ (we consider remote contractor work if you are one of a kind)

 Send your cover letter and CV with a link to your portfolio or Github page to
Applications will close on 21 March.

Wednesday, January 11, 2017

OpenCL path tracing tutorial 3: OpenGL viewport, interactive camera and defocus blur

Just a link to the source code on Github for now, I'll update this post with a more detailed description when I find a bit more time:

 Part 1 Setting up an OpenGL window

Part 2 Adding an interactive camera, depth of field and progressive rendering

Thanks to Erich Loftis and Brandon Miles for useful tips on improving the generation of random numbers in OpenCL to avoid the distracting artefacts (showing up as a sawtooth pattern) when using defocus blur (still not perfect but much better than before).

The next tutorial will cover rendering of triangles and triangle meshes.