A Reconstruction Filter for Plausible Motion Blur

in Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games 2012 (I3D'12)

Morgan McGuire, NVIDIA and Williams College
Padraic Hennessy, Vicarious Visions
Michael Bukowski, Vicarious Visions
Brian Osman, Vicarious Visions

Paper (PDF preprint)
Download Result Video (H.264 MP4)
Presentation (PPT)
Visualization code
Online Images and Video

ACM Copyright Notice Copyright by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Publications Dept, ACM Inc., fax +1 (212) 869-0481, or permissions@acm.org. The definitive version of this paper can be found at ACM's Digital Library http://www.acm.org/dl/.


This paper describes a novel filter for simulating motion blur phenomena in real time by applying ideas from offline stochastic reconstruction. The filter operates as a 2D post-process on a conventional framebuffer augmented with a screen-space velocity buffer. We demonstrate results on video game scenes rendered and reconstructed in real-time on NVIDIA GeForce 480 and Xbox 360 platforms, and show that the same filter can be applied to cinematic post-processing of offline-rendered images and real photographs. The technique is fast and robust enough that we deployed it in a production game engine used at Vicarious Visions.


The shader code is in the paper, so no code sample is necessary to implement this paper. A newer version of this effect is implemented in the G3D Innovation Engine and is available as open source.

This effect has been used in many production systems, often with modification. For industry discussion, see Crytek's 2013 SIGGRAPH presentation and Vicarious Visions's 2012 SIGGRAPH presentation. See Simon Green's GDC 12 talk for examples of our motion blur algorithm in the Unity DX11 demo. The Fabric RTR engine uses this effect as well. See our followup NVIDIA technical report for information about further refinements, some of which are in those shipping systems.

Eric Lengyel's article on Motion Blur and the Velocity-Depth-Gradient Buffer in Game Engine Gems describes two ideas for any post-processed motion blur that are compatible with our algorithm. The first is a method for choosing the depth-test discrimination threshold based on the local depth gradient and distance from the camera. Our "soft" depth comparison is an alternative that saves two texture channels at the expense of accuracy on slanted blurry pixels; the results are indistinguishable and the derivative could be computed on the fly, so either method is probably equally good. His second idea is to divide the screen into tiles and flag tiles near dynamic objects. Unflagged tiles receive a less-expensive motion blur algorithm that only considers camera motion and has no depth test. Flagged tiles receive the full shading. This improves performance at a small loss in quality for the unflagged tiles. We recommend considering this as an extension of our algorithm. So long as the tile boundaries are not revealed by the loss of depth testing (which depends on your scene and tile size), this can nicely speed up the blur pass, and the NeighborMax buffer is an obvious place to store the flags.

At the suggestion of the reviewers, we added the vector fields over the velocity images at the last moment before publishing the paper. We've since noticed that when we exported the velocity fields from our programs to PNG images, some of the values were clamped in the process. This didn't matter for viewing the RGB images but produced a slightly incorrect visualization of the vector lines for large velocities, e.g., the dragon's wings are primarily moving up and down in the teaser, but clamping the Y velocity left the X velocity relatively too large.

Images and Video


  author = {Morgan Mc{G}uire and Padraic Hennessy and Michael Bukowski and Brian Osman},
  title = {A Reconstruction Filter for Plausible Motion Blur},
  month = {February},
  year = {2012},
  booktitle = {Proceedings of the ACM Symposium on Interactive 3D Graphics and Games},
  url = {http://graphics.cs.williams.edu/papers/MotionBlurI3D12/}