A Fast and Stable Feature-Aware Motion Blur Filter


Jean-Philippe Guertin, University of Montreal
Morgan McGuire, NVIDIA and Williams College
Derek Nowrouzezahrai, University of Montreal

NVIDIA technical report (58 MB PDF)
NVIDIA technical report (low res) (1.5 MB PDF)
Video results (225MB MP4)
Video results (YouTube)
Abstract
BibTex

Abstract

High-quality motion blur is an increasingly important and pervasive effect in interactive graphics that, even in the context of offline rendering, is often approximated using a post process. Recent motion blur post-process filters (e.g., [MHBO12, Sou13]) efficiently generate plausible results suitable for modern interactive rendering pipelines. However, these approaches may produce distracting artifacts, for instance, when different motions overlap in depth or when both large- and fine-scale features undergo motion. We address these artifacts with a more robust sampling and filtering scheme that incurs only small additional runtime cost. We render plausible, temporally-coherent motion blur on several complex animation sequences, all in just 3ms at a resolution 1280x720. Moreover, our filter is designed to integrate seamlessly with post-process anti-aliasing and depth of field.

BibTex


@techreport{Guertin2013MotionBlurReport,
  author = {Jean-Philippe Guertin and Morgan McGuire and Derek Nowrouzezahrai},
  title = {A Fast and Stable Feature-Aware Motion Blur Filter},
  month = {November},
  day = {26},
  year = {2013},
  pages = {10},
  institution = {NVIDIA Corporation},
  number = {NVR-2013-003},
  url = {http://graphics.cs.williams.edu/papers/MotionBlur13}
}