Pushing the Boundaries of Novel View Synthesis (Talk)
2020 was a turbulent year, but for 3D learning it was a fruitful one with lots of exciting new tools and ideas. In particular, there have been many exciting developments in the area of coordinate based neural networks and novel view synthesis. In this talk I will discuss our recent work on single image view synthesis with pixelNeRF, which aims to predict a Neural Radiance Field (NeRF) from a single image. I will discuss how NeRF representation allows models like pixel-aligned implicit functions (PiFu) to be trained without explicit 3D supervision and the importance of other key design factors such as predicting in view coordinate-frame and handling multi-view inputs. I will also touch upon our recent work that allows real-time rendering of NeRFs. Then, I will discuss Infinite Nature, a project in collaboration with teams at Google NYC, where we explore how to push the boundaries of novel view synthesis and generate views way beyond the edges of the initial input image, resulting in a controllable video generation of a natural scene.
Biography: Angjoo Kanazawa is an Assistant Professor in the Department of Electrical Engineering and Computer Science at the University of California at Berkeley. Previously, she was a BAIR postdoc at UC Berkeley advised by Jitendra Malik, Alexei A. Efros and Trevor Darrell. She completed her PhD in CS at the University of Maryland, College Park with her advisor David Jacobs. Prior to UMD, she obtained her BA in Mathematics and Computer Science at New York University. She has also spent time at the Max Planck Institute for Intelligent Systems with Michael Black and Google NYC with Noah Snavely. Her research is at the intersection of computer vision, graphics, and machine learning, focusing on 4D reconstruction of the dynamic world behind everyday photographs and video. She has been named a Rising Star in EECS and is a recipient of Anita Borg Memorial Scholarship, best paper award in Eurographics 2016 and the Google Research Scholar Award 2021.