Bummer! Sales have ended.
Unfortunately, tickets for this event are no longer on sale.
Motion capture, or mocap, is a process of recording movement of objects or people and is used in military, entertainment, sports, and medical applications, and for validation of computer vision and robotics.
Please join the New York Technology Council and the CUNY Computer Science Doctoral Program for a joint discussion on this exciting topic. Professor Christopher Bregler will explain new technological advances allowing capture in public spaces, in football-sized stadiums, underwater in Olympic pools, up on 30 feet diving platforms, across YouTube for the US Election, on conductors in Lincoln Center, and on smartphones. In addition, Christopher will explain new systems that track pixels in Hollywood movies, allowing new visual effects.
Chris Bregler is a professor of computer science at NYU's Courant Institute, director of the NYU Movement Lab, and C.E.O. of ManhattanMocap, LLC. Prior to NYU he was on the faculty at Stanford University and worked for several companies including Hewlett Packard, Interval, Disney Feature Animation, and LucasFilm's ILM. His motion capture research and commercial projects in science and entertainment have resulted in numerous publications, patents, and awards from the National Science Foundation, Sloan Foundation, Packard Foundation, Electronic Arts, Microsoft, Google, U.S. Navy, U.S. Airforce, and other sources. He has been named Stanford Joyce Faculty Fellow, Terman Fellow, and Sloan Research Fellow. He received the Olympus Prize for achievements in computer vision and pattern recognition and was awarded the IEEE Longuet-Higgins Prize for "Fundamental Contributions in Computer Vision that have withstood the test of time.”
His work has also been featured in mainstream media such as the New York Times, Los Angeles Times, Scientific American, National Geographic, WIRED, Business Week, Variety, Hollywood Reporter, ABC, CBS, NBC, CNN, Discovery/Science Channel, and many other outlets.