Project Electric Sheep combines virtual reality (VR) and machine learning (ML) with robots. It's about as cutting-edge as Autodesk gets. Something that cutting-edge requires a leader that's just as cutting-edge.
Dr. Li has made a big impact in her short time at Autodesk. |
Dr. Hui Li is a principal research engineer in the Applied Research Lab in the Office of the CTO. Her role is to explore the future landscape of making things and look for blind spots for Autodesk, by applying the latest technologies in machine learning and robotics in design and manufacturing projects. Prior to Autodesk, Hui led the R&D division at Airware, where her team built computer vision and machine learning capabilities for commercial drones, performing autonomous inspections on construction sites, mining, and residential areas. Before that, she worked at Boeing as a key figure in the research efforts of building an agile robotic platform for airplane manufacture and factory automation. While at the Massachusetts Institute of Technology (MIT), she collaborated closely with NASA on underwater robots as well as space exploration robots. She has presented at the most prestigious Artificial Intelligence (AI) conferences such as the Association for the Advancement of Artifical Intelligence. Hui has a Ph.D. in AI and M.S. in Aerospace Engineering, both from MIT. |
With a background and qualifications like that, it's no surprise that her current project is Project Electric Sheep. Tightly coupled with VR and robotics, Project Electric Sheep explores the new train-in-VR-test-in-real-world paradigm and goes a step further. We collect data in VR to train robots to manipulate 3D objects. Testing on real robots teaches us how to improve simulation physics and rendering. It’s one big feedback loop! The learned-robot-capabilities can be fed into our applied robotics projects related to product design/manufacture and automated building construction. More importantly, we will have an ever-improving virtual ML platform that all of Autodesk can use and build upon.
As the leader of this project, Hui's approach entails:
- Build a simulation environment to collect training data that includes:
- High-resolution recognition of position, orientation, volume, and appearance of physical objects
- Robots with accurate physics
- Virtual cameras (images) and virtual depth sensing (point cloud data)
- Data collection and processing pipeline
- Extensibility to add other robots/objects and to interface with other software
- Train the AI system (deep neural networks) to manipulate 3D objects into two steps:
- Robot grasps a 3D object
- Robot places/removes the 3D object to assemble/disassemble in/from a larger context that is dynamically determined
- Obtain insights on how ML can improve the simulation environment based on testing in the real world so that the synthetic data has:
- More realistic rendering
- More accurate physics engine
For readers without a doctorate in AI nor a master's degree in Aerospace Engineering, in layman's terms:
- Artificial intelligence relies on machine learning.
- A robot, a machine, normally learns by performing a task in countless ways, keeping track of what works and what does not work.
- Rather than have a robot learn by physically performing a task, the robot can learn via a simulation of the task. This reduces the amount of time required for the robot to learn and allows for learning to happen in parallel. For example, instead of learning by having a robot physically perform a task 10,000 different ways, we can simulate having the robot perform the task. Since it's a simulation, we can run 10,000 simulations simultaneously and combine the results when done. This reduces the amount of elapsed time for a robot to learn by 9,999 times.
- Using synthetic data from simulations as robot training material is only possible if the simulations and renderings are virtually identical to real life. Machine learning can be applied to determine ways to make the simulations behave more like real life and the renderings to look identical to the images captured by cameras.
All of these simulations can be run with the robot powered off. As such, the robot is learning while it sleeps. It's as if the robot is getting smarter while counting electric sheep. Actually, the project is a reference to the 1968 novel, Do Androids Dream of Electric Sheep? by Philip K. Dick that was adapted into the movie Blade Runner.
Learning by dreaming is alive in the lab.