This article is very similar to the one I posted on Thursday of last week.
The Autodesk Applied Research Lab pursues a broad scope of inquiry — from Advanced Robotics, Internet of Things, and Machine Learning, to Climate Change, and the Future of Work. The team builds real-world prototypes to truly understand how cutting-edge technology will develop in the future, and how these developments will affect the future of Autodesk and the world at large.
Virtual reality is an artificial world consisting of images and sounds created by a computer that is affected by the actions of a person experiencing it. Most first-person video games are examples of virtual reality. Augmented reality is an enhanced version of reality created by the use of technology to overlay digital information on an image of something being viewed through a device such as a camera on a smartphone. What separates augmented reality from virtual reality is the inclusion of the real, physical world in the environment being experienced.
Team member, Senior Research Engineer, Evan Atherton, filed this report based on his recent work with augmented reality.
One of the main goals of Autodesk's Applied Research Lab is to explore new ways of interacting with robots — from the way we plan a robot's motion, to the way we visualize and adapt that motion. Augmented reality is a powerful tool that can fundamentally alter the way we go about the motion-planning process. By using a device like the iPad, or Microsoft HoloLens, we can project digital information about the robot into a physical space.
We are currently exploring augmented reality for three main interactions: visualization, augmentation, and adaptation.
-
VISUALIZATION
Projecting a digital simulation of a robot's motion in the physical space the robot occupies. Taking the robot off of the computer screen allows the designer to walk around it, get a better sense of its motion in real space, and check for collisions with other objects in that space.
-
AUGMENTATION
Adding additional information to the physical robot to get a better understanding of the robots behavior. For instance, we can place a meter on each robot joint to visualize how close it is to its joint limits as it moves, or we can overlay a spline to show the intended trajectory for the end effector.
-
ADAPTATION
Adjusting a parameter in our augmentation to modify the physical robot's path. Dragging the end-effector path spline, for example, could allow the designer to adjust the relative position of the robot's motion without having to return to the original design tool.
Here is a visualization of a virtual robot moving through the actual physical space of the Applied Research Lab. In this case, the colorful printout on the table is actually a marker that is recognized by the software.
Future versions will be possible without using the marker. The software can be taught to recognize the shapes of things like the robot itself.
Here is that same visualization (which predicted how the robot would move) running alongside the robot actually moving,
Technology like this allows people working with robots to see how a robot will behave before the robot performs its actions. This is even possible in the planning stages of a project as this could actually be done even before the project actually gets its robot.
By integrating these capabilities with other design tools like Autodesk Dynamo and Fusion 360, we can dramatically alter the current workflow for robotic path planning, allowing designers, artists, and engineers more control in the creation process.
Thanks, Evan.
Autodesk creates software that allows places, things, and media to be designed, made, and used. Robots are currently integral to the way many things are made. Right now, robots act in isolation. Perhaps with research like this on helping to control robots via augmented reality, robots will work right alongside humans. It won't be a case of "robots took our jobs," but "robots helped us with our jobs" instead.
Virtualization is alive in the lab.