Search Unity

Unity AI 2021 interns: Navigating challenges with robotics

September 16, 2021 in News | 9 min. read
Blue robotics arm
Blue robotics arm
Share

Is this article helpful for you?

Thank you for your feedback!

AI@Unity is working on amazing research and products in robotics, computer vision, and machine learning. Our summer interns worked on AI projects with real product impact.

As robots get more sophisticated and robot tasks more complex, the need for simulation is increasing. Simulation allows developers to scale as they don’t need to have a physical robot to represent every scenario they need to test. It also enables the ability to develop and test certain tasks during development, especially those tasks that cannot be carried out until the robot is fully deployed. Our Unity Robotics team is focused on enabling robotics simulation by harnessing the power, assets, and integrability of the Unity engine while building robotics specific tools and packages that expand simulation capability. The Unity Robotics Hub features demos, tutorials, and packages to get you started simulating your robot today.

During the summer of 2021, our interns worked diligently to create valuable contributions to our work at Unity. Read about their projects and experiences in the following sections.

Inverse kinematics and control in Unity

Jacob Platin, Robotics, University of Pennsylvania (Penn)

Inverse kinematics is essential for customers like READY Robotics, whose scene is pictured here

This summer, I had the amazing opportunity to work on integrating inverse kinematics and robot controllers into Unity as part of the robotics team.  When users need to simulate robots, particularly robotic arms, they need to control the robot using the same or similar APIs that they would use to control the real robots.  These APIs are known as robot controllers, and they provide a variety of functionalities, including moving the robot from one position to another, moving a single joint (in joint space), or even moving the robot in a circle.  Robot controllers work primarily in joint space—i.e., commands are given as target angles for each joint.  Humans, however, only care about the position and orientation of the end effector in Cartesian space (i.e., X, Y, and Z coordinates in our 3D world). Thus, the goal of inverse kinematics is to determine what joint angles correspond to a given position and orientation in Cartesian space.  Inverse kinematics is a crucial portion of a roboticist’s toolkit, so this package makes Unity even more capable and easier to use as a robotics simulation platform. 

Integrating these features in Unity proved to be an immense challenge that required me to brush up on my linear algebra, physics, calculus, computer science, and even pre-calculus skills, while concurrently designing the software in the most user-friendly way.  I also learned about simulating industrial robots in VR by creating a demo where users can move a cube in VR which the robotic arm follows.  With challenge comes great opportunity, however, and effectively single-handedly designing, building, and shipping such a fundamental piece of code for enabling roboticists in Unity has truly been an honor.  It is unbelievably rare that employees find themselves looking forward to and being consistently challenged by their work on a daily basis, and I am lucky to say that I found that experience at Unity!

Multi-Agent Robotics Simulation

Tiffany Yau, B.Eng. Robotics Engineering, University of Toronto

Simon Chamorro, B.Eng. Robotics Engineering, Université de Sherbrooke

3D simulation of a green robot moving around a wood floor

In industrial applications, multiple robots with different specialized capabilities must work in concert to carry out complex tasks. This project showcases how coordination between multiple robots can be achieved via the Unity Editor and robotics simulation packages, along with ROS 2, to carry out a find-and-ferry task in a warehouse. This demonstration also highlights the advantage of using Unity over other robotics simulation tools where multi-agent simulations like this are challenging to accomplish. Our simulation consists of two types of robots, which we call Findbot and Ferrybot. Multiple Findbots are responsible for finding target cubes in a warehouse environment using machine learning, and a single Ferrybot navigates to, picks up, and drops off these cubes at a designated location. To accomplish this, each Findbot is equipped with a camera for detecting the cube, while the Ferrybot has a robotic arm for picking it up. This example project will be useful for robotics developers and researchers looking to use Unity’s robotics tools in their own simulations. 

Overall, this was a great experience because we were able to use and integrate a wide array of Unity packages into our project. For instance, we used the Computer Vision Perception Package for data collection to train our pose estimation model. We also used an inverse kinematics package (mentioned in Jacob’s project above) on Ferrybot for picking up the cubes. Taking a dependency on a project being developed in parallel with ours was also a major challenge, but it was a great opportunity for learning collaboration and communication. It is also very rewarding to know that our project will be used to prepare a ROSCon 2021 workshop.

Join our team

If you are interested in building real-world experience by working with Unity on challenging artificial intelligence projects, check out our university careers page. You can start building your experience at home by going through our demos and tutorials on the Unity Robotics Hub.

September 16, 2021 in News | 9 min. read

Is this article helpful for you?

Thank you for your feedback!