It began with KAUST marine scientist Christian Voolstra’s frustration with his “clunky” equipment. His colleague, electrical engineer Khaled Salama, knew how to build things and had some contacts at Stanford University’s robotics laboratory that could help. They, in turn, knew the people at the California-based Meka Robotics who made robot arms.
Scientists talking to scientists led to a visionary international collaboration that produced the world’s first underwater robotic avatar.
Ocean One is not your run-of-the-mill robot. It acts as the extension of a human operator who uses a haptic-visual interface to feel and see what the robot encounters.
Similar in size to a human diver, Ocean One has the mobility and dexterity required to maneuver and gently grasp objects. Theoretically, it can function at unlimited depths because its sensitive electronics are immersed in oil to protect them from the pressure of the deep sea.
It’s also semi-autonomous—able to navigate and control its buoyancy. Most robots have every movement programmed, explained Professor Voolstra from the University’s Red Sea Research Center. “However, this thing is intelligent. It knows how to swim; you just need to tell it where to swim.” This frees Ocean One’s operator to focus on the specific tasks that the robot is being sent to do.
The interface consists of two joystick-like devices with a grasp-and-pinch controller, a 3D display that provides visual input from the robot’s cameras and a graphical user command center that displays the data coming from the robot’s many sensors.
It was no small feat to design an underwater robot that can do more than human divers or remotely operated vehicles. Professors Voolstra and Salama wanted it to be smart, dexterous and able to travel deep beneath the ocean’s surface while still in contact with its operator on the surface.
This multidisciplinary international team began by creating a virtual interactive environment, similar to a computer game, in which they could test basic functions they wanted their future robot to have. Using joysticks, they were able to test what it feels like when the virtual robot holds something or how it might move things from place to place.
“We didn’t want to end up with something that worked in fresh water but not salt water, so we needed to simulate every single thing,” explained Salama.
“The virtual environment we created gave us so much information that it helped us with the design,” continued Salama. “You’re not just building the hardware; you’re building the whole environment that goes with it.”
Collaboration was key as the two teams learned about each other’s needs. Salama admits that he knew little about marine science. “I can design something that moves, but I had no clue about what is needed in the ocean,” he said.
Read the full article