RE2

RE2 secures funds to advance underwater robotic arms

Research & Development

Robotic arms developer RE2 Robotics has received $2.5 million in funding from the Office of Naval Research to continue the development and commercialization of its technology under the Dexterous Maritime Manipulation System (DM2S) program.

RE2 Robotics underwater arms

RE2’s DM2S technology will provide Navy personnel with the ability to autonomously perform mine countermeasure (MCM) missions.

In this next phase of the program, RE2 will upgrade its dual-arm prototype, known as the Maritime Dexterous Manipulation System (MDMS), for deep ocean use; apply computer vision and machine-learning algorithms to enable autonomous manipulation capabilities; and integrate with underwater vehicles that can autonomously navigate.

Related Article

The first phase of the project saw the delivery of a dexterous underwater robotic system that was capable of teleoperation in an ocean environment.

“This additional funding enables our team to further expand and upgrade the capabilities of our underwater robotic arms to perform MCM tasks in deeper water through the use of autonomy.

“In addition, this advanced technology will allow us to pursue commercial opportunities, such as underwater inspection and maintenance in the oil and gas industry,” said Jorgen Pedersen, president and CEO of RE2 Robotics.

Unlike other underwater robotic systems that are hydraulic-driven, MDMS uses an energy-saving, electromechanical system.

This allows the system to perform longer-duration subsea inspection and intervention tasks while reducing system maintenance and downtime.

“With the development of our first MDMS prototype, we created a compact, lightweight system with a sealed, neutrally buoyant design that was successfully tested in the Pacific Ocean,” said Jack Reinhart, vice president of project management.

“We’re now looking forward to improving upon that proven design by adding even greater functionality in deep water, including integration with new underwater vehicles and computer-vision-based autonomy.”