Mayflower AI sea drone readies maiden transatlantic voyage
Over its roughly three-week trip from England to the United States, the Mayflower will be guided by an artificial intelligence-powered ‘captain’ and make the journey without humans on board.
Another ship called the Mayflower is set to make its way across the Atlantic Ocean this week, but it won’t be carrying English pilgrims — or any people — at all.
When the Mayflower Autonomous Ship leaves its home port in Plymouth, England to attempt the world’s first fully autonomous transatlantic voyage, it will have a highly trained “captain” and a “navigator” versed in the rules of avoiding collisions at sea on board, both controlled by artificial intelligence (AI).
The ship’s AI captain was developed by Marine AI and is guided by an expert system based on IBM technologies, including automation software widely used by the financial sector. The technology could someday help crewed vessels navigate difficult situations and facilitate low-cost exploration of the oceans that cover 70 percent of the Earth’s surface.
Over its roughly three-week trip, the Mayflower sea drone will sail through the Isles of Scilly and over the site of the lost Titanic to land in Plymouth, Massachusetts, as the colonists on the first Mayflower did more than 400 years ago.
This sleek new vessel, however, will carry experiments instead of people, and has more room for experiments because it has been designed without sleeping quarters, a galley or a bathroom.
Up to 700kg of experiments can be housed in modular compartments inspired by the design of the payload bay of a space shuttle.
“Right now, it’s full to the brim,” Brett Phaneuf, a managing director at MSubs, which built the Mayflower for the non-profit Promare and its partner IBM, told attendees at the May Xponential conference held by the Association for Unmanned Vehicle Systems International (AUVSI).
Science on board
There are a host of additional companies, individuals and universities that have contributed experimental technology and data-gathering equipment, Phaneuf said.
As a result, the Mayflower will be able to study sea levels, measure wave height and gather water samples for testing at regular intervals throughout its voyage.
The Mayflower will also do pollution sampling and document water chemistry. On board will be a holographic microscope for scanning water samples for microplastics — bits of plastic 5mm or less that are harmful to ocean life.
To determine water chemistry, one of the experiments will literally “taste” the water with a test originally devised to uncover counterfeit wine and whisky.
“You can dip the ‘tongue’ into the liquid and it will give you the exact chemical profile of the liquid that you’re looking at,” Lenny Bromberg, IBM’s programme director for automation, intelligence and decision management, told Al Jazeera.
The Mayflower will also be using hydrophones to listen for whales.
IBM, working with the Jupiter Research Foundation and Plymouth University, has created models of the various types of whales and other cetaceans found in the North Atlantic. With those models, said Phaneuf, “we’ll be able to determine species and number of animals” plus their location and general environment.
Experimentation at the wheel
Though the Mayflower won’t be taking any side trips on this voyage, the ship’s AI systems enable it to change course on its own if, for example, a science experiment finds something that merits further investigation.
Don Scott, who is one of the lead engineers on the Mayflower project, said the AI captain will have the ability to direct operations as needed, “which is a really key distinction that separates this from other types of platforms”.
“The science experiments aren’t just passengers on the Mayflower,” Scott, chief technology officer of Marine AI, told Al Jazeera.
To get to that point, IBM has been using its visual inspection technology and images of what the Mayflower might find out in the ocean to train the AI captain, Andy Stanford-Clark, IBM UK and Ireland’s chief technology officer, told the Xponential audience.
If there is something to be debugged or, heaven forbid, an accident, we can say, ‘Why did you make that decision?’ And it'll explain exactly why it made that decision.
Using a digital simulation — a sort of twin of the ocean — researchers threw the images in front of the AI captain’s virtual cameras to teach it what to do in different situations.
“That’s really where our training ground [is] for giving us the confidence the AI captain will do the right thing when it comes across something that it hasn’t seen before,” Stanford-Clark said during the conference.
The Mayflower’s AI captain can now take in information from the ship’s camera plus its radar, IBM’s weather service as well as coastal maps and the telemetry broadcast by ships through their automatic identification systems.
The AI captain puts these parameters, plus other factors such as the vessel’s battery power, the wind speed and direction “into a big optimiser,” Stanford-Clark explained.
The system then generates a response, he said, and given the constraints, decides “What’s the next best thing you can do? Where should you go, at what speed and in what direction?”
But an AI system’s decision-making process can be unclear and marine travel is a regulated activity where understanding how a decision is made is essential — so the team took it one step further.
It added an IBM Operational Decision Manager (ODM) — a rules-based expert system with an extensive history in the financial industry that Stanford-Clark said was “very, very good at parsing rules”.
The team fed the ODM the rules from the Convention on the International Regulations for Preventing Collisions at Sea (COLREGS) so it would know the rules of the sea.
The science experiments aren't just passengers on the Mayflower.
And because the ODM is a rules-based system, “it has full explainability”, Stanford-Clark said, and is able to provide “an audit trail of what it decided”.
“If there is something to be debugged or, heaven forbid, an accident, we can say, ‘Why did you make that decision?’ And it’ll explain exactly why it made that decision,” he said.
It’s that technology that intrigues Larry Mayer, a professor and the director of the Center for Coastal and Ocean Mapping at the University of New Hampshire and one of the programme leads for the Seabed 2030 project.
“This is the big and difficult stuff,” Mayer told Al Jazeera. “I think we’ll all be very, very interested to see how well that works, and I’m hoping very much that it does work well.”