ESA “had never tested a chip of this complexity for radiation,” says Furano. “We were doubtful we could test it properly … we had to write the handbook on how to perform a comprehensive test and characterization for this chip from scratch.”
The first test, 36 straight hours of radiation-beam blasting at CERN in late 2018, “was a very high pressure situation,” Dunne says. But that test and two follow-ups “luckily turned out well for us.” The Myriad 2 passed in off-the-shelf form, no modifications needed.
This low-power, high-performance computer vision chip was ready to venture beyond Earth’s atmosphere. But then came another challenge.
Typically, AI algorithms are built, or “trained,” using large quantities of data to “learn” — in this case, what’s a cloud and not a cloud. But given the camera was so new, “we didn’t have any data,” says Furano. “We had to train our application on synthetic data extracted from existing missions.”
All this system and software integration and testing, with involvement of a half-dozen different organizations across Europe, took four months to complete. “We were very proud to be able to be so quick and so efficiently flexible, to put everything on board in such a short time,” says Max Pastena, PhiSat officer at ESA. As far as spacecraft development goes, the timeline “is a miracle,” adds Furano.
“Intel has given us background support on the Myriad device when we’ve needed it, to enable PhiSat-1’s AI using our CVAI Technology,” says Dunne. “That’s very much appreciated.”
Unfortunately, a series of unrelated events — delays with the rocket, the coronavirus pandemic and unfriendly summer winds — meant the teams had to wait more than a year to find out if PhiSat-1 would function in orbit as planned.
The Sept. 2 launch from French Guiana — a first-of-its-kind satellite ride-share run by Arianespace — went fast and flawlessly. For the initial verification, the satellite saved all images and recorded its AI cloud detection decision for each, so the team on the ground could verify that its implanted brain was behaving as expected.
After a three-week deep breath, Pastena was able to proclaim: “We have just entered the history of space.”
ESA announced the joint team was “happy to reveal the first-ever hardware-accelerated AI inference of Earth observation images on an in-orbit satellite.”
By only sending useful pixels, the satellite will now “improve bandwidth utilisation and significantly reduce aggregated downlink costs” — not to mention saving scientists’ time on the ground.
Looking forward, the usages for low-cost, AI-enhanced teensy satellites are innumerable — particularly when you add the ability to run multiple applications.
“Rather than having dedicated hardware in a satellite that does one thing, it’s possible to switch networks in and out,” says Jonathan Byrne, head of the Intel Movidius technology office. Dunne calls this “satellite-as-a-service.”
Consider: When flying over areas prone to wildfire, a satellite can spot fires and notify local responders in minutes rather than hours. Over oceans, which are typically ignored, a satellite can spot rogue ships or environmental accidents. Over forests and farms, a satellite can track soil moisture and the growth of crops. Over ice, it can track thickness and melting ponds to help monitor climate change.
Many of these possibilities will soon be tested: ESA and Ubotica are working together on PhiSat-2, which will carry another Myriad 2 into orbit. PhiSat-2 will be “capable of running AI apps that can be developed, easily installed, validated and operated on the spacecraft during their flight using a simple user interface.”
For Intel, the potential impact is unquestionable. As Pastena puts it, we can eventually understand “the pulse of our planet.”