Not so advanced as very correctly pointed out by https://news.ycombinator.com/item?id=27554949
Motor-attitude control loop cannot, and should not be "fused" with navigation layer, whether INS, or optical flow sensor.
The bog standard deterministic controls algorithm here is "bog standard superior." Computer vision/optical flow sensors have much lower precision than gyroscopes, and accelerometers, let alone aerospace grade ones.
Not my monkey; not my circus. But… is critique of terms “gatekeeping”? I understand this term in the context of ownership (ie. gatekeepers for books are publishers/printers who own the means of production.)
I greatly appreciate some level of debate of terms here on HN.
First time I got my hands on a working mounted arm, I was cautioned again and again the need to run any new program in low-speed mode first. Because the arm had no limiting logic and would cheerfully power-bomb its own base with all the force and torque its motors could muster if I told it to.
Preventing that is still considered an "advanced technique."
Dropped frames are one of the easiest things to handle. Yes, the visual feature tracking depends on frames of video but even cheap phone IMUs these days are good enough to dead-reckon for a second or two, especially when embedded into a sensor fusion framework, so the prediction errors resulting from a single lost frame should be very minimal and not enough to throw off the tracking.
That's why I find it hard to believe that the VIN in use by the Mars Helicopter (part of a multi billion dollar program) wouldn't be able to deal with a dropped frame. It just doesn't add up. I suspect that the situation is much more complex than what the article suggests and that more things went wrong than just a dropped camera frame.
iPhone was not a first.
"Agent V" (2006) — a mobile AR game preinstalled on Nokia 3230 already has such technology.[0]
2) Was this something you even really needed to "Well, ackshully..." anyway?
My interpretion is that the copter, as an inessential system component, was seen as an opportunity for junior people to get some end-to-end experience. I hope they are learning the right things.
So maybe the issue is not as simple as a news article portrays it
or are you relying on a third party article's interpretation and then extrapolating that even further?
> This paper provides an overview of the Mars Helicopter navigation system, architecture, sensors, vision processing and state estimation algorithms.
The first failure, which delayed the initial spin test, was described as a "watchdog timeout", which for anyone not familiar with embedded development basically means the code crashed. We all write code that crashes, but I am having trouble thinking of an excuse to justify the fact that their code crashed before takeoff, on Mars, and they didn't see it coming. There is nothing about sitting on the ground on Mars that shouldn't have been tested repeatedly on earth, and testing in production is _really_ not the right way to do Aerospace development (although Boeing Starliner would beg to differ)
Similarly, there are a huge number of things that can and will result in dropped frames when running Linux on a Qualcomm mobile chip, and having a software stack that infers frame timing purely from the sequence number is brittle, and would definitely not have passed code review and testing where I work (I actually checked, we do have a robust solution). If I had to guess, I suspect the root cause of the dropped frame wasn't actually anything exciting like a cosmic ray, but instead was some run-of-the-mill event that would have been caught by a couple hours of flight testing on Earth. Either way, it shouldn't have made it to Mars.
I'm sure that there are a lot of great engineers working on the Ingenuity project that _don't_ write these sorts of bugs, and am glad that theae amateur fuckups (barely) haven't crashed the drone before it has been able to do some incredible technology demonstration work.
In my opinion assuming they were "testing in production" (production being mars!), or writing code that "definitely not have passed code review", or that they did not do couple of hours of flight testing on earth, is an unnecessarily unkind assessment of this project.
[1] Helicopter Models and Test Facilities: https://rotorcraft.arc.nasa.gov/Publications/files/Balaram_A...
I am very skeptical of the claim that this code was not tested before launch.
As an amateur RC pilot familiar with some of the excellent RC flight control systems, it would have been a huge missed opportunity if JPL didn't invite some experienced engineers from the commercial and consumer drone community to provide input (QA folks too!). It's hard to imagine they wouldn't have gotten ample volunteers to spend a few days helping out.
I understand JPL is already designing a larger and more capable iteration. It would be cool if experienced drone flight control devs such as yourself dropped the team a note.
Presumably they use some kind of Kalman Filter, but those are easy to program to account for missing frames, or frames at non-discrete timepoints, perhaps even for screwy camera images if the programmer had a reasonable prior for the likelihood of it happening. Kalman Filters by design account for measurement error.
I didn’t read into it too much so I may not have all the details right, but I think this is the gist of it.
That would make me curious how the timestamp error occurred: software, hardware? Camera or Navigation code? I assume they have very high standards, what was the process failure point?
Usually with a Kalman filter, you’re taking into account the spatial measurement error (gyro-measured roll rate error, accelerometer-measured acceleration error, etc) but I don’t think I’ve ever encountered a system that explicitly modelled sensor latency variation relative to timestamps. Based on the description of the problem they encountered here, I suspect what happened is that it lost a frame but didn’t adjust the “photo timestamps” appropriately; every frame that came along afterwards would have had an incorrect timestamp? Even if the Kalman filter was set up to handle “this photo was taken 20ms ago” when doing its forward integration, if they didn’t model “this photo was taken 50ms ago but is reporting that it was taken 20ms ago” then you’d pretty readily get the kinds of oscillation they were getting.
Edit: yeah, just like the sibling comment said :)
This bathtub style curve for perception of NASA design by HN commenters makes me question if the perception correlates with reality.
The GPS constellation has 32 satellites. A GPS satellite weighs ~2000kg. Starship should get 100-150 tons to Mars - the math checks out! (Disclosure: I got all my rocket science from sending Kerbals to their doom.)
So, MPS for Mars, LPS for Luna? Just imagine the amount and quality of rover footage we could get if each of them had (constant!) 1Gbps uplink to Earth...
atomic clocks
That would be really cool, I wonder if it would be faster and more precise because there would be negligible human-produced radio noise to interfere and the Martian ionosphere is much thinner, or if it would be worse because the thinner and weaker atmosphere and magnetic field don't protect the receivers from solar noise.
I wonder if a rover could drop a few beacons for time-of-flight 2D triangulation, it's not like they're moving hundreds of miles away over the horizon.
I imagine it’s possible to set up a ground-based network, but you would need a high density to cover large surfaces (you want to see at least four stations from every position). I also imagine that it would be difficult to get accurate vertical positions if the stations are all in the same horizontal plane.
[1]: https://en.wikipedia.org/wiki/Real-time_kinematic_positionin...
I understand the IMU is not an ideal input and integration over time leads to positional errors. But gyros are much better and the orientation of the drone in flight is paramount. What I wonder is what ‚advanced‘ control law allowed the drone to become unstable wrt. orientation when there was noisy positional input.
It's quite hard to detect when position and velocities are "obviously incorrect", especially when they come from VIO, where the optimization result can jump around in non ideal conditions, so I'm not surprised there was not a more graceful anomaly detection.
When it's on land, they can make the gyro reliably point 'down'. Then at least during flight they know which way 'down' is.
Would this be too fragile for Mars?
MEMS Gyros measure angular rate, not absolute angle; to compute an actual angle, you’re taking the integral of the rate from t=0 to now. Any small errors in the measurements add up quickly to give you completely nonsensical results. For drones-on-Earth, we use a variation of the Kalman filter to combine short-term and long-term measurements. As an example, an accelerometer requires a double-integral to turn into position, so errors accumulate very quickly, but we can correct those errors using GPS. The accelerometer and its integrals give us really quick acceleration, velocity, and position updates (at, say 500hz), and then the GPS is used to correct the long-term position and velocity (at, say 5hz).
I am sure the one on the Mars helicopter had more precision though :)
There are certainly people involved in the project who could have explained this to them. I hope they are learning fast.
This is how nearly all modern high-end quadcopter drones fly - navigation using optical flow camera sensors short circuited directly into attitude/motor control loops.
I guess there is not a small chance they entrusted helicopter autonomous operations programming to people with quadcopter background.
I would not be at all surprised to learn that commercial quads share the design mistake. I was surprised to learn that NASA professionals copied it into a Mars probe. But, notably, not into the vehicle that delivered the lander.
It turns out sideways drift accumulates very quickly - and so quickly that unless you are a very practiced drone operator its very hard to compensate for by hand.
GPS compensates somewhat, but obviously that isn't available on mars (or indoors).