Meta Segment Anything Model 3 - https://news.ycombinator.com/item?id=45982073 - Nov 2025 (133 comments)
p.s. This was lobbed onto the frontpage by the second-chance pool (https://news.ycombinator.com/item?id=26998308) and I need to make sure we don't end up with duplicate threads that way.
So the human results should have a clean mesh. But that’s separate from whatever pipeline they use for non-human objects.
I would recommend bounding boxes.
Checkout https://github.com/MiscellaneousStuff/meta-sam-demo
It's a rip of the previous sam playground. I use it for a bunch of things.
Sam 3 is incredible. I'm surprised it's not getting more attention.
Remember, it's not the idea, it's the marketing!
Segment anything however was able to segment all 5 dog legs when prompted to. Which means that meta is doing something else under the hood here, and may lend itself to a very powerful future LLM.
Right now some of the biggest complaints people have with LLMs stems from their incompetence processing visual data. Maybe meta is onto something here.