If you're interested in the process - or exploring your specific approach then why stop?
The additional interest might actually be helpful.
The time component is super interesting here though!
The paper I think your referring to made the interesting leap that a 3d radiance field could be rerendered out as a field of Gaussian splats, and that this would probably run faster in modern GPU pipelines for real-time performance. It looks like they also have the nice property of being able to be shifted around in memory quickly hence the animation property seen here.
I'm curious, would you classify particle effects drawn with quads as 4D gaussian splatting too?
I realize that a lot has happened since, but this is likely where it all started :)
Essentially they're doing what you do when you train a neural network, only that instead of adjusting weights connecting "neurons", you adjust the shape and position of gaussians, and the coefficients of spherical harmonics for the colors.
This requires the rendering step to be differentiable, so that you can back-propagate the error between the rendering and the ground-truth image.
The next key step is to every N iterations adjust the number of gaussians. Either fill in details by cloning a gaussian in an area which is undercovered, or split a gaussian in an area which is overcovered.
They use the gradient of the view-space position to determine if more detail is needed, ie those gaussians which the optimizer wants to move significantly over the screen seems to be in a region with not enough detail.
They then use the covariance of the gaussians to determine to split or to clone. Gaussians with large variance get split, the others cloned.
They also remove gaussians which are almost entirely transparent, no point in keeping those around.
That's my understanding at least, after a first time gloss-through.
> Essentially they're doing what you do when you train a neural network, only that instead of adjusting weights connecting "neurons", you adjust the shape and position of gaussians, and the coefficients of spherical harmonics for the colors.
My brain:
> They're providing inverse reactive current to generate unilateral phase detractors, automatically synchronizing cardinal gram meters.
Also has anyone been working on solving the "blurry" look these splats have up close?
This seems to be a rendering efficiency innovation, not particular to scanning.
That means it applies to artificially generated environments, whether photo realistic or stylized, and whether based on a real environment or a completely fictional one.
But of course, any photorealistic, extremely faithful to the smallest detail, rendering of a real place is going to involve a lot of scanning. That is true for any kind of rendering.
That said, games don't have to be super realistic to be fun. E.g. I could imagine a game based on GS at "Minecraft resolution".
His editing is hilarious too.
https://lumalabs.ai/capture/ed9d985b-9cc1-49e0-a39c-88afa203...
https://lumalabs.ai/capture/83e9aae8-7023-448e-83a6-53ccb377...
https://lumalabs.ai/capture/7f8df9c9-c548-4a47-9892-e945637c...
https://lumalabs.ai/capture/076fcfdc-ea80-4fdc-8159-c9fed831...
The key drawback that isn't highlighted is that you need a physical space to be a close approximation of what you want to render. So if you want to make a few counter strike maps based off of your workplace (not recommended) then this would be a good technology, but if you want to make an open world on an alien planet you're likely better off with traditional rendering.
Although actually, and on a slightly more innocent (but just as edgy!) note, the thing that immediately popped into my head upon reading "4D Gaussian Splatting", was the music from the 1992 Future Crew demo Unreal, and the image of it's inter-scene title screens. ["IYKYK", but basically, that famous old PC demo consists of several short sections, each showcasing a particular coding/graphical technique - each section prefaced by a title screen which named the effect being showcased.]
YT of Unreal demo, as citation for this highly-important observation : https://www.youtube.com/watch?v=InrGJ7C9B3s