1. Leveraging the multi-axis image-stabilizing movement available to an in-camera DSLR sensor with GPS for the purpose of tracking the sky during a long-ish exposure to reduce star trails. Ricoh-Pentax Astrotracer. http://www.ricoh-imaging.co.jp/english/photo-life/astro/
2. Removing the camera's IR filter, allowing it to capture hydrogen-alpha rays (656nm). This captures energy (image, color) not otherwise seen by normal camera sensors. Canon EOS Ra. https://www.usa.canon.com/internet/portal/us/home/products/d...
I know this only because I've been researching a DSLR/mirrorless camera upgrade but also delaying it regularly while I'm reminded how excellent phone cameras have become! Unless you're a pro, a pixel-peeper, an artist, or just someone who simply enjoys the machinery and process.
> The image has not been retouched or post-processed in any way.
Then the whole article describes how they've automated a full astroprocessing pipeline inside the phone, that makes heavy postprocessing...
* Aggressive HDR, sharpening, automatic tint 'fixing', etc. Am I talking about filters or automatic photo processing in flagship phones.
I do get the sentiment in that statement, in that if you take a photo with that phone, you can get the same output without doing much else than pressing the capture button and being still. Very still.
What google is doing with their pixel phones is magic, they turned a normal camera into a very very good one. I'm still impressed every time I use night sight. The drawback, as always, is that as soon as you zoom in or see the pictures on a large screen (or printed) it's painfully obvious that they're heavily post processed. It almost look like an abstract painting or an AI generated image (which it almost is).
Even with this, to get those dark dust clouds and stark colors some heavy postprocessing is required (which I mostly don't do because I consider it too much an alteration of original image, but it creates more interesting image). Don't think for a second that those superb images you can see everywhere are not literally over-painted in Photoshop (look at online tutorials on how to do it if you don't believe me).
I guess to make things impressive, google guys went to some proper remote desert far from any artificial light. And unless I missed something, they still used some tripod. In european alps, this kind of result is practically impossible - there is always some tiny village in every valley, and even if not light pollution seeps from far. One night panorama I have has quite strong glow coming from village of Chamonix some 15km far, that is on the other side of massive Mont Blanc range [1]. Anything can be achieved if you start playing a lot with Photoshop brushes, layers etc. but for me its one step too far.
Imagine what results can be had when such algorithms are paired to a full frame (or bigger) sensor!
[1] https://www.flickr.com/photos/99251154@N04/22790364795/in/al...
Tbh it's not that hard to get sharper results, this was shot on a 40 years old camera, with a 50+ years old lens using a cheapo carbon fiber tripod ( 25+mph wind that probably rocked my camera quite a bit)
https://i.imgur.com/sdxyyEw.jpg
> And unless I missed something, they still used some tripod.
They did: "Clearly, this cannot work with a handheld camera; the phone would have to be placed on a tripod, a rock, or whatever else might be available to hold the camera steady."
Sensor size is inversely correlated with the noise level. Smaller sensors have more noise where the least noise is on full frame (i.e., 35mm) or larger sensors. Phones have small sensors.
Of course regular noise cancellation and other very lossy processing still kicks in after that (which may explain the blurry result). It would be interesting to look at the raw image produced by this.
I use open camera on my cheap Nokia 7 plus (which uses two cameras) and have been getting OK-ish results in Darktable. The dng file you get combines information from both sensors. One of them is black and white so these look really flat until you fix it in post processing. The raw photos have lots of noise (as you would expect) but noise filtering is pretty effective.
I imagine for this it would produce a dng with information from the different stills combined but none of the other post processing (except maybe hot pixel removal).
UPD: oops, XT-1 is just 5 years old, actually. I guess I have to take my words back, sort of.
Still, the point is the same - as long as you make a shot for your Instagram account - your smartphone is alright. For anything bigger you still need a camera with decent lens.
Per-pixel noise depends indeed on the pixel size, but image-scale noise is almost completely independent of the pixel size.
That's because while a large pixel has less noise, it appears larger in the output and so the noise is correspondingly more visible. A small pixel is noisier, but smaller, so the net noise is the same for the same sensor size.
On the other hand, some people who like Google like it because it still sometimes works on geeky, cool, fun stuff instead of being super product focused.
If you compare a paper star chart from the Gameboy era to Stellarium³ and a Pixel, then it can only lead you to wonder where we will be in another 20 years.
1. http://pietrow.net/astrogb.html
Well, now hardware is fine and the "updates" are starting to feel less and less valuable. They no longer bring faster, cleaner interfaces. They just bring some new widgets and gizmos.
Now, I think the Pixel line is... OK. My wife and I have a Pixel 2 XL (used, ebay) and a Pixel 3 (spring sale for $400). And the 3a line is close to "everyday" pricing, especially when it goes on sale. But I'm starting to question whether it's worth sticking to Google's stock phones or if it's time to start cross-shopping competitors once again. But for so many of us, being able to snap photos and have them look pretty good is a nice comfort after the ugly early years of phones with cameras that required a lot of patience and persistence to use.
So in my opinion, there's value in putting resources into ensuring good photography, even if that's not the priority for every phone buyer. What are the best alternative phones with "good enough" cameras and "everyday" pricing?
Astrophotography always kind of rubs me the wrong way though because that's not how it looks. Even if you go out to somewhere that's really dark, like a large National Park, and wait for a clear night it's never going to look in your eye like it does on Instagram. Don't get me wrong, what you do see is absolutely magnificent, it's just not what's in those pictures.
Seeing the galaxy with your own eyes is one of the most majestic things you'll ever witness. It's something that has inspired spontaneous prayer throughout history. It doesn't really need a filter.
I don't want my camera to have a touchscreen/social media/wifi but it'd be cool if Adobe Camera Raw/some alternative could do this stuff!
Perhaps they're trying not to cannibalize their lower market segments or think that professionals would never use those things (on which they might be correct). But I can definitely see that computational photography beyond raw->JPEG conversion with a color profile could have its place in a DSLR.
Additionally, you are always going to move your pictures out of the camera for proper viewing, so it doesn't make a lot of sense to provide the best possible picture already in the camera.
I don't know if the software they bundle with the camera is any good at this computational photography though (I've never checked to see if there are linux versions), but it better be if it indeed helps image quality.
This is more of a "Now anyone can capture decent astrophotography with just their phone!" then some revolutionary new thing if that makes sense? Basically they have to push their phone's sensor as far as it'll go and use software to remove the noise, instead of using a better lens/sensor setup (which is space and cost prohibitive for a phone).
You're still welcome to do extensive post-processing on your computer (I don't want my camera doing any processing), and indeed that's what any astro-photographer will do if they wanted.
That said I've captured some amazing night/moon photos with my A6000 that I have never been able to achieve before, and that's without any extra processing.
Most of them are historical companies and are probably very old school / slow to adapt. Google has probably access to better software engineers than Nikon or Canon which seem to barely be able to develop a working bluetooth/wifi sync.
iPhone 11 Pro here.
Here is why I ask - It has gyro - so knows whether we are pointing at sky or not. - AI checks whether its a clear sky - if yes - post fake image. If no - Dont risk getting caught. - Time + geo spacing - Gives the angle, position of camera relative to the space above us.
And an other solution is to simply store a bitmap of the sphere around earth.
In the end photography like this is art, though, so if the person taking the shot is happy with it, then it's fine, probably. Just don't enter it in a competition with rules against retouching...
https://www.msn.com/en-ph/news/technology/was-the-moon-landi...
* Compensating for moving stars
* "Live viewfinder" during exposure
* Selectively darkening the sky
* Dark current compensation (though that is probably needed for all long-exposure photography...still, not a simple "more exposure" feature)
Aside: randomly recognised Ryan Geiss in the credits, he did the Milkdrop plugin for Winamp back in the day, and also some cool tech demos for Nvidia...