Samsung’s Moon shot explained: Scene Optimizer plus Super resolution and AI magic
For the past few days, Samsung has been under some heat over claims that they are faking the Moon images on the Galaxy S23 Ultra. It all started when Redditor u/ibreakphotos put a blurry image of the moon on his screen and took a picture with the Galaxy S23 Ultra, which then produced a nice moon. Outlets then started picking it up and now Samsung felt it needed to explain its process to the world.
Samsung released a detailed and technical explanation of how their moonshot works. The article has actually been online for a while now, only in Korean, but the latest controversy brought us the English version. Samsung uses Scene Optimizer, AI Deep Learning and Super Resolution. Moon shot will be activated when you have enabled Scene Optimizer and zoomed over 25x, the AI Deep Learning engine, which is preloaded with a variety of moon shapes and details, recognizes the moon, then applies Super Resolution multi-frame processing to enhance the moon.
This is nothing new. Samsung has been doing the same since the Galaxy S20 Ultra premiered ‘100x Space Zoom’, and it’s certainly not the only manufacturer to use that kind of treatment.
So Samsung’s moon photos aren’t technically fake, they’re enhanced using artificial intelligence. In reality, you don’t get the final image from the lens and sensor of your phone, but more from its processor. But what did you really expect? You need a huge lens, tripod and an expensive dedicated camera to get a decent picture of the moon.
Anyway, we’ll probably be talking about this in another two or three years when people forget about this again and someone brings it up again.