A user of the American Reddit forum, nicknamed ibreakphotos, has launched a shocking investigation.
This person showed with an illustrative example to demonstrate the Space Zoom technology implemented on high-end Samsung phones and used to capture the moon. Testing shows that it does not rely too much on the capabilities of the smartphone's optics and sensors, but is done using artificial intelligence algorithms.
The experiment itself is quite simple. This person downloaded a high-quality image of the moon from the Internet, then reduced it to 170 x 170 pixels and blurred it, literally "killing" all the details. He then magnified the image of the photo four times, turned off the lights in the room, thereby simulating night, displayed the image on his monitor, and took two pictures of the artificial moon using using natural shooting as well as Space Zoom technology. Then compare the two photos with each other.
ibreakphotos thinks that when taking pictures of the moon, the phone will process the image using an artificial intelligence algorithm. Simply put, Samsung trained a neural network on hundreds (if not thousands) of images of the moon, and now the results of the AI work are added to all of the photos. nature. As a result, the details that appear in the photo like a crater that the phone can't do are in fact AI-generated.
Naturally, the test results sparked a flurry of criticism against Samsung. Perhaps the company will reduce the hype about both Space Zoom and its moon photography. It should be remembered that this was one of the standout features on the Galaxy S23's camera even in the pre-release trailers.