Taken with my iPhone 16 Pro, with some edits in Lightroom. For the moon shots, I used a Bresser Pirsch 25–75×100 spotting scope.

https://www.reddit.com/gallery/1pt2e8u

4 Comments

  1. The last two generations of iPhones take really good night photos don’t they?

    Pulled from a quick AI chat:

    Samsung’s Galaxy S23 Ultra and later models use AI to enhance moon photos by adding surface details not present in the raw sensor data, effectively reconstructing the image based on learned patterns from thousands of lunar images. This process, while not a literal image overlay, involves synthetic texture generation that can make the moon appear more detailed than what the camera’s optics could capture alone. Samsung has stated that no image overlaying is applied, but the AI-based scene optimization enhances details and colors after recognizing the moon as the main subject. In contrast, Apple’s iPhone does not use such object-specific AI overlays for the moon or other celestial bodies. Instead, iPhone photography relies on longer exposures and computational photography to capture more light, which can reveal more stars and detail than the human eye can see, but without inserting synthetic features. While both brands use AI processing, Samsung’s approach for the moon is more aggressive in reconstructing detail, whereas Apple prioritizes naturalism and fidelity to the actual scene.