12 Sep. 2019
On Tuesday, it was Apple’s turn to unveil a “night mode” for its iPhone, a few months after nearly all its competitors. During last year, Google’s latest Pixel sort of introduced the world to the power of computational photography ; its own version of night mode, the well-named Night Sight, looked like a superpower. A few months later, Huawei showed the industry that it wants to be considered as the leader of mobile photography. The P30 Pro demoes were a good argument for the company’s ambitions, and they did so especially with, you guessed it, an impressive night mode.
The tech scene was understandably very impressed and I was too. What would have looked like an obscure dark shape now looked like a brightly lit scene. Shots of cities at night looked so bright that it almost looked the pictures were Photoshopped.
Knowing this feature is just a mode, meaning you can decide not to use it for a regular night picture, I never really liked the results of it. Technically the pictures are impressive. Sure. But something felt a bit off about them.
This morning, I had a “yes, exactly how I feel” moment when I read John Gruber’s take on the Apple keynote, and more precisely the part where he compares Apple’s take on night mode (where it is not really a “mode”) with Google’s, where you have to select “Night Sight” for the night mode to kick in:
My guess has been that Google made Night Sight its own mode because Night Sight images, though often amazing, are also often quite unnatural. It’s so effective that it often makes nighttime scenes look like they were shot in daylight — like an old Hitchcock movie where they shot day-for-night.
Speaking of movies, the iPhone 11 Pro video samples played at the event featured a lot of night scenes and the results were absolutely stunning.