The iOS 13.2 beta just dropped, and it includes Deep Fusion, the iPhone computational photography system.
Here are a few examples, so you can judge the results for yourself:
In my first tests, Deep Fusion offers fairly modest gains in sharpness (and much larger files — my HEICs came out ~2x bigger). pic.twitter.com/ISclMKT1hK
— Sebastiaan de With (@sdw) October 2, 2019
Click each picture to get a more detailed look, and keep in mind that these images are Twitter compressed. In that first image, focus on the upper right of the yellow speaker material.
Here’s another:
This is another photo I just took. Deep Fusion left, non Deep Fusion right. I know Deep Fusion activated because it had the preview photo before the full loaded. Turned it off using the outside the frame setting for second photo. pic.twitter.com/4XR5p09okb
— Juli Clover (@julipuli) October 2, 2019
This one shows off the overall increase in sharpness Deep Fusion brings to the table.
Next up, take a look at this blog post from JF Martin, which lays out a lot of detail on which camera modes kick in with which iPhone 11 Pro lenses, along with specific details on each of the three lenses.
And for the pièce de résistance, this video lays out both examples and detail on Deep Fusion. Interesting that the decision to use Deep Fusion is made for you. Also worth noting, at this early point in the beta cycle, Deep Fusion photos appear to consume about twice as much storage as regular photos.