WWDC Session Shows Lens Correction and Noise Reduction for OS X, Mentions iPhoto and Aperture
WWDC 2014, Apple’s World Wide Developers Conference, has been in full swing this week, and the sessions are available for free for anyone to watch. Which means anyone with the patience to watch all these and the technical understanding to know what it means can learn a LOT about what’s coming in the future.
A reader posted a link to this video, Session 514—”Advances in Core Image”, with instruction to start watching around the 33 minute mark. You can see his post here, and jump in on that conversation if you like.
Faces, smiles and blinks, oh my!
The RAW discussion starts at 32:50, although shortly before that is something about Faces with smile and eyes-open detection that was wicked cool. It was running in real-time on video, but consider it for Faces in iPhoto and Aperture, where it could theoretically find the photos in a series of shots where everyone (or the highest number of people) are smiling and have their eyes open. A real boon for any group photo situation!
Anyway in the first slide build talking about Adjusting RAW Images, the presenter points out that RAW Support is provided for the entire operating system (which of course we knew), and nicely mentions Aperture, iPhoto and Photos (which is curious since the only “Photos” app is on iOS, which to-date doesn’t support RAW files, although I didn’t watch the earlier part of this video, so perhaps that’s changed?)
They also mention that OS X Yosemite supports the latest version of DNG specification, which is obviously a Good Thing™.
The next tidbit that caught my attention was this introduction of Custom CIFIlter, which is explained as the ability for third party filters to be added to the RAW process before the image is drawn. We all know that any render as part of the RAW process is the highest quality, so I’d like to think that this means filters like the Google Nik Collection could feasibly be added earlier in the chain, non-destrucively, which would be huge.
Maybe I’m over-interperting this (I’m not an engineer after all), but imagine an Aperture where very plug-in sits as a module in the adjustment tab, fully editable at any time, all drawing back to the RAW source for the ultimate in image quality. Yummy.
Is this the new Aperture icon?
Shortly after that we get to see a demo. Before the demo itself got going though, I spotted something in the dock that caught my attention. You tell me, what does this icon look like to you?
I screen captured that to snag the frame where the mouse rolls over the icon, and it doesn’t read Aperture, but instead is called RawExpose (which is the app the developer demos from, but the generic icon running on the right). There’s no question that icon is the same as the Aperture we currently know and love, just with a lot less color in it.
Highlights & Shadows
As part of the demo on inserting filters into the RAW pipeline, they show off Highlights & Shadows as being run in an earlier part of the pipeline. Specifically they say that you can “insert this filter into the middle of our RAW processing pipeline and take advantage of the liner input space that we are operating in. That means that you will be able to better keep the color fidelity, you’ll operate on a linear 16-bit pipeline, and at the end get better results”.
(Remember we’re not looking at an iPhoto or Aperture or even Photos app here; this is just a simple developer app written to show off this technology).
New noise reduction algorithm with OS X Yosemite appears to be a huge improvement over what we’ve had in the past, and also dramatically faster. In the demo, they are operating on a RAW image and making noise reduction adjustments in real time, drawing to screen at 60 frames per second.
Jump to about 43:00 if you want to see this in action. What probably impressed me most was the CNR slider, or Color Noise reduction. Reducing just the color noise makes a huge visual difference, and it seems to work very, very well.
Finally they discussed lens correction. The demo was specifically to show what happens if you turn it off, by enabling “vendor lens correction” which if I understood this correctly, disabled the OS X Yosemite RAW lens correction. If there are any controls to be exploited we didn’t see them here, but it was interesting to see this mentioned, and to show the ability of what’s now in OS X, even if all we saw was a before and after.
If nothing else, at least we are seeing that some level of lens correction is being built into the RAW decode on OS X Yosemite. Since improved noise reduction and lens correction are by far the two most desired and overdue features in Aperture, this is great to see.
Second GPU on the new Mac Pro
There was also an interesting demo and discussion on the second GPU on new Mac Pro. While it’s possible today, with Yosemite it’ll be much easier to access that second GPU, which means developers can send background tasks to the second GPU while performing foreground tasks on the primary. It makes for an impressive demo.
What else can we learn from these sessions?
There are many, many hours of developer sessions free for the watching. Head over to developer.apple.com/videos/wwdc/2014/ if you’re so inclined and see what else you can spot, and be sure to let us know in the comments!