Photos in Moments of Time and Place
There has been a lot of speculation on what Photos.app will be like when its out. That’s a quite natural thing given the general interest in Aperture, combined with the many things Apple has left open until now.
Joseph already took a detailed look at the screenshots available for Photos on Mac and what can be drawn from the tools and icons shown in them. There is another resource though that we’ve left out so far; the available developer resources.
Aren’t those under NDA?
Apple got much more open with this WWDC. Most things are indeed available publicly (Photos Framework Reference (iOS) ) and it’s ok for developers to discuss them in public. One should be careful though; these are preliminary documents for APIs and technologies currently in development. They will very likely will change until their official release. So if anything is not available yet—this doesn’t mean it won’t come, or isn’t already there but simply not documented.
“But I’m no developer!”, you say
No problem—the purpose of my article is to provide a view on this resources to help you understand what’s going on and give you the right information to better estimate how serious Apple really is with its new strategy for its photography related products. If I leave questions open—feel free to ask me in the comments section.
The Big Picture
If we look back, there first was iPhoto. Introduced more than 12 years ago it quickly developed to the default solution for many Mac users to save, organize and share their photos. Three years after iPhoto, Apple introduced Aperture as the first “All-in-One” post production tool for photographers. Not much later (November 2006) , the RAW support formerly only in Aperture was transferred to Mac OS X (Tiger). Since then, RAW became available as an API in OS X and available in Aperture, iPhoto, Preview, Finder and many third party apps based on ImageIO (the API).
Over time Aperture got cheaper, and it gained more and more features. At the same time, it made things easier without losing anything of its power. Some people decided that this had to mean that Aperture lost its professional direction. I never understood this. What I think actually happened is that Aperture profited from developments in the broader scheme of Apples media frameworks. It would have been ridiculous to leave out features like face recognition, iOS Photostream and such things, just because some professional photographers think that they do not need them. It actually is a good thing that Apple consolidates its APIs in a way that most applications can profit from further developments on any levels.
The Hidden Progress
I personally think that under the hype about some of those so-called “consumer” features, many people missed seeing significant progress in areas like Camera RAW and performance. I remember that I was once very stunned when I reprocessed an old photo of a shark. In the following picture you can see the shark’s eye and how it comes out without raising denoise in RAW fine tuning (Original), how it looked if I bumped up to the max with the old RAW processing (Max. Denoise “Old”), and how it looked in the new RAW processing. As you can see, there is a huge difference. The last picture in this row shows a variant where I didn’t max out the slider and additionally raise the “details” slider a bit so that Aperture leaves some grain in it for a more pleasing look.
Just to debunk one idea: this progress is not just allowing the slider to go further. It clearly shows completely different denoising algorithms used. The old one was very likely a simple median filter while the new one looks like a wavelet based one. Though, everyone and his cat tells people until today that Apple made no progress over the last several years. They actually had great progress, and Aperture took advantage of it as a side result of being dependent on the OS feature “Apple Digital Camera RAW”.
Since then, I really haven’t had much need for NeatImage or any other noise reduction plugin that I had to use before to get useable results from noisy pictures.
The beginnings of Photos?
Not long ago, Apple announced the new unified library format for iPhoto and Aperture. We do not know if this actually was the first step into what we now know as Photos, but it certainly shows further developments in getting “pro features” on a wider base than one application. What does this mean? If you only have to develop for a single application, you do not have to put much thought into good abstractions in your code. As soon as you have to use parts of your code in multiple programs it gets more complicated and you have to plan much more carefully. What Apple actually seems to have done with the unified library is abstracting out the library code of Aperture in a way that iPhoto can use, too. The next development step from this point is creating a public API, that can be used for many different purposes and even by external developers. That is what’s happening now with what we know as “PhotoKit”; the base frameworks on which Photos.app builds on. So the application is no longer responsible for managing the guts of our photo libraries. This will now be done by an OS framework which can get used by any application.
What’s in it?
Many people got frightened, because the demo mostly showed things like the “Moments” based interface, which shows all your photos automatically grouped by time and location. The demo video at least showed that there are three other sections named “Shared”, “Albums” and “Projects”—so there has to be more. There still is much critisism about Apple’s idea of “all your photos in the cloud”—many users think that they will have to store their photos into iCloud from now on. (Editor’s note: This is absolutely untrue, as has been stated many times before. iCloud storage will be an option, not a requirement. –Joseph)
So if we look into what PhotoKit is now, we should be able to answer some of those questions and to better understand what will be possible with apps like Photos. There are indeed some interesting insights one can get from the documentation.
PHAsset is what is used for the photos. This is not that remarkable—I think everybody should have guessed that there has to be some thing that represents a photo. We actually want to know where our folders, projects and albums will be—if they are there.
They are. What we know as an Album is actually a PHAssetCollection a subtype of PHCollection. Other types of this thing are “SmartAlbum” and “Moment”. So a “Moment” is not that different to what we know as an Album today, but it gets created and managed automatically based on time and location of our photos. Where do albums come from? The documentation says they can be created within Photos.app or appear through “iTunes sync”. What? iTunes? What this actually means is that there still will be the good old “My Mac is my digital hub” sync method we now have.
There are also more than a dozen “subtypes” of PHAssetCollections There are the types “AlbumSyncedEvent”, “AlbumSyncedFaces” and “AlbumSyncedAlbum”—which should represent Events (Projects in Aperture), “Faces-Albums” and ordinary Albums alls coming from a former iTunes-Sync.
There is “AlbumMyPhotoStream” which represents the iCloud Photostream and there is “AlbumCloudShared” which represents what we now know as “Shared Photostreams” and will be in future “Shared Albums”.
What we know now is iOS 8
The current documentation doesn’t show the whole picture though. It is part of the iOS documentation set and clearly describes only what is currently known about PhotoKit for iOS. There are parts which already make clear that there has to be more than that if the Mac will step in: Generic Smart Albums are described to originate from an iTunes Sync—but there has to be a place where they get created and this seems to be just on the Mac. Also, there is no way on iOS to create an “Event”—but there has to be some way to create one, because the type “AlbumSyncedEvent” (iTunes synced Event) wouldn’t make any sense in that case.
We can follow from that, that contrary to the arguments used by some people there actually will not be all the features everywhere. There will be certain kinds of AssetCollections which cannot get created on an iOS device but maybe in Photos for Mac. But one will be able to access such AssetCollections either through an iTunes sync or as an iCloud shared Album. This is actually good news, because it shows that there is absolutely NO reason to conjecture from the current iOS 8 betas that Photos for Mac may miss important things. (There is also no reason to assume that everything will be there though!) For the applications, what finally counts is that all those things are AssetCollections and not where they are from.
So there are clear signs of Albums and Projects and also that storage will absolutely not be “all in iCloud”. But will there be Folders?
Where are the folders?
If we look at current iOS apps, there don’t seem to by any folders for albums. The good news: Folders are PHCollectionLists and the documentation talks about “folders containing user-created albums or smart albums”. The API also allows folders of folders.
Will there be more metadata than “favorite”?
If one looks into the documentation, then it shows, that there is currently no direct access to metadata about photos. Things like EXIF or ratings or tags are just not there. In practice that doesn’t mean that there is no way to get that information. The older APIs are still available, but there is no clear information from Apple yet on how this has to get used in future. It is likely that this are some unfinished parts of the API. One thing that should be kept in mind is that advanced search is an important part of Mac OS X Yosemite.
What about Adjustments and other Features?
An important lesson to learn about all this is that just because there will be new common frameworks between iOS and Mac OS X doesn’t mean that anything will be possible anywhere. If we think about photo extensions, it is unlikely that any photo extension will be available on iOS and Mac OS X. The system has to work even in absence of a particular extension. The non-destructive part of photo extensions is just an option; the programming protocols already take into account that a particular adjustment is not understood. The result will be similar to edit plugins in Aperture now.
The competition and outsight
Many Aperture users may think that Apple has withdrawn from any efforts of creating serious photography solutions. I think the reality could not be further from the truth; Apple’s new direction looks like a reaction to what Google is aiming for. At least since Google acquired NIK they put a lot of energy into advanced photography technologies into their products. They will bring RAW support for Android and they already support RAW in Google+. Apple had to advance its own ecosystem further to be able to compete with that. So the end of Aperture must not have much to do at all with Lightroom or the “pro market”. Lightroom is an island in the sea of photo management solutions. Adobe knows that and tries to fix some of that by introducing their own mobile device apps. For Apple it actually is not important if Lightroom is successful or not. There will be all kinds of professional level RAW converters integrating well with the ecosystem that is there. That could be Photos, or some other program—we will see. I personally would recommend Adobe to embrace this change and to try their best to create the best possible integration.
I think this all will turn out being very good news to all of us users, indeed.