Perhaps we spoke a tad too soon. Adobe has extended the AI-enhanced capabilities of its Photoshop image editing software to include instant sky replacement, some super-easy selection tools and some early Neural Filters, including emotion editing.
It all rolled out today in an update for Creative Cloud subscribers. Probably the most useful to experienced image editors will be the new selection tools, which use Adobe's Sensei AI algorithms to improve the software's ability to quickly draw selection lines around complex objects, in particular where tricky hair, complex backgrounds and objects that blend into the background somewhat.
But that's not the fun stuff. Sky replacement is a very neat feature that does a pretty amazing job of intelligently selecting the sky in your image and dropping in one of a couple dozen standard alternatives – or letting you upload your own sky images to get exactly the look you want. It also subtly changes the colors of the rest of the image to match.
I ran it on this image of Armenia's Temple of Garni and produced three believable alternative looks in a matter of minutes.
In quick testing, I noticed it fell to bits somewhat on wide-aperture shots with blurred backgrounds; the sky came out looking too sharp. But this tool creates separate layers for the new sky images, which you can edit separately to add your own blur later. And if you want to do this operation manually, the Select Sky feature does an awesome job giving you a one-click selection. Unlike Skylum's LuminarAI, it can't handle reflective surfaces at this point, but it's early days yet.
And then things get a bit weird. The new Photoshop gives you access to a new gallery of cloud-processed "Neural Filters" that feel like they're in relatively early stages of development. The only two that have emerged from beta are a rudimentary skin smoothing tool and a "style transfer" thing that makes an effort to change your photo into a style similar to a source image.
But there are others available in beta form, including one that tries to colorize black and white shots, a superzoom smart resolution enhancer, a JPG artifact remover, a makeup transfer tool, a depth-aware volumetric haze effect, and the extremely weird Smart Portrait tool.
Smart Portrait takes headshots into the Snapchat zone, giving you a bunch of sliders for things like happiness, anger, surprise, age, hair thickness, direction of gaze, angle of head and the direction of light on the face.
These take a minute to process on my 2013-era Macbook, and the results vary quite a bit, but used with a deft touch they can more or less do what it says on the tin. Although if you push things too far, or layer on too many filters, you can definitely end up losing a sense of who the person is into a mishmash of facial features, teeth and haircuts, presumably from the Adobe AI training dataset.
Is this a serious tool to use with your portraits? Maybe in some circumstances. It's certainly fun to play with. And it's just a beta; this functionality will clearly continue to develop as the Sensei AI is refined and extended. We're a little surprised that the toolset doesn't tackle what we'd think are some lower hanging fruits for portraits: automatic eye enhancement, eye bag removal and tooth whitening sliders to go along with a more advanced skin smoothing and wrinkle removal tool.
You can watch the Smart Portrait feature at work, along with the other new Neural Filters, in the video below.
Source: Adobe