Photoshops AI neural filters can tweak age and expression with a few clicks – The Verge

Artificial intelligence is changing the world of image editing and manipulation, and Adobe doesnt want to be left behind. Today, the company is releasing an update to Photoshop version 22.0 that comes with a host of AI-powered features, some new, some already shared with the public. These include a sky replacement tool, improved AI edge selection, and the star of the show a suite of image-editing tools that Adobe calls neural filters.

These filters include a number of simple overlays and effects but also tools that allow for deeper edits, particularly to portraits. With neural filters, Photoshop can adjust a subjects age and facial expression, amplifying or reducing feelings like joy, surprise, or anger with simple sliders. You can remove someones glasses or smooth out their spots. One of the weirder filters even lets you transfer makeup from one person to another. And its all done in just a few clicks, with the output easily tweaked or reversed entirely.

This is where I feel we can now say that Photoshop is the worlds most advanced AI application, Maria Yap, Adobes vice president of digital imaging told The Verge. Were creating things in images that werent there before.

To achieve these effects, Adobe is harnessing the power of generative adversarial networks or GANs a type of machine learning technique thats proved particularly adept at generating visual imagery. Some of the processing is done locally and some in the cloud, depending on the computational demands of each individual tool, but each filter takes just seconds to apply. (The demo we saw was done on an old Mac Book Pro and was perfectly fast enough.)

Many of these filters are familiar to those who follow AI image editing. Theyre the sort of tools that have been turning up in papers and demos for years. But its always significant when techniques like these go from bleeding-edge experiments, shared on Twitter among those in the know, to headline features in consumer juggernauts like Photoshop.

As always with these sorts of features, the proof will be in the editing, and the actual utility of neural filters will depend on how Photoshops many users react to them. But in a virtual demo The Verge saw, the new tools delivered fast and good quality results (though we didnt see the facial expression adjustment tool). These AI-powered edits werent flawless, and most professional retouchers would want to step in and make some adjustments of their own afterwards, but they seemed like they would speed up many editing tasks.

AI tools like this work by learning from past examples. So, to create the neural filter thats used to smooth away skin blemishes, for example, Adobe collected thousands of before and after shots of edits made by professional photographers, feeding this data into their algorithms. The GANs operate like a paired student and teacher, with one part trying to copy these examples while the other tries to distinguish between this output and the training data. Eventually, when even the GAN is getting confused trying to tell the difference between the two, the training process is complete.

Basically, were training the GAN to make the same corrections a professional retoucher would do, Alexandru Costin, Adobes vice president of engineering for Creative Cloud, told The Verge.

It sounds straightforward, but there are lots of ways this training can go wrong. A big one is biased data. The algorithms only know the world you show them, so if you only show them images of, say, white faces, they wont be able to make edits for anyone whose complexion doesnt fit within this narrow range. This sort of bias is why facial recognition systems often perform worse on women and people of color. These faces just arent in the training data.

Costin says Adobe is acutely aware of this problem. If it trained its algorithms on too many white faces, he says, its neural filters might end up pushing AI-edited portraits toward whiter complexions (a problem weve seen in the past with other ML applications).

One of the biggest challenges we have is preserving the skin tone, says Costin. This is a very sensitive area. To help root out this bias, Adobe has set up review teams and an AI ethics committee that test the algorithms every time a major update is made. We do a very thorough review of every ML feature, to look at this criteria and try and raise the bar.

But one key advantage Adobe has over other teams building AI image-editing tools is its catalog of stock photography a huge array of images that span different ages, races, genders. This, says Costin, made it easy for Adobes researchers to balance their datasets to try to minimize bias. We complemented our training data with Adobe stock photos, says Costin, and that allowed us to have a good as possible, distributed training set.

Of course, all this is no guarantee that biased results wont appear somewhere, especially when the neural filters get out of beta testing and into the hands of the general public. For that reason, each time a filter is applied, Photoshop will ask users whether theyre happy with the results, and, if theyre not, give them the option of reporting inappropriate content. If users choose, they can also send their before and after images anonymously to Adobe for further study. In that way, the company hopes to not only remove bias, but also expand its training data even further, pushing its neural filters to greater levels of fidelity.

This sort of speedy update based on real-world usage is common in the fast-moving world of AI research. Often, when a new machine learning technique is published (usually on a site named arXiv, an open-access collection of scientific papers that havent yet been published in a journal), other researchers will read it, adopt it, and adapt it within days, sharing results and tips with one another on social media.

Some AI-focused competitors to Photoshop distinguish themselves by embracing this sort of culture. A program like Runway ML, for example, not only allows users to train machine learning filters using their own data (something that Photoshop does not), but it operates a user-generated marketplace that makes it easy for people to share and experiment with the latest tools. If a designer or illustrator sees something cool floating around on Twitter, they want to start playing with it immediately rather than wait for it to trickle into Photoshop.

As a widely used product with customers who value stability, Adobe cant truly compete with this sort of speed, but with neural filters, the company is dipping a toe into these fast-moving waters. While two of the filters are presented as finished features, six are labeled as beta tools, and eight more are only listed as names, with users having to request access. You can see a full list of the different filters and their respective tiers below:

Featured Neural Filters: Skin Smoothing, Style TransferBeta Neural Filters: Smart Portrait, Makeup Transfer, Depth-Aware Haze, Colorize, Super Zoom, JPEG Artifacts RemovalFuture Neural Filters: Photo Restoration, Dust and Scratches, Noise Reduction, Face Cleanup, Photo to Sketch, Sketch to Portrait, Pencil Artwork, Face to Caricature

Yap says this sort of approach is new to Photoshop but will hopefully let Adobe temper users expectations about AI tools, giving them the license to update the tools more quickly. Weve built this framework that allows us to bring models [to users] faster, from research to Photoshop, says Yap. Traditionally when we do features, like sky replacement, theyre really deeply integrated into the product and so take a longer time to mature. With neural filters, that update cycle will ideally be much faster.

Its this pace that were trying to bring into Photoshop, says Costin. And it will come at the cost of the feature not being perfect when we launch, but were counting on our community of users to tell us how good it is [...] and then we will take in that data and refine it and improve it.

In other words: the flywheel of AI progress, wherein more users create more data that creates better tools, is coming to Photoshop. Tweaking someones age is just the start.

Read the original:
Photoshops AI neural filters can tweak age and expression with a few clicks - The Verge

Related Posts

Comments are closed.