![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
Adobe updated its Firefly generative AI platform multiple times last year, the most recent coming in September. Over that time, the Lightroom and Photoshop tools that rely on the technology have gotten steadily worse and the system’s choice to add a Bitcoin logo to a photo of a seagull is a perfect distillation of the problem.
Last week, I struggled to get any of Adobe’s generative or content-aware tools to extend a background and cover an area for a thumbnail I was working on for our YouTube channel. Previous to the updates last year, the tasks I asked Photoshop to handle were done quickly and without issue. Since, however, it’s been a rocky road.
All I was trying to do was make a little bit more room on that side of the frame so I could reposition the camera and lens Chris was using.
Eventually had to resort to the old-fashioned way of doing this and manually cloned out the area to produce the thumbnail we eventually published:
The issue with my request is apparently known by Adobe. When I reached out to the company for comment, a company representative pointed me to an article on Lightroom Queen where it says asking Generative Remove or Generative Fill to work in a space requires that the entire subject and anything related to it be selected or else it will try and replace it with something.
“Select the entire object/person, including its shadow, reflection, and any disconnected parts (such as a hand on someone else’s shoulder). Otherwise, the AI tries to rebuild the object based on what’s left behind. For example, if you select a person and miss their feet, Lightroom tries to rebuild a new person to fit the feet,” the article reads.
While this kind of makes sense if you don’t think about it too hard, it also is completely counterintuitive to the concept of the name of the tool and the result an editor is expecting.
If I am selecting a body part and asking a tool to fill or remove that space, zero percent of the time would I want it to replace my selection with its eldritch nightmare version of that exact same thing. What I, and any editor doing this, want is for what is selected to be removed as seamlessly as possible.
Also, this method does not always work, as I demonstrated last year:
This is a repeat of the problem I showcased last fall when I pitted Apple’s Clean Up tool against Adobe Generative tools. Multiple times, Adobe’s tool wanted to add things into a shot and did so even if an entire subject was selected — which runs counter to the instructions Adobe pointed me to in the Lightroom Queen article.
Adobe also pointed me to an Adobe Community post that has some tips for getting better results out of its generative tools, and while I can confirm these do help, we’re still seeing weird results even if we follow the instructions to the letter.
This loops us back to the Bitcoin situation. Yesterday, photographer Matthew Raifman shared a bizarre result Adobe’s Generative AI produced in Lightroom. The Generative Remove tool saw a selection of a reflection and decided to replace it with a Bitcoin logo.
“Adobe has officially jumped the shark. Their AI remove feature in lightroom just added a bitcoin to my gull bird in flight photo,” he shared on Bluesky. “A bitcoin!?!”
Raifman shared a screen recording with PetaPixel that verifies this wasn’t added on purpose and was the first suggestion from Adobe’s AI.
To its credit, two of the three options Generative Remove suggested did provide usable alternatives. Unfortunately, the Bitcoin option was the first one, which (whether Adobe intends this or not) tells an editor that it is what the platform feels is the best result.
It’s not so much that Adobe’s tools don’t work well, it’s more the manner of how they’re not working well — if we weren’t trying to get work done, some of these results would be really funny. In the case of the Bitcoin thing, it just seems like it’s trying to replace the painted pixels with something similar in shape to the detected “object” the user is trying to remove. But that doesn’t make any sense in how editors would expect the tool to perform.
Editors don’t want something replaced with an object akin to what they select to remove, they want it replaced with what is around it. But, somehow, Adobe’s AI just isn’t coded to understand this and it repeatedly generates the weirdest stuff because of it.
Generative Remove and Generative Fill have become so unreliable that some members of the PetaPixel staff have stopped using it entirely. As I pointed out, I had to go back to the manual clone stamp method to get the task I wanted completed.
“Overall, Adobe is aware and actively working to resolve,” Adobe tells PetaPixel.
When Adobe is pushing AI as the biggest value proposition in its updates
免責聲明:info@kdj.com
所提供的資訊並非交易建議。 kDJ.com對任何基於本文提供的資訊進行的投資不承擔任何責任。加密貨幣波動性較大,建議您充分研究後謹慎投資!
如果您認為本網站使用的內容侵犯了您的版權,請立即聯絡我們(info@kdj.com),我們將及時刪除。
-
- LivePeer將於4月7日舉行社區電話,重點介紹其鏈財政部的治理,資金和戰略方向。
- 2025-04-03 10:35:13
- LivePeer是一項分散的協議,利用以太坊區塊鏈使視頻處理領域民主化。
-
-
- PI網絡未能列入二手列表
- 2025-04-03 10:30:12
- 當Binance列出倡議的投票開始時,該交易所已第二次轉移了PI網絡。
-
-
-
-
-
- 隨著鯨魚的積累,比特幣(BTC)所有權動態變化,較小的持有人卸載
- 2025-04-03 10:20:12
- 來自加密分析公司玻璃節的數據揭示了比特幣(BTC)所有權動態的重大變化。
-