Adobe just made a very “we heard you” move for anyone who’s tired of bouncing between tools, panels, and half-finished AI experiments. The company rolled out a unified generative editing workspace in Firefly and introduced a conversational Photoshop AI Assistant in public beta for web and mobile, plus a scribble-and-prompt feature called AI Markup.
None of this is about replacing Photoshop craft. It’s about collapsing the “do the boring parts” loop, remove, extend, clean up, upscale, isolate, into a single flow that’s easier to repeat, easier to review, and way easier to hand off to someone who doesn’t speak fluent layer mask.
The headline isn’t “AI makes images.” It’s “AI makes revisions less annoying.”
What Adobe actually shipped
This update lands across two surfaces creators already live in:
- Firefly: a new unified image editing workspace in public beta that brings common generative actions into one place.
- Photoshop: a natural-language AI Assistant (web plus mobile public beta) that can execute multi-step edits from a typed request, with voice input supported where your device and browser allow.
- Photoshop web: AI Markup, letting you draw on the canvas to target where the AI should act.
Adobe’s framing is pretty consistent: the goal is fewer “tool scavenger hunts” and more continuous iteration. If you’ve ever done five versions of a product shot because the background changed, the aspect ratio changed, and then the client changed their mind again, yeah, this is aimed at you.
Firefly’s unified workspace
Firefly’s overhauled editor bundles the most-used generative actions into a single workspace: Generative Fill, Generative Remove, Generative Expand, Generative Upscale, and Remove Background. Adobe is clearly optimizing for high-frequency production tasks: resizing for new placements, cleaning up shots, and generating variations fast without turning your browser into a tab graveyard.
Two details matter for real workflows:
- Consistency of UI: when everything lives in one workspace, you’re not mentally context switching between “editing mode” and “generation mode.” That seems small until you’re doing it 80 times in a day.
- Non-destructive behavior: Adobe positions these generative edits as iterative and reversible within the editing flow, supporting fast review and repeatable revisions.
Why this consolidation matters
Creators didn’t need more AI features. They needed fewer places to use them. Firefly’s old vibe (like most AI tools) was: each feature is cool, but stitched together with vibes and hope. The new workspace makes Firefly feel less like a demo playground and more like a production station.
| Task | Old friction | What’s improved |
|---|---|---|
| Resize + reframe | Jump tools, re-check composition | Expand + preview in one flow |
| Cleanup + isolate | Multiple panels, repetitive steps | Remove + background tools side-by-side |
| Output polish | Upscale as a separate “last step” | Upscale integrated into the same session |
Photoshop’s AI Assistant
Photoshop’s AI Assistant is the bigger signal, not because “chat in Photoshop” is new (every app wants a chat bubble now), but because Adobe is tying natural language directly to multi-step operations inside a real production editor.
In the web and mobile betas, you can type requests like:
- “Remove the person in the background.”
- “Change the sky to sunset.”
- “Make the lighting warmer and bring down highlights.”
Then the Assistant translates that into actual Photoshop actions, selection, masking, fills, tonal adjustments, without you manually driving every step.
What it changes day-to-day
This is less “Photoshop for beginners” and more “Photoshop for teams.” The person who knows what they want but not how to do it can finally get usable drafts without waiting for the one designer on the team who’s already juggling three urgent requests and one “quick thing” that is not quick.
That’s the quiet power here: creative intent becomes a command interface. When it works well, it pulls Photoshop closer to the way real creative feedback is given:
“Can we try this cleaner, brighter, and less busy?”
Historically, that feedback turns into 12 manual micro-steps. Adobe’s betting it can compress that into a prompt, keep it editable, and let you iterate faster.
AI Markup targets edits
AI Markup (currently in Photoshop web beta) is Adobe acknowledging a basic truth: words alone can be ambiguous, and selections are still a pain. With Markup, you draw on the canvas, circle, brush, highlight, then attach an instruction like “replace this background” or “change shirt color to red.”
For production teams, this is one of the most practical parts of the update because it reduces two common failure modes:
- Accidental edits: the AI changes the wrong region because the prompt wasn’t specific enough.
- Mask fatigue: you spend more time defining the target than doing the actual creative change.
A small feature with big leverage
If Adobe nails the precision here, AI Markup becomes the bridge between “prompt-based editing” and “pro editing.” It’s basically a fast, human-readable way to say: this part, not that part.
Who benefits immediately
Adobe’s pitch is broad, but the winners are pretty specific:
- Marketing teams: fast, repeatable variations for ads, thumbnails, seasonal swaps, and A/B testing.
- E-commerce production: background removal, cleanup, and upscaling in a tighter loop.
- Social teams on mobile: faster edits during approvals, shoots, or “we need this posted in 15 minutes” moments (with voice input supported where available).
- Agencies: faster first passes that still land in the Adobe ecosystem for finishing and delivery.
If your output is “one perfect image,” you’ll still care, but you’ll care less than the teams shipping 50 versions of that image across placements, formats, and deadlines.
What to watch next
These are betas, so the interesting part isn’t the launch. It’s what Adobe learns from creators living inside them.
Reliability over novelty
The AI Assistant will live or die on whether it produces results that are:
- predictable (same prompt does not equal random chaos),
- editable (clean layers and history, not baked-in weirdness),
- review-friendly (easy to compare and revert).
Creators don’t need a chatbot that can do 1,000 things. They need one that can do the 12 things that happen every single day, without breaking the file or the vibe.
Workflow gravity
Adobe’s bigger strategy is showing: keep generation and edits inside the same ecosystem where finishing happens. Firefly’s unified editor plus Photoshop’s Assistant makes it more likely teams stay inside Adobe instead of generating elsewhere and coming back to Photoshop only to repair the damage.
If you’ve been tracking Adobe’s push to make Firefly a true hub (including partner models and more integrated surfaces), this update fits right into that direction. For related context, COEY previously covered the broader “Firefly as workflow infrastructure” shift in Adobe Firefly Unlimited: New Multi-Model Creative Hub.
Bottom line
Adobe’s Firefly and Photoshop updates don’t feel like “new AI tricks.” They feel like workflow pressure relief: fewer tool hops, faster revisions, and more ways for non-specialists to contribute without turning every request into a design bottleneck.
The unified Firefly workspace should speed up the high-volume basics. The Photoshop AI Assistant is the more ambitious bet, turning intent into action inside a pro editor. And AI Markup is the practical glue that could make prompt-based edits feel less like guessing and more like directing.
It’s not hype. It’s Adobe making generative editing behave more like production work: iterative, targeted, and built for the reality that feedback will always arrive five minutes before the deadline.






