Adobe launches Photoshop AI assistant, letting users request edits directly.
Photo by Rubaitul Azad (unsplash.com/@rubaitulazad) on Unsplash
The Verge reports Adobe has rolled out a public‑beta AI assistant for Photoshop on web and mobile, letting users simply describe edits in natural language and have the software apply them instantly.
Key Facts
- •Key company: Adobe
Adobe’s AI Assistant arrives in Photoshop’s web and mobile clients as a public‑beta feature that lets users issue natural‑language commands to edit images, according to a release from Adobe cited by The Verge. The chatbot‑style interface can remove distractions, swap backgrounds, tweak lighting, and adjust color palettes without the user having to manipulate layers or masks manually. Adobe says the assistant can operate in two modes: an “auto‑apply” mode that executes the requested edit immediately, and a “guided” mode that walks the user through each step, allowing them to learn Photoshop’s underlying tools while the AI handles the heavy lifting.
The functionality builds on the generative‑fill capabilities first introduced in Photoshop’s desktop version and now extended to the Firefly suite, which Adobe unveiled at its MAX conference in October. TechCrunch notes that the assistant can also accept prompts such as “add a soft glow,” “crop to 16:9,” or “enhance shadows,” and will translate those instructions into the appropriate series of adjustments—often a combination of layer effects, blending modes, and mask operations. For paid Photoshop subscribers, Adobe promises “unlimited generations” of AI‑driven edits through April 9, while free users receive a quota of 20 generations to experiment with, mirroring the tiered access model used for Firefly’s generative tools.
A new “AI markup” capability, also in public beta, lets users draw markers on the canvas to designate objects for removal or transformation. As TechCrunch explains, drawing a simple outline around a flower, for example, triggers the assistant to replace or delete that element and automatically re‑populate the background using generative fill. This markup workflow reduces the need for precise selection tools such as the Pen or Lasso, and it leverages Adobe’s underlying diffusion models to synthesize plausible content that matches surrounding textures and lighting.
Adobe is positioning the Photoshop assistant as part of a broader rollout of AI agents across its Creative Cloud portfolio. The Verge reports that similar assistants have already launched in Adobe Express and Acrobat, and that integration with Microsoft’s Copilot service is slated for those apps in the near term. While the desktop Photoshop client does not yet host the chatbot interface, Adobe hinted at a forthcoming desktop release, recalling a teaser from April 2025 that promised AI agents for Photoshop and Premiere Pro. The staggered deployment suggests Adobe is using the web and mobile platforms as a proving ground for the conversational workflow before committing resources to a full desktop integration.
From a technical perspective, the assistant’s prompt‑to‑action pipeline relies on Firefly’s generative models to produce pixel‑level edits, then maps the output onto Photoshop’s native layer stack. This hybrid approach preserves editability—users can still adjust the generated layer’s opacity, blending mode, or mask after the AI completes its pass. By exposing the underlying layer structure, Adobe mitigates a common criticism of generative tools that they produce “black‑box” results, allowing professionals to retain control over the final composition. The public‑beta rollout therefore serves both as a usability test for conversational editing and as a validation of Adobe’s strategy to embed generative AI directly into its flagship creative applications.
This article was created using AI technology and reviewed by the SectorHQ editorial team for accuracy and quality.