Case Study

Designed an AI image editing experience
that transforms user intent into controllable outputs.

Change the logo for me

Change the logo for me

Remove the background

Remove the background

Team & Role

2 UX Designer (1 of 2)

1 Product Manager

3 Software Developers

Duration

6 months

Ownership

AI generation workflow

Prompt optimization

Iterative editing experience

Impact

+12%

Increase in task accuracy

Increase in task accuracy

Increase in successful editing actions during image refinement.

-30%

Navigation path

Navigation path

Decrease in average steps required to reach a final visual

88%

User satisfaction

User satisfaction

Improve merchants' confidence when creating marketing visuals

Solution

The new way to generate image.

Transforms image editing from tool-based controls into an intent-driven, guided workflow.

Enable reference-based editing so users refine AI outputs with visuals instead of restarting.

01 / Problem

Abstract Labels Create Prompt Friction

Many restaurant owners lack formal design training but still need marketing visuals. Although they know what they want visually, expressing that intent through prompts is difficult.

Abstract Labels Failed to Convey Visual Intent

Merchants struggled to interpret abstract style terms like Minimalist or Luxury by emoji, especially without design training.

02 / Research Insight

Translating 150+ Signals into 3 Key Insights.

By synthesizing over 150 merchant requirement dimensions, we discovered 3 major pain points.

01

Prompt Gap

“I know what I want, but not how to prompt it.”

Many merchants have a clear visual idea but struggle to translate it into prompt language.

02

Lack of Control

“I can’t control what the AI generates.”

AI outputs often feel unpredictable.Users lack clear ways to guide or adjust the generation process.

03

Hard to Iterate

“Editing means starting over.”

Small changes often require restarting the generation process. This makes iteration frustrating.

03 / Design Goal

How might we help merchants express visual intent clearly and guide the generation process?

Design Goals

  1. Reduce prompt complexity

  2. Enable iterative editing

  3. Improve AI transparency

02 / Research Insight

Translating 150+ Dimensions into 3 Design Pillars.

By synthesizing over 150 merchant requirement dimensions, we discovered that "visual certainty" matters far more than "infinite possibility." Merchants don't want more options — they want the right option, shown clearly.

01

Prompt Gap

“I know what I want, but not how to prompt it.”

Many merchants have a clear visual idea but struggle to translate it into prompt language.

02

Lack of Control

“I can’t control what the AI generates.”

AI outputs often feel unpredictable.Users lack clear ways to guide or adjust the generation process.

03

Hard to Iterate

“Editing means starting over.”

Small changes often require restarting the generation process. This makes iteration frustrating.

04/ Design Solution

Turning Image editing into a easier experience

1

Reduce prompt complexity

Enhance information flow and make better use of space.

1

Reduce prompt complexity

Enhance information flow and make better use of space.

1

Reduce prompt complexity

Enhance information flow and make better use of space.

2

Enable iterative editing

Create intuitive and engaging user experiences.

2

Enable iterative editing

Create intuitive and engaging user experiences.

3

Improve AI transparency

Test design to ensure achieves business goals.

3

Improve AI transparency

Test design to ensure achieves business goals.

Translate abstract concepts into visual examples
Users can recognize and select image intent quickly.

Format

Ratio

Provide visual cues to help users under stand different formats

Style

Model

Provide visual cues so users can anticipate the generated outcome

Background

Leverage icons to communicate simple ideas and help users make quicker selections.

Streamline the editing workflow
Turning fragmented controls to unified editing

Before

While visuals improved clarity, the added length increased friction and caused users to drop off.

After

Ratio

Integrated 3 selections into Edit Instruction, allow users to edit images without navigating multiple controls.

Introduce Reference-Based Editing
instead of restarting generation.

Object Editing

Enable reference-based editing so users can refine AI outputs using visual references instead of restarting the generation process.

Edit Instructions

Ratio

Simple object editing guide

Describe changes

Model

Describe what you want to change

Object Instructions

Model

Choose different ojects

Design for Predictable Output
Guide users with structured controls instead of open-ended prompts.

Refine with Reference Images

Reference

Reference images anchor visual elements like logos, scenes, and styles consistent.

Constrain the Model

Model

Making model choice explicit helps users match generation behavior to task intent.

Design for Platform Outputs

Ratio

Built-in presets let merchants generate images ready for different social media.

06 / Impact

The Numbers Speak

“I used to spend so much time rewriting prompts just to get close to what I had in mind. Now I can just tweak things directly—it feels way more intuitive.

Sarah

Small business Owner

+12%

Increase in task accuracy

Increase in task accuracy

Increase in successful editing actions during image refinement.

Increase in successful editing actions during image refinement.

Increase in successful editing actions during image refinement.

-30%

Navigation path

Navigation path

Decrease in average steps required to reach a usable final visual

88%

User satisfaction

User satisfaction

Improve merchants' confidence when creating marketing visuals

Improve confidence when creating marketing visuals

Reflection

AI-native Workflow & Design Thinking

Beyond simply using AI as a tool, I began to rethink my workflow by integrating AI as a core part of the design process.

Traditionally, design workflows follow a linear path: ideation → task division → execution → delivery. However, with AI, this process becomes more iterative and collaborative: ideation → AI generation → human judgment → AI iteration → outcome.

This shift changes my role from a sole creator to a curator and decision-maker, where my value lies in framing the right problems, evaluating AI outputs, and guiding iterations.

Looking Forward

Looking Forward

I aim to further develop as an AI-native designer by integrating AI not only as a tool, but as a core part of my workflow and product thinking.