How AI Painting Transforms Design: Midjourney vs Stable Diffusion and 4 Real‑World Cases

This article explores the rapid rise of AI painting tools such as Midjourney and Stable Diffusion, compares their features, outlines new design workflows, and presents four detailed case studies—from 2D‑to‑3D character creation to poster and H5 header generation—showing how designers can boost productivity and creativity.

Zhixing ZXD Design Center
Zhixing ZXD Design Center
Zhixing ZXD Design Center
How AI Painting Transforms Design: Midjourney vs Stable Diffusion and 4 Real‑World Cases

Introduction

In the past six months, AIGC technology has achieved major breakthroughs across many fields, especially in design. AI painting tools accelerate industry transformation, offering designers new opportunities and challenges.

Midjourney vs Stable Diffusion

Midjourney and Stable Diffusion are the two most popular AI painting tools. They differ in model architecture, output style, and user interaction, but both provide simple operation and fast generation, making them ideal for high‑frequency, fast‑paced marketing needs.

Workflow Changes

Traditional design workflow (without AI) follows a linear process of concept → sketch → manual rendering. With AI tools, two new workflows emerge:

Reference → keyword generation with Midjourney/Stable Diffusion → select and refine the best image.

Reference → sketch → use Stable Diffusion to generate images from the sketch → refine.

Case 1: Student Character 2D→3D

Goal: Create a youthful student avatar for the ZXD brand.

Step 1: Draw a color draft (hand holding a book, twin‑tails, hat, overalls).

Step 2: Choose a large model and adjust LoRA for clothing, facial features, and hair.

Step 3: Craft prompts (e.g., "best quality, soft smooth lighting, 1girl, overalls, twin‑tails, ...").

Step 4: Set parameters (sampling steps 20, sampler DPM++ 2M Karras, size 1200×1800, prompt relevance 7, redraw strength 0.7).

Step 5: Apply ControlNet (pre‑processor invert, model control_v11p_sd15_lineart, weight 1).

Step 6: Generate images, iteratively adjusting prompts and parameters.

Step 7: Use Midjourney to enhance 3D realism, employing Describle to extract keywords and adjusting weight (--iw 2).

Case 2: Train Star Poster Line‑Art Coloring

Goal: Design a poster honoring outstanding employees (Train Star).

Sketch a symmetrical layout with a speeding train and a celebratory employee.

Select an illustration model (ID 9527).

Generate keywords via Stable Diffusion Tag reverse‑engineer, then modify.

Set parameters: sampling steps 20, sampler DPM++ 2M Karras, size 1280×720, prompt relevance 14.

Use ControlNet (pre‑processor invert, model control_v11p_sd15_lineart, weight 1.3).

Case 3: H5 Activity Header 2D→3D

Goal: Create a dynamic header for a “Special Forces Travel Challenge” H5 activity.

Draw a 2D concept featuring a backpack‑carrying special‑forces figure saluting amid outdoor scenery.

Choose separate models for background and main character to ensure precision.

Reuse prompt and parameter settings described in previous cases.

Apply ControlNet with LeRes depth estimation for complex backgrounds.

Case 4: Summer Solstice Poster

Goal: Quickly generate a seasonal poster using Midjourney.

Collect summer‑solstice reference images.

Generate the scene with Midjourney using a base image (垫图) and descriptive keywords.

Post‑process in Photoshop: adjust tone, add the XiaoZhi IP, finalize.

Conclusion

These cases demonstrate that AI painting is increasingly integral to design, whether converting 2D assets to 3D, creating scene posters, or producing H5 headers. By mastering tools like Midjourney, Stable Diffusion, and ControlNet, designers can streamline workflows, maintain creative control, and stay competitive in the emerging human‑AI collaborative era.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

Stable DiffusionAIGCMidjourneyAI painting3D conversion
Zhixing ZXD Design Center
Written by

Zhixing ZXD Design Center

The Zhixing Experience Design team (ZXD) leads innovative UX design and research for Zhixing Train Ticket, aiming to deliver smarter, more caring, and warmer product experiences.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.