From Build to App Store in One Prompt: AI Skills That Ship
I only had to think hard once. Then I ported it to Android and it just worked.
Publishing an app used to mean a full day of export busywork. Generate icons at twelve different sizes. Open Sketch or Figma to composite screenshots. Write copy for every localization. Repeat for every platform.
I automated all of it with Claude Code skills. No additional software. No Fastlane templates. No design tools. Just prompts that became reusable pipelines.
Skill 1: App Store Icons and Image Sets
The first skill I built generates every icon size Apple and Google require — from a single source image and a description.
Here’s how it works:
- I describe the icon concept (style, colors, elements).
- The skill runs a script that generates all required sizes for iOS (App Store, Spotlight, Settings, etc.) and Android (mdpi through xxxhdpi).
- It outputs properly named asset catalogs and resource directories, ready to drop into the project.
No Photoshop. No icon generator website. One skill, all sizes, both platforms.
Skill 2: App Store Screenshots
This is the one that changed everything. Getting polished App Store screenshots used to involve third-party tools, manual layouts, and hours of iteration. Now it’s a skill with a few steps:
Step 1: Create snapshot tests
The skill generates snapshot tests for the screens I want to showcase. These run in the simulator and capture pixel-perfect renders of the actual app — no mocked UI, no stale designs.
Step 2: Write the copy
I define the tone and style for the screenshot marketing copy. The skill generates headlines and descriptions for each screenshot, matching the voice I want.
Step 3: Generate the artwork
The captured screenshots and copy get composited into final marketing images — device frames, backgrounds, text overlays. All generated programmatically.
Step 4: Localize everything
This is where it really pays off. The skill takes the base screenshots and copy, translates and adapts them for every localization I support. Different text, same layout, same quality. What used to take an entire day per language now takes seconds.
The Best Part: Think Once, Port Everywhere
Here’s what made this worth writing about. I built these skills for iOS. When I started shipping Android apps, I didn’t rebuild anything. I asked Claude Code to port the skills to work with Android projects.
It read the existing iOS skills, understood the patterns, adapted the output formats for Google Play (different screenshot sizes, different icon requirements, different metadata structure), and produced working Android skills.
I thought hard once — about the workflow, the steps, the quality bar. I didn’t write a single line of code for any of it. Not for the iOS version. Not for the Android port. I described what I wanted, reviewed what I got, and shipped.
Why Skills Beat Scripts
I could have written shell scripts or Fastlane lanes to do some of this. But skills are different:
- They compose. I can chain the icon skill into a broader “prepare for release” workflow.
- They adapt. When Apple changes screenshot requirements (and they always do), I describe the change and the skill updates itself.
- They’re portable. The same mental model works across platforms because the skill understands intent, not just commands.
The Takeaway
The old App Store workflow was: build the app, then spend a day on marketing assets. The new workflow is: build the app, run the skills, submit.
The shift isn’t about saving time (though it does). It’s about removing the friction that makes you skip localizations, reuse old screenshots, or ship with placeholder icons.
When the cost of doing it right drops to zero, you just do it right every time.