Making My Apps Siri-Native Before Apple Tells Everyone To
Apps that don’t expose intents will feel invisible in an AI-first OS.
The Bet
Apple is building something big for WWDC 2026. Every signal points the same direction: Siri is becoming an agent that reaches into your apps to act on behalf of users.
The “Ask Siri” toggle that appeared across all apps in iOS 18. The “Use Model” Shortcuts action that returns typed values. The assistant schema protocols showing up in developer documentation. Third-party AI Extensions rumored for iOS 27.
The message is clear: expose your app’s capabilities as structured intents, or become invisible.
I have two iOS apps — CapyCast (weather with a capybara companion) and Packybara (Brazilian package tracking). Both are high-frequency, glanceable, action-light apps. The exact category where users will ask Siri instead of opening the app.
So I shipped App Intents for both. Before WWDC. Here’s what happened.
What I Built
CapyCast: Weather on Demand
“Hey Siri, what’s the weather in Capycast?”
Now returns spoken weather with capy personality, without launching the app. The response includes temperature, condition, and a capybara mood message. Behind the scenes, it’s checking cached weather data first (sub-second response), only hitting Apple WeatherKit if the cache is stale.
The typed output includes temperature (Double), rain probability (Double), UV index (Int), moon phase — all as individual fields that Shortcuts can branch on. “If rain > 60%, remind me to grab an umbrella” actually works now.
Packybara: Package Status via Voice
“Cadê meu pacote no Packybara?”
Returns the latest tracking status with capybara personality. “A caminho… viajando!” if it’s in transit. “CHEGOUUU! Bora abrir!” if it’s delivered. Users can also add tracking codes by voice — the carrier (Correios, Shopee, AliExpress, Anjun, J&T) is auto-detected from the tracking code format.
The real magic: delivery event triggers. When a package transitions to “delivered,” the app donates an intent via IntentDonationManager. Shortcuts can fire automations: “When delivered, send a text message.” This is the kind of integration that turns a utility into a system-level building block.
Architecture Decisions That Mattered
Wrapper Entities, Not Model Conformance
My first instinct was to add AppEntity conformance directly to my domain models — SavedCity, TrackedPackage. Bad idea. AppEntity requires static properties (defaultQuery, typeDisplayRepresentation) that pollute your data layer with framework concerns.
Instead, I created lightweight wrapper entities: CityAppEntity, PackageAppEntity, CarrierAppEntity. Each has a simple from(_ model:) factory method. Clean separation. The pattern copied identically across both apps.
Cache-First is Non-Negotiable
Siri expects fast responses. If your intent makes a network call, the user is staring at a spinner — and they won’t try again.
CapyCast’s weather intent checks CacheManager first (data < 1 hour old), and only hits WeatherKit if stale. Packybara reads CoreData directly — the packages are already local. Sub-second responses for both.
Direct Data Access from Intents
App Intents run on an arbitrary executor, but my services are @MainActor singletons. Rather than fighting the concurrency model, I read data directly:
- CapyCast: Reads
UserDefaults(same key asCityManager) for saved cities. No singleton needed. - Packybara: Creates a fresh
PersistenceControllerand runsNSFetchRequestfor read-only queries. Only uses the fullPackageStorefor writes (AddPackageIntent).
This keeps intent execution lightweight and avoids MainActor bottlenecks.
On-Screen Awareness
I added .userActivity() modifiers to the main weather view and package detail screen. Now Siri knows what you’re looking at. “Is this good weather for a walk?” while viewing São Paulo’s weather — Siri has the context.
The city entity also conforms to Transferable, so the data is available for copy/paste and AI reasoning. Small addition, big future-proofing.
What This Has to Do with Apple’s AI Direction
Here’s my read on where Apple is going, and why I’m betting on it:
The agentic runtime. Apple Intelligence isn’t just about on-device LLMs generating summaries. It’s about Siri being an agent that orchestrates across apps. The foundation for that is App Intents — structured, typed actions that the system can discover, compose, and execute.
Third-party AI Extensions. WWDC is expected to announce that Claude, Gemini, and other models can slot into Siri’s routing. When a user asks a complex question, Siri will delegate to the best available model — but it still needs your app’s intents to know what’s possible.
Natural-language Shortcuts. Apple is rumored to let users describe automations in plain English. “Every morning, check the weather and if it’s going to rain, remind me at 8am.” This only works if weather apps expose typed, structured data. String outputs won’t cut it.
Spotlight as discovery. I added CSSearchableItem indexing for all saved cities and active packages. Both apps now surface in system-wide search without any marketing spend. When Apple makes Spotlight smarter with AI, indexed content gets prioritized.
Every feature I shipped maps to an announced or rumored Apple capability. The framework is ready now. The apps that adopt early get placement advantages in Spotlight and Siri discovery rankings.
The Honest Parts
What I didn’t build yet: Widget interactivity (tapping a widget to trigger an intent), Control Center integration, Live Activities for Packybara. These are Phase 2/3 — I’m shipping the core intents first and reacting to WWDC.
What worried me: Creating a new PersistenceController per intent call in Packybara felt heavy. CoreData opens the SQLite file each time. In practice, the overhead is negligible for the query patterns I’m running. But I’ll watch for issues at scale.
What surprised me: The AppShortcutsProvider is incredibly powerful. Shortcuts appear in Spotlight and Siri immediately after install — no user setup required. This is free distribution. I wish I’d added it months ago.
The Bigger Picture
I’ve been thinking about this in terms of distribution channels:
- App Store search — you optimize keywords and hope
- Social media — you post and pray
- The OS itself — Siri, Spotlight, Shortcuts, Control Center, Action Button
Channel 3 is emerging. And unlike 1 and 2, it’s merit-based. If your app exposes the right intents with good data, the OS surfaces you. No ad spend. No algorithm gaming.
For weather and package tracking — high-frequency, voice-friendly categories — this is existential. If “Hey Siri, what’s the weather?” routes to your app, that’s distribution you couldn’t buy.
WWDC is June 8. The apps that ship before get a head start. I’m not waiting to find out what Apple announces. I’m building the foundation now and adapting later.
What’s Next
- React to WWDC announcements and adopt any new APIs
- Add forecast and moon phase intents for CapyCast
- Add “list active packages” and carrier filtering for Packybara
- Widget interactivity (tap to refresh via App Intents)
- Live Activity for package tracking on the lock screen
The capybara is learning to talk. And the OS is learning to listen.