Based in Brazil
// 8 min

How I Replaced a $20/Month Server with $1.50 in Serverless

Supabase iOS Serverless Swift

You’ll walk away from this post knowing how to replace a traditional Node.js server with Supabase Edge Functions, wire up scheduled tasks with pg_cron, and keep your iOS app talking to it all with minimal changes. You’ll also see what it looks like to delegate an entire migration to Claude Code and just… direct.

Assumes some familiarity with Supabase, Swift, and the general idea that servers cost money you’d rather not spend.

Alright!


The Problem

MarketCapy is a sarcastic capybara that roasts the financial markets. It’s a fun side project. It makes zero dollars. And it was running on a $20/month DigitalOcean droplet.

The droplet ran a Node.js backend that had accumulated features like barnacles on a ship: gRPC streaming, stock endpoints, portfolio simulators, DCA calculators. Most of it was dead code. The app had evolved into a digest-first experience. All it actually needed was three things:

  1. Generate a sarcastic market digest (via AI) on a schedule
  2. Send push notifications when a new digest drops
  3. Serve coin data when someone taps a $TICKER mention

That’s it. Three things. Running on a server that was provisioned for twenty.


The New Architecture

Here’s what replaced the droplet:

pg_cron (5 UTC times x 2 languages)
    |
    v
generate-digest (Edge Function)
    |-- Perplexity API (sonar model) --> sarcastic digest
    |-- Supabase DB (upsert digest)
    |-- APNs (push to matching devices)

iOS App
    |-- GET /get-digest    --> latest digest from DB
    |-- GET /get-coin      --> CoinGecko data (cached in DB)
    |-- POST /register-push --> save device token

Four Edge Functions. One PostgreSQL database. Zero servers to maintain. Zero lines written by hand — I’ll get to that.


Edge Functions Walkthrough

Full disclosure: I didn’t write any of this TypeScript. I described what each function needed to do — inputs, outputs, behavior — and Claude Code wrote them. I reviewed the output, asked for changes when something was off, and deployed. That was the workflow for the entire migration.

generate-digest: The Main Event

This is the only function that does real work. It calls Perplexity’s sonar model with a system prompt that makes the AI behave like a sarcastic capybara, stores the result, and fires push notifications.

The Perplexity call:

const response = await fetch("https://api.perplexity.ai/chat/completions", {
  method: "POST",
  headers: {
    "Authorization": `Bearer ${perplexityApiKey}`,
    "Content-Type": "application/json",
  },
  body: JSON.stringify({
    model: "sonar",
    messages: [
      { role: "system", content: systemPrompt },
      { role: "user", content: userPrompt },
    ],
    max_tokens: 1024,
    temperature: 0.7,
    return_citations: true,
  }),
});

The digest gets upserted by a composite key of slot + language + date, so re-running the same time slot just overwrites:

const { error } = await supabase.from("digests").upsert(
  { slot, language, summary, date: today, raw_news: rawNews },
  { onConflict: "slot,language,date" }
);

Push notifications go out via APNs with a JWT signed from a .p8 key stored as a base64 env var. Only English digests trigger pushes to avoid doubling up on bilingual users.

get-digest: The Simple One

Fifteen lines of actual logic. Query the digests table, return the latest row for the requested language. That’s the whole function:

let query = supabase
  .from("digests")
  .select("*")
  .eq("language", lang)
  .order("created_at", { ascending: false })
  .limit(1);

if (slot) {
  query = query.eq("slot", slot);
}

get-coin: Cached CoinGecko Proxy

When a user taps $BTC in the digest, the app calls this function. It checks a coin_cache table first (5-minute TTL), hits CoinGecko if stale, and upserts the fresh data back. If CoinGecko is down but we have stale cache, we return it anyway. Graceful degradation for free:

const { data: cached } = await supabase
  .from("coin_cache")
  .select("*")
  .eq("coin_id", coinId)
  .single();

if (cached) {
  const age = Date.now() - new Date(cached.fetched_at).getTime();
  if (age < CACHE_TTL_MS) {
    return new Response(JSON.stringify(cached.data), {
      headers: { "Content-Type": "application/json" },
    });
  }
}

register-push: Token Storage

POST a device token, upsert it. Done:

const { error } = await supabase.from("push_devices").upsert(
  { token, platform, mode, enabled: true },
  { onConflict: "token" }
);

Scheduling with pg_cron

Supabase gives you pg_cron and pg_net extensions. Together they let you schedule HTTP calls from inside PostgreSQL. No external scheduler needed.

Five time slots, two languages each, plus cleanup jobs:

-- Morning digest (English at :00, Portuguese at :02)
SELECT cron.schedule('digest-morning-en', '0 8 * * *',
  $$SELECT net.http_post(
    url := 'https://your-project.supabase.co/functions/v1/generate-digest',
    body := '{"slot":"morning","language":"en"}'::jsonb,
    headers := '{"Content-Type": "application/json", "Authorization": "Bearer <service_role_key>"}'::jsonb
  )$$
);

-- Cleanup old digests daily at 3 AM
SELECT cron.schedule('cleanup-old-digests', '0 3 * * *',
  $$DELETE FROM digests WHERE date < CURRENT_DATE - INTERVAL '7 days'$$
);

-- Purge stale coin cache every 30 minutes
SELECT cron.schedule('cleanup-coin-cache', '*/30 * * * *',
  $$DELETE FROM coin_cache WHERE fetched_at < NOW() - INTERVAL '1 hour'$$
);

Ten generation jobs + two cleanup jobs = twelve cron entries. All defined in a single migration file. Push it with supabase db push and you’re done.


What Changed on iOS

Surprisingly little. The app already talked to a REST API. The URL changed and the headers gained a Supabase anon key. That’s basically it.

The entire API surface fits in one file:

public struct MarketCapyAPI: APIRequest {
    public var baseURL: URL {
        URL(string: "https://your-project.supabase.co/functions/v1")!
    }

    public static func fetchNewsSummary(lang: String? = nil) -> MarketCapyAPI {
        var params: [String: Any]? = nil
        if let lang = lang { params = ["lang": lang] }
        return MarketCapyAPI(urlString: "/get-digest", parameters: params, method: .get)
    }

    public static func fetchSingleMarketCoin(coinId: String) -> MarketCapyAPI {
        MarketCapyAPI(urlString: "/get-coin", parameters: ["id": coinId], method: .get)
    }

    static func postPushToken(token: String, platform: String = "ios", mode: String? = nil) -> MarketCapyAPI {
        var params: [String: Any] = ["token": token, "platform": platform]
        if let mode = mode { params["mode"] = mode }
        return MarketCapyAPI(urlString: "/register-push", parameters: params, method: .post)
    }
}

Three endpoints. Three static factory methods. The networking layer adds the Supabase apikey header automatically.

What got deleted? The gRPC streaming layer, stock-specific endpoints, portfolio simulator APIs, DCA calculator endpoints, coin-specific news summaries. Hundreds of lines of dead code and an entire GRPC/ directory. Claude Code ripped all of it out, updated the imports, and I just confirmed the diffs. Gone.


The Prompt Engineering Catch

Here’s the part where I actually had to think. The iOS app parses the digest by splitting on ". " (period + space) to separate intro, body, and closing:

let sentences = text.components(separatedBy: ". ").filter { !$0.isEmpty }

if sentences.count >= 3 {
    intro = sentences[0] + "."
    let middleEnd = max(1, sentences.count - 1)
    body = sentences[1..<middleEnd].joined(separator: ". ") + "."
    closing = sentences.last.map { $0.hasSuffix(".") ? $0 : $0 + "." } ?? ""
}

First sentence becomes the intro card. Middle sentences become the body. Last sentence becomes the closing. Simple and fragile.

The Perplexity prompt has to match this contract exactly. If the AI uses line breaks, bullet points, or forgets trailing periods, the parser chokes. So the system prompt is aggressive about format:

FORMAT (strict -- the app parses by splitting on ". " so each sentence matters):
1. First sentence: A single sarcastic one-liner about today's market mood. End with a period.
2. Middle sentences (3-5): The actual digest. Each sentence ends with a period.
3. Last sentence: A deadpan sardonic closing one-liner. End with a period.

CRITICAL FORMAT RULES:
- Write EXACTLY as continuous plain text -- one paragraph, no line breaks
- Every sentence must end with a period followed by a space

Post-processing strips any markdown or citation markers that slip through:

summary = summary
  .replace(/\*\*/g, "")
  .replace(/##\s*/g, "")
  .replace(/^[-\u2022]\s/gm, "")
  .replace(/\[\d+\]/g, "")
  .replace(/\s{2,}/g, " ")
  .trim();

The lesson: when your AI output feeds a deterministic parser, the prompt isn’t a suggestion. It’s a contract.


Cost Breakdown

ComponentDigitalOcean (before)Supabase (after)
Server/hosting$20/month (droplet)$0 (free tier)
Databaseincluded$0 (free tier, <500MB)
Edge FunctionsN/A$0 (free tier, <500K invocations)
Perplexity API~$1.50/month~$1.50/month
Total$20/month~$1.50/month

That’s a 92% reduction. The Perplexity calls are the only real cost, and those would exist regardless of infrastructure.

The free tier covers everything else comfortably. Ten digest generations per day, a few hundred coin lookups, and push notification delivery. MarketCapy isn’t exactly processing Super Bowl traffic.


What I’d Do Differently

Not much, honestly. The whole migration took about 40 minutes. I described the old architecture, pointed Claude Code at the existing codebase, told it what needed to stay and what needed to go, and reviewed the output. The Edge Functions, the pg_cron migration, the iOS endpoint swap, the dead code cleanup — all delegated, all reviewed, all shipped in one sitting.

My actual contribution was the decision-making: what to migrate to, which features to kill, how the prompt needed to work with the parser. The implementation was Claude Code’s job. I’m not going to pretend otherwise.

If I were starting from scratch, I’d skip the droplet entirely. Supabase Edge Functions + pg_cron is a surprisingly complete backend for apps that are API-thin and schedule-heavy. You don’t need a server for this.

The capybara remains unimpressed.