How to set up Supabase Edge Functions for AI workloads

Quick Answer: Create the function with `supabase functions new ai-handler`, write a Deno handler that reads the user JWT, calls a model provider, and writes results back via the Supabase client with row-level security. Deploy with `supabase functions deploy ai-handler` and call from the frontend using `supabase.functions.invoke()` with the user's session token.

Why Edge Functions Fit AI Workloads

Supabase Edge Functions run on Deno Deploy, co-located with the database, and inherit row-level security via a forwarded user JWT. For AI workloads — embedding ingestion, RAG retrieval, model orchestration — that proximity to Postgres matters because most AI calls bracket a database read or write.

Step-by-Step Setup

  1. Install the Supabase CLI and authenticate: supabase login
  2. Initialise the project locally if it is not already: supabase init
  3. Create the function: supabase functions new ai-handler
  4. Write the handler in supabase/functions/ai-handler/index.ts:
import { createClient } from "jsr:@supabase/supabase-js@2";

Deno.serve(async (req) => {
  const authHeader = req.headers.get("Authorization");
  if (!authHeader) return new Response("unauthorized", { status: 401 });
  const supabase = createClient(
    Deno.env.get("SUPABASE_URL")!,
    Deno.env.get("SUPABASE_ANON_KEY")!,
    { global: { headers: { Authorization: authHeader } } }
  );
  const { prompt } = await req.json();
  const r = await fetch("https://api.openai.com/v1/chat/completions", {
    method: "POST",
    headers: {
      "Authorization": `Bearer ${Deno.env.get("OPENAI_API_KEY")}`,
      "Content-Type": "application/json",
    },
    body: JSON.stringify({
      model: "gpt-4o-mini",
      messages: [{ role: "user", content: prompt }],
    }),
  });
  const data = await r.json();
  await supabase.from("ai_logs").insert({ prompt, response: data });
  return new Response(JSON.stringify(data), { headers: { "Content-Type": "application/json" } });
});
  1. Set environment secrets: supabase secrets set OPENAI_API_KEY=sk-...
  2. Deploy: supabase functions deploy ai-handler
  3. Call from the frontend: await supabase.functions.invoke("ai-handler", { body: { prompt } })

Operational Notes

Edge Functions on Supabase Pro have a 150-second wall-clock limit per invocation as of May 2026. Long-running AI tasks (above this limit) should be queued via Postgres pg_cron + a dedicated worker, or moved to a Vercel Edge Function or a self-hosted runtime. For streaming responses, return a ReadableStream from the handler and process the SSE on the client.

Common Mistakes

Two recurring issues: forgetting to set the OPENAI_API_KEY secret (causing 500 errors on first call), and using the service role key in the function (bypassing RLS, which silently expands attack surface). Always pass the anon key and forward the user JWT for AI handlers that read or write user-scoped data.

Related Questions

Last updated: | By Rafal Fila

Related Tools

Related Rankings

Dive Deeper