Database
Next.js Techniques

Running Multiple Projects on a Single Supabase Instance

Running Multiple Projects on a Single Supabase Instance
Stop paying for separate DBs! Learn to isolate multiple projects on one Supabase instance using PostgreSQL schemas and Drizzle ORM—all within the free tier.

As an indie developer, choosing a database provider often feels like picking your poison. Every option has trade-offs:

  • Supabase: Free tier limits you to 2 projects. Upgrade to paid, and each additional project costs $10/month. If you're juggling multiple side projects, the costs add up fast.
  • Neon: They advertise 100 free projects, but compute time is the real cost driver. Once your app gets real traffic, those CU-hours add up fast.
  • Self-hosted PostgreSQL: Sure, you save money upfront, but now you're on the hook for maintenance, performance tuning, and security. Not exactly passive income.

Here's a better approach: use PostgreSQL schemas to run multiple projects on a single Supabase database.

This strategy works best when you have several low-traffic projects that, combined, still fit comfortably within Supabase's free tier (5GB egress/month—plenty for most side projects).

What Are PostgreSQL Schemas?

Schemas are a native PostgreSQL feature, not something unique to Supabase. Think of them like folders in a file system.

Within a single database instance, you can create multiple schemas. Each schema can contain its own tables, and they won't interfere with each other. By default, Supabase uses the public schema, but there's nothing stopping you from creating your own.

This is how you can isolate multiple projects in one database without them stepping on each other's toes.

Why Use Supabase Schemas?

Many people think Supabase's free tier only gives you 2 databases. In reality, those 2 databases come with 5GB egress and 500MB storage, the limits most products never hit. If multiple projects can share a database, why not cut costs dramatically?

Using Supabase schemas lets you share database resources while keeping each project's data isolated. You maximize the free tier, maintain security, and minimize costs—making it the ideal choice for indie developers.

Migration Guide: From Neon to Supabase with Schema Isolation

Note: The file paths below are based on the NEXTY.DEV boilerplate.

Let's walk through migrating a Next.js + Drizzle ORM project from Neon to Supabase using a dedicated schema.

1. Configure Drizzle to Use a Custom Schema

First, update your drizzle.config.ts to add a schemaFilter. This tells Drizzle to only manage your custom schema, so it won't accidentally drop tables in the public schema.

typescript
// drizzle.config.ts
export default defineConfig({
  // ... other config
  schemaFilter: ['my_project_schema'], // <--- Key line: specify your schema name
});

2. Update Your Schema Definitions

Next, modify lib/db/schema.ts to define all tables within your custom schema instead of the default public schema.

Pro tip: Define a schema object first, then use it to create all your tables.

typescript
// lib/db/schema.ts
import { pgSchema } from 'drizzle-orm/pg-core';

// 1. Define your schema
export const myProjectSchema = pgSchema('my_project_schema'); // <--- Match the name in drizzle.config.ts

// 2. Replace pgTable with myProjectSchema.table
// Old: export const user = pgTable('user', { ... })
// New:
export const user = myProjectSchema.table('user', {
  id: uuid('id').primaryKey().defaultRandom(),
  // ... other columns
});

// 3. Enums also need to be scoped to your schema
// Old: export const roleEnum = pgEnum('role', ['admin', 'user'])
// New:
export const roleEnum = myProjectSchema.enum('role', ['admin', 'user']);

3. Reset Your Migration History

For a full migration like this, don't try to generate incremental migrations on top of your old ones. You'll run into all sorts of headaches.

Instead, reset your migration history and let Drizzle generate everything from scratch:

  1. Delete old migrations: Remove everything in lib/db/migrations/.
  2. Generate fresh migrations:
    bash
    pnpm db:generate
    # or
    pnpm drizzle-kit generate
    
  3. Apply migrations to create the new schema and tables in Supabase:
    bash
    pnpm db:migrate
    # or
    pnpm drizzle-kit migrate
    

Drizzle will now create my_project_schema and all your tables automatically. No need to write raw SQL in the Supabase editor.

4. Migrate Your Data with a Script

Now you need to copy data from your old database (Neon) to your new one (Supabase + custom schema).

Add your old and new connection strings to .env, then use this script as a starting point. You can ask an AI assistant to tailor it to your specific schema.

typescript
import { drizzle } from 'drizzle-orm/postgres-js';
import postgres from 'postgres';
import * as schema from '../lib/db/schema';
import * as dotenv from 'dotenv';
import { getTableConfig } from 'drizzle-orm/pg-core';

dotenv.config();

const { OLD_DB_URL, NEW_DB_URL } = process.env;

// Define migration order (respect foreign key dependencies: base tables first)
const tables = [
  { name: 'user', table: schema.user }, // Base table
  { name: 'pricing_plans', table: schema.pricingPlans },
  { name: 'orders', table: schema.orders }, // Depends on user & pricing_plans
  // ... other tables
];

async function main() {
  const sourceClient = postgres(OLD_DB_URL!, { max: 1 });
  const destClient = postgres(NEW_DB_URL!, { max: 1 });
  const destDb = drizzle(destClient, { schema });

  for (const t of tables) {
    console.log(`Migrating table: ${t.name}...`);

    // 1. Build column mapping (db_column -> schema_property)
    const columnMapping: Record<string, string> = {};
    const tableColumns = t.table;
    for (const key in tableColumns) {
      // @ts-ignore
      const col = tableColumns[key];
      // Schema property keys are typically camelCase (e.g., cardTitle)
      // while col.name is the actual DB column (e.g., card_title)
      if (col && typeof col === 'object' && 'name' in col) {
        columnMapping[col.name] = key;
      }
    }

    // 2. Fetch data from old database (assuming public schema)
    const rows = await sourceClient`SELECT * FROM public.${sourceClient(t.name)}`;
    
    if (rows.length === 0) continue;

    // 3. Transform data (snake_case -> camelCase)
    const transformRow = (row: any) => {
      const newRow: any = {};
      for (const dbColName in row) {
        const schemaKey = columnMapping[dbColName];
        // Use the schema key if defined, otherwise keep the original column name
        newRow[schemaKey || dbColName] = row[dbColName];
      }
      return newRow;
    };

    // 4. Batch insert into new database
    const batchSize = 100;
    for (let i = 0; i < rows.length; i += batchSize) {
      const batch = rows.slice(i, i + batchSize).map(transformRow);
      await destDb.insert(t.table).values(batch).onConflictDoNothing();
    }
  }

  await sourceClient.end();
  await destClient.end();
  console.log('Migration complete!');
}

main();

5. Update Production Environment Variables

Finally, update your production .env to point to the new database:

env
DATABASE_URL=your-new-supabase-connection-string

Redeploy your application, and you're done.

Wrapping Up

With this approach, you can comfortably run 10–20 low-traffic side projects on a single Supabase database. Each project gets its own isolated schema, keeping data clean and secure—while you stay firmly in the free tier.

If you're using the NEXTY.DEV boilerplate, there's a built-in Agent SKILL to help automate this migration. For everyone else, just share this guide with your AI assistant, and it'll handle most of the heavy lifting.