Supabase
TalkingSchema connects directly to your Supabase project to import your live PostgreSQL schema and render a complete entity relationship diagram. From there, you can use the AI copilot to design new tables and relationships, generate migration SQL, and export your schema in any format — all without leaving the browser.
Supabase uses PostgreSQL under the hood, which means TalkingSchema has full support for Supabase-specific features: UUID primary keys, TIMESTAMPTZ columns, PostgreSQL enum types, Row Level Security (RLS) policy awareness, and Supabase's auth.users foreign key conventions.
Connecting to Supabase
Step 1 — Get your connection string
In your Supabase project:
- Go to Project Settings → Database
- Under Connection string, select URI (or Direct connection for connection poolers)
- Copy the connection string — it looks like:
postgresql://postgres:[YOUR-PASSWORD]@db.[PROJECT-REF].supabase.co:5432/postgres
Step 2 — Connect in TalkingSchema
- Click Import in the top toolbar
- Select Connect to database
- Choose Supabase
- Paste your connection string or enter credentials individually:
- Host:
db.[PROJECT-REF].supabase.co - Port:
5432 - Database:
postgres - User:
postgres - Password: your database password
- Host:
- Click Connect
TalkingSchema reads schema metadata only — no row data is accessed. Your credentials are used for this session only and are not stored.
Step 3 — Explore your ERD
Your Supabase schema appears on the ERD canvas. For the GSSC (Global Sustainable Supply Chain) example schema:
- All 10 tables render as nodes with their columns and type badges
- Foreign key relationships between
products → suppliers,purchase_orders → suppliers,purchase_orders → warehouses, etc. are drawn as relationship lines with crow's foot cardinality notation - Unique constraints (e.g.,
inventory.warehouse_id + product_id) are indicated on the table nodes - PostgreSQL enum types are shown inline
Designing New Features on Your Supabase Schema
Once your live schema is in TalkingSchema, the AI copilot can help you design schema additions that follow Supabase and PostgreSQL best practices.
Adding a table with RLS awareness
Add a supplier_audit_log table to track all changes to supplier records.
Columns: log_id UUID PK, supplier_id FK → suppliers, changed_by UUID FK → auth.users,
change_type (enum: insert/update/delete), changed_at TIMESTAMPTZ, old_data JSONB, new_data JSONB.
Follow Supabase naming conventions.
Adding Row Level Security guidance
I need RLS policies for the carbon_offset_credits table.
Only authenticated users whose supplier_id matches the record's supplier_id should see their own credits.
Generate: CREATE POLICY statements for SELECT, INSERT, UPDATE, and DELETE.
Generating a Supabase migration file
Generate a Supabase migration SQL file for these schema changes.
Format compatible with the Supabase CLI: supabase/migrations/{timestamp}_{description}.sql
Include: CREATE TABLE, CREATE INDEX, ENABLE ROW LEVEL SECURITY, and CREATE POLICY statements.
Exporting for Supabase Deploy
Direct SQL migration
Export your schema changes as SQL and apply them via:
# Using Supabase CLI
supabase migration new add_carbon_offset_credits
# Paste the generated SQL into the new migration file
supabase db push
Using the deploy-to-database feature
TalkingSchema can run your generated SQL directly against your connected Supabase database. Navigate to Export → Deploy to Database after reviewing your schema changes on the canvas.
What Gets Imported from Supabase
| Schema element | Imported? |
|---|---|
| Tables and columns | ✅ All tables in the public schema |
| Column data types | ✅ Including Supabase-specific: UUID, JSONB, TIMESTAMPTZ |
| Primary keys | ✅ Including composite PKs |
| Foreign key relationships | ✅ Including ON DELETE/UPDATE cascade rules |
| Unique constraints | ✅ |
| Check constraints | ✅ |
| PostgreSQL enum types | ✅ |
| Indexes | ✅ User-created indexes |
| Views | Roadmap |
| Functions / triggers | Roadmap |
auth.* schema | The auth.users table is recognized as a referenced entity in foreign keys |
| Row Level Security policies | Roadmap |
Frequently Asked Questions
Does TalkingSchema access my Supabase data?
No. TalkingSchema reads schema metadata only — table names, column definitions, types, and constraints. No row data is accessed at any point.
What about the auth.users foreign key pattern?
TalkingSchema recognizes foreign key references to auth.users(id) and renders them as relationship lines on the canvas. The auth schema tables appear as reference nodes. The AI copilot understands this pattern and generates new tables with auth.users references when instructed.
Can I use Supabase's connection pooler (Transaction or Session mode)?
Use the Direct connection string (port 5432) for schema introspection. Connection poolers (port 6543) are not suitable for DDL schema reads. The direct connection is available in Project Settings → Database → Connection pooling → Direct connection.
Does this work with Supabase's branching feature?
Yes. Each Supabase branch has its own connection string. Connect TalkingSchema to your branch database to design and test schema changes before merging.