Database Migration Tools: Flyway, Liquibase, Prisma Migrate, Drizzle Kit, and golang-migrate
Database Migration Tools: Flyway, Liquibase, Prisma Migrate, Drizzle Kit, and golang-migrate
Photo by Alexey Demidov on Unsplash
Database migrations are the difference between "deploying is boring" and "restoring from backups at 3 AM." Every production application needs a migration strategy, and the right tool depends on whether you want to write SQL yourself or let your ORM generate it, whether rollbacks are a hard requirement, and what language ecosystem you work in. This guide breaks down five tools across two fundamental approaches and helps you pick the right one.

Two Philosophies: SQL-First vs Schema-First
Before comparing individual tools, you need to understand the two fundamental approaches to migrations.
SQL-first tools (Flyway, Liquibase, golang-migrate) expect you to write migration files by hand. You write the SQL, you control the execution order, and you are responsible for correctness. The tool manages ordering and tracking which migrations have been applied.
Schema-first tools (Prisma Migrate, Drizzle Kit) let you define your desired schema in code. The tool diffs the current database against your schema definition and generates the migration SQL automatically. You review the generated SQL, commit it, and the tool applies it.
Neither approach is universally better. SQL-first gives you total control and works with any database feature. Schema-first is faster for common operations and catches drift between your code and your database. The trade-off is control vs convenience.
Tool Comparison
| Feature | Flyway | Liquibase | Prisma Migrate | Drizzle Kit | golang-migrate |
|---|---|---|---|---|---|
| Approach | SQL-first | Changelog-first | Schema-first | Schema-first | SQL-first |
| Language | Java (CLI) | Java (CLI) | TypeScript | TypeScript | Go (CLI) |
| Migration format | SQL files | XML/YAML/SQL/JSON | Generated SQL | Generated SQL | SQL files (up/down) |
| Auto-generation | No | Diff-based | Yes (schema diff) | Yes (schema diff) | No |
| Rollback | Paid (Teams+) | Built-in | Manual (write new migration) | Yes (drop support) | Yes (down files) |
| Database support | 20+ | 50+ | PostgreSQL, MySQL, SQLite, SQL Server, CockroachDB | PostgreSQL, MySQL, SQLite | 20+ |
| Dry run | Yes | Yes | Yes (--create-only) | Yes (generate only) | No |
| Baseline support | Yes | Yes | Yes (baselining) | Yes (push for existing) | Yes (force version) |
| CI-friendly | Yes | Yes | Yes | Yes | Yes |
| License | Apache 2.0 (OSS) + paid tiers | Apache 2.0 + paid | Apache 2.0 | Apache 2.0 | MIT |
Flyway: The SQL Purist's Standard
Flyway is the most widely used SQL-first migration tool. You write versioned SQL files, and Flyway applies them in order. It tracks applied migrations in a flyway_schema_history table. No abstraction, no ORM, no magic -- just SQL.
Setup
# Install via Docker (no Java required)
docker pull flyway/flyway
# Or via Homebrew
brew install flyway
# Or download the CLI
# https://flywaydb.org/download
Migration File Structure
sql/
├── V1__create_users_table.sql
├── V2__create_orders_table.sql
├── V3__add_user_email_index.sql
├── V4__add_order_status_enum.sql
├── R__refresh_user_statistics.sql # Repeatable (runs on every change)
└── U3__drop_user_email_index.sql # Undo (paid Teams+)
Naming convention: V{version}__{description}.sql. Double underscore between version and description. Flyway applies migrations in version order and records each one.
Writing Migrations
-- V1__create_users_table.sql
CREATE TABLE users (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
email TEXT NOT NULL UNIQUE,
display_name TEXT NOT NULL,
password_hash TEXT NOT NULL,
role TEXT NOT NULL DEFAULT 'user'
CHECK (role IN ('user', 'admin', 'moderator')),
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
CREATE INDEX idx_users_email ON users (email);
CREATE INDEX idx_users_role ON users (role);
-- V2__create_orders_table.sql
CREATE TABLE orders (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
status TEXT NOT NULL DEFAULT 'pending'
CHECK (status IN ('pending', 'confirmed', 'shipped', 'delivered', 'cancelled')),
total_cents BIGINT NOT NULL CHECK (total_cents >= 0),
currency TEXT NOT NULL DEFAULT 'USD',
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
CREATE INDEX idx_orders_user_id ON orders (user_id);
CREATE INDEX idx_orders_status ON orders (status);
CREATE INDEX idx_orders_created_at ON orders (created_at DESC);
-- V3__add_user_email_index.sql
-- Concurrent index creation to avoid locking in production
CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_users_email_lower
ON users (LOWER(email));
Running Flyway
# Apply all pending migrations
flyway -url=jdbc:postgresql://localhost:5432/mydb \
-user=postgres \
-password=secret \
migrate
# Show migration status
flyway -url=jdbc:postgresql://localhost:5432/mydb info
# Validate: ensure applied migrations match local files
flyway -url=jdbc:postgresql://localhost:5432/mydb validate
# Repair: fix metadata after a failed migration
flyway -url=jdbc:postgresql://localhost:5432/mydb repair
# Using Docker
docker run --rm --network=host \
-v $(pwd)/sql:/flyway/sql \
flyway/flyway \
-url=jdbc:postgresql://localhost:5432/mydb \
-user=postgres \
-password=secret \
migrate
Flyway Configuration File
# flyway.conf
flyway.url=jdbc:postgresql://localhost:5432/mydb
flyway.user=postgres
flyway.password=${DB_PASSWORD}
flyway.locations=filesystem:sql
flyway.baselineOnMigrate=true
flyway.baselineVersion=0
flyway.validateMigrationNaming=true
flyway.cleanDisabled=true
When Flyway Fits
- You want full SQL control with zero abstraction
- Your team includes DBAs who review migration SQL
- JVM ecosystem (Java, Kotlin, Scala)
- Multiple database types in the same organization
- Simple, battle-tested tooling
Liquibase: Enterprise-Grade Change Management
Liquibase is similar to Flyway but supports multiple changelog formats (XML, YAML, JSON, SQL), has built-in rollback in the free tier, and offers advanced features for compliance and audit trails.
Setup
# Install via Docker
docker pull liquibase/liquibase
# Or via Homebrew
brew install liquibase
# Or via SDKMAN
sdk install liquibase
Changelog in YAML
# changelog.yaml
databaseChangeLog:
- changeSet:
id: 1-create-users
author: engineering
changes:
- createTable:
tableName: users
columns:
- column:
name: id
type: uuid
defaultValueComputed: gen_random_uuid()
constraints:
primaryKey: true
nullable: false
- column:
name: email
type: text
constraints:
unique: true
nullable: false
- column:
name: display_name
type: text
constraints:
nullable: false
- column:
name: created_at
type: timestamptz
defaultValueComputed: now()
constraints:
nullable: false
rollback:
- dropTable:
tableName: users
- changeSet:
id: 2-add-user-role
author: engineering
changes:
- addColumn:
tableName: users
columns:
- column:
name: role
type: text
defaultValue: "user"
constraints:
nullable: false
- addCheckConstraint:
tableName: users
constraintName: chk_user_role
constraintBody: "role IN ('user', 'admin', 'moderator')"
rollback:
- dropColumn:
tableName: users
columnName: role
- changeSet:
id: 3-create-orders
author: engineering
preConditions:
- onFail: MARK_RAN
- tableExists:
tableName: users
changes:
- createTable:
tableName: orders
columns:
- column:
name: id
type: uuid
defaultValueComputed: gen_random_uuid()
constraints:
primaryKey: true
- column:
name: user_id
type: uuid
constraints:
nullable: false
foreignKeyName: fk_orders_user
references: users(id)
- column:
name: total_cents
type: bigint
constraints:
nullable: false
rollback:
- dropTable:
tableName: orders
Changelog in SQL
If you prefer raw SQL, Liquibase supports that too:
-- changelog.sql
-- changeset engineering:1-create-users
CREATE TABLE users (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
email TEXT NOT NULL UNIQUE,
display_name TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- rollback DROP TABLE users;
-- changeset engineering:2-add-role
ALTER TABLE users ADD COLUMN role TEXT NOT NULL DEFAULT 'user';
-- rollback ALTER TABLE users DROP COLUMN role;
Running Liquibase
# Apply pending changes
liquibase --changelog-file=changelog.yaml \
--url=jdbc:postgresql://localhost:5432/mydb \
--username=postgres \
--password=secret \
update
# Preview SQL without applying
liquibase --changelog-file=changelog.yaml update-sql
# Roll back the last N changesets
liquibase --changelog-file=changelog.yaml rollback-count 1
# Roll back to a specific tag
liquibase --changelog-file=changelog.yaml rollback release-1.0
# Diff two databases
liquibase diff --referenceUrl=jdbc:postgresql://localhost:5432/staging \
--url=jdbc:postgresql://localhost:5432/production
# Generate changelog from existing database
liquibase --changelog-file=generated.yaml generate-changelog
# Show current status
liquibase --changelog-file=changelog.yaml status
Preconditions
Liquibase's precondition system lets you write defensive migrations:
- changeSet:
id: 4-add-index-safely
author: engineering
preConditions:
- onFail: MARK_RAN
- not:
- indexExists:
indexName: idx_orders_user_id
changes:
- createIndex:
indexName: idx_orders_user_id
tableName: orders
columns:
- column:
name: user_id
When Liquibase Fits
- Built-in rollback is a hard requirement
- Compliance needs detailed change tracking and audit trails
- You prefer declarative changesets over raw SQL
- You need to reverse-engineer an existing database into a changelog
- Enterprise environments with formal change approval processes
Prisma Migrate: Schema-First for TypeScript
Prisma Migrate generates migrations from your Prisma schema. You define the desired state, Prisma figures out the SQL to get there. It is the most popular migration tool in the TypeScript ecosystem.
Schema Definition
// prisma/schema.prisma
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
generator client {
provider = "prisma-client-js"
}
model User {
id String @id @default(uuid())
email String @unique
displayName String @map("display_name")
role UserRole @default(USER)
orders Order[]
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@index([email])
@@index([role])
@@map("users")
}
model Order {
id String @id @default(uuid())
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
userId String @map("user_id")
status OrderStatus @default(PENDING)
totalCents BigInt @map("total_cents")
currency String @default("USD")
items OrderItem[]
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@index([userId])
@@index([status])
@@index([createdAt(sort: Desc)])
@@map("orders")
}
model OrderItem {
id String @id @default(uuid())
order Order @relation(fields: [orderId], references: [id], onDelete: Cascade)
orderId String @map("order_id")
productId String @map("product_id")
quantity Int @default(1)
priceCents BigInt @map("price_cents")
@@map("order_items")
}
enum UserRole {
USER
ADMIN
MODERATOR
}
enum OrderStatus {
PENDING
CONFIRMED
SHIPPED
DELIVERED
CANCELLED
}
Creating and Applying Migrations
# Create a migration (diffs schema against database, generates SQL, applies it)
npx prisma migrate dev --name initial_schema
# Generated file: prisma/migrations/20260215_initial_schema/migration.sql
# Contains the full CREATE TABLE/INDEX statements
# Create a migration without applying (for review before application)
npx prisma migrate dev --name add_order_items --create-only
# Edit the generated SQL if needed, then apply
npx prisma migrate dev
# Apply migrations in production (no prompts, no shadow database)
npx prisma migrate deploy
# Check migration status
npx prisma migrate status
# Reset database and reapply all migrations (development only)
npx prisma migrate reset
Handling Data Migrations
Prisma generates DDL (schema changes) but not DML (data changes). For data migrations, edit the generated SQL:
# Generate migration without applying
npx prisma migrate dev --name split_name_field --create-only
-- prisma/migrations/20260215_split_name_field/migration.sql
-- Add new columns
ALTER TABLE "users" ADD COLUMN "first_name" TEXT;
ALTER TABLE "users" ADD COLUMN "last_name" TEXT;
-- Backfill from existing data
UPDATE "users"
SET first_name = split_part(display_name, ' ', 1),
last_name = CASE
WHEN position(' ' in display_name) > 0
THEN substring(display_name from position(' ' in display_name) + 1)
ELSE ''
END;
-- Make columns required after backfill
ALTER TABLE "users" ALTER COLUMN "first_name" SET NOT NULL;
ALTER TABLE "users" ALTER COLUMN "last_name" SET NOT NULL;
-- Drop old column (do this in a later migration after app code is updated)
-- ALTER TABLE "users" DROP COLUMN "display_name";
# Apply the edited migration
npx prisma migrate dev
Prisma in CI
# GitHub Actions
- name: Apply migrations
run: npx prisma migrate deploy
env:
DATABASE_URL: ${{ secrets.DATABASE_URL }}
- name: Verify migration status
run: |
STATUS=$(npx prisma migrate status)
if echo "$STATUS" | grep -q "Database schema is up to date"; then
echo "Migrations are current"
else
echo "Migration drift detected"
exit 1
fi
When Prisma Migrate Fits
- TypeScript full-stack projects
- Greenfield applications where Prisma is the ORM
- Teams that want auto-generated migrations from a schema definition
- You want type-safe database access alongside migrations
Limitations
- No built-in rollback -- you write a new migration to undo changes
- Shadow database required for
migrate dev(needs CREATE/DROP DATABASE permission) - Cannot express all PostgreSQL features in the schema (partial indexes, expression indexes, complex check constraints require raw SQL edits)
- Coupled to the Prisma ecosystem
Drizzle Kit: SQL Control with TypeScript Types
Drizzle Kit generates migrations from TypeScript schema definitions, but unlike Prisma, it stays close to SQL. If you want the convenience of schema-diffing with the control of writing SQL-like code, Drizzle is the middle ground.
Schema Definition
// src/db/schema.ts
import {
pgTable, pgEnum, uuid, text, bigint, integer,
timestamp, boolean, index, uniqueIndex, check
} from "drizzle-orm/pg-core";
import { relations, sql } from "drizzle-orm";
export const userRoleEnum = pgEnum("user_role", ["user", "admin", "moderator"]);
export const orderStatusEnum = pgEnum("order_status", [
"pending", "confirmed", "shipped", "delivered", "cancelled"
]);
export const users = pgTable("users", {
id: uuid("id").primaryKey().defaultRandom(),
email: text("email").unique().notNull(),
displayName: text("display_name").notNull(),
role: userRoleEnum("role").default("user").notNull(),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
}, (table) => ({
emailIdx: index("idx_users_email").on(table.email),
roleIdx: index("idx_users_role").on(table.role),
}));
export const orders = pgTable("orders", {
id: uuid("id").primaryKey().defaultRandom(),
userId: uuid("user_id").notNull().references(() => users.id, { onDelete: "cascade" }),
status: orderStatusEnum("status").default("pending").notNull(),
totalCents: bigint("total_cents", { mode: "number" }).notNull(),
currency: text("currency").default("USD").notNull(),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
}, (table) => ({
userIdx: index("idx_orders_user_id").on(table.userId),
statusIdx: index("idx_orders_status").on(table.status),
}));
// Relations (for query builder, not schema)
export const usersRelations = relations(users, ({ many }) => ({
orders: many(orders),
}));
export const ordersRelations = relations(orders, ({ one }) => ({
user: one(users, { fields: [orders.userId], references: [users.id] }),
}));
Configuration
// drizzle.config.ts
import { defineConfig } from "drizzle-kit";
export default defineConfig({
schema: "./src/db/schema.ts",
out: "./drizzle",
dialect: "postgresql",
dbCredentials: {
url: process.env.DATABASE_URL!,
},
verbose: true,
strict: true,
});
Generating and Applying Migrations
# Generate migration from schema diff
npx drizzle-kit generate
# Review the generated SQL in drizzle/NNNN_migration_name.sql
# Apply migrations
npx drizzle-kit migrate
# Push schema directly to database (development, skips migration files)
npx drizzle-kit push
# Pull existing database schema into Drizzle format
npx drizzle-kit introspect
# Open Drizzle Studio (visual database browser)
npx drizzle-kit studio
Drizzle Query Examples
import { db } from "./db";
import { users, orders } from "./db/schema";
import { eq, and, desc, sql, count, gt } from "drizzle-orm";
// Simple select
const admins = await db.select()
.from(users)
.where(eq(users.role, "admin"))
.orderBy(desc(users.createdAt));
// Join with aggregation
const userOrderCounts = await db.select({
userId: users.id,
name: users.displayName,
orderCount: count(orders.id),
totalSpent: sql<number>`COALESCE(SUM(${orders.totalCents}), 0)::int`,
})
.from(users)
.leftJoin(orders, eq(users.id, orders.userId))
.groupBy(users.id, users.displayName)
.having(gt(count(orders.id), 0));
// Insert with returning
const [newUser] = await db.insert(users)
.values({
email: "[email protected]",
displayName: "Alice Johnson",
role: "admin",
})
.returning();
// Transaction
await db.transaction(async (tx) => {
const [order] = await tx.insert(orders).values({
userId: newUser.id,
totalCents: 4999,
status: "confirmed",
}).returning();
await tx.insert(orderItems).values({
orderId: order.id,
productId: "prod-123",
quantity: 1,
priceCents: 4999,
});
});
When Drizzle Kit Fits
- TypeScript projects that want SQL-level control
- You know SQL well and want type safety on top of it
- Lightweight migration tool without heavy ORM overhead
- Fast query execution with minimal abstraction
golang-migrate: The Portable CLI
golang-migrate is a standalone migration CLI. It runs SQL up/down migration pairs and supports 20+ databases. Despite the name, it works for any project -- it is a binary, not a Go library (though it can also be used as a library in Go code).
Setup
# Install (multiple options)
brew install golang-migrate # macOS
go install -tags 'postgres' github.com/golang-migrate/migrate/v4/cmd/migrate@latest # Go
docker pull migrate/migrate # Docker
# Create migration pair
migrate create -ext sql -dir db/migrations -seq create_users
# Creates:
# db/migrations/000001_create_users.up.sql
# db/migrations/000001_create_users.down.sql
Migration Files
-- 000001_create_users.up.sql
CREATE TABLE users (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
email TEXT NOT NULL UNIQUE,
display_name TEXT NOT NULL,
role TEXT NOT NULL DEFAULT 'user'
CHECK (role IN ('user', 'admin', 'moderator')),
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
CREATE INDEX idx_users_email ON users (email);
-- 000001_create_users.down.sql
DROP INDEX IF EXISTS idx_users_email;
DROP TABLE IF EXISTS users;
-- 000002_create_orders.up.sql
CREATE TABLE orders (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
status TEXT NOT NULL DEFAULT 'pending',
total_cents BIGINT NOT NULL CHECK (total_cents >= 0),
created_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
CREATE INDEX idx_orders_user_id ON orders (user_id);
-- 000002_create_orders.down.sql
DROP INDEX IF EXISTS idx_orders_user_id;
DROP TABLE IF EXISTS orders;
Running Migrations
DB_URL="postgres://user:pass@localhost:5432/mydb?sslmode=disable"
# Apply all pending migrations
migrate -database "$DB_URL" -path db/migrations up
# Apply N migrations
migrate -database "$DB_URL" -path db/migrations up 2
# Roll back one migration
migrate -database "$DB_URL" -path db/migrations down 1
# Roll back all migrations
migrate -database "$DB_URL" -path db/migrations down
# Go to a specific version
migrate -database "$DB_URL" -path db/migrations goto 3
# Show current version
migrate -database "$DB_URL" -path db/migrations version
# Force a version (fix dirty state after failed migration)
migrate -database "$DB_URL" -path db/migrations force 2
CI Integration
# GitHub Actions
- name: Run migrations
run: |
migrate -database "$DATABASE_URL" -path db/migrations up
env:
DATABASE_URL: ${{ secrets.DATABASE_URL }}
- name: Verify migrations are reversible
run: |
migrate -database "$TEST_DB_URL" -path db/migrations up
migrate -database "$TEST_DB_URL" -path db/migrations down
migrate -database "$TEST_DB_URL" -path db/migrations up
When golang-migrate Fits
- Go backend projects (also works as a Go library)
- You want a simple, standalone binary with no runtime dependencies
- Every migration must have an explicit up + down pair
- Maximum portability across environments
Best Practices (All Tools)
Test Migrations Before Production
# Pattern: up, down, up -- catches irreversible migrations
# Run against a test database before deploying
migrate -database "$TEST_DB" -path db/migrations up
migrate -database "$TEST_DB" -path db/migrations down
migrate -database "$TEST_DB" -path db/migrations up
Never Modify Applied Migrations
Once a migration has been applied to any shared environment (staging, production), treat it as immutable. If you need to change something, write a new migration.
Use Transactions Where Possible
-- Wrap DDL in transactions (PostgreSQL supports transactional DDL)
BEGIN;
ALTER TABLE users ADD COLUMN phone TEXT;
CREATE INDEX idx_users_phone ON users (phone);
COMMIT;
Separate Schema Changes from Data Changes
Keep DDL (CREATE, ALTER, DROP) and DML (INSERT, UPDATE, DELETE) in separate migrations. Schema changes are fast and transactional. Data migrations can be slow and may need batching.
Always Have a Rollback Plan
Even if your tool does not enforce rollback files, document how to undo every migration:
-- Migration: add_user_preferences
-- Rollback: DROP TABLE user_preferences;
CREATE TABLE user_preferences (
user_id UUID PRIMARY KEY REFERENCES users(id),
theme TEXT DEFAULT 'light',
notifications_enabled BOOLEAN DEFAULT true
);
Decision Framework
| If your stack is... | Choose |
|---|---|
| TypeScript, want an ORM | Prisma Migrate |
| TypeScript, want SQL control | Drizzle Kit |
| JVM, SQL-first | Flyway |
| Enterprise, compliance, rollback required | Liquibase |
| Go backend | golang-migrate |
| Polyglot, want a CLI tool | Flyway or golang-migrate |
| Existing database, need to start versioning | Liquibase (generate-changelog) or Prisma (db pull) |
Summary
Every production database needs versioned, automated migrations. The choice comes down to philosophy and ecosystem. If you are in TypeScript, choose between Prisma Migrate (schema-first, auto-generated, tightly coupled to Prisma ORM) and Drizzle Kit (schema-first but SQL-native, lighter weight). If you want raw SQL with no abstraction, Flyway and golang-migrate are battle-tested and simple. If enterprise compliance demands rollback support and audit trails, Liquibase has the richest feature set. Whichever tool you choose, start from day one. Retrofitting migrations onto an existing database with months of ad-hoc changes is a project nobody wants to do.
