ConfigsApr 13, 2026·3 min read

GORM — The Fantastic ORM Library for Go

GORM is the most popular ORM for Go. It provides a developer-friendly API for database operations — auto-migration, associations, hooks, transactions, and query building — with support for PostgreSQL, MySQL, SQLite, and SQL Server.

TL;DR
GORM maps Go structs to database tables with a chainable API for CRUD, associations, hooks, and transactions.
§01

What it is

GORM is the most widely adopted ORM library for the Go programming language. It provides a developer-friendly, chainable API for database operations including auto-migration, associations (has-one, has-many, many-to-many, polymorphic), hooks, transactions, and query building. It supports PostgreSQL, MySQL, SQLite, and SQL Server.

GORM is best suited for Go developers who want ActiveRecord-style productivity without writing raw SQL for every operation. Teams building web APIs, microservices, or any Go application that talks to a relational database will find GORM reduces boilerplate significantly.

§02

How it saves time or tokens

GORM eliminates repetitive SQL boilerplate. Instead of writing CREATE TABLE statements, you define Go structs and call db.AutoMigrate(). Instead of hand-crafting INSERT/UPDATE/DELETE queries, you use db.Create(), db.Save(), and db.Delete(). The chainable query builder handles WHERE clauses, joins, preloading, and pagination in a type-safe manner. For AI-assisted coding workflows, GORM's declarative struct-tag approach means you describe your schema once and the ORM handles the rest, reducing token consumption when generating database logic.

§03

How to use

  1. Install GORM and a database driver: go get -u gorm.io/gorm and go get -u gorm.io/driver/sqlite (or postgres/mysql).
  2. Define your models as Go structs embedding gorm.Model for automatic ID, timestamps, and soft delete.
  3. Open a database connection with gorm.Open(), run db.AutoMigrate() to sync your schema, and use the chainable API for CRUD.
§04

Example

package main

import (
    "gorm.io/gorm"
    "gorm.io/driver/sqlite"
)

type User struct {
    gorm.Model
    Name  string
    Email string `gorm:"uniqueIndex"`
    Age   int
}

func main() {
    db, _ := gorm.Open(sqlite.Open("app.db"), &gorm.Config{})
    db.AutoMigrate(&User{})

    // Create
    db.Create(&User{Name: "Alice", Email: "alice@example.com", Age: 30})

    // Read
    var user User
    db.First(&user, "name = ?", "Alice")

    // Update
    db.Model(&user).Update("Age", 31)

    // Delete (soft delete by default)
    db.Delete(&user)
}
§05

Related on TokRepo

§06

Common pitfalls

  • Forgetting that gorm.Model enables soft delete by default. Deleted records remain in the database with a non-null deleted_at column. Use db.Unscoped().Delete() for hard deletes.
  • Using db.Save() on a struct without a primary key value triggers a CREATE instead of UPDATE. Always load the record first or set the ID explicitly.
  • N+1 query problems with associations. Use db.Preload('Orders') to eager-load related records instead of accessing them in a loop.

Frequently Asked Questions

What databases does GORM support?+

GORM officially supports PostgreSQL, MySQL, SQLite, and SQL Server through separate driver packages. Community drivers exist for other databases like ClickHouse and TiDB. Each driver is installed separately so you only include what you need.

Does GORM support database migrations?+

Yes. GORM provides AutoMigrate which creates tables, adds missing columns, and creates missing indexes based on your struct definitions. For production use, many teams pair AutoMigrate with a dedicated migration tool like golang-migrate for versioned, reversible migrations.

How does GORM handle soft deletes?+

When your model embeds gorm.Model, deleting a record sets the deleted_at timestamp instead of removing the row. Subsequent queries automatically exclude soft-deleted records. Use db.Unscoped() to query or permanently delete soft-deleted rows.

Can GORM handle complex joins and subqueries?+

Yes. GORM supports raw SQL joins via db.Joins(), subqueries via db.Where(db.Select(...).Table(...)), and named scopes for reusable query fragments. For very complex queries, you can always fall back to db.Raw() with raw SQL.

Is GORM suitable for high-performance applications?+

GORM v2 introduced prepared statement caching, batch operations, and reduced allocations. For most web applications it performs well. For extreme throughput scenarios (millions of rows per second), some teams use GORM for writes and raw SQL or sqlx for performance-critical reads.

Citations (3)

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets