Migration Guide

v1.9.x → v1.10.0 (2026-03-13)

Go version requirement changed: Go 1.23+ is now required (was 1.21+). This is due to the mark3labs/mcp-go dependency used by the new MCP server. If you only use the parsing SDK (not the MCP server), Go 1.21+ still works, but go.mod declares 1.23.

No breaking API changes. Drop-in upgrade for existing code.

New: MCP Server

  • pkg/mcp/ — MCP server package with 7 SQL tools
  • cmd/gosqlx-mcp/ — Standalone MCP server binary
  • See docs/MCP_GUIDE.md for usage

v1.8.0 → v1.9.0 (2026-02-28)

No breaking changes. No API changes. Drop-in upgrade.

Behavioral changes to be aware of

lint exit codes (CLI-7): Previously: exits 0 unless errors present or --fail-on-warn set Now: exits 1 whenever any violation (error, warning, or info) is found Impact: CI pipelines using gosqlx lint as a gate will now correctly fail on warnings

E1009 for unterminated block comments (ERR-1): Previously: unterminated /* ... */ emitted E1002 (generic string error code) Now: emits E1009 ErrCodeUnterminatedBlockComment Impact: code catching specific error codes for /* handling should update to E1009


v1.7.0 → v1.8.0

Last Updated: 2026-02-24

This guide covers breaking changes in GoSQLX v1.8.0 and how to update your code. The primary breaking change is the token type system overhaul (#215) completed across PRs #252, #254, #255, #257, #258, #281, #267, #282, and #283.

Who Is Affected?

Usage PatternBreaking?Action Required
gosqlx.Parse(), gosqlx.Validate(), gosqlx.Format()NoNone
gosqlx.ParseWithTimeout(), gosqlx.ParseBytes()NoNone
CLI tool (gosqlx validate, gosqlx format, etc.)NoNone
pkg/sql/parser with parser.Parse()NoNone
Direct token.Token struct field accessYesSee below
String-based token constants (token.SELECT, etc.)YesSee below
token.Token.ModelType fieldYesRenamed to Type
ConvertTokensForParser() functionYesRemoved — use ParseFromModelTokens()

If you only use the high-level gosqlx package or the CLI tool, v1.8.0 is fully backward compatible and no changes are needed.

Breaking Change: Token Type System (#215)

Summary

The legacy string-based token.Type system has been completely removed. All token type comparisons now use models.TokenType (an integer type) for O(1) performance.

What Was Removed

  1. type Type string from pkg/sql/token — the string-based type definition
  2. Type (string) field from token.Token struct — replaced by the renamed ModelTypeType
  3. All string-based token constantstoken.SELECT, token.FROM, token.WHERE, etc.
  4. stringTypeToModelType map — the bridge between old and new type systems
  5. normalizeTokens() function — no longer needed with unified types
  6. ConvertTokensForParser() function — replaced by ParseFromModelTokens()

Migration Steps

Step 1: Update Token Creation

// BEFORE (v1.7.0)
import "github.com/ajitpratap0/GoSQLX/pkg/sql/token"

tok := token.Token{
    Type:      token.SELECT,              // string-based
    ModelType: models.TokenTypeSelect,    // int-based (was secondary)
    Literal:   "SELECT",
}

// AFTER (v1.8.0)
import "github.com/ajitpratap0/GoSQLX/pkg/sql/token"
import "github.com/ajitpratap0/GoSQLX/pkg/models"

tok := token.Token{
    Type:    models.TokenTypeSelect,      // int-based (now primary)
    Literal: "SELECT",
}

Step 2: Update Token Type Comparisons

// BEFORE (v1.7.0) — string comparison
if tok.Type == token.SELECT {
    // handle SELECT
}

if tok.Type == "SELECT" {
    // also worked
}

// AFTER (v1.8.0) — integer comparison (faster)
if tok.Type == models.TokenTypeSelect {
    // handle SELECT
}

Step 3: Replace String Constants with models.TokenType

Old (string)New (int)
token.SELECTmodels.TokenTypeSelect
token.FROMmodels.TokenTypeFrom
token.WHEREmodels.TokenTypeWhere
token.INSERTmodels.TokenTypeInsert
token.UPDATEmodels.TokenTypeUpdate
token.DELETEmodels.TokenTypeDelete
token.CREATEmodels.TokenTypeCreate
token.ALTERmodels.TokenTypeAlter
token.DROPmodels.TokenTypeDrop
token.JOINmodels.TokenTypeJoin
token.ONmodels.TokenTypeOn
token.ANDmodels.TokenTypeAnd
token.ORmodels.TokenTypeOr
token.NOTmodels.TokenTypeNot
token.NULLmodels.TokenTypeNull
token.TRUEmodels.TokenTypeTrue
token.FALSEmodels.TokenTypeFalse
token.ASmodels.TokenTypeAs
token.INmodels.TokenTypeIn
token.LIKEmodels.TokenTypeLike
token.BETWEENmodels.TokenTypeBetween
token.EXISTSmodels.TokenTypeExists
token.CASEmodels.TokenTypeCase
token.WHENmodels.TokenTypeWhen
token.THENmodels.TokenTypeThen
token.ELSEmodels.TokenTypeElse
token.ENDmodels.TokenTypeEnd
token.ORDERmodels.TokenTypeOrder
token.GROUPmodels.TokenTypeGroup
token.HAVINGmodels.TokenTypeHaving
token.LIMITmodels.TokenTypeLimit
token.OFFSETmodels.TokenTypeOffset
token.UNIONmodels.TokenTypeUnion
token.EXCEPTmodels.TokenTypeExcept
token.INTERSECTmodels.TokenTypeIntersect

For a complete mapping, see the models.TokenType constants in pkg/models/token.go.

Step 4: Replace ModelType with Type

// BEFORE (v1.7.0)
tokenType := tok.ModelType  // was the int-based secondary field

// AFTER (v1.8.0)
tokenType := tok.Type       // ModelType renamed to Type

Step 5: Replace ConvertTokensForParser

// BEFORE (v1.7.0)
tokens, _ := tkz.Tokenize([]byte(sql))
converted, _ := parser.ConvertTokensForParser(tokens)
ast, _ := p.Parse(converted)

// AFTER (v1.8.0)
tokens, _ := tkz.Tokenize([]byte(sql))
ast, _ := p.ParseFromModelTokens(tokens)

Using TokenType.String() for Display

The TokenType.String() method provides human-readable names for all token types:

tok := token.Token{Type: models.TokenTypeSelect, Literal: "SELECT"}
fmt.Println(tok.Type.String()) // "SELECT"

Other Changes (Non-Breaking)

New Dialect API

v1.8.0 adds dialect-aware parsing. No migration needed — existing code defaults to PostgreSQL:

// New (optional) — parse with explicit dialect
ast, err := parser.ParseWithDialect(sql, "mysql")
err = parser.ValidateWithDialect(sql, "mysql")

New Transform API

The new pkg/transform/ package is purely additive:

import "github.com/ajitpratap0/GoSQLX/pkg/transform"

stmt, _ := transform.ParseSQL("SELECT * FROM orders")
transform.AddWhere(stmt, "tenant_id = 42")

New ParseWithRecovery API

For multi-error parsing (useful for IDE integration):

ast, errors := gosqlx.ParseWithRecovery(sql)
// ast may be partial, errors contains all parse errors

Renamed Packages

OldNewPR
pkg/optimizer/pkg/advisor/#261

slog Replaces DebugLogger

// BEFORE (v1.7.0)
tkz.SetDebugLogger(myLogger)

// AFTER (v1.8.0)
import "log/slog"
tkz.SetLogger(slog.New(slog.NewTextHandler(os.Stderr, &slog.HandlerOptions{Level: slog.LevelDebug})))

Performance Impact

The token type overhaul delivers ~50% faster parsing with no API changes needed for high-level users:

Benchmarkv1.7.0v1.8.0Improvement
SimpleSelect 10 cols1542 ns/op783 ns/op49% faster
SimpleSelect 100 cols9736 ns/op4843 ns/op50% faster
SimpleSelect 1000 cols83612 ns/op39487 ns/op53% faster
SingleJoin1425 ns/op621 ns/op56% faster
SimpleWhere736 ns/op373 ns/op49% faster

License Change

GoSQLX was relicensed from AGPL-3.0 to Apache License 2.0 in this release cycle (PR #227). This is a more permissive license that allows commercial use without copyleft obligations.

Quick Checklist

  • Search your code for token.SELECT, token.FROM, etc. — replace with models.TokenType*
  • Search for tok.ModelType — rename to tok.Type
  • Search for tok.Type == "..." (string comparison) — replace with tok.Type == models.TokenType*
  • Search for ConvertTokensForParser — replace with ParseFromModelTokens
  • Search for SetDebugLogger — replace with SetLogger
  • Search for pkg/optimizer imports — replace with pkg/advisor
  • Run go build ./... to verify
  • Run go test -race ./... to validate

Getting Help

If you encounter issues migrating: