← All articles
JAVASCRIPT Oxc: The JavaScript Oxidation Compiler Toolchain 2026-02-15 · 9 min read · oxc · javascript · typescript

Oxc: The JavaScript Oxidation Compiler Toolchain

JavaScript 2026-02-15 · 9 min read oxc javascript typescript rust toolchain parser minifier transformer performance

Oxc: The JavaScript Oxidation Compiler Toolchain

OXC JavaScript toolchain logo

The JavaScript ecosystem has a tooling performance problem. Babel transpiles your code by parsing it into an AST, running plugins over it, and generating output -- all in single-threaded JavaScript. Terser minifies by doing the same thing again from scratch. ESLint parses your code a third time. Each tool has its own parser, its own AST format, and its own traversal logic. On a large codebase, you might parse the same file five or six times during a single build.

The Oxc project (short for Oxidation Compiler) is building a complete JavaScript toolchain in Rust that shares a single, high-performance parser across every operation. Parse once, then lint, transform, minify, and resolve -- all using the same AST, all in Rust, all multi-threaded. The individual components are already delivering 20-100x performance improvements over their JavaScript equivalents.

You may have already heard of OxLint, the project's linter. This article covers the full Oxc toolchain beyond linting: the parser that powers everything, the transformer replacing Babel, the minifier taking on Terser, the resolver, and how these pieces fit together into a unified system.

The Oxc Parser: The Foundation

Every component in the Oxc toolchain sits on top of oxc-parser, one of the fastest JavaScript and TypeScript parsers ever built. The parser handles the full ECMAScript 2025 specification, TypeScript (including .tsx), JSX, and stage-3 proposals.

Performance

The parser benchmarks speak for themselves. Parsing a large JavaScript file like checker.js from the TypeScript compiler source (2.8 MB):

Parser Time Relative
oxc-parser 5.6 ms 1x (baseline)
swc-parser 16.8 ms 3x slower
Babel parser 98.3 ms 17.5x slower
TypeScript (tsc) 113 ms 20x slower
Acorn 80.4 ms 14.3x slower

The parser achieves this through several Rust-specific optimizations: arena allocation (using bumpalo) instead of individual heap allocations, minimal AST node sizes, zero-copy string handling, and a hand-written recursive descent parser rather than a generated one.

Using the Parser Directly

The parser is published as both a Rust crate and a WebAssembly module:

# Rust
cargo add oxc-parser oxc-ast oxc-span

# JavaScript (via Wasm)
npm install @oxc-parser/wasm

From Rust:

use oxc_parser::Parser;
use oxc_span::SourceType;
use oxc_allocator::Allocator;

let source = r#"
  const greeting: string = "hello";
  export function add(a: number, b: number): number {
    return a + b;
  }
"#;

let allocator = Allocator::default();
let source_type = SourceType::from_path("example.ts").unwrap();
let parsed = Parser::new(&allocator, source, source_type).parse();

if parsed.errors.is_empty() {
    println!("Parsed successfully: {} statements", parsed.program.body.len());
} else {
    for error in &parsed.errors {
        eprintln!("Parse error: {}", error);
    }
}

From JavaScript via the Wasm binding:

import { parseSync } from '@oxc-parser/wasm';

const result = parseSync('example.ts', 'const x: number = 42;');
console.log(result.program); // Full AST
console.log(result.errors);  // Parse errors, if any

AST Format

Oxc uses its own AST format optimized for performance, but it aligns closely with the ESTree specification that tools like ESLint and Babel use. This means the mental model transfers directly. A VariableDeclaration node in Oxc has the same structure you would expect from the ESTree spec -- kind, declarations, and each VariableDeclarator has id and init fields.

The key difference is memory layout. ESTree-based parsers allocate each AST node as a separate JavaScript object on the heap. Oxc allocates nodes in a contiguous arena, which dramatically improves cache locality during traversal.

The Transformer: Replacing Babel

Oxc's transformer (oxc-transformer) handles the same job as Babel: converting modern JavaScript and TypeScript into code that targets older runtimes. It strips TypeScript types, transforms JSX, handles decorators, and applies syntax downleveling.

What It Transforms

The transformer currently supports:

Configuration

The transformer uses a configuration structure similar to Babel's presets:

use oxc_transformer::{TransformOptions, EngineTargets};

let options = TransformOptions {
    typescript: TypeScriptOptions {
        jsx_pragma: Some("React.createElement".into()),
        jsx_pragma_frag: Some("React.Fragment".into()),
        only_remove_type_imports: true,
        ..Default::default()
    },
    targets: EngineTargets::from_query("chrome >= 80, firefox >= 78"),
    ..Default::default()
};

Performance vs Babel

Transforming a real-world React application with TypeScript:

Tool Transform time (full project) Relative
oxc-transformer 12 ms 1x (baseline)
SWC 38 ms 3.2x slower
Babel 780 ms 65x slower
TypeScript (tsc) 2,400 ms 200x slower

The performance gap is even more dramatic on CI servers where cold-start time matters and there is no persistent caching.

The Minifier: Replacing Terser

Oxc's minifier (oxc-minifier) is a JavaScript minifier focused on producing small output quickly. It performs the standard minification passes: identifier mangling, dead code elimination, constant folding, whitespace removal, and syntax compression.

Approach

The minifier takes a different approach than Terser. Rather than running a multi-pass optimization pipeline with configurable passes, it operates in two well-defined stages:

  1. Compression: AST-level optimizations like constant folding (1 + 2 becomes 3), dead branch elimination (if (false) { ... } removed), sequence expression construction, and boolean simplification.

  2. Mangling: Identifier renaming using frequency analysis to assign the shortest names to the most-used identifiers.

// Input
function calculateTotalPrice(items, taxRate) {
  const subtotal = items.reduce((sum, item) => sum + item.price, 0);
  const tax = subtotal * taxRate;
  const total = subtotal + tax;
  return total;
}

// Output (oxc-minifier)
function calculateTotalPrice(e,t){const c=e.reduce((e,t)=>e+t.price,0);return c+c*t}

Benchmarks

Minifying a 1 MB unminified JavaScript bundle:

Minifier Time Output size Relative speed
oxc-minifier 28 ms 412 KB 1x (baseline)
esbuild 35 ms 408 KB 1.25x slower
SWC minify 54 ms 415 KB 1.9x slower
Terser 3,200 ms 401 KB 114x slower

Terser still produces slightly smaller output in some cases because it applies more aggressive optimization passes. But the difference is typically under 2%, while the speed difference is two orders of magnitude. For development builds and CI pipelines, the trade-off is clear.

The Resolver

The resolver (oxc-resolver) handles module resolution -- the process of turning an import specifier like import { Button } from '@/components/Button' into an actual file path on disk. This is the same job that enhanced-resolve (used by webpack) and Node's module resolution algorithm perform.

Why a Separate Resolver?

Module resolution is called thousands of times during a build. Every import and require statement triggers a resolution. The resolution algorithm involves checking multiple file extensions, reading package.json for exports maps, handling path aliases from tsconfig.json, and traversing node_modules directories. On a project with 5,000 imports, a slow resolver adds seconds to your build.

use oxc_resolver::{Resolver, ResolveOptions};

let resolver = Resolver::new(ResolveOptions {
    extensions: vec![".ts".into(), ".tsx".into(), ".js".into(), ".jsx".into()],
    alias: vec![
        ("@/*".into(), vec!["./src/*".into()]),
    ],
    tsconfig: Some("./tsconfig.json".into()),
    condition_names: vec!["import".into(), "require".into(), "default".into()],
    ..Default::default()
});

let result = resolver.resolve("/project/src/pages", "@/components/Button");
// Resolves to /project/src/components/Button.tsx

Performance

The resolver benchmarks against enhanced-resolve show a consistent 10-28x performance improvement. On a project with deep node_modules trees and complex exports maps, the difference is dramatic:

Resolver 5,000 resolutions Relative
oxc-resolver 18 ms 1x (baseline)
enhanced-resolve 320 ms 17.8x slower

The resolver fully supports Node.js module resolution, TypeScript path mapping, package.json exports and imports fields, and the tsconfig.json paths and baseUrl configuration.

Integration with Build Tools

The Oxc components are designed to be used as libraries, not just as standalone CLIs. This makes them ideal for integration into existing build tools.

Rolldown (Vite's Future Bundler)

Rolldown is a Rust-based bundler being built as the next-generation bundler for Vite. It uses Oxc's parser, resolver, and transformer internally. When Rolldown reaches stability, Vite users will get Oxc's performance benefits transparently -- no configuration changes needed.

This is arguably the most important integration path for Oxc. Vite is one of the most popular build tools in the JavaScript ecosystem, and Rolldown will bring Oxc to millions of projects without those projects needing to know about Oxc directly.

Rspack

Rspack, the Rust-based webpack-compatible bundler, uses oxc-resolver for module resolution and is adopting other Oxc components. If your project uses webpack and you want to migrate to a faster bundler without changing your configuration, Rspack with Oxc under the hood is the path.

Direct Integration via Napi

For build tool authors, Oxc publishes N-API bindings that let you call Oxc from Node.js with near-native performance:

import { transform } from '@oxc-transform/binding';

const result = transform('app.tsx', sourceCode, {
  typescript: {
    declaration: false,
    onlyRemoveTypeImports: true,
  },
  jsx: {
    runtime: 'automatic',
  },
  target: 'es2020',
});

console.log(result.code);
console.log(result.map); // Source map

Using Oxc in a Vite Project Today

While Rolldown is still in development, you can use Oxc components in your current Vite setup through plugins:

// vite.config.ts
import { defineConfig } from 'vite';

export default defineConfig({
  // Vite already uses esbuild for dev transforms
  // When Rolldown ships, Oxc will power both dev and prod

  build: {
    // For production, you can use the oxc minifier
    // via rollup plugin when it stabilizes
    minify: 'esbuild', // or 'terser' -- oxc option coming
  },
});

The Unified Architecture

The real power of Oxc is not any individual component but how they share infrastructure. Consider what happens when you run a typical JavaScript build:

Traditional toolchain (parse 5 times):

  1. ESLint parses your code (Espree parser) to lint it
  2. Babel parses your code (Babel parser) to transform it
  3. Terser parses the output (Terser's parser) to minify it
  4. Webpack's resolver resolves every import
  5. Your bundler parses the code again for tree-shaking

Oxc toolchain (parse once):

  1. oxc-parser parses your code once
  2. oxc-linter walks the AST to lint
  3. oxc-transformer modifies the AST to transform
  4. oxc-minifier compresses the same AST
  5. oxc-resolver handles all import resolution

The AST is allocated in an arena and shared across all phases. There is no serialization between steps, no AST format conversion, and no redundant parsing. This architectural decision is what makes the 20-100x performance claims possible -- it is not just that Rust is faster than JavaScript, it is that the work is fundamentally structured to avoid redundancy.

Getting Started: A Practical Migration Path

You do not need to adopt the entire Oxc toolchain at once. The components are designed for incremental adoption:

Step 1: Start with OxLint

OxLint is the most mature component and delivers immediate value:

npx oxlint@latest .

Run it alongside ESLint initially. Once you are comfortable with its output, you can drop ESLint rules that OxLint covers and keep ESLint only for plugin-specific rules (import sorting, React hooks, accessibility).

Step 2: Try the Transformer

If you are using Babel, test the Oxc transformer on your codebase:

npx oxc-transform src/

Compare the output against Babel's. For TypeScript stripping and JSX transformation, the output should be equivalent. For complex Babel plugins (styled-components, relay), you may need to keep those specific transforms in Babel while offloading the rest to Oxc.

Step 3: Adopt Through Your Bundler

The lowest-friction path is to wait for your bundler to adopt Oxc internally. If you use Vite, Rolldown will bring Oxc to you. If you use webpack, Rspack already uses Oxc's resolver. This way, you get the performance benefits without changing any of your project's configuration.

The Broader Rust-JavaScript Tooling Ecosystem

Oxc is part of a larger trend of rewriting JavaScript tooling in Rust and other systems languages:

Tool Replaces Language Status
Oxc (full toolchain) Babel, Terser, ESLint, resolve Rust Active development
SWC Babel Rust Mature
Biome ESLint + Prettier Rust Mature
Rspack webpack Rust Stable
Rolldown Rollup (for Vite) Rust (uses Oxc) Active development
esbuild Bundler + minifier Go Mature
Lightning CSS PostCSS + Autoprefixer Rust Mature

Oxc's differentiator is scope and architecture. SWC is primarily a transpiler. Biome is a linter and formatter. esbuild is a bundler and minifier. Oxc is building the entire toolchain with a shared core, which allows optimizations that standalone tools cannot achieve.

Current Status and Roadmap

As of early 2026, the Oxc components are at varying levels of maturity:

The project is open source under the MIT license and actively developed with funding from the Void Zero company (founded by Evan You, creator of Vue.js). The connection to the Vue/Vite ecosystem means Oxc will see real-world production use at scale through Rolldown.

When to Adopt Oxc

Adopt OxLint now if you want faster linting. Adopt the resolver now if you are building tooling. For the transformer and minifier, evaluate them on your codebase but keep your existing tools as a fallback. For the full integrated experience, watch the Rolldown project -- when it ships as Vite's default bundler, Oxc will become the engine powering one of the most popular build tools in the JavaScript ecosystem, and the performance improvements will arrive without any migration effort on your part.