Stefan Haas

Stefan Haas

Senior Software Engineer at Microsoft writing deep-dive engineering articles about TypeScript, Angular, monorepos, and developer experience.

9 articles latest: Apr 2026 TypeScript ¡ Angular ¡ Nx
• 5 min read

We Tried Incremental Builds in a Large Angular Monorepo. Here's Why We Stopped.

We Tried Incremental Builds in a Large Angular Monorepo. Here's Why We Stopped.

Incremental builds promise faster builds by pre-compiling libraries, but profiling a large Angular monorepo revelead that JavaScript bundlers re-process every file regardless of origin. The cold build penalty and limited speedup led us to abandon incremental builds in favor of faster bundlers and micro-frontends.

Nx Monorepo Angular TypeScript Incremental Builds

The Promise

If you maintain a large frontend monorepo, you’ve probably had this conversation:

“What if we pre-built our libraries and cached the output? Then only changed libraries would rebuild. The app build would just consume pre-compiled artifacts. It should be way faster.”

It’s a compelling idea. Nx has built-in support for it. The Angular Package Format is designed for it. The tooling exists. So we tried it — thoroughly, across hundreds of libraries in a large Angular monorepo with ~30,000 TypeScript source files and 7M lines of code.

The results were… not what we expected.

What We Did

We migrated hundreds of libraries from source-compiled (everything builds together in one pass) to individually pre-built (each library compiles to dist, the app reads from dist). We used:

  • SWC for pure TypeScript libraries (fast transpilation + tsc for declarations)
  • ng-packagr for Angular libraries (produces ESM2022 with Ivy metadata)
  • NX task caching to skip rebuilding unchanged libraries
  • esbuild as the application bundler

We built custom tooling for tsconfig path rewriting, automated the migration with an agent-driven bottom-up process, fixed dozens of issues (lodash globals, const enums, deep imports, missing exports, CJS-to-ESM conversion), and ran extensive benchmarks.

The Results

ScenarioIncrementalBaseline (from source)
Hot build (deps cached)~3m 28s~3m 49s
Cold build (nothing cached)~14m 15s~3m 49s
Warm Angular cache, real file change~4m 34s~3m 24s

The hot incremental build was roughly the same speed as baseline. The cold build was 3.7× slower. And with a warm Angular compiler cache and a real code change — the scenario closest to a developer’s actual workflow — incremental was over a minute slower.

Why? The Bundler Is the Bottleneck.

We profiled every phase of the Angular esbuild builder using NG_BUILD_DEBUG_PERF:

PhaseIncrementalBaseline
Angular compiler (create program, analyze, emit TS, diagnostics)67.7s76.5s
esbuild bundling (JS transform + bundle)~85s~85s

The Angular compiler was genuinely faster with pre-built libraries — 9 seconds faster across all phases. Ivy’s locality principle works exactly as designed: the compiler reads metadata from .d.ts files without recompiling pre-built sources.

But esbuild’s bundling phase was identical. It must re-parse, tree-shake, and bundle every input file — whether that file comes from source compilation or pre-compiled dist. Pre-built .mjs files don’t get any special treatment. They go through the exact same parse → transform → bundle pipeline as freshly-compiled source.

This means the theoretical maximum improvement from incremental builds is capped at whatever the Angular compiler phases represent as a fraction of total build time. In our case: ~20-25% at best, even if 99% of libraries were pre-built.

And that 9-second saving was eaten by NX task orchestration overhead — hashing and restoring hundreds of cached task outputs added 30-47 seconds to every build.

The Fundamental Problem

The mental model of incremental builds is: “skip work that was already done.” But modern bundlers don’t work that way. They don’t have a concept of “this chunk is already bundled, just concatenate it.” Every input file — whether from source or dist — goes through the full bundling pipeline.

In a compiled language like C++, pre-built libraries produce .o files that the linker concatenates. The linker is fast because it doesn’t re-parse or re-optimize the object code. But in JavaScript, there is no linker. The bundler IS the linker, and it re-processes everything.

This creates a hard ceiling: the bundler’s per-file processing time × total file count is irreducible. Pre-building libraries doesn’t reduce the total file count — it just changes where the files come from (dist vs source). In some cases it actually increases file count, because ng-packagr produces per-file ESM2022 output (one .mjs per source .ts).

What About Caching?

Angular’s esbuild builder does have a persistent cache (an LMDB-backed SQLite database). With a warm cache and no real code changes, both incremental and baseline builds drop to ~2 minutes — the cache is effective for both approaches equally.

With a real code change, the cache helps baseline more than incremental because the baseline’s cache contains richer compilation state (it compiled everything from source). The incremental cache has less to work with because it only compiled non-buildable source files.

The Cold Build Problem

The cold build penalty is the killer. Every time a developer creates a new codespace, switches branches, or the CI cache expires, they pay the full cost of building every library individually. In our case: 14 minutes vs 4 minutes for baseline.

You can mitigate this with NX remote cache or pre-baked cache images, but that adds infrastructure complexity and creates a new dependency. If the cache server is slow or unavailable, builds fall off a cliff.

What Actually Works

After this deep dive, our conclusion is that two approaches actually scale frontend build systems:

1. Fast Bundlers (esbuild, rolldown, rspack)

The most impactful optimization is simply making the single-pass build faster. Moving from webpack to esbuild cut our build time from 22 minutes to under 4 minutes — a 5× improvement with zero architectural changes.

Fast bundlers work because they attack the actual bottleneck: per-file processing speed. esbuild processes files ~100× faster than webpack because it’s written in Go, uses parallelism, and avoids the overhead of JavaScript-based AST transformations.

The lesson: invest in bundler speed, not build architecture complexity.

2. Micro-Frontends (Module Federation, iFrames)

If your app is too large for even a fast bundler, the answer isn’t to pre-build libraries within a single bundle — it’s to stop building a single bundle entirely.

Micro-frontends split the application into independently deployable chunks. Each chunk has its own build pipeline, its own deployment lifecycle, and its own caching boundary. When a team changes their micro-frontend, only their chunk rebuilds. The rest of the application is served from CDN cache.

This isn’t incremental builds with extra steps — it’s a fundamentally different model. Instead of “one big build that tries to be smart about caching,” it’s “many small builds that are naturally independent.”

Module Federation (and other Micro Frontend approaches) make this practical for Angular and React applications. The compile-time coupling that makes incremental builds hard (the bundler must see everything) is replaced by runtime composition (each chunk is loaded independently).

The Bottom Line

Pre-building libraries and caching build artifacts is an appealing optimization, but it hits a hard ceiling in JavaScript ecosystems because bundlers must re-process every input file regardless of origin. The theoretical maximum improvement is ~20-25%, which doesn’t justify the cold build penalty, configuration complexity, and developer friction.

If your builds are slow:

  1. Switch to a fast bundler (esbuild, rolldown, rspack). This is the highest-ROI change.
  2. If still too slow, split the app into micro-frontends. This removes the single-bundle bottleneck entirely.
  3. Don’t invest in incremental library builds unless the bundler ecosystem adds “pre-bundled chunk passthrough” — the ability to treat pre-compiled library bundles as opaque chunks that skip re-processing.

The JavaScript ecosystem doesn’t have a linker. Until it does, incremental builds will remain a well-intentioned dead end for large applications.


This analysis is based on a months-long proof-of-concept in a production monorepo with ~30,000 TypeScript source files, ~1,000 libraries, and extensive benchmarking with Angular compiler profiling. The numbers are real; the conclusion is earned.

Comments

Join the discussion and share your thoughts!