We Tried Incremental Builds in a Large Angular Monorepo. Here's Why We Stopped.
Incremental builds promise faster builds by pre-compiling libraries, but profiling a large Angular monorepo revelead that JavaScript bundlers re-process every file regardless of origin. The cold build penalty and limited speedup led us to abandon incremental builds in favor of faster bundlers and micro-frontends.
The Promise
If you maintain a large frontend monorepo, youâve probably had this conversation:
âWhat if we pre-built our libraries and cached the output? Then only changed libraries would rebuild. The app build would just consume pre-compiled artifacts. It should be way faster.â
Itâs a compelling idea. Nx has built-in support for it. The Angular Package Format is designed for it. The tooling exists. So we tried it â thoroughly, across hundreds of libraries in a large Angular monorepo with ~30,000 TypeScript source files and 7M lines of code.
The results were⌠not what we expected.
What We Did
We migrated hundreds of libraries from source-compiled (everything builds together in one pass) to individually pre-built (each library compiles to dist, the app reads from dist). We used:
- SWC for pure TypeScript libraries (fast transpilation + tsc for declarations)
- ng-packagr for Angular libraries (produces ESM2022 with Ivy metadata)
- NX task caching to skip rebuilding unchanged libraries
- esbuild as the application bundler
We built custom tooling for tsconfig path rewriting, automated the migration with an agent-driven bottom-up process, fixed dozens of issues (lodash globals, const enums, deep imports, missing exports, CJS-to-ESM conversion), and ran extensive benchmarks.
The Results
| Scenario | Incremental | Baseline (from source) |
|---|---|---|
| Hot build (deps cached) | ~3m 28s | ~3m 49s |
| Cold build (nothing cached) | ~14m 15s | ~3m 49s |
| Warm Angular cache, real file change | ~4m 34s | ~3m 24s |
The hot incremental build was roughly the same speed as baseline. The cold build was 3.7Ă slower. And with a warm Angular compiler cache and a real code change â the scenario closest to a developerâs actual workflow â incremental was over a minute slower.
Why? The Bundler Is the Bottleneck.
We profiled every phase of the Angular esbuild builder using NG_BUILD_DEBUG_PERF:
| Phase | Incremental | Baseline |
|---|---|---|
| Angular compiler (create program, analyze, emit TS, diagnostics) | 67.7s | 76.5s |
| esbuild bundling (JS transform + bundle) | ~85s | ~85s |
The Angular compiler was genuinely faster with pre-built libraries â 9 seconds faster across all phases. Ivyâs locality principle works exactly as designed: the compiler reads metadata from .d.ts files without recompiling pre-built sources.
But esbuildâs bundling phase was identical. It must re-parse, tree-shake, and bundle every input file â whether that file comes from source compilation or pre-compiled dist. Pre-built .mjs files donât get any special treatment. They go through the exact same parse â transform â bundle pipeline as freshly-compiled source.
This means the theoretical maximum improvement from incremental builds is capped at whatever the Angular compiler phases represent as a fraction of total build time. In our case: ~20-25% at best, even if 99% of libraries were pre-built.
And that 9-second saving was eaten by NX task orchestration overhead â hashing and restoring hundreds of cached task outputs added 30-47 seconds to every build.
The Fundamental Problem
The mental model of incremental builds is: âskip work that was already done.â But modern bundlers donât work that way. They donât have a concept of âthis chunk is already bundled, just concatenate it.â Every input file â whether from source or dist â goes through the full bundling pipeline.
In a compiled language like C++, pre-built libraries produce .o files that the linker concatenates. The linker is fast because it doesnât re-parse or re-optimize the object code. But in JavaScript, there is no linker. The bundler IS the linker, and it re-processes everything.
This creates a hard ceiling: the bundlerâs per-file processing time Ă total file count is irreducible. Pre-building libraries doesnât reduce the total file count â it just changes where the files come from (dist vs source). In some cases it actually increases file count, because ng-packagr produces per-file ESM2022 output (one .mjs per source .ts).
What About Caching?
Angularâs esbuild builder does have a persistent cache (an LMDB-backed SQLite database). With a warm cache and no real code changes, both incremental and baseline builds drop to ~2 minutes â the cache is effective for both approaches equally.
With a real code change, the cache helps baseline more than incremental because the baselineâs cache contains richer compilation state (it compiled everything from source). The incremental cache has less to work with because it only compiled non-buildable source files.
The Cold Build Problem
The cold build penalty is the killer. Every time a developer creates a new codespace, switches branches, or the CI cache expires, they pay the full cost of building every library individually. In our case: 14 minutes vs 4 minutes for baseline.
You can mitigate this with NX remote cache or pre-baked cache images, but that adds infrastructure complexity and creates a new dependency. If the cache server is slow or unavailable, builds fall off a cliff.
What Actually Works
After this deep dive, our conclusion is that two approaches actually scale frontend build systems:
1. Fast Bundlers (esbuild, rolldown, rspack)
The most impactful optimization is simply making the single-pass build faster. Moving from webpack to esbuild cut our build time from 22 minutes to under 4 minutes â a 5Ă improvement with zero architectural changes.
Fast bundlers work because they attack the actual bottleneck: per-file processing speed. esbuild processes files ~100Ă faster than webpack because itâs written in Go, uses parallelism, and avoids the overhead of JavaScript-based AST transformations.
The lesson: invest in bundler speed, not build architecture complexity.
2. Micro-Frontends (Module Federation, iFrames)
If your app is too large for even a fast bundler, the answer isnât to pre-build libraries within a single bundle â itâs to stop building a single bundle entirely.
Micro-frontends split the application into independently deployable chunks. Each chunk has its own build pipeline, its own deployment lifecycle, and its own caching boundary. When a team changes their micro-frontend, only their chunk rebuilds. The rest of the application is served from CDN cache.
This isnât incremental builds with extra steps â itâs a fundamentally different model. Instead of âone big build that tries to be smart about caching,â itâs âmany small builds that are naturally independent.â
Module Federation (and other Micro Frontend approaches) make this practical for Angular and React applications. The compile-time coupling that makes incremental builds hard (the bundler must see everything) is replaced by runtime composition (each chunk is loaded independently).
The Bottom Line
Pre-building libraries and caching build artifacts is an appealing optimization, but it hits a hard ceiling in JavaScript ecosystems because bundlers must re-process every input file regardless of origin. The theoretical maximum improvement is ~20-25%, which doesnât justify the cold build penalty, configuration complexity, and developer friction.
If your builds are slow:
- Switch to a fast bundler (esbuild, rolldown, rspack). This is the highest-ROI change.
- If still too slow, split the app into micro-frontends. This removes the single-bundle bottleneck entirely.
- Donât invest in incremental library builds unless the bundler ecosystem adds âpre-bundled chunk passthroughâ â the ability to treat pre-compiled library bundles as opaque chunks that skip re-processing.
The JavaScript ecosystem doesnât have a linker. Until it does, incremental builds will remain a well-intentioned dead end for large applications.
This analysis is based on a months-long proof-of-concept in a production monorepo with ~30,000 TypeScript source files, ~1,000 libraries, and extensive benchmarking with Angular compiler profiling. The numbers are real; the conclusion is earned.
Comments
Join the discussion and share your thoughts!