Webpack Fast Refresh vs Vite: What was Faster for ilert‑ui

This article shares what felt fastest in the day‑to‑day development of ilert‑ui, a large React + TypeScript app with many lazy routes. We first moved off Create React App (CRA) toward modern tooling, trialed Vite for local development, and ultimately landed on webpack‑dev‑server + React Fast Refresh.
Scope: Local development only. Our production builds remain on Webpack. For context, the React team officially sunset CRA on February 14, 2025, and recommends migrating to a framework or a modern build tool such as Vite, Parcel, or RSBuild.
Qualitative field notes from ilert‑ui: We didn’t run formal benchmarks; this is our day‑to‑day experience in a large route‑split app.
Mini‑glossary
A few helpful terms you will encounter in this article.
- ESM: Native JavaScript module system browsers understand.
- HMR: Swaps changed code into a running app without a full reload.
- React Fast Refresh: React’s HMR experience that preserves component state when possible.
- Lazy route / code‑splitting: Loading route code only when the route is visited.
- Vendor chunk: A bundle of shared third‑party deps cached across routes.
- Eager pre‑bundling: Bundling common deps up front to avoid many small requests later.
- Dependency optimizer (Vite): Pre‑bundles bare imports; may re‑run if new deps are discovered at runtime.
- Type‑aware ESLint: ESLint that uses TypeScript type info – more accurate, heavier.
Why we left CRA
Problem statement: ilert‑ui outgrew CRA’s convenience defaults as the app matured.
Here are the reasons that pushed us away from CRA:
- Customization friction: Advanced webpack tweaks (custom loaders, tighter split‑chunks strategy, Babel settings for react-refresh) required ejecting or patching. That slowed iteration on a production‑scale app.
- Large dependency surface: react-scripts brought many transitive packages. Installs got slower, and security noise grew over time without clear benefits for us.
Goals for the next steps:
- Keep React + TS.
- Improve time‑to‑interactive after server start.
- Preserve state on edits (Fast Refresh behavior) and keep HMR snappy.
- Maintain predictable first‑visit latency when navigating across many lazy routes.
Why Vite looked like a better solution
During development, Vite serves your source as native ESM and pre‑bundles bare imports from node_modules using esbuild. This usually yields very fast cold starts and responsive HMR.
What we loved immediately
- Cold starts: Noticeably faster than our CRA baseline.
- Minimal config, clean DX: Sensible defaults and readable errors.
- Great HMR in touched areas: Editing within routes already visited felt excellent.
Where the model rubbed against our size
In codebases with many lazy routes, first‑time visits can trigger bursts of ESM requests, and when new deps are discovered at runtime, dependency‑optimizer re‑runs that reload the page. This is expected behavior, but it made cross‑route exploration feel uneven for us.
Qualitative field notes from ilert‑ui
Methodology: qualitative observations from daily development in ilert‑ui.
Our repo’s shape
- Dozens of lazy routes, several heavy sections pulling in many modules.
- Hundreds of shared files and deep store imports across features.
What we noticed
- First‑time heavy routes: Opening a dependency‑rich route often triggered many ESM requests and sometimes a dep‑optimizer re‑run. Cross‑route exploration across untouched routes felt slower than our webpack setup that eagerly pre‑bundles shared vendors.
- Typed ESLint overhead: Running type‑aware ESLint (with
parserOptions.projectorprojectService) in‑process with the dev server added latency during typing. Moving linting out‑of‑process helped, but didn’t fully offset the cost at our scale – an expected trade‑off with typed linting.
TL;DR for our codebase: Vite was fantastic once a route had been touched in the session, but the first visits across many lazy routes were less predictable.
Why we pivoted to webpack‑dev‑server + React Fast Refresh
What we run:
- webpack‑dev‑server with HMR.
- React Fast Refresh via
@pmmmwh/react-refresh-webpack-pluginandreact-refreshin Babel. - Webpack
SplitChunksfor common vendor bundles; filesystem caching; source maps; error overlays; ESLint out‑of‑process.
Why it felt faster end‑to‑end for our team:
- Eager vendor pre‑bundling: We explicitly pre‑bundle vendor chunks (React, MUI, MobX, charts, editor, calendar, etc.). The very first load is a bit heavier, but first‑time visits to other routes are faster because shared deps are already cached. SplitChunks makes this predictable.
- React Fast Refresh ergonomics: Solid state preservation on edits, reliable error recovery, and overlays we like.
- Non‑blocking linting: Typed ESLint runs outside the dev server process, so HMR stays responsive even during large type checks.
Receipts – the knobs we turned
1// webpack.config.js
2module.exports = {
3 optimization: {
4 minimize: false,
5 runtimeChunk: "single",
6 splitChunks: {
7 chunks: "all",
8 cacheGroups: {
9 "react-vendor": {
10 test: /[\/\]node_modules[\/\](react|react-dom|react-router-dom)[\/\]/,
11 name: "react-vendor",
12 chunks: "all",
13 priority: 30,
14 },
15 "mui-vendor": {
16 test: /[\/\]node_modules[\/\](@mui\/material|@mui\/icons-material|@mui\/lab|@mui\/x-date-pickers)[\/\]/,
17 name: "mui-vendor",
18 chunks: "all",
19 priority: 25,
20 },
21 "mobx-vendor": {
22 test: /[\/\]node_modules[\/\](mobx|mobx-react|mobx-utils)[\/\]/,
23 name: "mobx-vendor",
24 chunks: "all",
25 priority: 24,
26 },
27 "utils-vendor": {
28 test: /[\/\]node_modules[\/\](axios|moment|lodash\.debounce|lodash\.isequal)[\/\]/,
29 name: "utils-vendor",
30 chunks: "all",
31 priority: 23,
32 },
33 "ui-vendor": {
34 test: /[\/\]node_modules[\/\](@loadable\/component|react-transition-group|react-window)[\/\]/,
35 name: "ui-vendor",
36 chunks: "all",
37 priority: 22,
38 },
39 "charts-vendor": {
40 test: /[\/\]node_modules[\/\](recharts|reactflow)[\/\]/,
41 name: "charts-vendor",
42 chunks: "all",
43 priority: 21,
44 },
45 "editor-vendor": {
46 test: /[\/\]node_modules[\/\](@monaco-editor\/react|monaco-editor)[\/\]/,
47 name: "editor-vendor",
48 chunks: "all",
49 priority: 20,
50 },
51 "calendar-vendor": {
52 test: /[\/\]node_modules[\/\](@fullcalendar\/core|@fullcalendar\/react|@fullcalendar\/daygrid)[\/\]/,
53 name: "calendar-vendor",
54 chunks: "all",
55 priority: 19,
56 },
57 "vendor": {
58 test: /[\/\]node_modules[\/\]/,
59 name: "vendor",
60 chunks: "all",
61 priority: 10,
62 },
63 },
64 },
65 },
66};
1// vite.config.ts - Vite optimizeDeps includes we tried
2export default defineConfig({
3 optimizeDeps: {
4 include: [
5 "react",
6 "react-dom",
7 "react-router-dom",
8 "@mui/material",
9 "@mui/icons-material",
10 "@mui/lab",
11 "@mui/x-date-pickers",
12 "mobx",
13 "mobx-react",
14 "mobx-utils",
15 "axios",
16 "moment",
17 "lodash.debounce",
18 "lodash.isequal",
19 "@loadable/component",
20 "react-transition-group",
21 "react-window",
22 "recharts",
23 "reactflow",
24 "@monaco-editor/react",
25 "monaco-editor",
26 "@fullcalendar/core",
27 "@fullcalendar/react",
28 "@fullcalendar/daygrid",
29 ],
30 // Force pre-bundling of these dependencies
31 force: true,
32 },
33});
34
Result: Helped some cold starts, but for our repo, didn’t smooth out first‑visit latency across many lazy routes as much as eager vendor chunks in webpack.
What we tried to speed up vite (and what we didn’t)
What we tried in Vite
Run ESLint in a separate process
What it does: Lints in the background instead of blocking the dev server.
Impact: Faster feedback while editing.
Enable a filesystem cache
What it does: Reuses build results across restarts.
Impact: Quicker cold starts and rebuilds.
Pre-bundle third-party code (vendor split)
What it does: Bundles libraries like React once and keeps them separate from app code.
Impact: Less work on every save; snappier HMR.
These tweaks made Vite feel better – but they weren’t enough to solve our bigger performance issues, which is why we evaluated Webpack.
Things we could have tried
More aggressive optimizeDeps tuning
Why we skipped: Can help large projects, but needs careful profiling and ongoing dependency hygiene. The time cost outweighed the likely gains for us.
“Warm crawl” on server start
What it is: A Script that visits routes at startup to pre-load modules and caches.
Why we skipped: Extra complexity and inconsistent payoff in real projects.
Pin versions for linked packages
What it is: Lock versions in a mono-repo to reduce Vite’s re-optimization churn.
Why we skipped: Useful in some setups, but adds maintenance overhead; not worth it before a larger rework.
Pros and cons (in our context)
Vite – pros
- Blazing cold starts and lightweight config.
- Excellent HMR within already‑touched routes.
- Strong plugin ecosystem and modern ESM defaults.
Vite – cons
- Dep optimizer re‑runs can interrupt flow during first‑time navigation across many lazy routes.
- Requires careful setup in large monorepos and with linked packages.
- Typed ESLint in‑process can hurt responsiveness on large projects; better out‑of‑process.
Webpack + Fast Refresh – pros
- Predictable first‑visit latency across many routes via eager vendor chunks.
- Fine‑grained control over loaders, plugins, and output.
- Fast Refresh preserves state and has mature error overlays.
Webpack + Fast Refresh – cons
- Heavier initial load than Vite’s cold start.
- More configuration surface to maintain.
- Historical complexity (mitigated by modern config patterns and caching).
Quick performance tests you can run locally to test your project

These checks are quick, human-scale benchmarks — no profiler required. Use a stopwatch and DevTools to compare real interaction delays and perceived smoothness.
Cold start
How to test: Restart the dev server and measure the time from running npm run dev to the first interactive page load (a stopwatch works fine).
Observation: Vite typically starts faster on a cold boot. Webpack, when using its filesystem cache, remains acceptable once warmed.
First-time heavy route
How to test: Open a dependency-rich route for the first time. Watch DevTools → Network and Console for optimizer runs or request bursts.
Observation: Vite may occasionally trigger re-optimization and reloads. Webpack’s vendor chunks tend to make first visits steadier.
Cross-route navigation
How to test: Navigate through several untouched routes and note responsiveness until each becomes interactive.
Observation: Vite improves after initial loads (as modules are cached). Webpack stays consistently predictable across routes.
Linting impact
How to test: Compare ESLint running in-process versus in a separate process. Measure typing responsiveness and HMR smoothness.
Observation: Running ESLint out-of-process kept the dev server responsive and maintained smooth HMR in both setups.
Balanced guidance – when we would pick each
Choose Vite if:
- Cold starts dominate your workflow.
- Your module graph isn’t huge or fragmented into many lazy routes.
- Plugins – especially typed ESLint – are light or run out‑of‑process.
Choose webpack + Fast Refresh if:
- Your app benefits from eager vendor pre‑bundling and predictable first‑visit latency across many routes.
- You want precise control over loaders/plugins and build output.
- You like Fast Refresh’s state preservation and overlays.
Conclusions
Both Vite and Webpack are excellent. Given ilert‑ui’s current size and navigation patterns, webpack‑dev‑server + React Fast Refresh delivers the tightest feedback loop for us today – based on qualitative developer experience, not micro‑benchmarks. We’ll keep measuring as our codebase evolves and may revisit Vite or a framework as our constraints change.

.png)
.png)
