Alexander Garcia
The final part of our bundle optimization series. Learn how to reduce image sizes by up to 96% using WebP conversion, Sharp, and automated optimization scripts. Includes a complete Node.js image optimization tool you can use in your own projects.
Alexander Garcia is an effective JavaScript Engineer who crafts stunning web experiences.
Alexander Garcia is a meticulous Web Architect who creates scalable, maintainable web solutions.
Alexander Garcia is a passionate Software Consultant who develops extendable, fault-tolerant code.
Alexander Garcia is a detail-oriented Web Developer who builds user-friendly websites.
Alexander Garcia is a passionate Lead Software Engineer who builds user-friendly experiences.
Alexander Garcia is a trailblazing UI Engineer who develops pixel-perfect code and design.
After optimizing our JavaScript bundles in Part 1 and Part 2, our total bundle size was still 8.5 MB. Why?
dist/tiki-social-project.jpg 7456.6 kB # ๐ฑ 7.5 MB! dist/me-dark.jpeg 731.6 kB dist/me-light.jpeg 746.4 kB dist/card-bg.png 1361.1 kB dist/me-chair.png 244.0 kB dist/blog-head.png 252.6 kB
A single image was larger than all our JavaScript combined. Time to fix that.
Digital images are massive by default:
For a portfolio site, these are the wrong tradeoffs. We need:
WebP is a modern image format from Google that provides:
We can, but it's good practice to provide fallbacks:
<picture> <source srcset="image.webp" type="image/webp" /> <img src="image.jpg" alt="Fallback for older browsers" /> </picture
However, with 97%+ support, I'm comfortable using WebP directly for my portfolio site.
Rather than manually converting images, I built a Node.js script using Sharp (a high-performance image processing library).
npm install --save-dev sharp
I created scripts/optimize-images.mjs:
import sharp from "sharp"; import { readdir, stat } from "fs/promises"; import { join, extname } from "path"; const IMAGE_DIR = "./static"; const MAX_SIZE_KB = 500; // Compress images larger than 500 KB async function getImageFiles(dir) { const files = await readdir(dir); const imageFiles = []; for (const file of files) { const filePath = join(dir, file); const stats = await stat(filePath); if (stats.isFile()) { const ext = extname(file).toLowerCase(); if ([".jpg", ".jpeg", ".png"].includes(ext)) { imageFiles.push({ path: filePath, size: stats.size, name: file, }); } } } return imageFiles; } async function optimizeImage(imagePath) { const ext = extname(imagePath).toLowerCase(); const image = sharp(imagePath); const metadata = await image.metadata(); console.log(`Optimizing: ${imagePath}`); console.log( ` Original: ${metadata.width}x${metadata.height}, ${metadata.format}` ); if (ext === ".jpg" || ext === ".jpeg") { // Convert JPEG to WebP const webpPath = imagePath.replace(/\.jpe?g$/i, ".webp"); await image .resize({ width: Math.min(metadata.width, 1920), withoutEnlargement: true, }) .webp({ quality: 85 }) .toFile(webpPath); const webpStats = await stat(webpPath); console.log( ` WebP: ${webpPath} (${(webpStats.size / 1024).toFixed(2)} KB)` ); } else if (ext === ".png") { // Optimize PNG or convert to WebP for photos const optimizedPath = imagePath.replace(".png", ".optimized.png"); await image .resize({ width: Math.min(metadata.width, 1920), withoutEnlargement: true, }) .png({ quality: 85, compressionLevel: 9 }) .toFile(optimizedPath); const optimizedStats = await stat(optimizedPath); console.log( ` Optimized: ${optimizedPath} (${(optimizedStats.size / 1024).toFixed( 2 )} KB)` ); } } async function main() { console.log("Finding large images...\n"); const images = await getImageFiles(IMAGE_DIR); const largeImages = images.filter((img) => img.size / 1024 > MAX_SIZE_KB); if (largeImages.length === 0) { console.log("No large images found!"); return; } console.log(`Found ${largeImages.length} large images:\n`); largeImages.forEach((img) => { console.log(` ${img.name}: ${(img.size / 1024).toFixed(2)} KB`); }); console.log("\nOptimizing...\n"); for (const img of largeImages) { await optimizeImage(img.path); console.log(""); } console.log( "Done! Review the optimized files and replace the originals if satisfied." ); } main().catch(console.error);
{ "scripts": { "optimize-images": "node scripts/optimize-images.mjs" } }
npm run optimize-images
Output:
Finding large images...
Found 4 large images:
card-bg.png: 1329.23 KB
me-dark.jpeg: 714.49 KB
me-light.jpeg: 728.95 KB
tiki-social-project.jpg: 7281.86 KB
Optimizing...
Optimizing: static/card-bg.png
Original: 1000x1000, png
Optimized: static/card-bg.optimized.png (275.75 KB)
...rest
Done!
WebP conversion and compression reduced image sizes by up to 96%
Total image savings: ~9.4 MB
# After image optimization dist/static/image/tiki-social-project.webp 251.6 kB # Was 7.5 MB! dist/me-dark.webp 66.9 kB # Was 731 kB dist/me-light.webp 67.1 kB # Was 746 kB dist/card-bg.png 282.4 kB # Was 1.3 MB dist/me-chair.png 244.0 kB # Kept original dist/blog-head.png 252.6 kB # Kept original Total: 7506.4 kB # Was 17.3 MB!
After generating WebP images, I updated my imports:
import tikiSocialProject from "@static/tiki-social-project.jpg"; <img src={tikiSocialProject} alt="Tiki Social Club" />;
import tikiSocialProject from "@static/tiki-social-project.webp"; <img src={tikiSocialProject} alt="Tiki Social Club" loading="lazy" />;
For critical images, use <picture> for better fallback support:
import meLight from "@static/me-light.jpeg"; import meLightWebp from "@static/me-light.webp"; import meDark from "@static/me-dark.jpeg"; import meDarkWebp from "@static/me-dark.webp"; export function Hero() { const { theme } = useTheme(); return ( <picture> <source srcSet={theme === "light" ? meDarkWebp : meLightWebp} type="image/webp" /> <img src={theme === "light" ? meDark : meLight} alt="Alex Garcia" loading="lazy" /> </picture> ); }
What's happening:
type="image/webp")For larger images, serve different sizes for different screen widths:
import tikiSocialProject from "@static/tiki-social-project.webp"; <img srcSet={` ${tikiSocialProject}?w=400 400w, ${tikiSocialProject}?w=800 800w, ${tikiSocialProject}?w=1200 1200w `} sizes="(max-width: 640px) 400px, (max-width: 1024px) 800px, 1200px" src={tikiSocialProject} alt="Tiki Social Club" loading="lazy" />;
How it works:
This requires a build-time image optimization plugin or a CDN like Cloudflare Images.
I initially tried using @rsbuild/plugin-image-compress:
import { pluginImageCompress } from "@rsbuild/plugin-image-compress"; export default defineConfig({ plugins: [ pluginImageCompress({ jpeg: { quality: 85 }, png: { quality: 85 }, webp: { quality: 85 }, }), ], });
Result: Build errors with JPEG files. The plugin uses @squoosh/lib which had compatibility issues with my setup.
Lesson learned: Sometimes a custom script is more reliable than a plugin. The Node.js script gives you:
For production apps with user-uploaded images, consider a CDN:
<img src="https://images.example.com/cdn-cgi/image/width=800,quality=85,format=webp/tiki-social.jpg" /
<img src="https://example.imgix.net/tiki-social.jpg?w=800&q=85&auto=format" /
import Image from "next/image"; <Image src="/tiki-social.jpg" width={800} height={600} alt="Tiki Social Club" />; // Automatically optimizes, lazy loads, and serves WebP
For static sites like mine, the build-time script approach works great.
Total bundle size reduction from 22.8 MB to 7.5 MB (67% improvement)
Tested on Lighthouse (simulated slow 4G):
Lighthouse metrics on simulated slow 4G - up to 74% faster load times
โ
Convert JPEG/PNG to WebP for photos
โ
Use PNG for graphics/logos with transparency
โ
Resize images to max display size (1920px width for most screens)
โ
Set loading="lazy" on below-the-fold images
โ
Use <picture> for critical images with fallbacks
โ
Compress images at 80-85 quality (sweet spot for web)
โ
Use responsive images with srcset for different screen sizes
โ
Automate optimization in your build process
โ
Measure before/after with Lighthouse
โ
Test on slow 3G/4G connections
Over this 3-part series, we:
Final result: A portfolio site that loads 68% faster with 67% less data.
Previous: Part 2 - Route-Based Code Splitting
Full series: