I recently ran into a bug with my static hosting provider, Cloudflare. For a short time, they were serving JavaScript files with an incompatible compression to Safari users.
All my lazy-loaded webpack bundles were returning errors for any users on iOS devices.
[Error] Unhandled promise rejection: SyntaxError: Invalid character '\ufffd'
What the hell is '\ufffd'?
After a bit of digging, it turned out to be a bug with Zstd compression mode being returned to Safari, which was incompatible with the browser. So, instead of getting my lovely web app, they were getting a big red error message.
It was pretty simple to fix in Cloudflare. I just needed to tweak a couple of settings and to turn off Zstd compression for all users for now, until it's compatible with all browsers.
And that worked great for new users.
But users who had already visited the site and downloaded the code with the incompatible Zstd compression still got the same error. Because the browser had already downloaded (and cached!) the incompatible code.
The browser didn't know to fetch new code. The filename was the same, only the compression had changed.
They could have done a hard reload in the browser to get the latest version, and the site would have worked fine, but website visitors don’t necessarily know to do that.
They just think that the site is broken.
Not cool!
I needed a way to fix my site for those users.
Why would you need to force change the [contenthash]
?
I’ve written before about cache busting a React app, and how I use [contenthash]
to add a unique hash based on the content of each bundle. So when I update the app, the browser gets the new code, because the file name has changed for that bundle.
const path = require('path');
module.exports = {
entry: {
app: './src/app.js',
},
target: ['web', 'es5'],
output: {
filename: '[name].[contenthash].js',
chunkFilename: '[name].[contenthash].bundle.js',
path: path.resolve(__dirname, 'dist/app'),
clean: true,
},
//...
The problem here was that the content hadn’t changed.
The fact is, [contenthash]
shouldn’t change if your code doesn’t change. That’s the whole point of it! You want the browser to cache the bundle, and for as long as the code is the same, it can happily use that bundle. Unless, of course, that bundle is throwing an error!
And this is really the only case I have come across where I needed all my hashes to change so that users could access the latest version, with the correct compression, even though the code had not changed.
The latest bundles would be served with the correct compression, compatible with the iOS Safari browser, but the browser didn’t know it needed to fetch a new bundle. It just thought, "I already have that file cached", and happily returned an error to the user. And I realised that as long as the [contenthash]
stayed the same, the browser would keep reloading the same cached bundle, and users would keep getting an error message.
The only way to fix it would be to change the contenthash.
At first, I did a couple of minor code updates, but that only changed the contenthash for those bundles that included the change. I didn’t need to update every bundle. Because I use code splitting to keep my bundle sizes reasonable, there were quite a few bundles, and there’s no easy way to know which code is in which bundle, since some bundles share components. And I have some bundles that are vendor bundles, I can’t change that code.
And while I did consider changing the names for all my webpack chunk filenames, it seemed wrong since the only thing that should change in my filenames for cache busting is the [contenthash]
.
So I needed a way to change the [contenthash]
, without changing the content.
Tricky!
How to force change the [contenthash]
There doesn't seem to be a way to tell Webpack to create new [contenthash]
's for all your files. I could change to the full hash, but then that would change on every deploy and all the benefits of the [contenthash]
for caching would be lost.
After spending some time in the docs, I found out that you can change the length of [contenthash]
. And that could help me! I realised that if I changed the length, all my bundles would get new names, and I could get around the caching issue.
Everyone gets new code.
Which does mean refreshing the code for every user, but it will just be a one-time thing. I would still get those hashes to stick around if the content doesn’t change going forward.
So my fixed bundles can still be cached by the browser.
Super!
Here’s how you do it:
const path = require('path');
module.exports = {
entry: {
app: './src/app.js',
},
target: ['web', 'es5'],
output: {
filename: '[name].[contenthash:16].js', 🆕
chunkFilename: '[name].[contenthash:16].bundle.js', 🆕
path: path.resolve(__dirname, 'dist/app'),
clean: true,
},
//...
Instead of [name].[contenthash].bundle.js
, which gives you the default contenthash length of 20, you can use [name].[contenthash:16].bundle.js
for a length of 16, or [name].[contenthash:8].js
for a length of 8.
And once I changed the length, every filename changed. And with that, I never saw that pesky [Error] Unhandled promise rejection: SyntaxError: Invalid character '\ufffd'
error again. 🎉
I don't see any need to change the length of [contenthash]
usually, unless you like or need shorter filenames. But if, for some reason, your files get corrupted (or compressed incorrectly), changing the length of [contenthash]
can help users fetch all new bundles without needing to update every bundle to create a new [contenthash]
.
Happy coding!
Related Posts: