Personally I think just leave it to users to vote with their feet. If I think a site is too slow to load or too slow to use, I'll stop using it. If I'm happy to wait for 10Mb of js to run a web app, then let me.
YES - I hate Google, their AMP team, or anyone else trying to tell us how to "fix" the web.
It used to be innocuous when Google had 20% browser market share, but now they act like they own how users should experience the web. Like the uBlock origin guy said... if Chromium keeps heading down this path it should no longer be called a "user-agent" because it's no longer acting on behalf of users.
> I presume you mean SPA, and you’re probably correct. What limits do you think are reasonable?
Yes, I meant SPA.
Well, there shouldn't be a limit in my opinion. Javascript is powerful enough to allow a lot of heavy stuff like Games, desktop class applications, etc. Example (no longer in chrome app store) : https://www.youtube.com/watch?v=MfZpRtuPu-o
The issue with SPA is simple: You could use, let's say, React with Apollo GraphQl and the final bundle would be below the 500kb. The problem is when you need some UI components, the obvious way is to search for a plugin for that. Many React libraries use stuff like Jquery, Lodash, etc. under the hood that will increase the final bundle size. Over time, it just adds up with every new feature you want to add. Example: Kendo UI for React is just a wrapper around the jQuery version. Many other libraries follow this approach.
To make things worse, the same package can be added to he client with different versions because they're dependencies of different packages.
We also live in the age of isomorphic javascript (code that runs on the server and client), so there is an "extra" bundle pushed to the client so that the apps becomes more responsive without needing the server that much.
Of course Gzip help to reduce the bandwidth but it doesn't solve the problem that more lines of javascript, means more time needed to interpret the code and memory "wasted". On frameworks like MeteorJs, you could archive a bundle of 10 megabytes or more very very easily. Some seconds into a Meteor app on iOS Mobile, the app would crash. But no one force me to use Meteor at that time, there were plenty of better options. Mea culpa, period.
This problem of slow webpages/"slow" javascript is quite old. Most modern libraries/frameworks already help developers to bring faster apps by having good defaults. Example: Gatsby, Apollo GraphQl client, etc.
Enforcing limits could brings us to a world where we need to use iframes and subdomains to get an app running. For sure no one wants that.
I don't think a size limit is reasonable at all. According to the article a 500kB JS takes about 3-4MB of RAM once it's parsed. So having 500kB costs me a bit of parsing time and a handful of MB of RAM, both of which aren't really bottlenecks.
If there was some size limit X, then any SPA requiring more javascript would just be split up into modules that dynamically load and unload as they are needed. But that just makes the total instruction parsing time worse, for saving single-digit MBs of RAM.
If that does anything to the problem of slow javascript at all it makes it worse, because now I might have to wait for the right module to load and parse.
Correct me if I’m wrong, but I have a feeling what you describe could be prevented and that if you wanted to load/unload modules the sum of the JS could still be limited to a certain size
> To me, if a SPA is > 500kb of JS my assumption is it’s unnecessarily bloated.
I'm guessing you've never written something that can, for example, dynamically generate XLSX files out of filtered datasets in the browser. Sure, we could do it on the server - but our users value fast and regular turnaround time on feature updates far, far more than the site loading a little slower every few weeks when the cached JS is replaced.
> I'm guessing you've never written something that can, for example, dynamically generate XLSX files out of filtered datasets in the browser.
Guess again :)
> our users value fast and regular turnaround time on feature updates far, far more than the site loading a little slower
That might be acceptable, for a time. Particularly if you’re working on a product that doesn’t have many/any backend developers. Or a backend. Or a product where the developers are more specialized in frontend. Or a product where the backend does not have a mechanism for handling long running tasks
But at some point for features like various file exports in the browser, one might want to fix the “why can’t we build X feature and delight customers as quickly server side”
> one might want to fix the “why can’t we build X feature and delight customers as quickly server side”
Sure. It's just that there's a two- or three-year-long to-do list of features before we can get back to optimizing things that none of the users care about anyway.
If any of our users even notice that once or twice a release cycle the page for the web app takes a little longer to load, they haven't cared about it enough to actually mention it. By contrast, we get a constant flow of new feature requests.
I presume you mean SPA, and you’re probably correct. What limits do you think are reasonable?
To me, if a SPA is > 500kb of JS my assumption is it’s unnecessarily bloated.
Gzipped aren’t most modern frameworks/libraries < 150kb? With many being considerably less
Perhaps the issue isn’t proposed limits, but instead the state of most modern SPAs