I think that's a bit misleading (disclaimer: I very much like Bazel existing, though I think a better version of it could exist somewhere).
Surely a lot of work is put into Bazel core to support huge workflows. But a huge amount of work is put into simply getting tools to work in hermetic environments! Especially web stack tooling is so bad about this that lots of Bazel tools are automatically patching generated scripts from npm or pip, in order to get things working properly.
There is also incidental complexity when it comes to running Bazel itself, because it uses symlinks to support sandboxing by default. I have run into several programs that do "is file" checks that think that symlinks are not files.
Granted, we are fortunate that lots of that work happens in the open and goes beyond Google's "just vendor it" philosophy. But Docker's "let's put it all in one big ball of mud" strategy papers over a lot of potential issues that you have to face front-on with Bazel.
Personally I think this is what companies should do -- it guarantees hermeticity as you say, guards against NPM repo deletion (left-pad) and supply chain attacks. But for people who are used to just `npm install` there is a lot more overhead.
personally I don't think there is that much value in society in endlessly vendoring exactly the same code in various places. This is why we have checksums!
I understand that Google will do this stuff to remove certain stability issues (and I imagine they have their own patches!), but I don't think that this is the fundamental issue relative to practical integration issues that are solvable but tedious.
EDIT:I do think people have reasons for doing vendoring, of course, I don't think that it should be the default behavior unless you have a good reason.
Surely a lot of work is put into Bazel core to support huge workflows. But a huge amount of work is put into simply getting tools to work in hermetic environments! Especially web stack tooling is so bad about this that lots of Bazel tools are automatically patching generated scripts from npm or pip, in order to get things working properly.
There is also incidental complexity when it comes to running Bazel itself, because it uses symlinks to support sandboxing by default. I have run into several programs that do "is file" checks that think that symlinks are not files.
Granted, we are fortunate that lots of that work happens in the open and goes beyond Google's "just vendor it" philosophy. But Docker's "let's put it all in one big ball of mud" strategy papers over a lot of potential issues that you have to face front-on with Bazel.
Once it works though... beautiful stuff.