Let's say you have two dependencies each requesting colors:
colors@1.0.1-1.0.3
colors@1.0.1-1.0.4
With npm, you'll end up with 1.0.3 because it satisfies all constraints. OP wants to end up with 1.0.4 if at least one dependency tested with 1.0.4 (and reject 1.0.5). I don't know of a way to do this with npm today.
You might like the [1] "overrides" field in npm v8.3 although I would recommend using it with caution, changing your dependencies dependencies has unknown consequences... even a patch release can break everything...
You are reading this wrong. The OP is suggesting that if you have two dependencies that are requesting:
colors@^1.0.1
colors@^1.0.2
then npm should get you 1.0.2, instead of 1.0.4, because it's the "version as close as possible" to "the dependency version that the package was actually tested with".
OP is not suggesting that npm should ignore dependency constraints, just that the version that is picked is the closest to the tested version (among those that satisfy the constraints).
If you have a package that explicitly says it won't work with >1.0.3, installing 1.0.4 is silly.
> the standard way that this is done allows minor and point releases to be trusted.
I feel like this event (and previous ones) has taught me that one should NOT trust patch and minor version upgrades to work. Obviously we want them to, but I distinctly recall having "minor" patches that broke existing behavior in the past, and has bitten my team on multiple projects over the last several years. Pinned versions are a giant pain, but having builds suddenly stop working seems worse, because you can't plan ahead for the time to upgrade.
I've come to believe that pinned versions with an active dependency check is the way to go. A lot of the dependency checks/scans are build time rather than an "on going" approach.
If nothing else, that is a step in the direction of reproducible builds which are also in the Good Thing category.
This is likely going to be another maturing event for NPM and the community where they will need to decide how they want to move forward. The blind trust of a `^1.2.3` version specification is something that will likely be outgrown.
I still believe that one of the biggest problems that JavaScript libraries face is the transitive dependency explosion combined with the "always update" build policies and that in turn makes makes the issue of a suddenly untrustworthy developer more likely and more problematic.
https://stackoverflow.com/a/22345808 - and especially the comment.
You will find a lot of `^1.2.3` in version specifications which means everything from `1.2.3` up to (but not including) `2.0.0` is allowed.
Specifying `1.0.1 - 1.0.3` is allowed too and would meet the desired functionality - but that isn't the culture of JavaScript developers.
The version range is allowed in other dependency management systems (e.g. Maven - https://www.mojohaus.org/versions-maven-plugin/examples/reso... ) but rarely do I see it used - most often its pinned to a specific known good version.