setup.py/Setuptools' install_requires isn't quite right. Like most dependency systems, it is unfortunately oriented towards asserting what should work, when as package metadata it should only attempt to indicate what is known not to work. The terms "requires" and "depends" encourage this thinking.

There is some overlap. When a package requires some library, it is the same as saying that we are very sure this package will be broken if you don't have this library. But other assertions are inverted. "Requires LibFoo>=1.0" should be phrased "we know LibFoo<1.0 won't work". Occasionally there will be brief periods of breakage, like "LibFoo==1.3b5 won't work". Phrasing this as requirements is hard.

In the end, all this information doesn't tell you what does work, it just gives you a smaller space in which to search for a working set. This is why pip includes the concept of requirements, separate from any single package -– it's an assertion that all those things work together towards some end (not towards any end, requirements files are not widely reusable).

Ultimately the registry of "things that don't work together" and "things that do work together" shouldn't be intrinsic to a package or release. It's a growing set of knowledge. You can't know what future versions of your dependencies will do, and conflicts between three or more packages are possible while our tools only deal with sets of two. So the truly superior packaging system will represent compatibility as a separate concept from any of one component of a system. I do not however know of such a superior packaging system. It could well be an external advisory tool, separate from installation; if it could also advise on scenarios then installation tools could consult with it before actually installing. In time the in-package metadata could fade away as it is seen as unnecessary.
Shared publiclyView activity