1. I couldn't find any other videos of Ryse running on old hardware. Much less a GTX 480. But here's Rise of the Tomb Raider running on a GTX 480, and some Análisis on what to expect from a GTX 480https://youtu.be/ZjA2qxbj4W4https://youtu.be/KL--FHS2fuk
2. What you're saying is partially true. While I didn't take into account that DirectX gets outdated and stuff, and it is most likely true that in that sense some older cards won't render games, the part that you say that devs don't "optimize" their games for all pc hardware is kind of misleading. Consoles (excluding this generation) are very VERY different from their competitors. Just look at the PS3 and Xbox 360. The PS3 had its weird cell processor and some setup similar to SLI or CF, where the CPU did half of the graphics rendering, and a relatively weak GPU did the other half. But pc hardware doesn't have such drastic differences, mainly to maintain backwards compatibility. So when a DEV "optimizes" their software for PC, they're really optimizing it for both new and old hardware, even if they don't test it on older hardware. Sure, there are things like changing micro architecture, but in the end, pc hardware doesn't change that much. Also, folks like nVidia (and most likely AMD too) release new drivers that actually increase real world performance of their old hardware. They make better use of it. So even if devs weren't optimizing software for old hardware, manufacturers are.