So here's the random trick of the day: say you decided to finally upgrade your monitor due to a random discussion on G+, but it turns out that you haven't upgraded your desktop in a while, so you're stuck with single-link DVI.
And the fancy new monitor is a 2560x1440 one that requires dual-link DVI to drive it, so says the documentation in big letters. What do?
Of course, you could just try to find a HDMI cable, since I suspect the machine is still new enough that it happily does HDMI at pixel frequencies high enough that it would all work fine. But you're a lazy git, and you can't find a cable anywhere. And by "anywhere" I mean "lying right there on my desk, not under a pile of paper".
So rather than waste your time with trying to find hardware you may or may not have, just say "hey, I'm not playing games anyway, so why not just drive that thing with a single DVI link at 30Hz instead of the 60Hz it wants. It's going to buffer the data somewhere to see if it needs to stretch it anyway".
And if you are that kind of lazy git, here's what you do:
Step 1: calculate the VESA timing modes for 2560x1440 at 30Hz. You could do this by hand if you were a real man, but we already covered the whole "lazy git" part. So use the "gtf" tool (no, that's not random noise, it means "Generalized Timing Formula", it's part of the VESA standard for how the pixel signal timings are supposed to look like)
Running "gtf 2560 1440 30" spits out the following lovely turd, bringing back bad memories of X11 config files. There's a reason we don't do them any more, but people still remember it, and get occasional flashbacks and PSTD:
# 2560x1440 @ 30.00 Hz (GTF) hsync: 43.95 kHz; pclk: 146.27 MHz
Modeline "2560x1440_30.00" 146.27 2560 2680 2944 3328 1440 1441 1444 1465 -HSync +Vsync
Yeah, G+ will completely corrupt the formatting of those two lines, but for once it doesn't really matter. It looks like noise regardless of formatting. It's not meant for human consumption.
Step 2: tell 'xrandr' about this mode by just copying-and-pasting the numbers that gtf spit out after incanting the magic words "xrandr --newmode 2560x1440". So the command line looks something like
xrandr --newmode 2560x1440 146.27 2560 2680 ...
which will quietly seem to do absolutely nothing, but will have told xrandr that there's a new mode with those particular timings available.
Step 3: tie that mode to the list of modes that the HDMI1 output (which is what is connected to the DVI output, which you would have figured out by just running "xrandr" without any arguments what-so-ever) knows about:
xrandr --addmode HDMI1 2560x1440
Again, absolutely nothing appears to happen, but under the hood this has prepared us to say "yes, I really mean that". Lovely.
Step 4: actually switch to it. This is where the monitor either goes black, spectacularly blows up, or starts showing all its pixels the way it is supposed to:
xrandr --output HDMI1 --mode 2560x1440
Ta-daa! Wasn't that easy? Never mind what the manual says how you should use this monitor, we have the technology to do better than that. Or, in this case, worse than that, but whatever.
Now, obviously any sane person would ask himself why the GTF calculations aren't something that 'xrandr' just knows about, and why this isn't just a single command to say "please switch that output to 2560x1440@30". Why all the extra steps?
The answer to that question? I have absolutely no idea. Graphics driver people are an odd bunch.
And the fancy new monitor is a 2560x1440 one that requires dual-link DVI to drive it, so says the documentation in big letters. What do?
Of course, you could just try to find a HDMI cable, since I suspect the machine is still new enough that it happily does HDMI at pixel frequencies high enough that it would all work fine. But you're a lazy git, and you can't find a cable anywhere. And by "anywhere" I mean "lying right there on my desk, not under a pile of paper".
So rather than waste your time with trying to find hardware you may or may not have, just say "hey, I'm not playing games anyway, so why not just drive that thing with a single DVI link at 30Hz instead of the 60Hz it wants. It's going to buffer the data somewhere to see if it needs to stretch it anyway".
And if you are that kind of lazy git, here's what you do:
Step 1: calculate the VESA timing modes for 2560x1440 at 30Hz. You could do this by hand if you were a real man, but we already covered the whole "lazy git" part. So use the "gtf" tool (no, that's not random noise, it means "Generalized Timing Formula", it's part of the VESA standard for how the pixel signal timings are supposed to look like)
Running "gtf 2560 1440 30" spits out the following lovely turd, bringing back bad memories of X11 config files. There's a reason we don't do them any more, but people still remember it, and get occasional flashbacks and PSTD:
# 2560x1440 @ 30.00 Hz (GTF) hsync: 43.95 kHz; pclk: 146.27 MHz
Modeline "2560x1440_30.00" 146.27 2560 2680 2944 3328 1440 1441 1444 1465 -HSync +Vsync
Yeah, G+ will completely corrupt the formatting of those two lines, but for once it doesn't really matter. It looks like noise regardless of formatting. It's not meant for human consumption.
Step 2: tell 'xrandr' about this mode by just copying-and-pasting the numbers that gtf spit out after incanting the magic words "xrandr --newmode 2560x1440". So the command line looks something like
xrandr --newmode 2560x1440 146.27 2560 2680 ...
which will quietly seem to do absolutely nothing, but will have told xrandr that there's a new mode with those particular timings available.
Step 3: tie that mode to the list of modes that the HDMI1 output (which is what is connected to the DVI output, which you would have figured out by just running "xrandr" without any arguments what-so-ever) knows about:
xrandr --addmode HDMI1 2560x1440
Again, absolutely nothing appears to happen, but under the hood this has prepared us to say "yes, I really mean that". Lovely.
Step 4: actually switch to it. This is where the monitor either goes black, spectacularly blows up, or starts showing all its pixels the way it is supposed to:
xrandr --output HDMI1 --mode 2560x1440
Ta-daa! Wasn't that easy? Never mind what the manual says how you should use this monitor, we have the technology to do better than that. Or, in this case, worse than that, but whatever.
Now, obviously any sane person would ask himself why the GTF calculations aren't something that 'xrandr' just knows about, and why this isn't just a single command to say "please switch that output to 2560x1440@30". Why all the extra steps?
The answer to that question? I have absolutely no idea. Graphics driver people are an odd bunch.
View 288 previous comments
Thanks a million times, I was about to weig options such as buying a new computer and other nonsense just because of hte new monitor and inability of my laptop to display this resolution, you really saved me a lot of nerves and time, I'm gonna buy you a Slovenian beer the next time I see you!Feb 6, 2014
Yesterday I got a new Dell U2515H Monitor because my old on died after many years. I am using a Celeron J1900 based system and I was not aware that it might be difficult to get that resolution to work with it. But with this trick it is perfect now. I even was able to get it to 55Hz with the following timings:
xrandr --newmode "2560x1440" 220.812 2560 2608 2640 2720 1440 1443 1448 1478 -hsync -vsync
Again - Thanks a lotNov 15, 2015
Thanks! However, i can't get 55Hz, max i can get is 40 :(Dec 11, 2015
Hello! I did and it worked
xrandr --newmode "2560x1440" 220.812 2560 2608 2640 2720 1440 1443 1448 1478 -hsync -vsync
BUT restared the Ubuntu , after reboot the resolution got back to full hd. I tried again, but still don't work.Jan 5, 2016
If you have bandwidth limitations, 'cvt --reduced' will give you a reduced-blanking vesa modeline. gtf does not seem to have this option.Mar 20, 2016
For reasons unknown to me I cannot force it to work on frequencies higher than 40Hz on Ubuntu 17.10. But it worked happily on 49Hz or 50 a year ago when I used Arch.3w
Add a comment...