Shared publicly  - 
 
So here's the random trick of the day: say you decided to finally upgrade your monitor due to a random discussion on G+, but it turns out that you haven't upgraded your desktop in a while, so you're stuck with single-link DVI.

And the fancy new monitor is a 2560x1440 one that requires dual-link DVI to drive it, so says the documentation in big letters. What do?

Of course, you could just try to find a HDMI cable, since I suspect the machine is still new enough that it happily does HDMI at pixel frequencies high enough that it would all work fine. But you're a lazy git, and you can't find a cable anywhere. And by "anywhere" I mean "lying right there on my desk, not under a pile of paper".

So rather than waste your time with trying to find hardware you may or may not have, just say "hey, I'm not playing games anyway, so why not just drive that thing with a single DVI link at 30Hz instead of the 60Hz it wants. It's going to buffer the data somewhere to see if it needs to stretch it anyway". 

And if you are that kind of lazy git, here's what you do:

Step 1: calculate the VESA timing modes for 2560x1440 at 30Hz. You could do this by hand if you were a real man, but we already covered the whole "lazy git" part. So use the "gtf" tool (no, that's not random noise, it means "Generalized Timing Formula", it's part of the VESA standard for how the pixel signal timings are supposed to look like)

Running "gtf 2560 1440 30" spits out the following lovely turd, bringing back bad memories of X11 config files. There's a reason we don't do them any more, but people still remember it, and get occasional flashbacks and PSTD:

  # 2560x1440 @ 30.00 Hz (GTF) hsync: 43.95 kHz; pclk: 146.27 MHz
  Modeline "2560x1440_30.00"  146.27  2560 2680 2944 3328  1440 1441 1444 1465  -HSync +Vsync

Yeah, G+ will completely corrupt the formatting of those two lines, but for once it doesn't really matter. It looks like noise regardless of formatting. It's not meant for human consumption.

Step 2: tell 'xrandr' about this mode by just copying-and-pasting the numbers that gtf spit out after incanting the magic words "xrandr --newmode 2560x1440". So the command line looks something like 

   xrandr --newmode 2560x1440 146.27 2560 2680 ...

which will quietly seem to do absolutely nothing, but will have told xrandr that there's a new mode with those particular timings available.

Step 3: tie that mode to the list of modes that the HDMI1 output (which is what is connected to the DVI output, which you would have figured out by just running "xrandr" without any arguments what-so-ever) knows about:

xrandr --addmode HDMI1 2560x1440

Again, absolutely nothing appears to happen, but under the hood this has prepared us to say "yes, I really mean that". Lovely.

Step 4: actually switch to it. This is where the monitor either goes black, spectacularly blows up, or starts showing all its pixels the way it is supposed to:

xrandr --output HDMI1 --mode 2560x1440

Ta-daa! Wasn't that easy? Never mind what the manual says how you should use this monitor, we have the technology to do better than that. Or, in this case, worse than that, but whatever.

Now, obviously any sane person would ask himself why the GTF calculations aren't something that 'xrandr' just knows about, and why this isn't just a single command to say "please switch that output to 2560x1440@30". Why all the extra steps?

The answer to that question? I have absolutely no idea. Graphics driver people are an odd bunch. 
3419
650
Zak Elep's profile photoAbuhab Lorenzo's profile photoJay Carlson's profile photoTobias Diedrich's profile photo
300 comments
 
I suggest the xrandr flag be --gtfo ... for generalized timing formula output, of course.
 
man, thanks for the trick but... wouldn't be easier to find the HDMI? ;)

Thanks for the post, humor appreciated
 
Now that's a nifty trick, I'll have to remember that.  And I guess 30fps should be acceptable for 2D desktop use anyway.  At least until it annoys you enough to get motivated to find an HDMI cable.
 
For extra laughs, I have a 2560x1440 monitor whose HDMI input maxes out at 1920x1200. That took me a good while to figure out... 
 
Well it's always a burden working with stuffs when it comes to graphics... hehe. Brilliant move though. :)
 
Wow those mode lines brings back memories.  LOL, glad I don't have to do that any more.
 
Wasn't it much simple to find HDMI cable? No it wasn't obviously, you are a Linus Torvalds! Great man! As always!
 
I bow down to your lazy old gityness.

'why this isn't just a single command to say "please switch that output to 2560x1440@30". Why all the extra steps?'

Worse is better?  :->
 
As another famos person once said: "We choose to connect this monitor with a single link DVI, not because it is easy, but because it is hard"
 
I feel that wave of nostalgia now...

The 'good old days' when we were intimate with those 'lines of turd' trying to get that incredible hi-res 19" monitor that got thrown out at work to show something. Back when such beasts were not multi-sync, but only supported one resolution. And that resolution really stretched the capabilities of your graphics hardware. "Those were the days, my friend" :-D
 
Any sane person would ask himself why the GTF calculations? Shouldn't it be any sane person would ask himself why not just use the Hdmi. Still too funny, and learned another way to possibly spectacularly break a monitor.
 
That doesn't seem at all lazy to me. It's a mental exercise rather than physical. It's nice if you have the grey matter upstairs to compute it tho, I was lost and educated from the start. 
 
I'm even lazier, I would have gone and bought a new desktop to drive that display at its full res.
 
X11 config files! Don't talk me about X11 config files...
 
Are you all sleepless? Linus, go to bed!
 
A dropdown with all possible modes that won't implode the earth and create a giant black hole in the space your desk used to be in would be most efficient for lazy gits.
 
Ok Mr Torvalds, I chose Linux because I figured it was safe from someone hacking in and using my camera to spy on me, but you not only use your back-doors to turn on my camera, but you have awesome zoom, enhance and x-ray capabilities much farther than we mortals get... so much so you were able to find the damn cables 3 feet from me under that pile of paper AND you publish the story!?! 
 
Man, i think your idea of being lazy and mine aren't exactly the same ;-) 
 
as +Dirk Bergstrom said, HDMI is not necessarily the best option, as some older version of HDMI max out at 1920x1200. I've got such a monitor (~2009 30" dell), and DisplayPort was the best option.
 
Oh god. I swear X11 configuration was meant as some kind of test to see if you were worthy of running linux. I swear the vast majority of my time getting to know linux was either configuring monitors, trying to print anything, and getting any sound out of my sound-card.

Thank you to everyone who had a part in me never having to struggle with that any more (mostly)
 
Too much effort... I'll stick with my old monitor.
 
It makes sense to make Unix-like operating systems and its tools difficult. It is the best way to scare away all the nasty n00bs who ask dumb questions and don't let you work.
 
Nice work!  I am still scarred from hand editing X configs in the past.  Before getting to step (1) I would have just bought a new graphics card that has dual link DVI.
 
Monoprice?  That goes way further than digging around the desk looking for the cable you already own 10 of but are too geeky lazy to uncover.
 
X always had a steep learning path, how could you do something and not learn something about how it works internally? Wouldn't that be very user friendly?
 
The scariest thing for me as a new linux user all those years ago was x configs. Im glad they are gone. 
 
any one can tell me what does google+ do?
 
the most terrifying thing about xorg.conf is that I still expect it to function as it used to so when it doesn't work I eat my shoulder in frustration.
 
IIRC +Linus Torvalds you had a Sony Vaio C1? That is where I learnt everything I know about xorg.conf _:-D
 
op op op op oppan linus style
 
+Paul Leader sound never was a problem untill pulseaudio, you must be a new user _;-p
 
A very interesting solution, not one that I would have necessarily done but hay, if it works, who cares :D
 
Someone was obviously too lazy to read the 'how to be a lazy git' manual in full. 
 
Thank you so much for this read. Informative and hilarious.
 
I needed to hook up to a TV with a VGA connector with a sensible resolution, and did the "xrandr" dance with random modelines I found using Google -- just like the old days.  I wish I'd known about "gtf".
 
Cool, now all you have to do is get the audio to go through that HDMI cable. Hahahahaha.(Having gone through all that i've earned the right to enjoy watching you guys toil away at it... Have a goodnight!)
 
Awww, that's really bringing back some glorious memories.
That was THE way, back in the early '90s. Absolutely nothing wrong with it !
-
 
Haha. It wasn't long ago when I tried to get a 50hz mode on my desktop monitor just like that. I didn't get it working properly though, I gave up, and settled for a "jumpy" picture with 25/50hz material (==all tv recordings).
 
AH HA HA HA! I need to get to bed...eight I got court in the morning or putting a lockout on! That was funny tho! yaw too smart for me. night. thanks for the education.
 
Patches welcome :-) but I guess because gtf comes with the server and xrandr on its own nobody created libgtf. You can probably get a console with video=2560x1440@30 or something close by on the command line
 
Patches welcome :-) but I guess because gtf comes with the server and xrandr on its own nobody created libgtf. You can probably get a console with video=2560x1440@30 or something close by on the command line
Translate
 
+Linus Torvalds According to wikipedia the requirements for dual link DVI is: "The DVI specification mandates how the dual link may be used. All display modes that use a pixel clock below 165 MHz, and have at most 24 bits per pixel, are required to use single-link mode. All modes that require more than 24 bits per pixel, and/or 165 MHz pixel clock frequency must use dual-link mode."

Thus theoretically you should have just reduced the bpp to force single link DVI. Obviously this can mean two things. Wikipedia needs revision or you screen is not as demanding as might have thought.

Anyway, thanks for the insight on the inner workings of gfx mode settings.
 
I propose that such commands be aliased as `wtf`
... But `--gtfo` or `--clean-your-desk-you-lazy-git` args would also suffice

Actually, I've just been inspired to start using emoji and emoticons for my short args:
X-P (gag mode / print debug output)
:-D (happy mode / load **my** preference file)
4:-=I (strict / nazi mode)
8==D (dick mode / super strict mode
....

 
I'm wildly entertained at how +Linus Torvalds actually posts stuff like this.

It's like basketball training from Michael Jordan: You know there's no chance your lazy ass will get to trying out this stuff but it's damn impressive to watch.
 
Dude, you invented Linux... just splurge on an HDMI cable. Or did this morph into a test of will and hackery?
 
Back in the days of CRTs we used to try and max out refresh rates for FPS gaming but also always over 60hz to avoid flicker and eyestrain. Wouldn't 30hz be rather bad for the eyes or is there something about TFTs that changes that? 
 
Surprising the geeks at G+ hq didn't add [code] tag support
 
hmm...and made you wonder why?...Creators do that!
 
+Andrew Meigs The thing is this. Imagine that your sitting comfortably in front of your computer. Do you get up to get to store? Or do you exercise your mind and fingers while still sitting comfortably? There is no competition
 
+Nick Holloway I wish I had known about xrandr. I used to modify my config manually and then would Ctrl-Alt-Backspace and hope the settings I tried would work with my setup.  I also had to add startup parameters in LILO to make my SCSI drive work. I swear I've learned more trying to make unsupported hardware work than any other way, but I don't know how useful my knowledge is now.
 
+Alex Lee should it not be Helsingfors Syndrome? ;-)

(Helsingfors is the Swedish name of Helsinki, and as Linus is a Swedish speaking Finn...)
 
The body still needs to function..I'd pick movements?...I'm really bad at this!
 
First step should be asking yourself why am I still saying "What do?".
 
I dont use a desktop and a separate monitor so i may not have any use with this post, but i just read this completely, your writing style makes some mundane things interesting.


 
Clearly your definition of lazy and mine aren't quite the same. I would have left it on the desk not plugged in til a cable materialised. 
 
Hmm, I wonder if this trick could be used to make Mac Mini drive 2560x1440 display through DVI without having to use Apple's famously crappy MiniDP->Dual-DVI converter.
 
I had to do that once to get non-blurry fonts with the Nouveau driver.  I switched back to the proprietary drivers shortly after.
 
It was easier to by a cable or new hw but what's the fun in that? :-) 
 
"Spits out the following lovely turd" oddly enough I'd love to have that on a shirt with +Linus Torvalds head on it, and one of those little quote bubbles. 
 
Modelines and dotclocks.  Just when you thought it was safe to go back in the pixels ....
 
that's the kind of post I'd like to hear from Linus more often :)
 
and I don't even know him...loved that comment tho!
 
+Steve Mo You do if you want 3 pages of 166 lines each displayed all at once. Maybe if this catches on, I should buy shares in Bausch & Lomb. :)
 
You know what, Linus? Fucking cry about it. "Waah~ I have 2560x1440 but can't get it without jumping through a few hoops!" I, at best, have 1366x768 resolution on my shit-ass 720p monitor, and I consider it a God-send since I'm barely trudging along on my Debian-based Asus K52F laptop. Jesus. [/bitterness]
 
sounds too simple a solution for a geek.  you should have opened up the monitor and soldered in some extra circuitry to solve this problem...
 
The move away from CRTs was a blessing and a curse. A blessing because it eliminated the flickering that resulted from running them at a low refresh rate. A curse because they could not achieve refresh rates as high when you actually wanted them.
 
Ah, and there +Linus Torvalds you prove that YOUR definition of lazy is waaaaaay different than my definition of lazy... =) That's why you are brilliant and I'm happy that you choose to contribute little bits of genius to our lives. Thanks!!!
 
Brings back the memory of blowing up crt monitors....
 
+Linus Torvalds I could have mailed an HDMI cable to you and arranged it to be delivered on your desk on top of the pile of papers.
 
Seriously though, at 30 FPS, typing might get annoying after a while.
 
I'm sure you meant PTSD. Flashback and PSTD...reminds me of a girlfriend I had once. Or was it twice?
 
Who best to follow when you are studying Linux than the creator himself :)
 
Proverb: Linux is only free if your time is worthless.
Mike L.
 
That sounds a bit like the story how I broke my first CRT-Screen (it was almost fresh, like from 1998 and it was 2006 maybe…) by guessing some numbers in the config… I was young, stupid and wanted a new screen anyway ;)

Also: Isn't "one tool for one job" part the linux philosophy? ;)
 
Proof that nothing can be over - engineered
 
+Carmelyne Thompson you wouldn't believe the sounds old CRT screens can make when you mess up those calculations. Or just try streching them outside the boundaries of the hardware. So high pitched i probably wouldn't notice them anymore :-D
 
and this is why linux will never become a consumer product, your average user finds this challenging enough in windows...
 
Can I ask about google map on mobile, I want google to notice me when I change a simcard from my Mobile and send the number changed to, can google do that?
 
+Alex Wilson we are talking oldtimer geek here. Any sane person would just plug in using the cable prescribed by the manual, and it would "just work". In fact I spend far less time messing with drivers under linux. Almost everything "just works" without having to install any software from a driver cd. I even plugged in a color calibration thingie just for fun. I immediately got a new option available in the screen configuration and calibrating my screens was so easy anyone could do it.
 
@ally mperembe
.
i think google.map.cant do that. i am not yet trying it but try to install this app..
avg mobilation.. its been sending me mails when my sincard has changed..
 
Now if only those of us that have monitors of that size could get a proper frame buffer mode from the kernel to drive that full resolution...
 
For me lazy would be trying to find the cable instead of doing what you did. 
 
Sorry , I don't know what this blog is about . So long as it looks okay ,  I wouldn't bother whether is has more or less pixels . Just find the cable . If you don't have cable , don't use that sort of monitor .
 
+Michael Vescovo "An aitch-dee-em-eye cable". Maybe you pronounce it "Haitch", but many people don't.
 
I am still recovering from my experience with X11 configuration files during the mid-nineties. I need counseling. 
 
Mr Torvald - How about buying a new display card? i bet you can afford one.... Or a decent laptop with a Displayport/ BTW: most HDMI outputs do not output > 1920x1080 
Translate
 
Possibly it's so convoluted to stop you doing it at all costs unless you're really desparate or mad? :P
 
"Running "gtf 2560 1440 30" spits out the following lovely turd"
I admire your way with words Sir :o)
 
+Gareth Johnson There is a difference between TFT/LCDs and CRTs. Digital displays are hold type displays, they hold the same image until the next refresh is called and the image changes instantly. CRTs (and Plasma, by the way) are pulse type displays. They ignite or pulse every pixel/line with the refresh rate and then the illumination decays until the next refresh. That is why CRT and Plasma seem to flicker at 60 Hz or less where LCD do not. 
Driving a LCD with 30 Hz might result in choppy movement (mouse cursor, scrolling and so on) but not in a flickering image.
 
+Tal Barenboim See: lazy - he was comfortably in front of his monitor, and the store is a long way away (ie further than the coffee pot or the beer).
 
Can anyone else appreciate this as humourous? Maybe I'm just too much of a nerd - rotfl
 
I so remember the X11 config madness back in the 90's, terror with every new video card
 
Hmm, why bother with all the calculations? Isn't it easier to google for the normal mode data and then sclale the pixel clock!
 
You seem to be expressing my mind here, Sir Torvalds. I have been having challenges with the video driver of my computer....lol
 
BTDT:
xrandr --newmode "2560x1440@36" 160.40 2560 2592 2912 2944 1440 1472 1481 1513
xrandr --newmode "2560x1440@30" 143.93 2560 2592 3136 3168 1440 1473 1480 1513
xrandr --newmode "2560x1440@24" 110.53 2560 2592 3008 3040 1440 1473 1479 1513
In the end I decided to get a dual-link graphics card anyway because sometimes you'd notice the low refresh rate.
 
+Florian Echtler They spend so much energy in order to prevent me from using it that I don't want to disapoint. I don't use HDMI.

Same goes with blue rays.
 
+Linus Torvalds Interesting read, as usual... but damn, you've got way too much time on your hands!
 
Thus cool keep moving on and never give up
 
+Linus Torvalds Your LCD monitor should be fine with a "reduced blanking interval" signal - those modes you posted still have the big fat CRT-compatible blank intervals (to reposition the CRT's electron beam) included.  Cut those, and you get the same amount of useful video bandwidth at a lower clock (might even allow you to go up to a high refresh rate).  e.g. 120MHz instead of 146MHz at 30Hz

I seem to have a (hacked maybe?) copy of the X11 "cvt" utility here which does that:

root@zebedee:~#/usr/local/bin/cvt-flex  2560 1440 30 -r

  # 2560x1440 @ 30.00 Hz Reduced Blank (CVT)
  #   field rate 29.95 Hz; hsync: 43.75 kHz; pclk: 119.00 MHz
  Modeline "2560x1440_30.00_rb"  119.00  2560 2608 2640 2720  1440 1443 1448 1461  +HSync -Vsync


... it seems to use opposite HSync and Vsync polarities to your modes.  Dunno why tho'.

Like wise:

40Hz:

  Modeline "2560x1440_40.00_rb"  159.50  2560 2608 2640 2720  1440 1443 1448 1467  +HSync -Vsync

50Hz:

  Modeline "2560x1440_50.00_rb"  200.25  2560 2608 2640 2720  1440 1443 1448 1474  +HSync -Vsync



I use the same trick to drive a 1920x1080 display over a very long HDMI cable - works pretty well in general - occasionally video playback software gets surprised at the low frame rate (e.g. confused deinterlacers etc.).


HTH...

Tim.
 
Nifty - curious which monitor did you get?

I was toying with the idea of upgrading, but since I rely on a KVM HDMI switch that only does 1.3a I wasn't sure I could go there without frying something. Sounds like this could do the trick. Last attempt at running at higher rates fried the docking station and the DP->HDMI adapter.
Ell Tee
 
I am having a lot of trouble with the "lazy git" part of this.
 
"The answer to that question? I have absolutely no idea. Graphics driver people are an odd bunch." - #quotable
 
This brings back memories of getting an old desktop to talk to my new TV while building my first Myth box.
 
"please compile my linux kernel all drivers as modules except A, B, and C"
 
"The answer to that question? I have absolutely no idea"

ooooook Linus, tanks for job :)
 
Or you could just buy a new graphics card.
Unless you want to make it hard for your self.
 
Hey Linus, You wrote PSTD. I dunno 'bout that. Did you mean PTSD or mebe PMS?
 
I agree with some of the posters above, instead of halving the frame rate the smarter approach would be to reduce the data rate to match the capabilities of the cable. +Roelf Pringle +Tim Small 
 
LOLz, yes... lazy, I would have ordered a new cable/video card off the internet but i'm a completely different kind of lazy.
 
+Giuliano Peretti HDMI is not the solution in this case, as it also has single-link and dual-link.  So if your DVI doesn't support dual-link, your HDMI output also won't support it.   And while it does have slightly higher data rates than DVI on a single link, it's not enough for 60fps.
 
The same "lazy git" people (Torvalds dixit) who prefers cables (ie hard) are the same people who prefers everything "virtual" (ie soft)? Torvalds show how the soft intelligence may even substitute cables. 
 
Oh my God... X11 modelines... I thought those had been eradicated...

Thankfully haven't had to deal with them in years, since nVidia's proprietary settings GUI is so good.
 
I was going to read the entire post but I'm something of a lazy git so ... 
 
The only sane connector for high resolution displays is DisplayPort, as I also found out when I got my 2560x1440 monitor. HDMI is designed for TVs so its not a good fit for anything exceeding 1080p. Dual-Link DVI works in principle but why buy a special cable sold by its weight in gold? But with DisplayPort you just have to plug it in, works!
 
Can someone fedex this guy a cable.   or perhaps an updated workstation.
 
hi gomathy look simpal as cute i hope you mean it
 
That's exactly what I was going to recommend.
 
Buying a min video card would upgrade your computer - and delay the gray hair and baldness issues. - That's got to be worth something- "saving face/ hair". If your computer can't handle that - slap yourself and get a reality check- buy a new computer - your way over do.
 
Laziness is the mother of all invention :)
 
You would rather do all that than grab an HDMI cable, and it's the Graphics Driver people that are odd?? LOL
 
Lazy? I'm not sure that word means what you think it means.
 
I still have to use xrandr on a daily basis to coerce my external monitor and my laptop to play nicely together. I want the external to the left of my laptop and sometimes docking and undocking doesn't go as smooth as it might and I have to tell the system what state I'm in. So xrandr still has its uses I would say.
 
on the bright side I now know what possessed the makers of my laptop to give me a display port but not hdmi or dvi.
 
bad memories of x11 config files, indeed.
 
woof........... nice conclusion. They are really odd and off-putting.
 
Reading this post reminds me of the "space pen" punchline -- "The Russians used a pencil."
 
Lol informative and funny thanks Linus. :-D
 
+1 for xconfig flashbacks of horror alone.  Just needed to add punched cards and I'd have to call an ambalance :).
 
+Glenn Snead the old story of the pencils just isn't true i have read. pencils give off carbon dust that could short electronics, so they can't be used.
 
I am going to get Sony Vaio S laptop having nvidia gt 640m le video card. Wondering will I be able to use it with my HP 2560x1440 monitor through hdmi on that resolution? I use Linux Mint 14.
 
Takes me back to when I was getting Sun Microsystems CRTs to run off of a PC VGA card.  *shudder*

Definitely going to be dealing with some PTSD issues today.  Time to hit the rum!
 
+birger monsen mine all got made into areplanes with a paper clip and turned down nose so a couple of elastic bands could fire them.  Could go thru a cola can with 3 paper clips attached.  Frisbee net that followed was never as good for fun.
 
That gave me a good laugh...because I would rather twiddle with the bits than get out of my chair and look in the next room for the proper cable as well!  :o)
 
Typical of software guys - they hate messing with hardware... Even if it just a cable underneath papers on one's desk.

+Linus Torvalds let me know when you get tired of the 30Hz and get one of your daughters to switch the cable for you.
 
I think the answer lies in an overelaborate, poorly-written shell script that kinda works most of the time, but does all that in one step.
Of course, there's the problem of how once one has written such a thing, the problem is solved forever, so one no longer cares, dooming humanity to an endless cycle of rediscovering obscure documentation and re-solving the problem time and time again.
Ha ha, only serious.
 
I wished there were GTF when I was configuring X11 for odd ball graphic formats. Haven't killed any monitors (yet).
 
I got one of the $400 AURIA 2560x1440 monitors. Turns out the 8400GS laptop is happy to drive it at 60Hz over VGA. 240MHz dot clock means "short, well-made cables" though.
 
+John Glotzer  unfortunately the only "fix", that I am aware of, for radeon's KMS problem is to switch back to Ubuntu 10.04(if you are using that distro) and use UMS.
 
I thought that the mode line days were gone long time ago, but I must have been misinformed. 
 
Old x11 features never die, they are just subsumed into the kruft.
 
Mmm, yeah, the cable store is right down the street. ;)
 
Is this a joke?  One for my step-son who loves Linux.
 
xrandr was intentionally designed to be cryptic and hard to use. Either that, or I'm too lazy to write the obvious code to make setting a custom mode easy. Besides, don't you feel awesome for having made it work?
 
+Keith Packard in this case it was actually funny and interesting to do, and I didn't mind playing around with my new monitor settings.

But I've been on the road with a laptop that cannot sync to the external display because there's some stupid lack of EDID information or something, and then the whole thing is just a f*cking disgrace. At that point, you don't want to dick around with generating VESA timings (the biggest problem is remembering what the program(s) to do it are called) etc.

So then it really would be good to have a mode where you say

    xrandr --output=VGA --gtfo=800x600@60

or something like that. 

Of course, I could hope for gnome to just have a nice "custom mode" graphical setting thing for the display setting multi-monitor support, but that would be too complicated for users, so they've made it easy for people by making it impossible to do anything useful. Par for the course.
 
+Vassil Panayotov been on Fedora way too long to change this used to work in Fedora as well I think - for me the nvidia driver allows me to work/play with no issues. So for now I'm not blocked. Also you said "radeon" - your issue was with radeon but mine is with nouveau - just for clarity.
 
I rather read a joke that ends with a "and then the priest said", but this is good to know.
 
Actually, I run dual monitors. The first is DVI and the second is VGA. My graphics card belongs in a retirement home so it doesn't get EDID information for VGA monitors. Whenever I use gtf to generate a modeline, I always get a distorted picture. So now, I use PowerStrip on windows to generate a X11 modeline and that works perfectly.

As for the answer to your comment, have you tried putting the custom mode (and the lines to use said modeline) in xorg.conf?
 
Don't know what the f**k you're talking about but it sounds amazing. You're they guy who I want to set up my surround sound.
 
That get a mac comment had me in stitches cuz i cant fugure out if its sarcasm or not.
 
Bringing back bad memories of having to mangle modelines to get monitors to behave.
 
Linus, I'm sure xrandr people would gladly accept a patch, though you'd probably have to make it something like "Please switch that output to this arbitrary resolution 2560x1440@30, yes I mean it, No I'm not an idiot" for it to be accepted.
 
Do "lazy gits" spend that much time in the terminal? :)
Ken Yee
 
Get a nice GTX 670 graphics card so you can do OpenCL on Linux :-)
 
You have way to much time on your hands! Just buy the right equipment when buying the new monitor? There is a thought.... Can you decide this? ...M..O...R....O....N!
 
i just + without even reading Linus's posts!
god of Linux!
 
"GTF" is just a few keys away from "WTF."
 
Good trick.. but should get the right cable. You wouldnt drive a super car on homemade bio fuel other than extrme needs.
 
Hhaha I have done this before!! My old monitor was CRT ultra mega old with 1024x768 at 75Hz, but I like force 1280xIcantremember, so, I just had to play lowering to 60Hz to get that (xrandr doesn't report me that config, so, I had to do the same thing)
 
lol linus you just madea fool of a lot of driver peeps and did it in still !!!
 
+Linus Torvalds
I always meant to hook up cvt (the Coordinated Video Timings spec replaced the General Timing Formula in 2002, according to wikipedia)  bits inside xrandr and make it all easy to do, but I also expected the GUI tools to take over and replace my lame command line tool (which was intended solely to make it possible for me to debug the RandR extension). Maybe someone will submit code that does this now. I know I don't care enough to bother myself; my laptop has DisplayPort, and my monitors do as well, which is plenty for 2560x1600.
 
I wonder how long it will take for Linus to realize that the headache he currently has is the result from starting at a 30 hz refresh rate. 
 
+Art Mazur I didn't think that applied to LCD panels the same as it does to CRTs. Am I misinformed?
 
:D Brilliant stuff :) Like the olden days with sync-on-green monitors and so on (let us forget them..)
 
Lol, he said "Graphics driver people are an odd bunch". Haha you are a linux kernel people your self.
 
I love this post!  Your humor is unmatched.
 
+Andrew Dieffenbach seriously? You really shouldn't make little threats like that. Love to meet you:) maybe you need a good F$&@ or maybe you've never even had it yet but anyhow real tuff saying things via G+ , lets have a drink sometime and see how that goes ...kid!
 
Refresh rate affects LCDs in a very different way. You get color drift if you don't refresh often enough, and of course motion can be jerky.
 
Most monitors can easily do 24 Hz without visible drift though.
 
Modelines got you down? Sax2 to the rescue! Those were the days.

I actually have a plasma tv that over scans too much. I had to cook up a modeline for it and it really was PTSD. 
 
sounds like another bird to NVidia 
Translate
 
Better get that HDMI cable from anywhere in town; I bet it would be much faster for regular ppl than do what Linus suggested :)
 
Perfect.  Now will someone post this "how to" up on tldp please?  ;)
 
I used to use (I still own it) an old laptop whose panel developed a crc error on EDID data (eeprom broke) and even its own BIOS POST screen was showing in 1024x768... top-left corner on a 1280x800 panel - the rest to the right and down was garbage (actually, last vertical and horizontal line repeated).
I fixed it in X config (whichever way it was trendy in Ubu10.04 days- I vaguely remember I didn't explicitly use GTF, just added the general whiff of the mode 1280x800@60 there, did not need modeline.). And it ran so happliy, till it went into storage after being replaced by a new(er) machine...
 
a) look stupid
b) get a new videocard ....... mother board ...... mem / cpu ......... AHHHH CRAP get a new computer 
 
I have no idea what I just read, but read every word I did
 
+Linus Torvalds this is funny, I've published a post about this, after my beloved LCD stopped working and had to get back to my 17" CRT for a few days. In my case, xrandr showed me 1024x768@60 or 1024x768@85. The first was killing my retina, the second as killing the colors. So I applied this to get a cool 1024x768@75. The only difference is that I got the timings with the cvt command.

+Keith Packard we love xrandr still :-)
 
Had something similar, I made it work through VGA with 1920x1440@50 ... I'll try full resolution at 30Hz next time !
 
Even if you're buying gold plated special salesman hdmi cables, surely your time is worth more than this?
Mitch C
 
The things we do just to make due with what we have at hand.

-BTW, Loved the name drop in SWORDFISH!
 
So, if I'm comfortable working with 15 cycles per second, does this mean we can quadruple the output resolution a cable can carry!?

This is insane, especially if the principle was applied to something that could pass a 240hz signal at a high resolution.
 
I never knew single link dvi would work that way...
 
I'll try it for studying this trick, but I think it is easier to find a hdmi cable. 
 
Alas, an HDMI cable wouldn't have fixed your problem anyway, given that it's limited to the bandwidth of single-link DVI + some small amount for a SPDIF audio channel.

This is one of the major reasons everybody's slowly-but-surely migrating to DisplayPort; it can handle much higher output resolution/refresh rate combinations due to a combination of faster maximum data rate (17.28 Gbit/sec for all 4 lanes pulling in tandem vs. 10.2 Gbit/sec for HDMI) and a much reduced amount of bandwidth per pixel needed. Since DP isn't carrying all the legacy baggage of the VESA framebuffer along with it, it can get away with just shoving a bunch of image data as packetized chunks that can be lightly compressed for further efficiency.

(For the pedants, yes, I know that HDMI v1.3 and above supports much higher resolutions, but that requires the use of the new type B connector, which you're not likely to find on a probably-not-very-new graphics card.)
 
All of u bullshit &uiors femeli butavory in u.faken live!
 
+Dax Kelson You can still put this sort of thing in your /etc/X11/xorg.conf

Section "Monitor"
        Identifier      "Acer 24"
        Modeline "1920x1200-25r"   63.00  1920 1968 2000 2080  1200 1203 1209 1215  +HSync -Vsync
        Modeline "1920x1200-50r"  127.75  1920 1968 2000 2080  1200 1203 1209 1229  +HSync -Vsync
EndSection


... a few of us probably remember when you didn't get a picture unless you had a section like this in your XF86Config (or whatever it was called).  You'll also need a corresponding "Device" and "Screen" section I think:

Section "Device"
        Identifier      "Ati Radeon R300"
        Driver          "radeon"
        Option          "Monitor-DVI-0" "Acer 24"
EndSection


Section "Screen"
        Identifier      "Default Screen"
        Device          "Ati Radeon R300"
        Monitor         "Acer 24"
        DefaultDepth    24
"1280x1024" "1024x768" "800x600" "640x480"
        EndSubSection
        SubSection "Display"
                Depth           24
                Modes           "1920x1200-50r" "1920x1200-25r" "1280x1024" "1024x768" "800x600" "640x480"
        EndSubSection
EndSection

Section "ServerLayout"
        Identifier      "Default Layout"
        Screen          "Default Screen"
        InputDevice     "Generic Keyboard"
        InputDevice     "Configured Mouse"
EndSection

, but that should get you started.  Ahh, the nostalgia...
 
I didn't consider myself a true Linux user until I went to a dual-monitor setup and had to edit xorg.conf to rotate my left screen to portrait mode.
 
Epic post.
I remember messing with the modelines trying to get my NAGA to do 800x600@60hz instead of the rated 56hz (yep, the brand was actually NAGA). This was by hand btw, like a real man.
Man those were some ugly noises when I went too far... I guess that's why my beloved NAGA ended up all blurry.
 
Sounds like we need a script to combine all that into one easy to use wonder (for the TRULY laze git out there)
 
+Linus Torvalds  The easiest thing would be order HDMI cable on the internet , sitting right where you were. :) Good post though.
 
I was hand tuning modlines only 2 years ago.
 
Wow! This is a nice trick for a tech survival manual...


 
+Alex Wilson you're thinking about it wrong. Linux has already won the next front and it is a consumer product in the hands of millions. Android cell phones are Linux.

Just because you and I know the frustrations of trying to rebuild a kernel, it isn't a right of passage. The OS as a consumer serviceable part are coming to an end. Linux, because of its roots, will always be accessible, but the way a consumer is going to be able to tinker will become more difficult to execute or impractical.

The Linux on the desktop arguments hides the bigger picture. It doesn't matter if Linux on laptops and desktop PCs hasn't gained a larger userbase. It has attracted enough of the right people to build the next generation of devices.

There will always be another distribution just as Red Hat gave way to Canonical, so too has Canonical given way to Google. Canonical is pushing Unity across your laptop, desktop, and phone, just as Microsoft is trying to do with Windows 8; although +1 for those gestures. For this generation of devices, Google is quickly asserting dominance with its Android distribution. Time will tell if Android continues to be the number one distribution or if the Ubuntu phone gains any traction.

The next battle isn't Windows vs. Linux, it's Android vs. the world -- Linux won.
 
This is the most interesting post about the most mundane thing I've ever read.
Jim B
 
How come we could drive our old 21" CRT's at resolutions higher than 1920x1080 over a simple VGA cable, yet now with all the fancy monitors, better electronics etc.... it seems no longer possible unless you trade in some functionality.
 
Honestly, I LOLed at the "lazy git" bits :-D
 
I think you are alien. And you are trying to brain wash us.
Translate
 
This was my No.1 reaction when I first started using Linux.
Why the unnecessary complexity of both framebuffer and X11 setup, when it comes to choosing a graphics mode?
I had come from a DOS world where the hardware was much messier, but the configuration still only took a few seconds.
Since every program had to be configured, time was put aside to reduce unnecessary steps.
 
Hummmm ... I think I'll go out and buy a brand new HDMI cable ..
Translate
 
<rage type="wierd" severity="low" reason="microsoft lackey"> +Alex Wilson
At leat in linux you can fix the problem. Windows users just forgive bill gates for some reason... Anyway, the average windows user doesn't eveen know what an HDMI cable is, let alone how to use one.. </rage>
 
You saved my day! The handbook of my shiny new Dell Ultrasharp Monitor said it couldn't support full resolution over HDMI (nor over VGA) whereas my Zenbook doesn't have DVI and anyway officialy supports 1080 only. But hey, at least that's FULL HD! It's 90's again.
 
wow.  this is old as dirt but i just got said monitor at said resolution and this is what i'm going to do since i only have HDMI on the laptop and not displayport.

thanks!!
 
Thanks a million times, I was about to weig options such as buying a new computer and other nonsense just because of hte new monitor and inability of my laptop to display this resolution, you really saved me a lot of nerves and time, I'm gonna buy you a Slovenian beer the next time I see you!
Add a comment...