[ltp] Getting nVidia to clone display on a projector

John Jason Jordan linux-thinkpad@linux-thinkpad.org
Mon, 22 Nov 2010 15:13:47 -0800


On Mon, 22 Nov 2010 23:23:52 +0200
Micha Feigin <michf@post.tau.ac.il> dijo:

>> The problem that remains, however, is getting my laptop screen to
>> display anything other than 1680x1050. I am currently using the
>> latest nVidia driver with Fedora 13, x86_64, and it allows me to
>> select:
>> 
>> 1680 x 1050
>> 1280 x 1024
>> 1280 x 960
>> 1152 x 864 ... etc.
>> 
>> Once you install the nVidia driver, going back to the nouveau driver
>> is a pain. So I experimented instead by booting to a live CD of
>> Ubuntu Lucid. This uses the nouveau driver by default. The screen
>> came up at 1680x1050, but the monitor settings utility allowed me to
>> select from:
>> 
>> 1680 x 1050
>> 1400 x 1050
>> 1280 x 1024
>> 1280 x 960 ... etc.
>> 
>> While booted to Lucid I set the video to 1400x1050 and the screen
>> worked fine, although the aspect ratio was wrong. I did that just to
>> prove to myself that the screen itself can handle 1400x1050, even if
>> the nVidia driver doesn't think it can.
 
>First of all, you are going at it wrong. Native resolution means that
>this is the amount of actual pixels on screen. If you do anything else
>you will get interpolation, but it will try to use the whole
>resolution. It won't leave a black band (as a general rule, under some
>platforms it will agree to do so when running 4:3 resolution instead
>of 16:10, but it's rare), but rather it would distort the image.
>
>If you are giving a presentation, it is generally better to get a
>distorted image on the laptop rather than on the projector.



>Also, if you really need that resolution on a projector you are
>missing the point, as people won't be able to see it. It is an
>expensive projector, but still, sitting 30' away from the screen, you
>can't discern that resolution.

Actually, the classroom is about 30 x 75 feet, with the projector
screen on the long side and two rows of tables. Most people will be 10
to 20 feet from the screen. 

But that is not important. I think I misled you when I said I needed as
much resolution as I can get. What I really need is screen real estate.
I will be using a DTP program displaying two letter-size pages side by
side, plus I will have three or four dialog box windows on top. The
less I have to drag things out of the way so the audience can follow
what I am demonstrating, the better. It's not a question of sharpness
of the image - a little fuzziness is OK. It's a question of needing to
have a lot of things on the screen at the same time.

>NVIDIA is not hard coding the driver to use resolutions that all cards
>support. All those cards use a very similar hardware, and you have
>several driver files in there. What happens is that the card gets a
>list of supported resolutions and refresh rates from the projector /
>screen and gives you the list that it receives and that coincides with
>it's supported refresh rates. You can set a range of refresh rates in
>the monitor section and it rules out stuff that is not there.
>
>Yhere was a tool and method to create a custom edid to fool the
>driver, but I can't remember the specifics.
>
>Try to have a look here if it helps:
>http://comments.gmane.org/gmane.linux.hardware.thinkpad/38337

I had already found that page while googling earlier. I didn't try it
because I have Windows only installed in Virtualbox, not as a dual
boot. I could run the program in Windows under Virtualbox, but heaven
only knows what it might report, if it reports anything at all. Plus,
it looked to complex for me. I'm not very smart.

>Section "Monitor"
>    Identifier     "Monitor0"
>    VendorName     "Unknown"
>    ModelName      "Unknown"
>    HorizSync       28.0 - 33.0
>    VertRefresh     43.0 - 72.0
>    Option         "DPMS"
>EndSection

That is what the xorg.conf file created by the nVidia installer says on
my computer too. So you're saying the reason I get the option of
1680x1050, and nothing lower until I go all the way down to 1280x1024
is because the nVidia driver decided that the intervening resolutions
wouldn't work at the above sync and refresh rates? 

That doesn't make a lot of sense to me. Higher resolutions normally
require lower sync and refresh rates as you approach the upper limit of
what the video card can do. So if it can do 1680x1050, it should have
no problem doing any lower resolution. Plus, it doesn't explain why the
driver thinks it's OK to run the screen at 4:3 ratios instead of 16:10.
And it doesn't explain why the nouveau driver will run the screen at
1400x1050 without a problem. I'd switch to the nouveau driver for the
presentation, but with the nouveau driver I can't get xrandr to see any
external monitors or projectors. It's as if the cable was not
connected. 

I'm not saying the driver is not polling the laptop screen for EDID
information. I'm just saying that if it is doing so, it's doing a damn
bad job of interpreting the data.