[ltp] Getting nVidia to clone display on a projector

Micha Feigin linux-thinkpad@linux-thinkpad.org
Mon, 22 Nov 2010 23:23:52 +0200


On Mon, 22 Nov 2010 11:16:23 -0800
John Jason Jordan <johnxj@comcast.net> wrote:

> On Mon, 22 Nov 2010 11:44:33 +0200
> Micha Feigin <michf@post.tau.ac.il> dijo:
> 
> >It works for me with the nvidia driver using nvidia settings. You do
> >need to make sure that you are working withing the projector specs.
> >Most projectors these days can handle 1024x768
> >
> >I set the second screen to twinview mode and then set both screen to
> >1024x768 duplicating each other and things never failed to work yet
> >(apart for the problem of very partial support for embedded video
> >playback in pdf presentation created using beamer).
> 
> Thanks for the reply. I have made some progress since my original post,
> but I am not quite there yet. That is, I can get my laptop display to
> appear on the wall with the projector, but I can't yet get an
> acceptable resolution. The resolution is very important, as the
> presentation I am going to give is on open source desktop publishing. I
> need to squeeze pixels out of the projector until it screams.
> 
> First, to review, I have a T61 with nVidia Quadro NVS 140M, which
> normally runs the 15.4" laptop screen at 1680x1050.
> 
> The projector in the classroom is a Hitachi CP-SX1350. Hitachi specs
> say its native resolution is 1400x1050 and maximum resolution is
> 1600x1200. My (dim) understanding of the terms "native" and "maximum"
> is that it can interpret data from a computer up to 1600x1200, but will
> project no more than 1400x1050. Thus, if I were to set my computer
> screen down to 1600x1000 (preserving the 16:10 widescreen aspect
> ratio), the projector would accept this and then squash it down and
> project 1400x875, with a black band on the top or the bottom for the
> missing pixels (1050 - 875 = 175).
> 
> As an alternative, I could set my computer display down to 1400x875,
> and then my laptop display and the projector display would be the same.
> Either way, the maximum resolution I can project is 1400x1050.
> 
> The problem that remains, however, is getting my laptop screen to
> display anything other than 1680x1050. I am currently using the latest
> nVidia driver with Fedora 13, x86_64, and it allows me to select:
> 
> 1680 x 1050
> 1280 x 1024
> 1280 x 960
> 1152 x 864 ... etc.
> 
> Once you install the nVidia driver, going back to the nouveau driver is
> a pain. So I experimented instead by booting to a live CD of Ubuntu
> Lucid. This uses the nouveau driver by default. The screen came up at
> 1680x1050, but the monitor settings utility allowed me to select from:
> 
> 1680 x 1050
> 1400 x 1050
> 1280 x 1024
> 1280 x 960 ... etc.
> 
> While booted to Lucid I set the video to 1400x1050 and the screen
> worked fine, although the aspect ratio was wrong. I did that just to
> prove to myself that the screen itself can handle 1400x1050, even if
> the nVidia driver doesn't think it can.
> 
> Installing the nVidia driver creates an xorg.conf file. Toward the
> bottom of the file it lists modes as "nvidia-auto-select." I tried
> commenting out that line, then adding my own modes, and restarting X.
> The nVidia video settings utility still stubbornly refused to allow me
> anything other than the list above. I also used cvt to create
> modelines for other resolutions, then pasted them into the xorg.conf
> file, but it still made no difference. I also tried setting the
> resolution from the command line with xrandr, but the only resolutions
> it would change to were the nVidia settings. If I tried (e.g.) "xrandr
> -s 1400x1050," it would just give me an error that the resolution was
> not possible.
> 
> It is odd that installing the nVidia driver created an xorg.conf file,
> and then the driver ignores it.
> 
> I am trying to figure out where the nVidia driver is getting its list
> of resolutions from. It's certainly not taking them from the xorg.conf
> file. I assume that the nouveau driver got its list of resolutions by
> querying the monitor for its EDID data. If so, then the nVidia driver
> could not be getting its list of resolutions from EDID, else it would
> also offer 1400x1050.
> 
> When I downloaded the driver from nvidia.com I noted that the same
> driver works with quite a list of different cards. I am guessing that
> nVidia hard coded the driver to list only resolutions that all those
> cards were capable of. The driver is 260.19.21. If someone else is
> using that driver with a different size screen than mine, it would be
> interesting to see what resolutions the nvidia-settings utility offers.
> 
> Meantime, any other suggestions, corrections and observations are
> welcome. 

First of all, you are going at it wrong. Native resolution means that this is
the amount of actual pixels on screen. If you do anything else you will get
interpolation, but it will try to use the whole resolution. It won't leave a
black band (as a general rule, under some platforms it will agree to do so when
running 4:3 resolution instead of 16:10, but it's rare), but rather it would
distort the image.

If you are giving a presentation, it is generally better to get a distorted
image on the laptop rather than on the projector.

Also, if you really need that resolution on a projector you are missing the
point, as people won't be able to see it. It is an expensive projector, but
still, sitting 30' away from the screen, you can't discern that resolution.

NVIDIA is not hard coding the driver to use resolutions that all cards
support. All those cards use a very similar hardware, and you have several
driver files in there. What happens is that the card gets a list of supported
resolutions and refresh rates from the projector / screen and gives you the
list that it receives and that coincides with it's supported refresh rates. You
can set a range of refresh rates in the monitor section and it rules out stuff
that is not there.

Yhere was a tool and method to create a custom edid to fool the driver, but I
can't remember the specifics.

Try to have a look here if it helps:
http://comments.gmane.org/gmane.linux.hardware.thinkpad/38337

It was needed at the time for 1024x768 on the internal monitor, but at the moment with
the current xserver, nvidia driver and bios I don't have any trouble. Following
is my xorg.conf file if it helps.

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
EndSection

Section "Files"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

Section "Extensions"
    Option         "Composite" "Enable"
EndSection