[ltp] T520 Ubuntu 11.10 x64 - can't connect external monitor

Marius Gedminas linux-thinkpad@linux-thinkpad.org
Mon, 6 Feb 2012 04:50:20 +0200

Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
Content-Transfer-Encoding: quoted-printable

On Sun, Feb 05, 2012 at 12:47:42PM -0500, Fen Labalme wrote:
> 2012/2/4 Marius Gedminas <marius@gedmin.as>
> >
> > > Will the VGA be able to drive a 1920x1200 display?
> >
> > Why not?  My Asus EeePC 900 (ancient Intel 945) drives a 1920x1080p TV
> > over VGA with sufficient 3D for all those Compiz effects.
> I dropped back to running with BIOS set to Intel Integrated Graphics and
> was able to get a 3200x1200 virtual display with the external monitor
> attached via a VGA cable (and hotplug basically works):
> 1600x900 (T520) + 1600x1200 (Samsung) =3D 3,840,000 pixels total
> This is lower than the native (1920x1200) resolution on Samsung (and looks
> horrible).

Do you know why this is?

What if you tried to arrange the monitors one above the other instead of
side by side?  What if you disabled the internal display and used only
VGA out -- could you use 1920x1200 then?

> What I want is a 3520x1200 virtual display:
> 1600x900 (T520) + 1920x1200 (Samsung) =3D 4,224,000 pixels total
> This is upsetting as the maximum resolution available from the Intel GPU =
> 2560x1600 =3D 4,096,000 pixels [URL1]

> URL1:
> http://software.intel.com/en-us/articles/quick-reference-guide-to-intel-p=

Hey, you already got more than that!

That URL specifies the maximum resolution over a DisplayPort connection,
not the limit of the GPU itself.

> Question 1: Am I thinking about this right?  Do those numbers confirm that
> I can't get a virtual screen the size I want using only one or the other
> GPU?

No, those numbers don't limit the virtual desktop size, just the size of
each screen.

> There's at least one clue that this is not the case from xrandr:

> > xrandr -q
> Screen 0: minimum 320 x 200, current 3200 x 1200, maximum 8192 x 8192

That looks more like a GPU limit.

> (BTW: I had to remove the xorg.conf during the switch from Discrete to
> Integrated graphics, and even after fooling with the setup a while, no new
> xorg.conf file was created.)

Right, that's normal.  The open world of Linux graphics has progressed
to the point where almost everything is autodetectable, hotpluggable,
and configurable at runtime, so Xorg.conf is rarely needed.

NVidia's proprietary drivers are still stuck in the past in that sense,
(the though goes, I believe, because why waste money on engineering
when things already work sufficiently well for them).  They do get
better 3D performance with fewer bugs, at the cost of poorer integration
with the rest of the system.

> System specs:
> T520 with i7-2760QM running Ubuntu 11.10
>    Intel HD 3000 Integrated GPU
>    Nvidia NVS 4200M w/ Optimus and 1GB DDR3 Memory
> Thinkpad monitor has 1600x1200 display (HD+)
> Samsung monitor has 1920x1200 display

Marius Gedminas
The rest of the world will have to be educated by Microsoft's paperclip
or the DancingGnu (a still to be written Emacs AI tutor for beginning
users), I'm afraid.
	-- Markus Kuhn suggests an Emacs alternative to Vigor and Clippy

Content-Type: application/pgp-signature; name="signature.asc"
Content-Description: Digital signature

Version: GnuPG v1.4.11 (GNU/Linux)