[gutsy] Resolution autodetection regression on nvidia with digital flat panel.
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
displayconfig-gtk (Ubuntu) |
Fix Released
|
Undecided
|
Unassigned |
Bug Description
Graphics card: MSI nVidia GeForce 8500GT
Display: Samsung SyncMaster 226BW (DFP @ 1680x1050@60Hz)
Upon a fresh install of Ubuntu 7.10 Tribe 5, resolution detects fine (1680x1050 and lower are available.) After installing and enabling nvidia-glx-new through the restricted driver manager and restarting, the proper resolution persists, and I verified it was configurable to that resolution through displayconfig-gtk. However, after dist-upgrading (as of Sep19 23:49 EST) and restarting, GDM+Xorg came up in an incorrect resolution, and did not correct upon login. I fiddled with displayconfig-gtk, and was not able to set 1680x1050, even after explicitly setting "Generic LCD 1680x1050.)
I was successfully able to import my display's specifications from the .inf driver included on the CD that came with it using displayconfig-gtk's "add model," but it would still not allow me to set the resolution to 1680x1050 via the drop down box. I resorted to modifying my xorg.conf manually, where I had to remove all the entered modelines and add "1680x1050" to the mode list in the screen section in order to return my display to the appropriate resolution.
Should be fixed in the next release.