I have the same problem, nvidia closed source driver (295.20) apparently prevents my screen from being recognized properly in gnome-control-center, while it's recognized when using the nouveau driver.
I also noticed that the EDID information does not show up in xrandr...:
$ xrandr -q --verbose | head -20
xrandr: Failed to get size of gamma for output default
Screen 0: minimum 320 x 175, current 1920 x 1200, maximum 1920 x 1200
default connected 1920x1200+0+0 (0x161) normal (normal) 0mm x 0mm
Identifier: 0x160
Timestamp: 27868
Subpixel: unknown
Clones:
CRTC: 0
CRTCs: 0
Transform: 1.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000
filter:
1920x1200 (0x161) 115.2MHz *current
h: width 1920 start 0 end 0 total 1920 skew 0 clock 60.0KHz
v: height 1200 start 0 end 0 total 1200 clock 50.0Hz
1920x1080 (0x162) 105.8MHz
h: width 1920 start 0 end 0 total 1920 skew 0 clock 55.1KHz
v: height 1080 start 0 end 0 total 1080 clock 51.0Hz
1680x1050 (0x163) 91.7MHz
h: width 1680 start 0 end 0 total 1680 skew 0 clock 54.6KHz
...whereas it's correctly recognized in nvidia-settings (Apple Cinema HD) and can be acquired and later decoded with parse-edid:
I have the same problem, nvidia closed source driver (295.20) apparently prevents my screen from being recognized properly in gnome-control- center, while it's recognized when using the nouveau driver.
I also noticed that the EDID information does not show up in xrandr...:
$ xrandr -q --verbose | head -20
0. 000000 1.000000 0.000000
0. 000000 0.000000 1.000000
xrandr: Failed to get size of gamma for output default
Screen 0: minimum 320 x 175, current 1920 x 1200, maximum 1920 x 1200
default connected 1920x1200+0+0 (0x161) normal (normal) 0mm x 0mm
Identifier: 0x160
Timestamp: 27868
Subpixel: unknown
Clones:
CRTC: 0
CRTCs: 0
Transform: 1.000000 0.000000 0.000000
filter:
1920x1200 (0x161) 115.2MHz *current
h: width 1920 start 0 end 0 total 1920 skew 0 clock 60.0KHz
v: height 1200 start 0 end 0 total 1200 clock 50.0Hz
1920x1080 (0x162) 105.8MHz
h: width 1920 start 0 end 0 total 1920 skew 0 clock 55.1KHz
v: height 1080 start 0 end 0 total 1080 clock 51.0Hz
1680x1050 (0x163) 91.7MHz
h: width 1680 start 0 end 0 total 1680 skew 0 clock 54.6KHz
...whereas it's correctly recognized in nvidia-settings (Apple Cinema HD) and can be acquired and later decoded with parse-edid:
$ parse-edid < ~/edid.bin
parse-edid: parse-edid version 2.0.0
parse-edid: EDID checksum passed.
# EDID version 1 revision 3
Section "Monitor"
# Block type: 2:0 3:ff
# Block type: 2:0 3:fc
Identifier "Cinema HD"
VendorName "APP"
ModelName "Cinema HD"
# Block type: 2:0 3:ff
# Block type: 2:0 3:fc
# Block type: 2:0 3:0
# DPMS capabilities: Active off:yes Suspend:no Standby:no
Mode "1920x1200" # vfreq 59.950Hz, hfreq 74.038kHz
DotClock 154.000000
HTimings 1920 1968 2000 2080
VTimings 1200 1203 1209 1235
Flags "-HSync" "+VSync"
EndMode
# Block type: 2:0 3:ff
# Block type: 2:0 3:fc
# Block type: 2:0 3:0
EndSection
I don't know whether it's worth saying, but ICC profiles can be applied via the command-line, although with some warnings:
$ dispwin ~/AppleCinemaHD -20120304. icc
XRandR 1.2 is faulty - falling back to older extensions