What a pleasant weekend I've had; no nasty meetings or politics, just some good clean hacking fun.

I spent a few hours each day poking at the modesetting branch and getting it working on my shiny new 965-based desktop system. Eric had been working on the SDVO support and gotten that working, so I figured I'd at least get the CRT output working, which seemed like an easy enough task.

The BIOS-based modesetting code was already working, so I knew the hardware worked correctly. But, our existing CRT modesetting code was producing a nice black screen. I love modesetting code—the most common error indication is just 'sadness'; the monitor remains black and indicates that there is no signal present on the wire.

I poked around looking at what the BIOS did and how that differed from what the modesetting driver did and made a small bit of progress thanks to an accident. I left the video clock programmed for 1600x1200 and then asked the server to display a 640x480 mode. One would expect this would leave the video mode running far too fast. But, to my surprise, the monitor happily locked onto this and reported that it was running at 85Hz. Weird. A bit of math and I discovered that somehow the clock was getting divided by 4 somewhere. Sure enough, simply multiplying the real clock by 4 left me with stable modes across a wide range of sizes. Unfortunately, not including the high-resolution modes loved by our users; those were now out of the reach of the programmable clock. But, it made the question of where the problem was a bit clearer—something was wrong with the clock register programming.

The next accidental discovery was that in pure clone mode, with both CRT and DVI connected to the same pipe, I also managed to see a working mode for the lower resolutions (640x480 and 800x600). Higher resolutions still failed. It's important to note that the DVI connector is reached through the SDVO port, which must run at high frequencies. Low frequency modes are padded with junk and clocked faster to keep the bus stable. For 640x480 and 800x600 modes, the clock is multiplied by 4.

While I had looked at the register results for the BIOS mode setting, I hadn't seen it in action. Fortunately, action shots were available.

Dave Airlie hacked up Matthew Garrett's vbetest program and left it on here. This fine piece of work executes the video bios and monitors all of the video device register accesses it performs. Watching it live allowed me to see precisely which registers it thought were related to clock timing. I noticed that it set the DPLL_A_MD register when programming a pure CRT mode, something which seemed a bit odd to me as that register has rather vague documentation about UDI and SDVO outputs. But, one small sentence did mention CRT multipliers of some sort, so I figured I might as well give it a try. Stealing the same setting that it used, the CRT now locked nicely using the normal un-multiplied clock frequency, and worked across the whole range of modes.

Thanks Dave, thanks Matthew.

The final adventure for the weekend was to discover why my screen image was getting corrupted when I used a large frame buffer. The effect was quite mystic—contents written to one location in memory would be duplicated to many locations on the screen. I thought it might be fifo size issues, but exploration with an application window and a window manager demonstrated that the corruption was not just on the screen, but actually visible to the GPU and CPU as well, and appeared to be caused by multiple GATT entries mapping to the same physical page. I eventually disocovered that just skipping the first 256K of video memory and not using fixed the problem; I haven't looked into this in more detail, but it seems likely that those areas are actually mapped to the GATT table itself, and using them for other things caused the symptoms observed above. For now, I've just made the driver skip over that amount of memory; it fixes the problem I had.

I also spent a bunch of time shrinking the driver to eliminate a bunch of redundant state. We're planning on moving all of the initial frame buffer configuration to common RandR code shared across drivers, so I went ahead and disabled the Intel-specific code in the driver. Yes, this means that there's no way to configure screen layout when you start the server, but you can use the RandR extension afterwards to make it do whatever you like. It's temporary, eventually we'll get the common code working. Probably the first time X has had this kind of state which can only be set through the protocol and not in the config file.

As Eric made a merge that broke things before he took off for the weekend (strong work, Eric), I've placed my work on a separate branch for merging this week sometime. Everything here is on the modesetting-keithp branch in the xf86-video-intel repository.