Comment 13 for bug 217908

Revision history for this message
In , Gavin McCullagh (gmccullagh) wrote :

(In reply to comment #11)
> Then I loaded the site
> http://www.carteretcountyschools.org/bms/teacherwebs/sdavenport/artgallery6.htm

Something strikes me about the way firefox (or is it Gecko?) deals with this page. The real cause of this issue is that the page contains a number of massive images, which are embedded at small resolution in the page. We all know this is down to bad web page authoring, but obviously firefox has to deal with it. My laptop slows to a crawl loading this page.

Arguably the reason for the problems in that webpage is that firefox uses the video card to do the resizing -- therefore caching a huge quantity of image data which it will never actually display (unless the person chooses to view the image which is probably quite rare).

I realise the video card will do the resize more quickly, but if firefox resized the image in software, the X server's RAM usage would be fairly minimal as it would be caching the data which was actually needed. There is often a balance between cpu and ram usage, this seems to be one of them.

I would therefore be inclined to suggest an optional feature (particularly for remote X servers or low ram situations) where firefox would resize the image in software and send the resized pixmap to the X server. Perhaps this would only be used where the original image was above a certain size -- or where the display size was under some fraction of the image size (eg less than half the width and height).

The X server seems often to do a simple resample, rather than a resize, so I would expect firefox to do the same.