(In reply to comment #6)
> (In reply to comment #5)
> > This problem is confirmed in Ubuntu 10.04 when trying to open a 2.8 GB file. It
> > crashes everytime I try to open the file.
> >
> > lsb_release -rd
> > Description: Ubuntu 10.04.1 LTS
> > Release: 10.04
> >
> > apt-cache policy kwrite
> > kwrite:
> > Installed: 4:4.4.2-0ubuntu2
> > Candidate: 4:4.4.2-0ubuntu2
> > Version table:
> > *** 4:4.4.2-0ubuntu2 0
> > 500 http://us.archive.ubuntu.com/ubuntu/ lucid/main Packages
> > 100 /var/lib/dpkg/status
> >
> > Downstream bug may be found at:
> > https://bugs.launchpad.net/ubuntu/+source/kdebase/+bug/620789
>
> This is a joke, right? Do you even have enough RAM to perform this?
You don't need ten GB of ram to open that, with a text-editor built for being able to handle such large files. The solution is pretty easy: read the file into memory only in chunks.
> I don't see how somebody can have a 2.8 GB text file:
Database dump, Log-Files, text-file with losts binary data attached (I know lots of *nix-installers doing this). There are endless use-cases.
> Assuming an average A4 page has roughly 3 kB of data [1], a file of 2.8 GB
> would then be about 1 Million of A4 pages...
(In reply to comment #6) us.archive. ubuntu. com/ubuntu/ lucid/main Packages dpkg/status /bugs.launchpad .net/ubuntu/ +source/ kdebase/ +bug/620789
> (In reply to comment #5)
> > This problem is confirmed in Ubuntu 10.04 when trying to open a 2.8 GB file. It
> > crashes everytime I try to open the file.
> >
> > lsb_release -rd
> > Description: Ubuntu 10.04.1 LTS
> > Release: 10.04
> >
> > apt-cache policy kwrite
> > kwrite:
> > Installed: 4:4.4.2-0ubuntu2
> > Candidate: 4:4.4.2-0ubuntu2
> > Version table:
> > *** 4:4.4.2-0ubuntu2 0
> > 500 http://
> > 100 /var/lib/
> >
> > Downstream bug may be found at:
> > https:/
>
> This is a joke, right? Do you even have enough RAM to perform this?
You don't need ten GB of ram to open that, with a text-editor built for being able to handle such large files. The solution is pretty easy: read the file into memory only in chunks.
> I don't see how somebody can have a 2.8 GB text file:
Database dump, Log-Files, text-file with losts binary data attached (I know lots of *nix-installers doing this). There are endless use-cases.
> Assuming an average A4 page has roughly 3 kB of data [1], a file of 2.8 GB
> would then be about 1 Million of A4 pages...
So? Nobody said he's going to print this.