all messages for Emacs-related lists mirrored at yhetil.org
 help / color / mirror / code / Atom feed
* Handling large files with emacs lisp?
@ 2013-06-04 12:52 Klaus-Dieter Bauer
  2013-06-04 14:46 ` Jambunathan K
  2013-06-04 22:02 ` Stefan Monnier
  0 siblings, 2 replies; 4+ messages in thread
From: Klaus-Dieter Bauer @ 2013-06-04 12:52 UTC (permalink / raw)
  To: help-gnu-emacs

Hello!

Is there a method in emacs lisp to handle large files (hundreds of MB)
efficiently? I am looking specifically for a function that allows
processing file contents either sequentially or (better) with random
access.

Looking through the code of `find-file' I found that
`insert-file-contents' and `insert-file-contents-literally' seem to be
pretty much the most low-level functions available to emacs-lisp. When
files go towards GB size however, inserting file contents is
undesirable even assuming 32bit emacs were able to handle such large
buffers.

Using the BEG and END parameters of `insert-file-contents' however has
a linear time-dependence on BEG. So implementing buffered file
processing for large files by keeping only parts of the file in a
temporary buffer doesn't seem feasible either.

I'd also be interested why there is this linear time dependence. Is
this a limitation of how fseek works or of how `insert-file-contents'
is implemented? I've read[1] that fseek "just updates pointers", so
random reads in a large file, especially on an SSD, should be
constant-time, but I couldn't find further verification.

kind regards, Klaus

PS: I'm well aware that I'm asking for something, that likely wasn't
    within the design goals of emacs lisp. It is interesting to push
    the limits though ;)

------------------------------------------------------------

[1] https://groups.google.com/d/msg/comp.unix.aix/AXInTbcjsKo/qt-XnL12upgJ


^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: Handling large files with emacs lisp?
  2013-06-04 12:52 Handling large files with emacs lisp? Klaus-Dieter Bauer
@ 2013-06-04 14:46 ` Jambunathan K
  2013-06-05 10:47   ` Klaus-Dieter Bauer
  2013-06-04 22:02 ` Stefan Monnier
  1 sibling, 1 reply; 4+ messages in thread
From: Jambunathan K @ 2013-06-04 14:46 UTC (permalink / raw)
  To: Klaus-Dieter Bauer; +Cc: help-gnu-emacs


May be you can steal some stuff from here.

        http://elpa.gnu.org/packages/vlf.html

It is a GNU ELPA package that you can install with M-x list-packages
RET.



Klaus-Dieter Bauer <bauer.klaus.dieter@gmail.com> writes:

> Hello!
>
> Is there a method in emacs lisp to handle large files (hundreds of MB)
> efficiently? I am looking specifically for a function that allows
> processing file contents either sequentially or (better) with random
> access.
>
> Looking through the code of `find-file' I found that
> `insert-file-contents' and `insert-file-contents-literally' seem to be
> pretty much the most low-level functions available to emacs-lisp. When
> files go towards GB size however, inserting file contents is
> undesirable even assuming 32bit emacs were able to handle such large
> buffers.
>
> Using the BEG and END parameters of `insert-file-contents' however has
> a linear time-dependence on BEG. So implementing buffered file
> processing for large files by keeping only parts of the file in a
> temporary buffer doesn't seem feasible either.
>
> I'd also be interested why there is this linear time dependence. Is
> this a limitation of how fseek works or of how `insert-file-contents'
> is implemented? I've read[1] that fseek "just updates pointers", so
> random reads in a large file, especially on an SSD, should be
> constant-time, but I couldn't find further verification.
>
> kind regards, Klaus
>
> PS: I'm well aware that I'm asking for something, that likely wasn't
>     within the design goals of emacs lisp. It is interesting to push
>     the limits though ;)
>
> ------------------------------------------------------------
>
> [1] https://groups.google.com/d/msg/comp.unix.aix/AXInTbcjsKo/qt-XnL12upgJ



^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: Handling large files with emacs lisp?
  2013-06-04 12:52 Handling large files with emacs lisp? Klaus-Dieter Bauer
  2013-06-04 14:46 ` Jambunathan K
@ 2013-06-04 22:02 ` Stefan Monnier
  1 sibling, 0 replies; 4+ messages in thread
From: Stefan Monnier @ 2013-06-04 22:02 UTC (permalink / raw)
  To: help-gnu-emacs

> Is there a method in emacs lisp to handle large files (hundreds of MB)
> efficiently?

Not really, no.  The closest is vlf.el, available from the GNU ELPA.

> Using the BEG and END parameters of `insert-file-contents' however has
> a linear time-dependence on BEG.

I don't know of such a time-dependence.  What makes you think so?


        Stefan




^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: Handling large files with emacs lisp?
  2013-06-04 14:46 ` Jambunathan K
@ 2013-06-05 10:47   ` Klaus-Dieter Bauer
  0 siblings, 0 replies; 4+ messages in thread
From: Klaus-Dieter Bauer @ 2013-06-05 10:47 UTC (permalink / raw)
  To: Jambunathan K; +Cc: help-gnu-emacs

Oddly, when I tried today again, I saw constant time file access with ~
80MB/s across the 183MB installer of Libreoffice and 240-290 MB/s on a
repetitive text file. Most likely explanation: A bug in my test function
(e.g. accidentially inserted text length not being constant). A bit
embarassing here ^^'

On the other hand this shows me that Emacs Lisp is indeed usable for
general purpose processing.

kind regards, Klaus


2013/6/4 Jambunathan K <kjambunathan@gmail.com>

>
> May be you can steal some stuff from here.
>
>         http://elpa.gnu.org/packages/vlf.html
>
> It is a GNU ELPA package that you can install with M-x list-packages
> RET.
>
>
>
> Klaus-Dieter Bauer <bauer.klaus.dieter@gmail.com> writes:
>
> > Hello!
> >
> > Is there a method in emacs lisp to handle large files (hundreds of MB)
> > efficiently? I am looking specifically for a function that allows
> > processing file contents either sequentially or (better) with random
> > access.
> >
> > Looking through the code of `find-file' I found that
> > `insert-file-contents' and `insert-file-contents-literally' seem to be
> > pretty much the most low-level functions available to emacs-lisp. When
> > files go towards GB size however, inserting file contents is
> > undesirable even assuming 32bit emacs were able to handle such large
> > buffers.
> >
> > Using the BEG and END parameters of `insert-file-contents' however has
> > a linear time-dependence on BEG. So implementing buffered file
> > processing for large files by keeping only parts of the file in a
> > temporary buffer doesn't seem feasible either.
> >
> > I'd also be interested why there is this linear time dependence. Is
> > this a limitation of how fseek works or of how `insert-file-contents'
> > is implemented? I've read[1] that fseek "just updates pointers", so
> > random reads in a large file, especially on an SSD, should be
> > constant-time, but I couldn't find further verification.
> >
> > kind regards, Klaus
> >
> > PS: I'm well aware that I'm asking for something, that likely wasn't
> >     within the design goals of emacs lisp. It is interesting to push
> >     the limits though ;)
> >
> > ------------------------------------------------------------
> >
> > [1]
> https://groups.google.com/d/msg/comp.unix.aix/AXInTbcjsKo/qt-XnL12upgJ
>


^ permalink raw reply	[flat|nested] 4+ messages in thread

end of thread, other threads:[~2013-06-05 10:47 UTC | newest]

Thread overview: 4+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2013-06-04 12:52 Handling large files with emacs lisp? Klaus-Dieter Bauer
2013-06-04 14:46 ` Jambunathan K
2013-06-05 10:47   ` Klaus-Dieter Bauer
2013-06-04 22:02 ` Stefan Monnier

Code repositories for project(s) associated with this external index

	https://git.savannah.gnu.org/cgit/emacs.git
	https://git.savannah.gnu.org/cgit/emacs/org-mode.git

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.