Thanks for the links. I got to know more about memory management now. Now, it is clear why the memory is consumption is so high when I open a bunch of images at the same time. However, it still does not explain my observations during usual workflow. For my real usage, I open pdfs (or images) one by one most of the time and kill the buffers periodically. Then, if the memory is freed upon closing a pdf buffer, I expect it to be reused for opening a new buffer. Even though memory consumption is expected to grow in this case, it should be in order of the largest image I open (times maximum number of image buffers open at the same time). But it is not what I see. I did a small test by modifying my earlier lisp code to open and close the same image list sequentially: #+begin_src emacs-lisp (dolist (file (directory-files "~/Tosort/pictures&photos/" 'full ".*jpg")) (find-file file) (mapc #'kill-buffer (seq-filter (apply-partially #'string-match ".+.jpg$") (mapcar #'buffer-name (buffer-list))))) #+end_src The resulting memory usage graph is attached. What we can see is that the memory is indeed growing (as expected). Moreover, the memory consumption does not increase as much as if we open all the images together. However, the final heap size appears to be over 400Mb (from smaps), which is almost half of what was observed with all the images open at the same time. Since the largest .jpg file I have in the folder is just around 5.5Mb, 400Mb sounds strange for me. Googling on memory consumption issues, I found that there might be some memory fragmentation problem happening [1]. P.S. Were there any attempts to implement garbage collection for emacs in C code? I found an article [2] showing that using an actual GC may speed up an application in comparison with malloc/free approach. [1] https://stackoverflow.com/a/9069474/9196985 [2] https://www.linuxjournal.com/article/6679 Regards, Ihor