From mboxrd@z Thu Jan 1 00:00:00 1970 Path: news.gmane.org!not-for-mail From: "Ami Fischman" Newsgroups: gmane.emacs.devel Subject: how do you track down emacs memory leaks? Date: Sat, 1 Nov 2008 21:05:13 -0700 Message-ID: <9aa0cfde0811012105o20c51089j1cd80d81d2895a6d@mail.gmail.com> NNTP-Posting-Host: lo.gmane.org Mime-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_Part_50878_25243328.1225598713525" X-Trace: ger.gmane.org 1225598727 30000 80.91.229.12 (2 Nov 2008 04:05:27 GMT) X-Complaints-To: usenet@ger.gmane.org NNTP-Posting-Date: Sun, 2 Nov 2008 04:05:27 +0000 (UTC) To: emacs-devel@gnu.org Original-X-From: emacs-devel-bounces+ged-emacs-devel=m.gmane.org@gnu.org Sun Nov 02 05:06:29 2008 connect(): Connection refused Return-path: Envelope-to: ged-emacs-devel@m.gmane.org Original-Received: from lists.gnu.org ([199.232.76.165]) by lo.gmane.org with esmtp (Exim 4.50) id 1KwUEP-0004jP-Ao for ged-emacs-devel@m.gmane.org; Sun, 02 Nov 2008 05:06:29 +0100 Original-Received: from localhost ([127.0.0.1]:41501 helo=lists.gnu.org) by lists.gnu.org with esmtp (Exim 4.43) id 1KwUDI-0004Zr-P6 for ged-emacs-devel@m.gmane.org; Sun, 02 Nov 2008 00:05:20 -0400 Original-Received: from mailman by lists.gnu.org with tmda-scanned (Exim 4.43) id 1KwUDE-0004Yc-Tp for emacs-devel@gnu.org; Sun, 02 Nov 2008 00:05:16 -0400 Original-Received: from exim by lists.gnu.org with spam-scanned (Exim 4.43) id 1KwUDD-0004XI-Af for emacs-devel@gnu.org; Sun, 02 Nov 2008 00:05:16 -0400 Original-Received: from [199.232.76.173] (port=42699 helo=monty-python.gnu.org) by lists.gnu.org with esmtp (Exim 4.43) id 1KwUDD-0004XB-80 for emacs-devel@gnu.org; Sun, 02 Nov 2008 00:05:15 -0400 Original-Received: from wf-out-1314.google.com ([209.85.200.173]:52419) by monty-python.gnu.org with esmtp (Exim 4.60) (envelope-from ) id 1KwUDC-0007V0-P4 for emacs-devel@gnu.org; Sun, 02 Nov 2008 00:05:15 -0400 Original-Received: by wf-out-1314.google.com with SMTP id 28so1999795wfc.24 for ; Sat, 01 Nov 2008 21:05:13 -0700 (PDT) Original-Received: by 10.142.172.12 with SMTP id u12mr6405736wfe.180.1225598713531; Sat, 01 Nov 2008 21:05:13 -0700 (PDT) Original-Received: by 10.143.11.8 with HTTP; Sat, 1 Nov 2008 21:05:13 -0700 (PDT) X-detected-operating-system: by monty-python.gnu.org: GNU/Linux 2.6 (newer, 2) X-BeenThere: emacs-devel@gnu.org X-Mailman-Version: 2.1.5 Precedence: list List-Id: "Emacs development discussions." List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Original-Sender: emacs-devel-bounces+ged-emacs-devel=m.gmane.org@gnu.org Errors-To: emacs-devel-bounces+ged-emacs-devel=m.gmane.org@gnu.org Xref: news.gmane.org gmane.emacs.devel:105258 Archived-At: ------=_Part_50878_25243328.1225598713525 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit Content-Disposition: inline Hi emacs-devel, I'm looking for advice on how to track down a memory leak in emacs. CVS HEAD builds I did on 20080702, 20080810, 20080915, and 20081025 all steadily grow in process VmRSS and VmSize to the tune of ~70-100MB per day when I use gnus and let it automatically check for new mail. This quickly makes emacs sluggish for all operations, presumably because of heap-walking slowing way down (for GC? for consing? I don't know). Stefan Monnier's memory-usage.el shows growth across the board over time, but none of the numbers come close to the RSS. For example right now my emacs is at 75MB of RSS (and 175M total size) but memory-usage says: Garbage collection stats: ((749993 . 70859) (52116 . 152) (7501 . 2362) 3678192 1020951 (676 . 756) (19003 . 1331) (136481 . 18922)) => 6566816 bytes in cons cells 1254432 bytes in symbols 197260 bytes in markers 3678192 bytes of string chars 4083804 bytes of vector slots 17184 bytes in floats 569352 bytes in intervals Total bytes in lisp objects (not counting string and vector headers): 13304187 Buffer ralloc memory usage: 67 buffers 949032 bytes total (240671 in gaps) (for a total of ~22.8MB). I've tried using valgrind and tcmalloc but neither works with a dumped emacs binary, and I couldn't get enough work done with a temacs binary to extract useful information. So, anyone else see memory leaks in the last 6 months? Anyone have any clever tips on how to snapshot the lisp-space heap and different points in time and then extracting the delta, preferably summarized by allocation stacktrace? -a ------=_Part_50878_25243328.1225598713525 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit Content-Disposition: inline
Hi emacs-devel,

I'm looking for advice on how to track down a memory leak in emacs.  CVS HEAD builds I did on 20080702, 20080810, 20080915, and 20081025 all steadily grow in process VmRSS and VmSize to the tune of ~70-100MB per day when I use gnus and let it automatically check for new mail.  This quickly makes emacs sluggish for all operations, presumably because of heap-walking slowing way down (for GC? for consing?  I don't know).  Stefan Monnier's memory-usage.el shows growth across the board over time, but none of the numbers come close to the RSS.  For example right now my emacs is at 75MB of RSS (and 175M total size) but memory-usage says:

Garbage collection stats:
((749993 . 70859) (52116 . 152) (7501 . 2362) 3678192 1020951 (676 . 756) (19003 . 1331) (136481 . 18922))

 =>    6566816 bytes in cons cells
    1254432 bytes in symbols
    197260 bytes in markers
    3678192 bytes of string chars
    4083804 bytes of vector slots
    17184 bytes in floats
    569352 bytes in intervals

Total bytes in lisp objects (not counting string and vector headers): 13304187

Buffer ralloc memory usage:
67 buffers
949032 bytes total (240671 in gaps)

(for a total of ~22.8MB).

I've tried using valgrind and tcmalloc but neither works with a dumped emacs binary, and I couldn't get enough work done with a temacs binary to extract useful information.

So, anyone else see memory leaks in the last 6 months?  Anyone have any clever tips on how to snapshot the lisp-space heap and different points in time and then extracting the delta, preferably summarized by allocation stacktrace?

-a
------=_Part_50878_25243328.1225598713525--