Am 09.11.23 um 06:52 schrieb Eli Zaretskii: >> Date: Wed, 8 Nov 2023 21:29:46 +0100 >> Cc: philipk@posteo.net, visuweshm@gmail.com, emacs-devel@gnu.org >> From: Harald Judt >> >>> It is IMO okay to fail when sha256sum is not available and a file is >>> larger than the available VM, so Emacs runs out of memory, if this is >>> the situation that worries you. But these cases should be relatively >>> rare, and so there's no reason to fail to have this feature in the >>> much more frequent case that the files are not as large as the >>> available VM. >>> >>> If the file can be read by Emacs, even if it's large, then killing the >>> buffer after computing the hash should not have any adverse effects on >>> memory usage of that Emacs session. >> >> I have started to implement the fallback to internal functions, here are my >> results - it does even have size-limiting to avoid getting Emacs killed, which >> I managed to do trying with a big 4 GiB ISO file: >> >> https://codeberg.org/hjudt/dired-duplicates/compare/main...fallback-to-internal-checksumming >> >> Eli, is that how you imagined it? I would be glad if someone could give it a >> quick review. > > Yes, that was the idea I had, thanks. > > The size limitation should have its default value dependent on whether > the build is a 32-bit (which we still support) or 64-bit. You can > look at how we compute treesit-max-buffer-size, to figure out how to > express the conditions for the default value. Yes, but I wonder, why do this? There can be 32-bit systems as well as 64-bit systems that can have only 2GiB RAM, both might fail when trying to open a file that has e.g. 1536MiB. Then, there might be both types of systems that have 8gb of RAM that can open such files with no problems? Maybe it would be possible to make it dependent on the amount of RAM available on the system? Harald -- `Experience is the best teacher.' PGP Key ID: 4FFFAB21B8580ABD Fingerprint: E073 6DD8 FF40 9CF2 0665 11D4 4FFF AB21 B858 0ABD