Eli Zaretskii writes: > I'm talking about basis for the 0.7% figure. I used 0.7%*RAM because total RAM is the only reasonable metrics. What else can we use to avoid memory over-consumption on low-end machines? Of course, I used implicit assumption that memory usage will scale with gc-cons-threshold linearly. IMHO, it is a safe assumption - the real memory usage increase is slower than linear. For example, see my Emacs loading data for different threshold values: | gc-cons-threshold | memory-limit | gcs-done | gc-elapsed | gc time | | 1Mb | 523704 | 394 | 25.809423617 | 0.065506151 | | 2Mb | +9624 | 210 | 13.41456755 | 0.063878893 | | 4Mb | +1224 | 109 | 6.400488833 | 0.058720081 | | 8Mb | +3164 | 63 | 3.223383144 | 0.051164812 | | 16Mb | +5532 | 37 | 1.757097776 | 0.047489129 | | 32Mb | +20264 | 25 | 0.995694149 | 0.039827766 | | 64Mb | +59860 | 19 | 0.624039941 | 0.032844207 | | 128Mb | +115356 | 16 | 0.42626893 | 0.026641808 | | 256Mb | +171176 | 14 | 0.277912281 | 0.019850877 | | 512Mb | +332148 | 12 | 0.122461442 | 0.010205120 | Also, see the attached graph.