"Basil L. Contovounesios" writes: > Marcelo Muñoz writes: > >> Try to apply json-pretty-print to follow json: >> >> {"t": 1, "key":2} >> >> fail with the message: json-pretty-print: Bad JSON object key: t > > Here are some simpler repros: > > (json-encode '((nil . 0))) > (json-encode '((t . 0))) > (json-encode-key nil) > (json-encode-key t) > > All of these fail with json-key-format since at least as far back as > Emacs 24.5. [...] > See also https://debbugs.gnu.org/24252#26 for some precedent in > rewriting json-encode-key without relying on json-encode. > > I'm AFK until start of August, but I'll try to have a better look at > this when I get the chance if no-one beats me to it. Sorry, finally got around to this. The attached patch should fix this issue while also speeding up encoding in a backward compatible way. Using the same benchmark as https://bugs.gnu.org/40693#89 I get the following: encode canada.json old (1.409496598 96 0.7710352720000002) (1.406660968 96 0.7707586369999997) (1.406515696 96 0.7698804519999998) (1.4098724120000001 96 0.7712946) new (1.452364951 96 0.7682001569999999) (1.451790854 96 0.7712237389999999) (1.452158289 96 0.7710199420000006) (1.4520665160000001 96 0.7707500029999999) This shows that the two extra cases of funcall+cond in json-encode slightly slow down encoding of large numbers of numbers, but I doubt it's significant enough. If it is, we can probably just tweak the dispatch order in json-encode. encode citm_catalog.json old (2.7812737399999996 272 2.1942181940000003) (2.77954628 272 2.1904517840000004) (2.779567506 272 2.1901039010000005) (2.778913438 272 2.189370834) new (0.7056556740000001 68 0.55314481) (0.704577043 68 0.5515927839999994) (0.702683784 68 0.5491281600000004) (0.703850623 68 0.5503691039999996) encode twitter.json old (1.427292653 148 1.1098771399999983) (1.428440774 148 1.109535473000001) (1.4265714 148 1.1097104909999977) (1.426152699 148 1.110347719) new (0.365952034 40 0.29652698499999985) (0.366947621 40 0.29772050399999905) (0.36731820000000004 40 0.29776995099999937) (0.366228327 40 0.29696426200000126) These show that examples with more realistic objects are encoded far faster. Decoding performance is not affected. This change fixes some errors in json.el and brings it a tiny bit closer to json.c in handling confusable keys, but doesn't go all the way, for backward compatibility. For example, json-encode and json-serialize still disagree on ((:t . 1)), (("t" . 1)), (t 1), ("t" 1), hash tables with non-string keys, etc. WDYT? -- Basil