Eli Zaretskii schrieb am Sa., 27. Feb. 2016 um 09:21 Uhr: > > From: ohki@gssm.otsuka.tsukuba.ac.jp > > Cc: ohki@gssm.otsuka.tsukuba.ac.jp, 22815@debbugs.gnu.org > > Date: Sat, 27 Feb 2016 08:21:39 +0900 > > > > Eli Zaretskii writes: > > > > From: ohki@gssm.otsuka.tsukuba.ac.jp > > > > Cc: ohki@gssm.otsuka.tsukuba.ac.jp, 22815@debbugs.gnu.org > > > > Date: Fri, 26 Feb 2016 19:09:57 +0900 > > > > > > > > > I was asking why couldn't the plug-in do the conversion, e.g., by > > > > > using libiconv? Emacs is not the only piece of software that knows > > > > > how to convert from one encoding to another. > > > > > > > > I considered using libiconv once, but Emacs has the conversion > > > > capability, so why not use it. > > > > > > Because it can signal an error, if the encoding you pass is not a > > > valid coding-system that Emacs recognizes? > > > > Yes it does! > > In the course of developing my plugin, > > I encountered `Invalid coding system' message, and Emacs keep working > > (no crash, no hangup). > > It's all too easy to get that, since Emacs coding-systems have names > that are rarely used elsewhere. And using libiconv is easy enough. > > So I'm uneasy about this. What do others think? > > > > I agree, this adds complexity without significant advantages. I'd recommend to add a wrapper for make-unibyte-string instead, then users can choose to use Emacs functions for decoding and encoding strings.