On 2014-06-17 at 04:12, Rusi wrote: > On Tuesday, June 17, 2014 7:12:11 AM UTC+5:30, Garreau, Alexandre wrote: >> On 2014-06-14 at 16:51, Yuri Khan wrote: >>> Yes, LaTeX does a lot to produce a beautifully typeset printout from >>> an ASCII source. This is not enough; I want that same beautiful >>> typesetting on screen, in browser, in any page width I happen to have, >>> in my favorite typeface and font size, without having to recompile the >>> document. And at the same time, it does too much. It has to maintain, >>> and document authors have to utilize, a multitude of workarounds that >>> are caused by TeX not using Unicode internally. > >> Having something technically and typographically good like LaTeX, >> semantic and interpreted like HTML and language-neutral like >> markdown/any-binary-interpreted-format would be great. > > Yes its important that we start moving to XeteX (luatex) where I can > directly write α etc than \alpha. I know XeteX, but I wasn’t thinking to it… And yet LaTeX is not fully language-neutral because of command names (\emph, \textbf, \title, \section, etc.) and isn’t interpreted, and not reaaaaally semantic (since it’s only made to be compiled into a graphical thing). > ¹ Dare I say “universal”? As math is the only language approaching > universality known to humanity. Well, nothing is really universal (everything need a shared knowledge, thus, a culture). Even math, when it isn’t based on latin or greek language, stay based on occidental/arabic/indo-European culture and symbols. But we can artifically make universal things, just as we more or less did with lojban, or TCP/IP, etc. So what we can do is just invent new pieces of culture based on the most universal things we can, and avoiding linguistic/geographic/gender/class cultural biases.