On Sat, May 4, 2013 at 4:58 PM, Óscar Fuentes <ofv@wanadoo.es> wrote:
This change introduces a serious slowdown which is noticeable for large
candidate lists (try with 10000 elements.) The slowdown happens on every
invocation.

It is obvious that having duplicate candidates makes no sense, but at
the same time scanning the list in advance for all duplicates is
expensive.

The only way to introduce list with dupes is ido-completing-read (i.e. it's not an issue for files and buffers), so I think it's okay to remove dupes just once on entry.

Patch atached.


--
Le