Wow, such overreaction.
LaTeX already knows the hyphenation of most words, as already been not only stated but demonstrated. The fact that a million words exist, the vast majority of which are almost never used is completely irrelevant. Including the results of a script that finds exceptions to what the system knows, and identifies those that are known to be ambiguous (though grammar-check-like software could handle most such cases) to be given to the book designer at book creation to mark just these words likely wouldn't take more than 10 minutes per book.
If the trade-off is between no hyphenation anywhere, or a non-reflowable format, vs the occasionally wrong pre-sent vs. pres-ent, I certainly would put up with the latter.
There's no reason in principle why a computer can't identify stacks and rivers, and try to do something about them. LaTeX already can be programmed to completely avoid widows and orphans, though on a sufficiently small page, doing so would be a bad idea.
Maybe a perfect algorithm isn't possible, and a computer can't do these things perfectly, though I'm not sure they're humanly perfectable either, but let me just say this...
Using the failure of perfectability as an excuse for poo-pooing the push for software that does these things much better than what we currently have is well... incredibly silly.
Last edited by frabjous; 09-02-2009 at 10:22 AM.
|