Originally Posted by twobob
...if someone fuelled me with some decent usable (as in "I can gorram understand it enough to use it") Eink dithering code: I might be dangerous. ...
In that case, I do not see how you will understand my new code that does 32-bit pipelined dithering (4-pixels in parallel), with no dither table. It is unfinished, but my proof-of-concept code proves that it works. I hand-simulated it all on paper first, with truth tables and stuff...
I will TRY to squeeze in a few comments or something, to save twobob
a few spare braincells.
The problem is, a "simple" multi-term logical expression takes a half page of comments to describe what is going on down in the bits, especially with hidden propagation of carries, and "do not care" bits. I takes longer to explain it that to just DO it...
EDIT: Although I use Karnaugh mapping and DeMorgan's Theorem to simplify the logical expressions so that they RUN faster, that refactoring removes them further from their first reality-based implementation and tends to "obfuscate" their meaning. Speed and space optimizations are not always intuitive of easy to describe.