Quote:
Originally Posted by ardeegee
sites that publish pirated copies of ebooks (or anything else) are self-policing
|
ROFLMAO. Self-policing doesn't work for anything and I doubt that anybody with so few ethics that they'd knowingly distribute pirated copies of books that were otherwise easily obtainable has enough ethics to bother with policing a web site.
Your point about txt files is valid-I should have mentioned that viruses are as apt to come from simply visiting a malicious web site as from downloading something from it. (You *are* aware that you can't visit a web site without downloading it, right?)
HTML files can contain malware. It's happened in the past, just as it's happened with PDF files. (AFAIK malicious HTML has never carried a virus, but that's more a matter of the lack of ingenuity than incapability. Maybe lack of intent, too, as the only HTML exploits I recall were technical demos showing that it could be done.)
Of course the holes, in all cases except when directly executable files are downloaded, are in the readers, i.e. the browser for HTML files or the PDF reader. Adobe doesn't control all the PDF readers and, IMO, doesn't do all that great a job of patching the holes in what they do control. They certainly don't seem to be as motivated to find the holes as the hackers are so they are definitely *not* proactive about patching them, which means that there is always an opportunity for somebody to slip something thru.
Most browser makers seem to be much more proactive about finding & patching holes, but the content they deal with is so much less controlled that there are inevitably a lot more holes in them. Theoretically it's possible that the 'sandbox' approach will protect the user, but I remain doubtful. IMO there are probably some holes in how the sandbox is implemented in some browsers-they just haven't been found yet.
P2P darknet might be safer as you don't need to visit a 'website' once you have the client set up but in that case self-policing is even more of a joke.