Thread: Who Owns Ideas?
View Single Post
Old 09-22-2008, 07:27 PM   #50
axel77 has learned how to read e-booksaxel77 has learned how to read e-booksaxel77 has learned how to read e-booksaxel77 has learned how to read e-booksaxel77 has learned how to read e-booksaxel77 has learned how to read e-booksaxel77 has learned how to read e-booksaxel77 has learned how to read e-books
Posts: 584
Karma: 914
Join Date: Mar 2008
Device: iliad
Originally Posted by DaleDe View Post
The brain uses associative content addressable look up which a computer has never been able to duplicate. A brain has no ALU at all. Mathematics in the brain is done with table look up. Remember all the tables you had to learn as a child. The computer depends heavily on the ALU (arithmetic logic unit) portion for all decision making. While externally a computer seems to be doing things that a human does the process is totally different.

I hope we don't stray too much off topic here, but I think this failure is not something inherently in "electronic" way of processing, as compared to the hybrid of biologic/chemical/electric way the brain works.

Its my critique at the electronic computing sciences and economy. At the core there has been no development for decades for now. Von Neumann invented the Von Neumann machines 62 years ago, and today Computers still are Von Neumann machines, without changes at the core archticture. The Van Neuman machines got faster, a lot faster, they got more random access memory (a concept of Von Neumann machine), they were optimized with supreme-mega-fast-pipethrough-out-of-order-caching-computer-guessing, or how this extra optimize techs are all called of modern cpus, but the basic idea remained so much the same. Now it seems we are hitting a physicial barrier (who knows if it can be transgressed one day, but currently somwhere between 3-4 GHz seems to be a pretty hard barrier not to transgress at linear computing). What does industry and science come up with? Not that we think of inhertintly different ways of computing things in a parallel with electronics.. Now we are putting 2 Von Neuman machines into one chip. Then 4, now 8. IMHO its time to start looking into alternative concepts of computing, that are just so totally different than we are used what computers are and how they work. Can you for example imagine a computer that has no main memory at all? Where all its internal memory would be small calculating logical as well? A computer where a programm would not be transgressed by one or several linear processes going down step by step? Where part of the "active memory" gets actived as needed to fullfill a given task? Its certainly hard to imagine, and even if we start to get into this sort of computing now, it will take sure considerable time to have computers built that way, that can equally well assist us with tasks we want computers to do for us today. For one things we would also have to through all the modern OSes away (Windows, Linux, Mac) as they all are also fundamentally built on the Von Neumann electronic computing design....

However the argument that started this computer debatte of was:

Out of a human does not come out, what you don't put at some point in.

And I do stay by that. I don't want to take him the free will, and I don't say he cannot come up with insights of a higher magnitute as output that was given him as input, as people can "sort" and things to "imagine" new stuff. But a guy coming up with an idea, without a serious array of inputs that give a nice track to that idea, its not coming out. Its just as unlikely as someone to suddendly start speaking chinese who did never hear chinese / read it or learned it.

Last edited by axel77; 09-22-2008 at 07:33 PM.
axel77 is offline   Reply With Quote