• Forum
  • Lounge
  • After thought on binary and computers th

 
After thought on binary and computers that think more like our brains

One of my options for a Uni course is AI, im very exited about it but chose it simply because there wasn't 5 software engineering choices and in AI there would be plenty of the kind of programming i want to do in it. (a little program running by itself doing what i told it to :D)

I looked up AI, human brain, and computer in google just to see how close to mimicking the human brain we were and read an article, this a line (in my own words) stuck in my mind...

'Engineers are creating a computer that acts more like a human brain, they have built a processing chip that rather than dealing in on and off, deals with a range of voltages'

So is binary one day going to be replaced? Our brains use a better system, so maybe binary isnt perfect, as small as it is maybe nothing and something can only have so much power.

(isnt it almost unbelievable that rna can build a computer more powerful than our collective minds have thought up in only 1000 million years or whatever)

Any way i was thinking of something that could could describe billions of times more than 0 and 1 using ten voltages rather than on and of, and having the position of the voltage in a byte mean something as well as other bytes in relation to it, think chess board, and depending on whats happening on the board could mean something.

So why is binary so convenient if you can say soooo much more with a tiny bit more?

Could this be the future for computers, or is it binary forever?
A signal that uses varying voltages is generally called an analog signal. We've gone digital for all of our processing because it's reliable, it's an on or an off. But by introducing varying voltages, error in hardware can cause different voltages to be interpreted as different values (since you'll never have a circuit reach a perfect 0.7V or 0.8V, it might be 0.688V and 0.83V, respectively). While this is detrimental to instructions processing, this also makes it difficult to perfectly replicate data. You can't guarantee that the voltage being read for duplication is the proper voltage, and you can't guarantee that you're going to generate the same voltage upon replication. While this can still be an issue with a binary system, it's nowhere near as much of an issue as it would be with an analogue signal.

When it comes to brains, they're structured to be good at handling error. This might make it reasonable to use a system that deals with analogue signals for brain simulation, since it could handle the errors and would benefit from the extra data bandwidth. However, for the rest of our computational devices, we will still be using a binary system since they aren't as readily prepared to handle error.
Last edited on
I think the error thing shouldnt be th only reason not to use the system, errors can be overcome and some signals could have priorities too, think 0.1 v 0.001amps could be a lower priority to 0.01 amps...granted the chip wud get hot, but its just an example.

It doesnt have to be analogue either, circuitry these days can be pretty precise so if its between 7.0 and 8.0 then its 7, just like the way phones turn your voice into digital signal.
Plus with the human brain being good at handling error it should all be fine anyway.
closed account (3qX21hU5)
There human brain is way more powerful then a computer will ever be in our lifetime. And personally I am afraid of when we do have computers that are more powerful and can "think" like the human brain. It would not be good trust me.

Anyways from my understanding all hardware is not the same, I might not understand totally what you guys are saying but the way I look at it is like this. Lets take CPU's for example. They are all different, one chip could hit a overclock speed lets say 4.7 while another chip of the exact same model made by the exact same company can hit a overclock speed 5.1.

So taking that as a example of how hardware can differ even when they are suppose to be the same it would be extremely hard to do what is being proposed. Just because like NGen said
But by introducing varying voltages, error in hardware can cause different voltages to be interpreted as different values (since you'll never have a circuit reach a perfect 0.7V or 0.8V, it might be 0.688V and 0.83V, respectively).


Again I'm not sure I fully grasp what being discussed here so feel free to correct me :)
@Zereo
What, I think, this thread is about is instead of using 5V to represent 1 and 1V to represent 0, rather have a range of voltages say 1V - 10V so that you can then use the decimal instead of binary system.
Digital (binary) circuitry is used because it is simple and uses gates. The simplest operation which a digital circuit can perform is the logical "NOT" operation. Which uses 1 NMOS and 1 PMOS transistor (see CMOS schematic here: http://en.wikipedia.org/wiki/Inverter_(logic_gate)). It also only consumes power during the transients where the gates are switching from open to closed (very small).

For analogue values, it is true that accuracy is a problem (resistors can come with 5% tolerances at best and capacitors come with 20% tolerance), however I think the real barrier here is the complexity associated with analogue. A filter or gain may not be so hard to implement but things very much more complicated much more quickly.

It's not so hard to create a phase offset on a waveform or to amplify, saturate and measure waveforms, but to say "if (a < b)" requires specific circuitry which cannot be provided by a generic analogue circuite. Meanwhile a processor CAN process something like this whilst remaining generic.

Another thing to consider is that a processor may have a million transistors. This is fine (they are small devices), but with analogue signals we would also need inductors, capacitors, relays, transformers, and resistors. These units are much larger. Our computers are going to be the size of city blocks again.
So if the brain exists, its possible to make an artificial one ourselves, i wonder what rout we should take to research this area?
So if the brain exists, its possible to make an artificial one ourselves, i wonder what rout we should take to research this area?

Scientists have been trying at this since the advent of computers (and even before if you widen the scope of this).
In my opinion you really cant compare a brain with a microprocessor(today's technology), our brain is not an analog computer, it is rather a complex network of neurons that exchange information by chemical(between a synaptic gap) and electronic(through the neuron) means and how it exactly works is a large field to be explored more,microprocessors(binary) on the other hand run cycle by cycle changing states within registers and latches, inside the gates of alu or cu.

A computer doesn't think, it does simple things instead at the lowest level,and only does what it is said to do.
AI like human brain is a far goal to be achieved.
Taking about analog computers , there are some computers that run on base 4 , 8 but on larger bases it is really unreliable as is said earlier.

-amhndu
Last edited on
I agree with Ngen.

@stewbond,
hmmm.. Microprocessors are mostly built using bjts not mos, to my knowledge due to speed and other reasons,

@devonrevenge
that is a big field to explore, there are researches going all over the world and is mostly a mystery.
So what do we know already, whats the difference between thinking and processing? i suppose this is more of a question for a computer scientist and a a neuro-scientist in one, either one without the other may lack insight, where could i find such a person?
Last edited on
@Biscuit, wow this stuff is really really interesting, you read this?
I wonder if we will suss out the brains coding language in our lifetimes :D
Am gonna plug myself in to my raspberry PI deus ex eat your heart out.

seriously im gonna make a nural network memory sim :D
Topic archived. No new replies allowed.