The computer chip

Pages: 12
I have tried to GOOGLE on how a computer chip actually works, but I never really found anything. I smashed open a computer chip once and all that is in there is copper circuits. How does a computer chip actually work? It is a little square with pins sticking out of it hooked into a circuit board. Lets say this computer chip, calculates something like I dunno, how long a missile is going to take to hit it's target, then trigger the war head on impact.

I just don't see how it is possible that computer chips work and a full blown CPU like an Intel processor that is just, I have no words on how insane that is.

From my understanding, somehow, the electricity running through these chips, is a bunch of AND/OR/XOR gates, that open and close the gates based on the signal, resulting in a 1 or a 0. I just don't see how it is possible, to make a computer chip do a certain function. Like, in basic electronics, you need a computer chip to make an LED turn on or off. But a computer chip is just electricity running through the metal inside of it. How does the computer chip control that electricity?

Even though computer chips are a reality and they do what they do, I still don't believe it. And I am on the firm belief that no man created that thing, must have been Aliens. The CPU is impossible, it is just not possible to build yet it exists.

Can someone explain to me how a computer chip works like I am a 5 year old?

I found this: https://www.quora.com/How-does-a-computer-chip-work

But I still don't see how the chip "knows" any of it. I don't see how that little chip (not a full blown CPU) knows what a gate is. Like I said it is just electricity running through metal, how does the chip "know" anything. Except no voltage = 0 and Voltage = 1, even then how does the cihip even no how to pump out a 1 or a 0 based on the voltage.

See it's impossible, damn Aliens.
Last edited on
closed account (z05DSL3A)
Can someone explain to me how a computer chip works like I am a 5 year old?
The chip is filled with magic smoke, if you let the smoke out the chip stops working.
Haha, yeah that actually is true, if you see a computer chip billowing smoke out of it, its probably fried.

But I agree, computer chips are magic. It is not metal and electricity in cased in that really weird hard plastic, its magical fairy smoke. Believe me when I tried to smash a computer chip open with a peen hammer that little guy wouldn't budge. I got a small glimpse of what what was inside. Computer chips are tough little suckers.

Aliens!

What is interesting though, assuming man really did invent the computer chip, like how did that even start. They somehow had the idea in the first place and somehow, put little circuits into a small plastic casing, then somehow hooked into a circuit board and somehow tested it's functionality. With a monitor? they run electricity through this thing and by some miracle a 1 or 0 prints on the screen. Most likely you run electricity through some metal and nothing would happen. Then what?

I guess I have a half of a brain, because these guys where like yeah do this and that and BOOM we have 3D video games. Easy Peezy.

Either way, nobody can deny that Computer Chips, Central Processing Units and Memory is bizarre. Not many people on this planet know exactly how they work, or how to build them. Especially military grade stuff. I am sure Nividia had to fly out to pluto to recruit whatever things built their GPU's.
Last edited on
Like, in basic electronics, you need a computer chip to make an LED turn on or off.

Well, a single transistor is sufficient to turn an LED on or off. A computer chip contains several transistors. Well more like several billion nowadays.
It's a curios topic for someone who says they have a degree in CS.

http://www.cplusplus.com/forum/beginner/183869/


And this one from someone who knows C#.NET

http://www.cplusplus.com/forum/general/181972/


Things aren't adding up, were you stoned or something?
OP, I think I see where your confusion comes from. Circuitry doesn't actually "know" anything about the function it's fulfilling. We can understand the output of computers simply because we have designed our output devices to represent that output in a manner our senses can understand.
How the computer can perform seemingly logical operations without understanding logic is simple, really.

Imagine you have a group of 10 people and a list of some number of yes/no questions, and you want to know what those people answer as a group, following some arbitrary criterion you've defined. So this is the setup: you read every question in your list, and each person presses one of two buttons each has in front of them. If the group as a whole answers "yes", a light turns on, otherwise it turns off.
Suppose that your criterion is "the group says 'yes' only if everyone says 'yes'". Then you could connect up the buttons as switches in a serial circuit. Current will only flow through when all the switches are closed.
If your criterion was instead that "the group says 'yes' when anyone says 'yes'", you could connect the switches in parallel.
The circuit doesn't understand anything about what it's doing. It's performing Boolean algebra simply because of how you've built it and how electricity works. It could also be possible without semiconductors, given buttons that could simultaneously change the state of a large number of switches, to construct a circuit that would turn the light on if a majority of the people said "yes". It would have to be a really complex circuit, though.

Semiconductor gates work basically the same, only instead of being built out of fingers that push buttons, you have special materials whose conductivity from terminal A to B changes depending on the current that's applied to terminal C.

They somehow had the idea in the first place and somehow, put little circuits into a small plastic casing, then somehow hooked into a circuit board and somehow tested it's functionality. With a monitor? they run electricity through this thing and by some miracle a 1 or 0 prints on the screen.
The very earliest transistors (and vacuum tubes before that) were probably tested using oscilloscopes or even just simple light bulbs.

It is not metal and electricity in cased in that really weird hard plastic
Silicon is technically a metalloid.
Last edited on
OP wrote:
Except no voltage = 0 and Voltage = 1, even then how does the cihip even no how to pump out a 1 or a 0 based on the voltage.


This is your problem OP. The "chip" doesn't pump out a '1' or a '0'. It turns ON or OFF depending on input. We interpret it as a 1 or 0 because it's easier to read and write that way.

OP wrote:
From my understanding, somehow, the electricity running through these chips, is a bunch of AND/OR/XOR gates, that open and close the gates based on the signal, resulting in a 1 or a 0. I just don't see how it is possible, to make a computer chip do a certain function.

A lot of articles out there over simplify this stuff to the point of being wrong so it's easy to see where you might get confused. Dopped silicon and germanium work very similar to the way that vacuum tubes do. Take a transistor acting as a switch for example; you have a voltage difference between the collector and the emitter, but that difference can't balance out because the two poles are separated by an intervening material attached to the base. So since no current is flowing and the circuit is homeostatic (except for leakage current) the circuit is said to be "OFF" or '0'. But by applying a certain amount of voltage to the base, called the trigger voltage, the circuit is now closed and said to be "ON" or at a '1'. From there it's just a matter of putting a component inline with that circuit to have it do something.

The most confusing thing about transistors is that when people explain them, they tend to explain the amplification properties at the same time because of all of the overlap. But if you keep in mind that the two use cases can be different, then you'll be in a better position to make the connections (pun intended).


EDIT: P.S. What the hell is wrong with being a SysAdmin?
Last edited on
For a long time, math theory far outpaced our ability to actually do the computations. 100 or 200 years ago a "computer" wasn't a machine, it was a job description. There were people whose job it was to do math calculations.

The need for fast math led to all sorts of cool devices for computation. Perhaps the most famous is the slide rule, which every high school and college student had before about 1974. There were also mechanical adding machines. These could also multiply. Amazing!

Now we're all familiar with electrical switches like a light switch on a wall. Take one step further and create a switch that is itself controlled by electricity. So if voltage is present, the switch is on, and if voltage is absent, the switch is off. Put a bunch of clever people in a room for a few days and they will be able to put together a circuit that can add numbers using these switches. You input the numbers in binary using toggle switches to energize or ground a wire. Energized = voltage = 1. Grounded = no voltage = 0. Push these through your circuit and presto! The outputs will be the sum of the numbers. How cool is that?

Next, someone comes along with an all-electronic switch: it has no moving parts! And it can switch in just a fraction of a second! I'm referring to a vacuum tube. Make your circuit out of vacuum tubes and it runs much faster.

By this time, businesses and the military are going spastic because quick calculations provide a gigantic advantage. They throw money at the problem and quickly we have things like the ENIAC computer. Because each vacuum tube is about the size of a Red Bull can the computers are big. Also they draw huge amounts of power and those tubes burn out occasionally. If you have a thousand tubes and they burn out on average once a year, then you'll lose about 3 per day. This stuff is a pain!

Next comes the transistor. This is based on weird materials called semi-conductors. A semi-conductor will conduct electricity if you apply a charge across it, but not otherwise. Once you have this material, it's easy to create a semi-conducting switch: a transistor! The transistor is way WAY more reliable than a vacuum tube. It draws less power and it's a lot smaller. Now you can make computers with transistors. Very quickly companies started packaging groups of transistors into standard circuits in standard packaging: AND and OR gates, multiplexers etc. You could put these "discreet components" on a board, connect their inputs and outputs together to create more and more complex devices. Perhaps the most extreme example is the CRAY-1 supercomputers, built entirely out of NAND extremely fast NAND gates.

Using photo lithography, it became possible to create smaller and smaller transistorized circuits until eventually it was possible to put combine lots of discreet components into a single Integrated Circuit. Then you could combine ICs into a single package that was a central processing unit (CPU).

From there it's been largely a question of scale. Manufacturers were able to create smaller and smaller circuits, allowing designers to cram more and more functionality into the physical space. The smaller circuits also could run faster and faster until now a typical Intel CPU has around 10 BILLION transistors on it (if memory serves me right).
Learning how computers worked really took a lot of the magic out!

I feel like were still in the stone age, nuclear power stations and flat screen tvs and putting men on the moon ain't all that once you know, just years of refining understanding of many many subjects.

Looking into machine learning however made me realize that there is no understanding of anything, a waterfall isnt a waterfall its just an arrangement of electronic data in our heads that only means something to me and no one else.

true story.
They are actually pretty simple and fairly low-tech devices, no magic. The chip knows how to control the flow of electricity the same way that the ground knows how to control the flow of water. The "special" enabling technology is just a basic little switch. It's only slightly more impressive, technologically, than a lamp.
Well, let's not go overboard. From a theoretical standpoint yes, CPUs are simple machines and operate on easily understandable principles, more or less. Technologically, it's many orders of magnitude more difficult to cram billions of transistors into the surface area of a fingernail than to make a lamp. If this was not the case, fabs would not cost as much as they do.

Just because something is not literally magical doesn't mean it's low-tech. A knife is low-tech. A CPU is bleeding-edge-tech.
Well, let's not go overboard. From a theoretical standpoint yes, CPUs are simple machines and operate on easily understandable principles, more or less. Technologically, it's many orders of magnitude more difficult to cram billions of transistors into the surface area of a fingernail than to make a lamp. If this was not the case, fabs would not cost as much as they do.

Just because something is not literally magical doesn't mean it's low-tech. A knife is low-tech. A CPU is bleeding-edge-tech.


I was being a little strong/exaggerative, but really what I'm saying is that the operation and technology behind the computer chip (as an invention), is not that advanced. Sure, modern computer chips are quite advanced, but most of that comes from technology for making things small, and from incremental clever ideas and logic to make them better.

And while the logical circuitry behind modern CPU's is pretty complex and beautifully designed, the first computer chips were pretty basic, basic enough that the logical aspect could be designed by a first year computer science student. This is not really what I consider super advanced technology. It's certainly not anything human beings couldn't pull off. Once electricity was harnessed, and humans became industrialized, it was only a matter of time before we made computers.
Last edited on
I remember playing with flipflops and making a binary counter circuit a good decade ago, that stuff was all that was needed to see how things progress.

we had the potential to build the computers we have now in the late 50s if we wanted to fill the country up with valve transistors.


I feel that the real technologically advanced stuff is in how these chips are made and so small and in such a high frequency as well as the physics behind doped radioactive nickel manipulating polarities in diodes.

Its weird to think we have come far over the last 100 years, actually just had a thought, isnt it weird that we have had airplanes for over 100 years?
Last edited on
Its weird to think we have come far over the last 100 years, actually just had a thought, isnt it weird that we have had airplanes for over 100 years?


Not when you consider we basically went from horseback to the moon in less time.
It's a curios topic for someone who says they have a degree in CS.

http://www.cplusplus.com/forum/beginner/183869/


And this one from someone who knows C#.NET

http://www.cplusplus.com/forum/general/181972/


Things aren't adding up, were you stoned or something?


The college I went to was a for profit fast track school. It was not like a normal Computer Science program. I failed Algorithms two times and on my third attempt, the teacher did my assignments for me just to pass me. I was not qualified for the school and it is amazing I even graduated sorta. Yeah I know C#.NET, but not on a professional level, I don;t have a job doing it, I am just a lousy systems administrator making $15 bucks an hour. They gave us cheat sheets in our Calculus classes. I failed Database design on my first attempt. It was a total disaster. So yeah it adds up that I have no clue on how a computer chip is possible. I am not very bright. Plus I knew a genius C# programmer at my current job, he never went to college so Degrees don't say much.

EDIT: P.S. What the hell is wrong with being a SysAdmin?


Thanks for your post, it was very informative, as with all the others. But to answer your question, I don't like being a systems administrator, I take no pleasure in it, I find it boring. I just do it, because it is all I know how to do to make a living.
Last edited on
@NuklearKrisis

The other thing that prompted me to post was the references to outer space, aliens, military weapons and no one knowing how to design CPU's: it all seemed a bit weird.
@TheIdeasMan

I am not trolling if that is what you are implying. I love this forum and the last thing I would ever want to do is a get a bad rep here. I had a genuine question about computer chips. and after all the great replies I got, I guess man did build this stuff. But, a a modern GPU today that can make the amazing video games we have today, I cannot believe a little processor can do that. They are human, but with brains not of a normal human. A brain I wish I had.

Aliens is one thing, but what is wrong with militarily weapons and space? I am pretty sure C/C++ and ADA + the insane CPU's in those systems made all the bad ass weapons and space software we have today.

When a Fighter pilot in his F16 looks at his MFD to lock on to a target, then fire off the missile, it is all C/C++ happening under the hood.

Plus I found everyone's response enjoyable, informative and entertaining. That is why I posted this under the Lounge.
Last edited on
I am not trolling if that is what you are implying.


OK, no worries - it just that the way you combined everything in your first two posts seemed weird to me, that's all :+)

I guess your question is a bit like evolution: How can we have so many wonderful organisms (including humans), that all came from a simple Eukaryotic cells?

https://en.wikipedia.org/wiki/Cell_%28biology%29


As others have mentioned, things started simple then became more complex over time as new things were developed on the shoulders of those things that came before.

That is the very nature of CS, big things are built from lots of small things that are combined in particular ways.

And that is also the nature of many things, whether it be furniture, buildings, bicycles, cars or indeed the F16 with all it's systems.
Plus I found everyone's response enjoyable, informative and entertaining.


Yeah admrkrk made my day.
lol, glad to oblige. Although technology, and every day life, seems to be moving along at a fast pace, it has actually slowed down, if you think about it. The Civil War was in the 1860s and we were sending rockets to the moon in the 1960s. Do you think there will be such a drastic change by the 2060s?

The only thing "magical" about computer chips is that what used to take up an entire room has been shrunk down to the size of a fingernail. I am not sure if that is an exaggeration, or an understatement, but you get my point.
Pages: 12