A CPU is driven by a clock signal - a signal which switches between logical 0 and 1, many times a second. This serves as the tick or heartbeat of the CPU.
For example, if the CPU has a fetch decode execute cycle, it will fetch the instruction on the first tick, decode it on the second tick, then execute it on the third tick. On the fourth tick it will fetch the next instruction, and so on.
The clock speed (number of ticks per second) determines how many instructions the CPU can execute per second. In early microprocessors in the 1970s, clock rates were a few MHz (million ticks per second). Current CPUs (2017) are clocked at several GHz (thousand million ticks per second).
Computers are obviously a lot faster now than they were in the 1970s, and clock rates have a lot to do with that. However, you must be careful when comparing the clock rates of different modern computers.
The clock rate, or processor speed as it is often called, only tells part of the story. If you have two computers which are very similar, except that one has a faster version of the same type of processor, then that machine would most likely run a bit faster (though not necessarily quite as fast as you hope).
But if you have two totally different computers, the clock rate might not tell you very much, because:
- All processors are not equal - some can do a lot more work on each tick of the clock - processor type is important.
- The rest of the computer doesn't run as fast as the CPU. For example, without enough cache memory the CPU might not run at full speed.
- A processor with more cores might perform better than a processor with fewer cores, even if they have the same clock speed.
It is important to look at all the factors, not just clock speed.