Clock rate

From ArticleWorld


The clock rate of a computer chip denotes the speed at which it can execute instructions and perform essential operations. It is measured in terms of megahertz or gigahertz. Another name for clock rate is clock frequency, or cycles per second. It is most often used to refer to the speed of the microprocessor. Current processors have clock rates ranging from 1GHz to more than 3 GHz.

It should be noted that the clock rate of a computer does not give a direct measure of performance, and is useful for comparing computer chips only if they belong to the same processor family. The speed at which computer programs run also depends on amount of RAM and its clock speed. The processor, which has a clock rate that is many times faster, often has to "wait" for the memory in order to complete operations. Many other factors, including the clock rate of the computer's FSB and the amount of processor cache, determine the speed of an entire computer system.

History

Early computers, like the IBM PC of 1981 had clock rates of around 4.77 MHz. Intel’s 8085 had a clock rate of 5 MHz. As years passed and technology improved, processors with higher clock rates began to appear on the computing scene. By the year 1995, the Pentium chip that Intel produced ran at 100MHz, which is equivalent to 100 million cycles per second. The latest of Intel’s processors belonging to the Pentium 4 series, have clock rates well over 3GHz.

Most computers have been recognised by virtue of their clock rates. However, such a criterion is meaningless because it is not just the clock rate that decides performance; certain architectures may be fast at performing particular tasks but slower at others. The myth about the megahertz of a processor was exploded in the late 1990s when top companies like Intel and AMD started advertising their processors on the basis of their model numbers rather than their clock rates.