Server Architecture: Past, Present, and Future – Part 1

Servers have evolved significantly over the past three decades. At one time, a single server filed an entire room, required its own cooling system, and had processors that ran slower than many of today’s mobile phones. Today, servers are smaller, faster, and more energy efficient, but the most important element is still the processor.

Because server technology is constantly evolving, it is important to know the history of server processors, how they came to be, and where they are headed. Microprocessors have evolved from single-core, inefficient chips to multi-core powerful multi-taskers with on-chip cache and incredible speed.

Brief Processor History

Early server processors were primarily RISC (reduced instruction set computing) chips, dating back to 1964. By the standards of the time, these processors were extremely fast and revolutionary.

Later RISC-based processors would include IBM’s PowerPC chips and Sun Microsystem’s SPARC. For several decades (from the 80s until present) these chips have powered many of the world’s servers.

Intel’s 8086 processor introduced a new type of chip architecture, which would come to be known as x86. The 8086 was introduced in 1978 as Intel’s first 16-bit processor. That first chip had a maximum CPU clock of 10 MHz. Although most people think of these processors as being designed primarily for desktop computers, the 16-bit and later 32-bit versions have both powered a multitude of small and large servers.

 











Comments: