Sci-tech: Side-stepping Moore's Law

Wael Elazab
6 Min Read

Multi-core processing: Doing more, even if you don’t need to

The wonders of technology have brought us many ways to pass and – let’s be honest – exploit our time. And now you can multiply them by four. At least that’s what Intel Corp. and Advanced Micro Devices (AMD) would have you believe.

These manufacturers have each launched multi-core chips in the past half year – both launched dual-core, then Intel launched quad-core – after decades of nothing but single-core chips. Why the change? Well, the decision to go beyond single-core chips has to do with limitations as we want to do more and more, in less time, with our computers.

Let’s step back for a second.

The chief microprocessor in your desktop computer, referred to as the central processing unit (CPU), interprets computer program instructions and processes data. Microprocessor evolution tends to follow Moore’s Law, which very roughly states that microchip speed will double every 24 months, which has held true for the most part.

Consider the CPU to be the ‘core’ of your computer.

A multi-core microprocessor attempts to combine more than one independent processor into the same integrated circuit, which is a microchip. Microchips are in all computers, and most electronic devices. The microprocessor is one of the most advanced varieties of microchip.

As people have been pushing for more and more computer processing speed and the hardware companies have responded, there have also been developments to minimize the excessive heat and power consumption that comes with ever higher processor speeds. This has been done by having two cores, whereby tasks that your computer needs to do are split up and performed simultaneously.

A single-core machine can handle the processing activities of such an application of course, but cannot handle the multiple coring aspect of it, instead having to do one of these at a time. So if memory management is taking a long time, for instance, then the user feels it, because the interface stops responding – the screen freezes. A dual-core machine instantly reduces this drain on processing resources and as a result desktop applications almost by default benefit from having more than one core to spread the load.

Intel s Core 2 Duo and AMD s Athlon 64 X2 dual-core microprocessors work in markedly different ways, but the upshot with either is that you can run two process-intensive programs at once, such as watching a DVD while streaming music onto the computer in your living room. For the most part, you can t do this with a single-core desktop CPU without one of the applications slowing to a halt – or both slowing down just enough to spoil your fun. If the processor is intended for your home computer, to play games or otherwise, two cores are going to show a noticeable improvement.

But four cores? This is when they become redundant for most purposes, as the jump from dual to quad on the desktop – with Intel’s brand new Core 2 Quad processor being the only option initially until AMD releases its quad-core chip – will hardly be noticed.

Quad-core chips are in theory more desirable, with much more potential computing power. The catch is that additional cores are only useful in certain environments and your desktop computer is unlikely to make proper use of four cores. A crude analogy with the motoring industry would be buying a new luxury sports car that can do 300 km/h – you will never find a road to make use of that power.

It’s theoretically conceivable that a quad-core processor has double the effect, you won’t notice it.

In fact, aside from very complex software like computer-aided design, 3D rendering, ray-tracing, or video-encoding software, which are very easily multi-threaded and benefit from multiple cores, applications won’t work faster with a quad-core chip. Word simply won’t spell-check quicker or Excel calculate faster. What you would see is that multiple applications running simultaneously will work properly, with one application using one core and the other application using the other core.

Where you’d see the full benefits from multi-core CPUs is when the software is coded to be multi-threaded – allowing the software to divide its processes among the available cores, thereby dramatically increasing a program s running speed. It s a brilliant concept and one that works really effectively, if such software is available. We need to wait until more everyday software is actually designed for multiple cores, because currently most of it isn’t, and your applications just don’t know what to do with multiple cores.

Where is this software readily available, you cry? Servers. Most server operating systems and the software they run can take advantage of multiple cores, and even multiple CPUs, as they are generally written with such environments in mind.

Quad-core and above will eventually become useful for the desktop user, but not before the majority of the software you run is programmed for it. This will take some time.

Jump on dual-core as though it was your spouse just back after a long trip, but leave quad-core alone for now.

TAGGED:
Share This Article
Leave a comment