Skip to content
Join our Newsletter

Hardware revolutions

It’s interesting how the discovery or refinement of one technology snowballs, creating an avalanche of new tech breakthroughs that change the way things work.
andrewbyline

It’s interesting how the discovery or refinement of one technology snowballs, creating an avalanche of new tech breakthroughs that change the way things work.

For example, it was only a few years ago that Intel and AMD were locked in a battle of transistors to see how many they could cram onto a single chip. It was all about nanometers, or how narrowly they could thread lines of copper and gold onto chips without them disintegrating.

Then something happened. In 2003 AMD came out with the first mass produced 64-bit CPU (central processing unit) for businesses and home consumers, followed closely by Intel. The use of 64-bit architecture basically allows your computer to handle longer lines of code faster, which opens the door for burlier programs, larger files and applications, and all around better performance.

Then companies started to give some thought to chip architecture. Rather than stacking chips to improve performance, as animators and film editors had been doing for years, the big chip companies started to create “duo” processors where two CPUs shared the same piece of silicon, dividing up tasks, and essentially allowing computers to do more things at once.

The chip makers are now looking at ways to double and even quadruple that capability.

At the same time motherboard manufacturers are doing their best to design boards that can accommodate several chips while increasing the speed and volume that data is transferred between the main system components — hard drive, memory, CPU, graphics processor, cache, and feedback systems (USB and firewire ports, Internet and wireless connections, etc.).

The Cell processor represents another new direction in architecture. Created by Sony, Toshiba and IBM at a cost of about $400 million, the chip includes a power architecture core processor connected to seven sub-processors that handle the tasks assigned to it. The first model, which will be included in the Sony PS3 game console, is capable of performing three to 12 times faster than any desktop processor, depending on the nature of the application and how the software is written.

Last week another breakthrough in architecture was announced. Nvidia (www.nvidia.com) showed off a new graphics chip with a built-in CPU that handles some of the computer’s processing duties. Since almost all applications are graphics-based these days, even if it’s just in terms of the interface, it can be more efficient to integrate graphics and processing onto the same chip.

One of the issues with graphics processors is that they are seldom used to their potential because of other bottlenecks in a computer as information travels from media to CPU to graphics card and out. Now, with the GeForce 8800 GTX, the same information will be processed on the graphics card itself without trying to package it and route it through the motherboard. The new graphics chip should also reduce power consumption as well, something that typically results in additional performance benefits.

Although this might seem like a small tweak to system architecture that will mostly be embraced by hardcore gamers, animators and film editors, AMD confirmed that they’re working on a similar system, and there are rumours Intel is doing the same. In a few generations it could even be a new industry standard.

Digital on its way

Last week I finally broke down and ordered cable, renting a digital receiver and subscribing to the digital basic package. And boy does the reception suck on my high definition 1080i capable television.

Movies look crystal clear, even though the signal also has to be upgraded by the television’s processor, but the cable stations can appear fuzzy, grainy, pixilated, and generally below the quality I can get on my older set.

According to the cable guy, this is normal — the equivalent of putting regular gas into a high performance engine. However, he assured me that things are changing as television stations upgrade their signals from analog to digital, producing clearer signals.

Some countries have adopted an end-date for stations to switch from analog to digital, but the Canadian Radio-television and Telecommunications Commission has decided to leave it up to the market to decide when to make the jump. Given that most people still own normal television sets, it could be a while before I’m enjoying the kind of picture I thought I was paying for.

In the meantime there are a few things I can do to improve my picture quality. One is to stop using the coaxial cable input on my television and use AV cables to separate the sound and video signals.

Another is to go to the splitter that divides the cable signal coming into my home and either add an amplifier or upgrade to a better quality splitter to ensure the signal is cleaner.

Then there’s the fact I live in a large complex. Every time the signal is split it degrades, and there are about 95 units in my complex. I hope the system will be upgraded and the signal boosted over time.

If all else fails, there’s always satellite, where you receive a clean, undivided signal that has already been pre-digitized — if only they were allowed by my strata.