BetaONE will rise again!


 
Prev Previous Post   Next Post Next
  #2  
Old 16th Oct 04, 08:28 PM
Alpine's Avatar
Alpine Alpine is offline
Retired Crew
 
Join Date: Feb 2002
Location: Run Forest, RUN!!
Posts: 3,601
Alpine is on a distinguished road
Send a message via ICQ to Alpine Send a message via AIM to Alpine
Management at Intel was known in the past for doing the right thing for the right technical reasons. Things were done on merit, and not to win slideshow beauty contests. For over twenty years, Intel did the right thing technically, and it worked out in the marketplace because it was the best. It is easy to sell when you have the best thing going.

But the Pentium 4 represented a sea-change. Technical merit took the back seat to other concerns. From the outside, it looks to me like the Pentium 4 was designed to hit a number that sold well, not to be the best. This was the critical failure that is going to devastate Intel. Mike Magee came up with the term marchitecture, meaning marketing driving architecture, for a reason.

This mistake set in motion a series of goals that proved unattainable to even the brilliant engineering teams at Intel. There was no backup architecture, and more management decisions put the incredibly good Pentium M out of the running. Now emergency steps are being taken to take advantage of the Pentium M is that are too little, too late.

AMD stops shooting itself in the foot
The other Intel problem is AMD. It has recovered from the series of self-inflicted wounds that were Palomino and Thoroughbred A, and is once again pushing Intel. When the Athlon came out years ago, Intel was pushed to the wall and the Pentium III did not have what it takes. The Pentium 4 in Northwood guise did, and Intel grabbed the ball and ran so fast that AMD didn't realise what was happening. AMD had a long standing habit of tripping over its own feet when they it tried to run, and Intel just strolled on, laughing all the way to the bank.

But the K8 core gave AMD a processor that worried Intel. When it ramped raw MHz faster than Intel with a core that was not supposed to ramp fast, it was clear that something was very wrong.

Intel no longer had the luxury of time, and the engineers had to produce and do it right the first time. There was no time for a plan B. If there were problems, it would mean slipped launch dates, and nothing in the rabbit's hat to pull out and make marketing look good.

The technical problems are the real killer. The Willamette and Northwood cores had several problems, most notably that they were probably the most aggressive circuit designs ever attempted. Elements on the bleeding edge that theoretically shouldn't have worked were made to work well. Northwood was an incredible success, and allowed it to claw back marketshare.

The Domino Theory
The cost to make such parts was immense. A big problem was the use of self-resetting domino circuits, which are very timing sensitive. There are pulses that have to arrive at a certain point at a certain time for the circiut to work. That in turn drives the next one, and the next. If one fails, they all go, and since it is not a function of clock speed but more trace lengths, it does not cause the chip to have a low maximum frequency, it just makes it fail.

If you want to make it work, you have to change the trace lengths between the transistors, pretty much a manual job. Part of the problem is finding the parts to change. Most test equipment changes the characteristics of the circuit enough to make the reading nonsensical, so bug hunting is more black magic than science. Then you have to move the transistors a little bit, a nip here, and a tuck there.

Multiply this by a few million transistors and you have gainful employment for a lot of engineers. Move one too much in one direction, and you have problems with the surrounding transistors. It kept a lot of people very busy. By most accounts, the team size for the Netburst cores was three to four times that of a Pentium M core team.

This labour intensive, fragile and cutting edge process that succeeded so brilliantly in the past was not the way of the future, and it was a potentially huge impediment to progress.

Prescott was designed to use a more relaxed and robust circuit methodology and ceded some performance for a lot of forgiveness. Part of the change was a breathtakingly long pipeline built for speed. On the low end, it would take a larger penalty for a branch mispredict, and instruction throughput had potentially a 50% longer latency, but scale to immensely high clock rates.

There were other problems with this architecture including a huge transistor count, it consumed vastly more power, and needed twice the cache to keep up with its predecessor. But it was easier to design for, and it would ramp, boy would it ramp. All was forgiven because the light at the end of the tunnel, immense clocks to satisfy the marketing boys and girls, looked feasible.

90 Nano Engineers
Despite what Intel said at the time, there were problems with the 90 nanometre process. Many were solved, but even at the coming out party for it at Fall 2003 IDF, it was clear that done did not mean done right. Some problems did not surface at the press conference. When the Dothan chips came out and Intel hit the quoted 21 Watt envelope, many took that as a sign that the 90nm process was back on track. Prescott's ravenous power consumption was blamed on the transistor count, the pipeline stages, star alignment, or some other crackpot theory of the day. These things all contributed, but a management decision that was the biggest problem.

It did the marchitecture thing, and gave at least one other group a lot of say in the process. Instead of a 90nm process finely tuned to putting out the best CPU in existence, a compromise was forged and that compromised the microprocessors.

Excessive leakage and power consumption made the chips less attractive to potential buyers, especially for Xeons where density, not destiny, is a real problem in server rooms. Prescott used too much power.

A self imposed power cap was put in place. Gone were the days of picking a clock and that determined the power that was used. Power was set in stone, and you had to get creative and do the engineering to fit the MHz into those limits. This can be done, but the problems is that AMD won't give Intel time to work things out. No Plan B this time.

Another narrowing of the frequency box is the design of the Pentium 4. The multipliers on the clock are fixed as are the FSB settings. The steps you have are the steps you have got. Changing them means a lot of work on the controlling PLLs, a long, hard and unpleasant process.

That means a finite and inadequate number of steps you can design a CPU within an ever decreasing window of workable clock speeds. If you replaced the PLLs, it gives a little more play, but takes time to do.
Reply With Quote
 


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT +1. The time now is 06:04 PM.


Design by Vjacheslav Trushkin for phpBBStyles.com.
Powered by vBulletin® Version 3.6.5
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.