Catch up with Intel, Sony has the most beautiful chip dream.
If it were said now that a Japanese semiconductor company is going to surpass Intel, many people would probably laugh it off. Although Intel was overtaken by Samsung last year, it remains one of the strongest semiconductor manufacturers on this blue planet. As for Japanese semiconductor manufacturers, whether it's Kioxia, which specializes in storage, Sony, which specializes in sensors, or Renesas, which has developed rapidly in recent years, none of them have made it into the top 10 global semiconductor manufacturers, which can be described as falling behind.
However, one point that everyone acknowledges is that Japanese semiconductors have had their prosperous times in the past. Going back to the 1980s, several Japanese companies were dominant, with Philips, Intel, Motorola, and others being mere underdogs to Japanese semiconductors.
But Japanese semiconductors were like a meteor that streaked across the sky, experiencing a brief period of brilliance in the 1980s, and then succumbing to the United States' pursuit and blockade. In 1985, the United States and Japan signed the Plaza Accord, which led to a significant appreciation of the yen against the US dollar, causing the Japanese economic bubble to gradually burst and dealing a heavy blow to the domestic semiconductor industry in Japan.
Advertisement
At the beginning of 1986, the US Department of Commerce ruled that Japanese memory chips were involved in unfair competition and dumping at low prices, imposing a 100% anti-dumping duty on Japan. In June 1987, the United States passed the Toshiba Sanctions Act, canceling a series of procurement contracts and prohibiting all Toshiba products from being exported to the United States for 2 to 5 years. In 1991, the United States and Japan signed another five-year "New Semiconductor Agreement," in which the United States demanded that the market share of foreign semiconductors in Japan must reach 20%...
A variety of unequal agreements and sanctions have led to the decline of Japan's five major semiconductor giants, including Fujitsu, NEC, Hitachi, Toshiba, and Mitsubishi, from prosperity to decline, ceding this potentially limitless market to the United States. In 1996, US semiconductors accounted for more than 30% of the global market share, while Japanese semiconductors accounted for less than 30%, and the gap between the two countries gradually widened.
In order to help Japanese domestic semiconductor manufacturers out of the predicament, Japanese companies have also made some self-rescue adjustments. In 1999, NEC and Hitachi each spun off their DRAM businesses to establish a new company, Elpida, and later the DRAM division of Mitsubishi Electric was also merged into Elpida, with the intention of competing against American DRAM companies.
However, at this time, Japanese companies were not only content with maintaining the status quo, but also had ambitious Japanese companies that were not willing to be ordinary. They planned to learn from Goujian's patience and perseverance, and through a bold bet, ascend to the throne of the semiconductor world once again.
The Birth and Embryo of the Cell Processor
In 2000, with the great success of the next-generation game console PS2 worldwide, SCE (Sony Computer Entertainment), which was in the limelight, began to prepare for the next generation of consoles. CEO Ken Kutaragi had a bold idea in his mind: could he collaborate with American companies to create a universal processor similar to Intel and PowerPC, which could not only be used in the next generation of consoles but also in other digital home appliances and even servers? If successful, SCE could completely dominate the console market, and winning the next decade would no longer be a dream.
Once this idea emerged, it took root in Ken Kutaragi's heart, changing the semiconductor market pattern, surpassing industry leader Intel... The various beautiful visions of the future seemed to be beckoning to SCE.In 2000, Sony Computer Entertainment (SCE), Toshiba, and IBM jointly signed an agreement to establish the STI consortium and set up the R&D center in Austin, Texas. The consortium agreed that during the next four-year R&D cycle, Sony would provide the budget, IBM would mainly be responsible for processor development, and Toshiba would be in charge of the production of subsequent processors and related storage chips.
This development took four or five years, with Sony investing billions of dollars in R&D funds, almost depleting the savings accumulated from the PS1 to PS2. However, the processor known as Cell was slow to emerge.
But this did not affect Sony's external promotion. In 2003, Ken Kutaragi said in an interview with the Japanese "PCWatch" column that a sufficient number of Cell processors connected in series could achieve or even exceed the performance of the "Earth Simulator" (Earth Simulator, a supercomputer made by NEC, which was one of the fastest computers in the world at the time).
A processor capable of simulating the Earth immediately aroused people's interest, and everyone turned their attention to Sony, wanting to see what was so extraordinary about this processor that could match supercomputers.
Fortunately, hard work pays off. In November 2004, IBM, Sony, Sony Computer Entertainment, and Toshiba first revealed some key concepts of the highly anticipated advanced microprocessor codenamed Cell, a microprocessor jointly developed by the four companies for new generation computing applications and digital consumer electronics.
At the press conference, the four companies finally confirmed that Cell is a multi-core chip, including a 64-bit power processor core and multiple collaborative processor cores, capable of large-scale floating-point processing. It is optimized for compute-intensive and rich media broadband applications, including computer entertainment, movies, and other forms of digital content.
According to the press release, some of the main advantages of the Cell processor in design include:
- Multi-threading, multi-core structure
- Simultaneous support for multiple operating systems
- Abundant bidirectional bus bandwidth provided to the main memory and auxiliary chips (Companion Chips)Flexible On-Board I/O (Input/Output) Interface
Real-time Resource Management System for Real-Time Applications
On-Board Hardware Supporting Security Systems for Intelligent Intellectual Property Protection
Adopted 90-nanometer Silicon-on-Insulator (SOI) Technology
"A large and rich content, such as multi-channel high-definition broadcasting programs and millions of pixels of digital still/moving images captured by high-resolution CCD/CMOS imaging devices, requires a large-capacity real-time media processing function. In the future, all formats of digital content will be integrated together and integrated into broadband networks, thus starting an explosive growth." Sony's Executive Vice President and Chief Operating Officer, President and Group CEO of Sony Computer Entertainment, Ken Kutaragi, said, "To access and/or browse the vast content freely and in real-time, a more advanced graphical user interface in a three-dimensional environment will become 'key' in the future. To handle such a rich application, the current PC structure has reached its limit in both processing power and bus bandwidth."
All four companies have shown their utmost sincerity for the Cell processor: IBM plans to start the trial production of the Cell microprocessor in the first half of 2005 at the wafer production factory in East Fishkill, New York; Sony hopes to launch a broadband content and high-definition television (HDTV) system equipped with the Cell processor in 2006; Sony Computer Entertainment also hopes to launch its new generation of computer entertainment system with Cell, namely PS3, to achieve a revolutionary change in computer entertainment experience; Toshiba has envisioned various applications for Cell and hopes to launch its first Cell-based product in 2006 - a high-definition television system (HDTV).
In 2004, on the eve of the release of the Cell processor, the ambitious Ken Kutaragi even approached Apple CEO Steve Jobs to promote the Cell processor, hoping that this cross-era processor could be equipped on the next generation of Mac, hoping that the Cell ecosystem could expand to personal computers and desktops.
However, Jobs did not give him face and directly rejected the proposal, not hiding his disappointment with the Cell design, saying that Cell is not even as good as the PowerPC that has been used for so many years. I believe everyone knows what happened next, Apple announced at the next year's Worldwide Developers Conference to switch to Intel and x86, completely breaking Sony's thoughts.
But Ken Kutaragi was not discouraged, because Sony still had the big killer of this generation of the main machine, PS3. The two generations of PS1 and PS2 sold hundreds of millions of units worldwide. As long as the PS3 with the Cell chip is launched, no matter it is Intel or Apple, they all have to bow their heads in front of Sony.
The Uniqueness and "Power" of the Cell ProcessorSo much has been discussed about the Cell chip, with Sony, IBM, and Toshiba all being very confident, but where exactly is its power? I'm afraid everyone doesn't have a specific concept yet.
In 2005, the development of the Cell chip was nearing completion, and the first batch of chips began to be trial-produced. It used a 90nm process, equipped with 4 PPE main cores (Power Processor Element, referred to as PPE, derived from the simplified PowerPC970) with a frequency as high as 4GHz, as well as 32 SIMD-based co-processors (Synergistic Processor Element, referred to as SPE) with a total computing power of 1TFloaps. The overall performance is not inferior to the top desktop processors, and it even touches the threshold of server chips. In addition, it also integrates an XDR memory controller, which can be matched with a memory system with a bandwidth of 25.6GBps, and its front-end bus also adopts a 96-bit, 6.4GHz frequency FlexIO parallel bus (originally named "Redwood", developed by RAMBUS company), which is also the fastest computer bus in history.
However, the combination of 4 PPEs and 32 SPEs makes the chip area and power consumption reach a very high level, and the multi-core design also affects the final mass production yield.
IBM also provided a specific technical analysis. As a microprocessor, CELL is a hybrid between traditional desktop processors (such as Athlon 64 and Core 2 series) and professional graphics cards (such as NVIDIA and ATI). It is expected that Cell can not only be used for entertainment devices, high-definition displays, and high-definition TV systems but also for digital imaging systems (medical, scientific, etc.) and physical simulation (such as scientific and structural engineering modeling), which can be said to be an all-round processor.
The Cell processor is specifically divided into four parts: the external input and output structure, the main processor known as the Power Processing Element (PPE) (a dual-synchronous multi-threaded PowerPC 2.02 core), eight fully functional co-processors known as Synergistic Processing Elements (SPE), and a dedicated high-bandwidth ring data bus connecting PPE, input/output elements, and SPE, called the Element Interconnect Bus (EIB).
To achieve high-performance computing, the Cell processor needs to use EIB to connect SPE and PPE, and access the main memory and other external data storage through fully cached consistent DMA (Direct Memory Access, direct memory access). In order to make full use of EIB and combine computing and data transfer, each processing element (PPE and SPE) is equipped with a DMA engine. Since the load/store instructions of SPE can only access its own local scratchpad memory, each SPE completely relies on DMA to transfer data to the main memory and the local memory of other SPEs. The main design of the architecture is to use DMA as the core means of data transfer within the chip, in order to achieve the maximum asynchrony and concurrency in data processing within the chip.
In addition, the PPE that can run traditional operating systems has control over the SPE, and can start, stop, interrupt, and schedule processes running on the SPE. For this reason, PPE has additional instructions related to the control of SPE. Unlike SPE, PPE can read and write the main memory and local memory of SPE through standard load/store instructions.
Although there is a complete architecture, SPE is not completely autonomous and needs to be started by PPE before it can work. Since most of the computing power of the entire system comes from the co-processor, on the one hand, DMA is used as the method of data transfer, and on the other hand, each SPE is limited by a smaller local cache. This is a great challenge for those who have never been in contact with Cell software developers, and it is necessary to adjust the running software very carefully to maximize the potential of this processor.
In fact, the answer that IBM has handed in seems excellent but is actually complex. Just understanding the difference between this processor and other ordinary processors takes a lot of effort, and the overly large scale of the prototype chip also led to the final mass production being delayed again and again. In the end, Sony was forced to face the reality and reluctantly cut the Cell.This strike down, it cut off a lot of performance, the first Cell processor released only carried a main frequency of 3.2GHz PPE main core and 8 SPE co-processors, to ensure production yield, also shielded 1 SPE, another SPE was allocated to the operating system and audio, the game can only call 6 SPE, it integrated 234 million transistors, using IBM's 90-nanometer SOI, Low-K process manufacturing, core area of 221 square millimeters, chip scale and Intel's dual-core Pentium D is comparable.
But also do not have to be too pessimistic, the Cell chip is actually not a pure CPU, but also includes some of the functions of the GPU, SPE co-processor theoretically can be carried out on the physical, audio, light source for geometric calculations, and even simulate the post-processing effects that the GPU does not support, such as curved surface subdivision, computer shaders, and so on, has the embryonic form of the CUDA core in today's Nvidia GPU.
And when the Cell began development, what Sony wanted was to use a Cell to be responsible for the CPU function, another Cell to bear the function of the GPU, it seems to be a fantasy, but in fact, it is not completely impossible, later there are manufacturers such as Leadtek released a PCI-E card based on the Cell, used to accelerate video decoding.
In addition, because IBM considered the needs of the server from the beginning of development, it also used the server-level 256MB XDR high-performance memory, so the Cell is not only powerful in floating-point computing, but also has good support for parallel computing and distributed computing, as long as there are enough PS3 consoles equipped with the Cell, it can form a supercomputer, which can be said to be beyond the reach of other desktop processors.
It is reasonable to say that under the guidance of the Cell chip, the PS3 has already been invincible, in Sony's original imagination, it is self-evident to dominate the host market, and it is imminent to seize the desktop market, the "Sony is great" joke seems to have become a reality.
However, all of Sony's dreams began to shatter at the moment of the release of the PS3.
The failure and annihilation of the Cell processor
Why did it shatter, the reason is still on the PS3 host.
We mentioned earlier that the Cell can take on some of the functions of the GPU, but this does not mean that the Cell can be directly used as a GPU, the graphics calculation is still handed over to the GPU, IBM naturally does not produce GPUs, Sony can only turn to the two major graphics card manufacturers NVIDIA and ATI, working overtime to insert a custom RSX into the PS3, which is based on the Geforce7800 series, and the performance is between G70 and G72.
But at this time, it was close to the release date, the 256M XDR memory of the Cell chip, can only be used by the CELL itself, the RSX GPU core can not share this part of the memory, in order to quickly go to market, Sony also stuffed an additional 256M GDDR3 memory, the cost has increased a lot.Not only that, but to be compatible with the previous generation of PS1 and PS2 consoles, Sony also stuffed an additional EE+GS chip to ensure that the previous generation of games could run perfectly on the PS3 through hardware compatibility, which was another bleeding effort.
In addition, Sony and Panasonic, among other companies, established the "Blu-ray Disc Association" in 2004 with the intention of promoting the popularization of the next generation of disc formats, competing with the HD-DVD Promotion Association. At this time, the PS3 also undertook the task of promoting Blu-ray discs and helping Sony win the disc format. Given that the previous generation of PS2 won the market by supporting DVDs, it was also reasonable to include a Blu-ray drive.
Such a set of addition calculations, the cost of the PS3 has already reached a terrifying level. According to disassembly reports, the cost of each Cell chip is about $89, the cost of the RSX graphics card is about $129, the cost of the Blu-ray drive is about $125, and the cost of the EE+GS is about $27. Just the cost of the chips and the drive has already reached $370, and the total cost even reached $805-840. It is important to note that the starting price of the PS3 is only $499, and this does not include the early research and development and later marketing costs. Selling one unit results in a net loss of more than $300, and even Sony's deep pockets cannot withstand such a mess.
Due to the continuous addition of various chips, the power consumption of the PS3 has also reached a terrifying level. When playing Blu-ray games, the total power consumption of the PS3 easily exceeds 200W, and even in the standby menu page, the power consumption will remain around 170W.
The high price is still a secondary issue. The Cell chip, which Sony has high hopes for, has it really lived up to the strength of the previous publicity?
In fact, the most critical six SPE co-processors surrounding the Cell chip have three development modes. Among them, the mode that can maximize the potential of the co-processor is also the most difficult to develop and optimize. Developers need to bypass the operating system, API, and operation, and directly operate the development of SPE, with a terrifyingly low efficiency. Among the six SPEs, only four support this mode. From the birth to the discontinuation of the PS3, not many games were developed under this mode. In 2006, when dual-core processors were prevalent, the Cell chip, which is essentially still a single-core processor, became a nightmare for developers.
What is more ironic is that Sony's biggest rival, Microsoft, also used an IBM processor on the Xbox 360. However, Microsoft did not develop it as painstakingly as Sony, but directly customized a Xenon processor with three integrated 3.2GHz PowerPC cores from IBM, which are exactly the PPE main core of the Cell. It is also equipped with ATI's R500 graphics card, and the overall architecture is very close to the PC, which greatly reduces the development difficulty. A large number of PC games only need simple porting to land on the Xbox 360, which is a world of difference compared to the PS3.
However, the PS3 as a whole is not without merit. Relying on good support for parallel computing and distributed computing, it can shine in other places.
In 2010, the Air Force Research Laboratory (AFRL) of the United States built a cost-effective supercomputer, which consists of 1760 PS3s, 168 independent graphics processing units, and 84 coordinating servers. Its code name is "Condor Cluster," which is used to process satellite images, radar, and research AI. AFRL also opened some of the computing power of the Condor Cluster to some universities and research institutions. It is revealed that the total investment of this supercomputer is about 2 million US dollars, and the computing performance is 500 TFlops, with the cost and power consumption being only one-tenth of the conventional supercomputer with the same computing power.
In addition, Sony also announced in 2007 that the PS3 officially joined Folding@home, which is a distributed computing project that studies protein folding, misfolding, aggregation, and related diseases caused by them. Users can allow their PS3 to perform the computing tasks distributed by Folding@home when idle. As of September 2008, the participating PS3 game consoles provided 1.2 PFlops of computing power, accounting for nearly 35% of the total computing at that time.However, the pace of Cell was also limited to this. From its release to the cessation of support, apart from the PS3, IBM servers and supercomputers, and Toshiba TVs, no other electronic products have used this peculiarly designed processor. The entry into the desktop market was never realized. This processor, which was developed for four to five years, entered the semiconductor market in a very undignified manner.
Of course, some of the ideas of Cell can still be found in various processors today, such as NVIDIA's CUDA cores, AMD's APUs, and Apple's latest M-series chips. Perhaps some of their inspirations come from this failed chip developed by IBM and Sony?
The technical director of Guerrilla Games, who developed the exclusive visual masterpiece "Killzone" for the PS3, also reminisced about Cell in 2021, believing that this processor is still more powerful than any CPU from Intel. It was ahead of its time, but it was difficult to grasp in terms of usability and balance.
The saying "one step ahead is a genius, and two or three steps ahead often becomes a martyr" is very suitable to interpret the process of Cell. Perhaps the ambitions of Japanese manufacturers in semiconductors, and the determination to catch up with the United States, gradually disappeared with the departure of Cell.
Leave A Comment