Dual core evolves computing power

While the technology offers faster processing sans the limitations faced by single-core chips, new licensing policies are bound to cause companies a major headache.

  • E-Mail
By  Caroline Denslow Published  May 8, 2005

|~|main_dual_core_page01.jpg|~|Dual-core computing keeps Moore’s Law alive and stretches the boundaries of what can be achieved using standard architecture. It is especially useful in multitasking environments, such as data mining, mathematical analysis and Web services, and brings the digital home concept even much closer to reality.|~|Processor speed has long been the measuring stick for chipmakers to prove their industry dominance. However, due to soaring power demands and heating problems associated with the perpetual pursuit of speed, chip vendors are forced to re-evaluate their processor strategies and look at other ways of gauging performance. In the past two years, we have seen AMD and Intel introduce their 64-bit architectures, which saw CPUs doubling the amount of data it can process per cycle. Now, the industry reaches another milestone with the advent of dual-core computing. Multithreading is not a new concept. Technology experts have discussed the possibility of developing dual-core systems in the ‘90s but it was only now that engineers discovered a way to make it happen. Early forms of computers with multiple processors can be found in the server environment. Intel’s Hyper-Threading (HT) technology is another classic example of producing chips that have multiprocessor capabilities. However, the difference between HT and dual-core is that HT allows a single chip to operate like two separate processors without implementing two cores on one die. Dual-core, on the other hand, are basically two physical cores placed in one chip. Developers see dual-core computing as a way to maintain the momentum in enhancing processor performance without the challenges that come with improving clock speeds. In his keynote address at this year’s Intel Developer Forum, chief executive Craig Barrett referred to dual core as a logical way to keep up with Moore’s Law. “If you want to have transistor budgets in the billion or ten billion range, you have to do things a little bit different to continue to double the processing capability, the processing power, on an annual basis. In fact, going to dual core, multi-core approaches are the way you would do that…That enables all sorts of new applications and allows us to continue the basic premise of Moore’s Law: innovate and integrate,” says Barrett. While dual-core technology has been available for high-end workstations for some time, the first versions of dual-core processors for mainstream use only appeared this month. On April 21, AMD introduced its first processor line that features dual-core capabilities, the Opteron 800 series, which is being targeted at servers with four to eight processors. In the next two months, AMD will be rolling out three more server chips and a desktop line. Meanwhile, in an effort to beat AMD in the multi-core race, Intel pre-emptively launched its dual-core chips three days earlier than AMD’s launch and is initially offering the processors for desktop use. Intel’s Extreme Edition processors (codenamed Smithfield) will initially be targeted at gamers and PC enthusiasts, but Xeon dual-core server chips will also be released in the first quarter of 2006. As expected, key OEM customers of the two vendors, publicly expressed their support for the technology. Dell and a few others have announced that they will start selling PCs containing the Smithfield chips. AMD partners Sun and HP have also declared plans to ship servers based on AMD’s processors over the next few weeks — Sun with its Sun Fire V40z servers and HP with its ProLiant BL45p and DL585 systems. While Intel and AMD have a history of launching products targeting the same market, this time, they are taking different approaches with their dual-core strategies. With AMD, the initial focus will be on servers. Intel, however, will start with dual-core chips for the PC market. While it led some industry observers to question the motive, others think that it was a logical move for the chip company. Nathan Brookwood, principal analyst at Insight 64, believes that Intel’s decision was brought about by its strong track record in the desktop market, and is leveraging that reputation to allow it to successfully deploy dual core into the mainstream. The competition between the two vendors is also expected to intensify with both declaring to extend the technology to the mobile space. The first dual-core mobile processor from Intel, codenamed Yonah, is expected to come out at the end of the year, although volume production won’t start until 2006. AMD is also working at its own dual-core mobile versions, and is looking at offering the technology as part of the Athlon 64 family and the Turion 64 family, an AMD spokesperson says.||**||Multicore benefits|~|main_dual_core_page02.jpg|~|Database and web server applications will move to multi-core computing first, Hein Van Der Merwe claims.|~|The performance of dual-core systems is expected to be considerably better than that of comparable single-core boxes. AMD claims that its new Opteron processors offer up to 90% performance boost. This is especially advantageous in multitasking environments, such as data mining, mathematical analysis and web services, according to Tarek Heiba, AMD’s regional general manager for the Middle East and Africa region. “With dual-core processors, AMD expects to see performance increases in multitasking environments, as well as multi-threaded applications,” says Heiba. Ferhad Patel, market development manager, Intel Middle East, Turkey and Africa, agrees that the technology will enable both enterprise and consumer users to process multi-threaded applications. “From real-time business interactions to automated asset tracking, the ability of technology to transform businesses is increasing. Many of these solutions will demand fast throughput for multiple, simultaneous transactions, which is a perfect fit for dual-core processor-based platforms,” Patel says. “Server virtualisation and consolidation, grid computing, and embedded IT capabilities are helping companies simplify their data centres, improve utilisation, and reduce total costs. Dual-core processors will give IT more granular control to support these and other critical advances,” adds Patel. According to Hein Van Der Merwe, Senior Data Center Architect, Sun Microsystems Middle East and North Africa, dual core will have greater impact on systems used in a computer room or enterprise-style environments. “The applications that will move to dual core first are the processor-hungry environments like databases and throughput models such as front-end application server and web server environments; in other words, anywhere where high throughput in a system is required, and where lots of threads can be scheduled simultaneously and need to be serviced as quickly as possible,” says Van Der Merwe. For desktop PC users, using dual-core systems means working more efficiently because they will be able to run high-end applications easily, as if they are running any normal software program such as an e-mail or word processing application. It also means that they can work more efficiently because it gives them the ability to keep working even while running the most processor-intensive tasks in the background, such as searching a database, rendering a 3D image, ripping and burning music files to a CD, or downloading videos off the Web. This is especially useful when it comes to PC security because it enables them to work on their machines while simultaneously running, say, a virus protection program in the background without affecting system performance or causing any disruption. On the consumer front, Heiba sees that multi-core processors will push the concept of digital home much closer to reality. By expanding the role of PCs, dual core will allow these systems to be leveraged for new tasks, including serving as the hub for digital entertainment in the home. “As media-centric PCs move into the mainstream, multi-core processors can serve as the foundation for the digital home. Combined with new applications designed for consumer environments, multi-core processor-based media centre PCs will be capable of simultaneously serving different media sources to multiple rooms in the home,” explains Heiba. “This will enable consumers to experience richer features and more functionality, especially for applications like digital media and digital content creation,” he adds. “PCs can already be wireless integrated with cable desktop boxes and TVs throughout the home. With a multi-core processor-based PC serving as the hub of a wireless home network, dad can be surfing the Web in the living room, while his daughter is downloading and playing MP3 audio files in her bedroom, and his son is playing a game on an appliance in the kitchen,” says Heiba. In the long term, dual-core processing will help enterprises maximise on their IT investments and prolong the shelf life of their systems. “Multi-core computers have the ability to run today’s applications as well as tomorrow’s more complex applications, which means that the hardware will retain its value over time,” explains Heiba. “Next-generation software applications will require the performance capacity provided by multi-core processors. Software destined to break barriers in the user experience, like voice recognition and artificial intelligence, will be possible with multi-core processors,” Heiba continues.||**||Licensing woes|~|main_dual_core_page03.jpg|~|Microsoft will treat dual-core as one processor and will charge licensing as such, says Mauro Meanti.|~|One problem that users will have to deal with when it comes to dual-core systems is on software licensing. In the past, software licenses are charged per system — or per processor. With dual-core, the question arises as to how software vendors will charge for licenses. Some vendors like Oracle will require users to pay a license for each core, while others, such as BEA is planning to hike up pricing by 25%. “At BEA we follow a price per physical CPU, which was implemented a long time ago, whereby a dual processor is charged on a premium of 1.25%,” says Diyaa Zebian, general manager, BEA Middle East and Egypt. “We currently have customers in production on dual processors that enables them more processing power with cheaper software cost; however that may also result in a complication in the licensing scheme,” Zebian adds. According to Ayman Abouseif, Oracle’s senior marketing director for the Middle East and Africa, Eastern and Central Europe, they will offer customers two kinds of pricing models. “For its technology products, Oracle has two primary pricing models. Customers can choose between Named User Plus and Per Processor pricing models based on their specific needs,” Abouseif says. “Named User Plus is ideal for organisations with discrete and countable user populations. For uncountable populations, processor licensing is required. The Processor pricing model is based on the number of processors a customer has installed and the number of those processors that the customer has operating. This model is easily measured, a fact that makes costs transparent for our customers,” he adds. Abouseif says Oracle’s licensing structure was modified to keep up with the developments happening in the industry. “Software licensing models evolve as IT environments continue to evolve. Driven by the propagation of n-tier architectures, our current licensing models grew from concurrent user, named user single server and named user multi-server licensing models to processor and named user plus models. As the software landscape continues to transform, we anticipate that software licensing will continue to transform along with it,” Abouseif explains. The company is also looking at adding employee-based licensing in its existing pricing models, making its licensing structure flexible depending on the size of a company. “In the near term, we expect to augment our existing pricing models with employee-based licensing, which essentially is an enterprise model based on the size of a company. Again, customers are looking for predictability and low costs and we are committed to providing them with pricing models that meet their needs,” comments Abouseif. Chipmakers, on the other hand, are recommending that software licenses should be based on the established simultaneous multi-threaded (SMT) processor-licensing model, which allows existing software to run on multi-core processors without having to make changes. Microsoft is one of the companies adopting this policy. “From a licensing standpoint Microsoft considers a multi-core processor a single processor, independent from the number of cores it contains. After looking at the recommendations from our industry partners in the chip and hardware manufacturing space we determined that a per processor — not per core — licensing model is the right model for customers,” declares Mauro Meanti, general manager, Server and Tools Group, Microsoft Europe, Middle East and Africa. According to Meanti, Microsoft's decision was influenced by several factors. “Microsoft has not charged for chip processor improvements in the past and we’re not going to start now. Enterprise computing does not have to be expensive. Customers should benefit from advancements in chip technology without having to pay more for the same software,” Meanti explains. He believes that by maintaining its multi-core licensing policy, Microsoft will be able to continue to drive value in data centres. “We’re committed to providing flexible licensing programmes that help customers achieve their IT goals and maximise their ROI, as well as help accommodate the dynamic needs of their IT environments. This is an example of that commitment,” Meanti says. But as with any new technology, analysts suggest users should approach dual core with caution. In an online advisory, Gartner advised enterprise users to study the dual-core proposition carefully before it starts investing in it. “Consider the dual-core platform for early deployment to workstation users if the resulting productivity increase promises a meaningful business benefit. Prepare for full-scale deployment in 1Q06,” Gartner advises.||**||

Add a Comment

Your display name This field is mandatory

Your e-mail address This field is mandatory (Your e-mail address won't be published)

Security code