Thursday, April 3, 2008

Programming for multicore chips a challenge

Adding cores could create challenges for programmers writing code that lets applications work effectively with multicore chips.

Agam Shah
PC World
Wednesday, April 2, 2008; 11:19 AM

Adding more cores is desirable to meet growing computing demands, but it could create more challenges for programmers writing code that enables applications to work effectively with multicore chips.

As technology develops at a fast rate, a challenge for developers is to adapt to programming for multicore systems, said Doug Davis, vice president of the digital enterprise group at Intel, during a speech Tuesday at the Multicore Expo in Santa Clara, California. Programmers will have to transition from programming for single-core processors to multiple cores, while future-proofing the code to keep up-to-date in case additional cores are added to a computing system.

Programming models can be designed that take advantage of hyperthreading, which enables parallel processing capabilities of multiple cores to boost application performance in a cost-effective way, Davis said. Intel is working with universities and funding programs that will train programmers to develop applications that solve those problems, Davis said.

Intel, along with Microsoft, has donated $20 million to the University of California at Berkeley and the University of Illinois at Champaign-Urbana, to train students and conduct research on multicore programming and parallel computing. The centers will tackle the challenges of programming for multicore processors to carry out more than one set of program instructions at a time, a scenario known as parallel computing.

Beyond future-proofing code for parallelism, adapting legacy applications to work in new computing environments that take advantage of multicore processing is a challenge coders face, Davis said. Writing code from scratch is the ideal option, but it can be expensive.

"The world we live in today has millions of lines of legacy code ... how do we take legacy of software and take advantage of legacy technology?" Coders could need to deliver what's best for their system, Davis said.

Every major processor architecture has undergone quick changes because of the rapid rate of change as described by Moore's Law, which calls for better application and processor performance every two years, but now the challenge is to deliver performance within a defined power envelope. Power consumption is driving multicore chip development, and programmers need to write code that works within that power envelope, Davis said.

Adding cores to a chip to boost performance is a better power-saving option than cranking up clock frequency of a single-core processor, Davis said. Adding cores increases performance, but cuts down on power consumption.

In 2007, about 40 percent of desktops, laptops and servers shipped with multicore processors. By 2011, about 90 percent of PCs shipping will be multicore systems. Almost all of Microsoft Windows Vista PCs shipping today are multicore, Davis said.

Intel is also working on an 80-core Polaris chip, which brings teraflops of performance.

"We're not only talking about terabit computing, but the terabyte sets [of data] we can manage." Davis said. Users are consuming and storing tremendous amounts of data now, and in a few years, the amount of data should reach zettabytes, Davis said.

The next "killer" application for multicore computing could be tools that enable the real-time collection, mining and analysis of data, Davis said. For example, military personnel using wearable multicore computers are able to simulate, analyze and synthesize data in real time to show how a situation will unfold. Doing so is viable and doesn't create risk for military personnel, Davis said.

"These types of applications have taken weeks to do ... now these types of applications are literally running in minutes," Davis said.

As cores are added, the performance boost may also enable more applications, Davis said. The oil and gas industry will demand one petaflop of computing capacity in 2010, compared to 400 teraflops in 2008, to cost-effectively collect seismic data, compare it to historical data and analyze the data. Compared to the past, oil and gas explorers can collect and analyze data much faster now, Davis said.

Wednesday, March 12, 2008

Remote BackUp Services -


From Woodland Hills,California :
03-12-08 :
WHAT IF DISASTER WERE TO STRIKE YOUR BUSINESS ?


HOW MUCH IS YOUR BUSINESS DATA WORTH ?







It's more than just electric charges on a magnetic-medium;
it's the lifeblood of your business. It holds all of your budgets,
your product design, contracts, sales info, emails and all the other business
correspondence that makes your business successful.

WestCoastComputer is proud to partner with IBackup Solutions,

the leader in remote backup solutions.

PC World says about IBack, "Of the 17 services we tried, our favorite backup service is IBackup."
PC Magazine's Editor's Choice says about this service, "very intuitive and the best of the bunch"


LEARN MORE & FREE TRIAL .


With IBackup for Windows, you can backup/restore interactively or schedule regular online backups for Windows desktops, laptops and servers. It has the look and feel of the native Microsoft Windows Explorer coupled with powerful scheduling and logging features.

IBackup for Windows automatically selects critical data (like Microsoft Outlook, Outlook Express, My Documents, Financial Files and other Office files) with its file selection wizards for backup. Advanced features include Open file Backup, System State backup, MS SQL Server, MS Exchange Server and Oracle Server backups.



General Features

Incremental and compressed backups that greatly reduce the network bandwidth usage by transferring only the modified portions of a file


· 128-bit Secure Socket Layer encryption during transmission
· Backup/restore using mirror and relative path options more
· Easy-to-use wizards for interactive backups, restores and scheduling
· Provision to restore from the earlier Snapshots of files maintained in the IBackup account more.
· Provision to regulate Internet bandwidth usage with the Bandwidth Throttle feature more

Scheduling Features


*
Automatic Selection of most common user data and application data types
*
‘Automatic Power off’ option after a scheduled backup (Works with Windows 2000/NT/XP)


Advanced Features


*
Supports Open file Backup for most common application data types more
*
Allows backup of critical system related components using ‘System State Backup’
*
Ability to backup MS SQL Server databases without stopping the database services more



*
Online backup of MS Exchange Server databases and Mailboxes without interrupting the running MS Exchange Server services more



*
Ability to backup Oracle Server databases without stopping the Oracle Server database services more
*
Supports one-way Sync from local machine to IBackup account


System Requirements


*
Windows Vista, XP, 98, Me, NT/2000, Windows 2000 (Servers), Windows 2003 Server, Windows 2003 Web Edition
*
Internet Explorer 5.0 and higher
*
64 MB RAM, 10 MB Free Hard Disk space for installed program, 20 MB or more recommended for local caching



LEARN MORE & FREE TRIAL













****************************************************

****************************************************

Don't mess with Accounting .

Don't mess with Accounting .
Kathy gone wild !

Need Next Day Delivery ?

Need Next Day Delivery ?
We'll get your Order shipped right away .