64-bit processors are becoming the standard for systems that range from desktop computers to scalable servers. Traditionally, most business computer systems have run on 32-bit processors. The main advantage of 64-bit processors over 32-bit is their ability to use and access far more memory. The maximum amount of memory that 32-bit processors could handle was 4 gigabytes of RAM, and that was split between the operating system and any applications that were running. With 64-bit systems, up to a terabyte (1000 GB) or more can be used.1 This lets users work with much more data and analyze larger, more complex problems than ever before. For example, users with spreadsheets over 2 GB in size (which is not uncommon) simply could not open these files on a 32-bit system, but 64-bit systems can handle these files with no problem.2
Windows Vista and Windows 7, as well as Office 2010, offer 64-bit versions. Nearly half of the world’s PCs are already running 64-bit versions of Windows.3 More and more PCs are shipped with 64-bit processors and 64-bit Windows as the default operating system. As the price of memory continues to fall, it is more affordable to include large amounts of RAM to take advantage of the increased computing power 64-bit systems offer.
More Memory, Cheaper Hardware
The ability to access much more memory lets 64-bit systems process more data per cycle. This in turn enables more scalable, higher performing computing configurations.4 It also allows for the development of more far more sophisticated models and software applications.
The expansion of 64-bit processors comes at a time when hardware prices continue to fall. Virtually all PCs have been shipping with multicore processors for some years now, and the price of RAM has gotten lower and lower. This means that very powerful desktops, servers, or clusters of multiple computers can be assembled relatively inexpensively, bringing what was once considered supercomputing power to the mainstream business audience.
As a result of these trends, High Performance Computing (HPC) systems are becoming commonplace. These HPC systems are clusters of multicore or multi-processor servers, running together in parallel processing to crunch more data, more frequently, and faster. Combined with HPC systems, 64-bit architecture represents an exponential improvement in performance for HPC systems.
64-bit computing also allows for more efficient multi-tasking, stress testing, data encryption, and other functions where 32-bit systems can fall short. And, 64-bit systems can support larger files more easily. For instance, a 4 GB file (which is not unusual) on a 32-bit system cannot be efficiently “mapped;” that is, only portions of the file can be accessed at a time, whereas with a 64-bit system, the entire file can be accessed at once.5
Palisade’s Support for 64-bit Computing
@RISK and the DecisionTools Suite line of analytical software has been written to take advantage of the increased processing power of the 64-bit versions of Excel 2010 and Windows 7. Users with large models will likely notice improved speed in simulation and optimization run times, enabling them to run larger simulations or perform analyses more frequently. In addition, computationally intensive analyses such as stress testing, advanced sensitivity analysis, and combined simulation and optimization will run more efficiently. Furthermore, 64-bit @RISK and DecisionTools Suite will enable the creation of larger models than previously possible. Given the amount of memory available to 64-bit systems, there is virtually no limit to the size or complexity of spreadsheet models or VBA applications running in spreadsheets.
1What Is 64-bit Computing? (Webopedia)
264-bit editions of Office 2010 (Microsoft TechNet)
3Windows 7 boosts 64-bit computing (ComputerWorld)
4Microsoft 64-Bit Computing (Windows Server 2008r2)