In the first article in this series I discussed CPUs. In the second, I looked at GPUs. In this final article in the series I’ll look at the other components as well as the operating system that we recommend.

Memory

At the moment our software is still 32 bit, although we have built and tested 64 bit versions already. Because it is 32 bit, the most RAM it can use at one time is 2 GB (on a 32 bit Windows system) or 4 GB (on a 64 bit Windows system). (Technically, it is possible to use up to 3 GB on a 32 bit Windows system using the /3GB switch in the boot.ini file, but this is likely to cause problems if you are using a graphics card with lots of memory.)

At the moment, 4 GB of RAM costs about $44 Australian dollars, including GST. (A$1 = US$1 right now.)

In light of that, having less than 4 GB of RAM seems like a false economy. In fact, we recently upgraded our development PCs from 6 GB of RAM to 12 GB of RAM — an upgrade that costs A$120 at the moment — because even though each individual program might only be able to use 4 GB of RAM, there’s nothing to stop you from running multiple copies of each program at the same time. Furthermore, the more RAM you have in your system, the more the OS can use to cache your disk accesses, which are about three orders of magnitude slower than RAM accesses.

So, if you have a PC with dual memory channels, I would recommend at least 4 GB of RAM and preferably 8 GB. If you have a PC with three memory channels, I’d recommend 6 GB of RAM and preferably 12 GB.

Hard Disk

Given how important the CPU, RAM, and graphics card are to our software, it is perhaps a little surprising that the speed of the hard disk isn’t actually that important. The software tends to load the data up at the beginning and save it all at the end (like the way you use Word or Excel) rather than constantly read and write to a database (like Access). There are some exceptions to this (e.g. the writing of .ori files during relative-only point generation and the reading of images by 3DM CalibCam as it needs them if they can’t all fit in memory at once) but the overall rate of data transfer is relatively slow. Some customers even keep their projects on a network drive and do all disk accesses over a network! (Personally I like to copy the project locally.)

Of course, having a large hard drive can be beneficial, given the volume of data being used. Even with JPEG compression, images from modern DSLRs are in the 10-20 MB range each, and a large project can have hundreds of images in it. The DTMs generated from those images can easily add up to a few GB. Fortunately hard disks are down to below A$50 per 1000 GB now, so 1–2 TB of storage in a PC is unexceptional.

Operating System

Earlier I mentioned that 2 GB was the limit for 32 bit applications running on 32 bit versions of Windows. There was a time when this was considered a lot of memory, but cameras produced much smaller images then!

The Canon EOS 5D Mark II produces 21 megapixel images. When saved in JPEG format these may only take 15 MB on disk, but the software can’t use the images when they are in that format — it’s only for storage. When it reads those images into RAM, it must decompress them into raw RGB format, and that requires three bytes per pixel, or just over 60 MB per image.

Merge a handful of those images together and you’re talking about 100 megapixels and 300 MB per image. Now, load two of those into memory and process them to generate a DTM. Now create a texture from one of those images for the graphics card to be able to drape it over the DTM. Pretty soon you’ve got a problem — not necessarily because you have actually run out of memory per se, but because the memory you do have has become fragmented, and there are no chunks big enough to satisfy the next big memory allocation request the software makes. In our experience 32-bit Windows starts reporting a failure to allocate memory once about 1.5 GB of that 2 GB limit has been used.

With 64 bit Windows things look very different. The theoretical limit has only doubled, but the practical limit increases from about 1.5 GB to about 3.5 GB, meaning three times as much data can be handled — just by using a 64 bit version of Windows.

It is still possible to run out of memory, and therefore a 64 bit version of the software is still worthwhile, but the maximum practical project size is much larger, and most people using 64 bit Windows are unlikely to actually hit that limit.

For this reason we recommend using a 64 bit version of Windows rather than a 32 bit version.

But which version?

When Vista first came out there was a lot of negative publicity. However, we adopted it immediately and had no real problems with either our software or the other software we use.

Likewise, we switched to Windows 7 as soon as it came out and there weren’t any compatibility problems with our software at all.

Since we use 64-bit Windows 7 (Ultimate) on our development machines, that’s a pretty safe bet — it’s the one most likely to have been tested! But we also officially support Vista and Windows XP, so any of those should be fine, with one caveat — the 64 bit version of Windows XP was not widely supported by hardware vendors, so unless you have very mainstream hardware, you might have difficulty obtaining a driver for your hardware.

(We actually conducted a customer survey exactly a year ago to see what our customers were using so we could determine whether it was safe to update our minimum supported OS version. Overwhelmingly the response was Windows XP. (Strictly speaking, we asked what the minimum supported OS should be, so if they were able to use both Windows XP and Vista they would still respond Windows XP, but most of them made comments to the effect that they weren’t able to use anything newer than Windows XP. One even said that in an organisation with 2000 PCs, all were Windows XP except one that was running Vista!) Interestingly, there have been quite a few emails recently from customer’s IT departments asking whether our software is compatible with Windows 7 so it seems that many are now making the transition.)

Summary

It’s probably worth summarising the recommendations from all three posts in one place, so here they are :

CPU

Intel Core i7, i5, and i3, and AMD Phenom II, with more expensive = better. (Note that when comparing workstation versions of CPUs with their desktop counterparts, like Intel Xeon vs Core and AMD Opteron vs Phenom, the more expensive = better rule does not necessarily apply.)

GPU

NVIDIA and AMD/ATI, with more expensive = better. (Note that when comparing workstation versions of GPUs with their desktop counterparts, like NVIDIA Quadro vs GeForce and AMD FirePro vs Radeon, the more expensive = better rule does not necessarily apply.)

The one I would advise against is Intel, and this is worth emphasising because a lot of notebooks come with Intel embedded graphics. The ones we have seen have very poor and out-of-date OpenGL support, and are so slow as to be unusable anyway.

RAM

At least 4 GB ($44!), more will increase productivity.

Hard Disk

Not especially important, but plenty of storage useful. 2TB = $99!

OS

64 bit strongly recommended, Windows 7 preferred, Vista and XP supported.

Customers often ask us to advise which PC out of a set of PCs they can pick from to use and we are more than happy to oblige. If you need any advice, don’t hesitate to ask!