In 64-bit architectures, the address space encompasses 16 exabytes. MacOS X 10.5, for example, can address this much as virtual memory, although it "only" supports 4 terabytes of physical RAM.
Given that my first computer only had 4K of RAM this is an amazing number of bytes. I thought my first hard drive, at a healthy 5 MB, was plenty big at the time.
To put an exabyte into perspective, the ramp up from my first hard drive to a terabyte (you can buy a terabyte drive these days for around $300 US) is the same order of magnitude from a terabyte to an exabyte. Each time hard drive technology improves you hear people wonder what good is so much hard drive space, yet we never seems to have any trouble finding a use.
A single movie compressed on a DVD is around 5GB, an HD movie around 25GB. Working with raw HD data (even that is usually compressed somewhat) you need much more. So imagine you need 500GB to store a modest HD movie during editing. In an exabyte you could work on 2 million HD projects at a time.
In the early 90's a copy of Deltagraph (Mac) was around 2.5 MB in size, which shipped on several floppy disks. In an exabyte I could keep 440 billion copies.
The MMPOG game I play (Battleground Europe) uses about 700MB RAM during gameplay. In an exabyte there would be room for 1.5 billion times more data.
Google's data for its search engine is apparently around 1 petabyte or so, a mere 0.1% of an exabyte. In a 64 bit address space you could fit it 16,000 or so times.
It's a big number, which seems pointless to consider: who would ever use this much data, either on disk or in virtual memory? Yet technology continues to discover reasons to use more and more storage. I can imagine that some day the division between permanent storage (disk) and RAM will vanish; everything you work with will exist in a single address space. In this way an exabyte doesn't seem as far off as it appears.
The nasty fly in this exabyte ointment is of course software. How do we develop software than can take advantage of almost limitless address space, not to mention tens, hundreds or even thousands of processors, or even enormous grids of these machines? Somehow software evolution has to speed up or all that hardware potential will be wasted.
Another fly people don't think about (but Google does) is power - at some point you have to provide juice for all these bytes.
So this is mere speculation for now, we still have a long way to go before a 64 bit address space seems tight. Perhaps by then we can simulate the human brain and let the computer figure out what to do next.
Of course by the time 128 bit address spaces are needed, we might all be obsolete and it won't matter.