I started using my new work dev machine this week. The key features are 3G of memory and 2x160G SATA drives arranged as RAID0, so it performs reasonably well. Complete with a fast processor something like this should be considered as a standard developer configuration these days. The old machine only had 512M of memory which is nowhere near enough for development purposes when you may want to run several instances of Visual Studio, the application you're working on, in my case is a large server application, Office apps, browser instances, Virtual PC, and an aggregator.
The graphics card is a Radeon X800 XT which hopefully should be sufficient for the graphics requirements of Longhorn. When the beta arrives sometime this year (?) I'd like to experiment with installing it onto an external USB2 drive so that I can boot off that when I want to play with Longhorn, yet leave my development environment untouched on the drives in the machine.
I chose a 20" LCD monitor which looks very good at 1600x1200. I experimented with dual 19" monitors at 1280x1024 but found that my preferred Visual Studio layout on one of the monitors (Output/Find Results/etc on the left, 80 columns of code in the middle, and solution explorer on the right) was too cramped, and I didn't want to have to keep looking at the other monitor when working on code. Walking round the office I did cringe a bit to see people working with Visual Studio maximised on a 1600x1200 screen with code extending across the full width of the screen. I don't particularly like working with lines of code stretching to 200 odd characters and the excessive levels of indentation that become too much of a temptation. What happens when developers move to 30" screens?
The only aspect of development which still drags is ClearCase. This is horrendously slow over a broadband/VPN connection and I can't begin to understand why such simple tasks such as checking out a single file take so long. Maybe the original developers were averse to premature optimization.