As I previously noted, Apple's 20-inch iMac has a life-cycle carbon footprint of 870 Kg of CO2e. Of this, 382.8 is from production and 435 from a 4-year use phase. Apple doesn't explain the methodology, though. For a sanity check, I looked at a 2004 paper by Eric Williams which uses a hybrid (process + economic IO) LCA method to compute the energy intensity of computer production. Mapping his electricity consumption to US average (not that much production takes place here) and all direct fossil fuel to natural gas (might be too optimistic), and using our internal LCI data for electricity and natural gas (from CarbonScopeData), I have 321.44 Kg of CO2e for a desktop computer with a 17-inch CRT monitor and 733 MHz processor produced in 2000 -- the carbon footprint is very close to the 20-inch iMac's. The iMac, of course, runs a 3.06 or 3.33 GHz processor -- much better performance at nearly the same footprint as the older machine, assuming the methodologies are comparable.
Next question: Assuming an average useful lifetime of 4 years, does it make sense to replace your computer with a more energy-efficient machine before the 4 years are up? Using the iMac example, if you bought a newer computer (consuming 20% less energy in the use phase and similar production phase energy use and carbon emissions) after only 3 years of using the existing one, it would take 4 more years (the entire use phase of the new computer) to just break even on total carbon emissions. This assumes that the old computer is recycled or sits in your garage.
Computer production is very energy and carbon intensive, so it makes good carbon sense to use our old computers as long as possible before getting new ones.