A couple of weeks ago I was compelled to open the case on my balky computer and dig into its guts. My goal was diagnosing a long-running and progressively worsening series of program crashes and operating-system reboots, all of which were crimping my productivity and putting my data at risk.
It took more hours than I would have liked, but in the end I had my culprit: a bad stick of DDR2 memory, now upgraded and replaced. Along the way I also updated the BIOS for my computer, stress-tested and reconfigured various bits of hardware and software, and killed several trojans and a dormant worm.
I am now suffering no computer ills. My machine is running like an electronic top. I’m confident going forward that I have a stable platform from which to work, and that’s no small comfort given that I hope to do a great deal of writing over the next nine months. My computer is, after all, my workshop, and I don’t need a workshop that blinks out at random intervals.
Optimus Perfecticus
While diagnosing my computer problems I ran a series of tests, including MemTest86+ — which proved decisive. In order to run that program I had to download and install it, which I was able to do after a couple of faltering attempts to decipher the geek-speak instructions.
While performing this relatively simple task I found myself confronting an age-old debate that seems almost generic to human existence:
When should you hire someone to do a job for
you, and when should you do it yourself?
The answer, always, is found at the intersection of time and money. How much will it cost, and how long will it take, either to pay someone to solve the problem or to do it yourself? (Here I’m assuming that the goal is not one of self-satisfaction, but simply solving a problem by the most effective means.) [ Read more ]