Is It Wrong To Tweak Your Device?

My first computer (back in 1983) was a <a href="http://en.wikipedia.org/wiki/Compaq_Portable" target="_blank">Compaq Portable</a>, a 28-pound DOS machine with two 5.25-inch floppy drives and a 9-inch display. I was delighted with my new purchase -- until I found the small sticker on the back of the machine that said if I opened it (to, say, add memory), I would void the warranty. Say what?

Barbara Krasnoff, Contributor

October 4, 2007

4 Min Read

My first computer (back in 1983) was a Compaq Portable, a 28-pound DOS machine with two 5.25-inch floppy drives and a 9-inch display. I was delighted with my new purchase -- until I found the small sticker on the back of the machine that said if I opened it (to, say, add memory), I would void the warranty. Say what?I immediately called support (which in those days was staffed by real techs who didn't have scripts to follow). The guy on the other end of the line admitted that yes, according to the Powers That Be, only authorized repair staff were supposed to open the box. What he "couldn't" tell me was the exact place to press down on the case to release the cover -- and he "didn't" tell me twice, to make sure I got all the details.

The tradition of users being able to tweak their hardware goes back well past computers, to one of the first mass-market technological products: the automobile. From the beginning, owners were tweaking their machines, adding features to improve performance and ease-of-use. Even today, when automobiles are so computerized that you need a degree in engineering to understand how they work, people still like to get their hands dirty -- make their own repairs, try a different brand of oil filter, add some cool detailing. And automobile manufacturers understand that -- if you fry your electrical system with that snazzy new stereo system, the warranty won't cover it, but it will still take care of your faulty transmission.

So when computers started to appear on desktops, many of those purchasers followed the same tradition -- and, after some initial hesitation, it was encouraged by many manufacturers. Desktops were engineered so that it would be easier for users to add memory, expansion cards, or internal drives, and support resources included instructions on how to add them properly.

There were exceptions. A few -- the first compact Macs, for example -- were contained in a single unit, and users who wanted to mess with their computers had to find ways to pry it open (using a tool that, I'm told, was termed by many a "Mac cracker"). This was, obviously, not encouraged by the manufacturer.

Perhaps that explains the recent reaction to a blog written by my colleague Alexander Wolfe covering Apple's recent action against iPhone buyers who had either unlocked their iPhone or added unauthorized applications. He reported on a possible class action suit against Apple by purchasers who objected when Apple not only wiped the offending software from their iPhones during an upgrade, but rendered the hardware useless.

I was very surprised -- not only by of the number of people who commented on the issue (which was something around 600 when I last checked), but by the opinion held by many that iPhone users who tweak their phones against Apple's orders -- including uploading something as trivial as screenshot software -- deserve to have their iPhone "bricked" because they voided their warranty.

Thinking about it, however, I came to the conclusion that it may be simply a matter of a difference in cultures -- and in attitudes toward technology.

Apple's excellent systems -- and I have too many friends who have switched to Macs to doubt the worthiness of the technology -- have always been aimed directly at consumers: Buy it, turn it on, you're done. PCs, on the other hand, were originally slated for business users who had IT staff to support them; as a result, individual purchasers needed to be able to do at least some fundamental tasks. You needed to have some familiarity with DOS commands (and a bit of Basic coding didn't hurt); you needed to be familiar with such things as drivers and optimization; you needed to tweak software so it would work with all your other software.

Of course, things have changed since then. Microsoft has been trying to make things a lot more user-friendly (and a lot more, well, Microsoft). And many Mac users are savvy and sophisticated tech tweakers. But I suspect that the difference in culture remains (although it has, to some extent, crossed over into either camp). It would explain why many people feel it is wrong to alter a piece of hardware that they've purchased, and why others feel it is part of the unspoken agreement between buyers and sellers that digital devices should be modifiable, as long as those modifications are relatively benign.

Or maybe it's just a simple difference of opinion.

Read more about:

20072007
Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights