I just read an article debunking - or in a lot of cases not - several long-time tech myths.
I was left somehow unsatisfied, tired, and cranky by this.
I think this is in part due to the fact that the article's author turned to a member of the MANAGEMENT TEAM of one of Best Buy's "Geek Squad" crews for his tech answers; which same individual delivered a load of hooey roughly the size of the Chrysler Building.
Right.
So, without further ado, I am answering his "myths" myself, based on my extensive love affair with all things computer-related and my (fairly good) level of education with regard to the subjects being discussed.
Myth #1: When you reboot your computer, you must wait 15 seconds before turning it back on.
This is, at least for computers after the advent of Windows, completely bullshit. The article gives all kinds of wacko theories about why this used to be so, and even more wild-eyed theories came from the comments section, but here's the real deal.
Back in the olden day - say, Apple II days - your computer's hard drive was crazy inefficient. (Not to mention small. 1 MB? DAMN!) In order to maintain the operating speed, usually 5400 RPM, the hard drive's actual discs - called platters - would spin up when the computer booted, and spin down again to a stop when you turned it off - and in between, two important things happened. First, the platters stayed in motion at a constant rate of speed, and second, the read/write heads were moving around to retrieve data.
Now, for those of you who have no idea how your computer works, or what it does, think of your hard drive like a stack of CD's, with very small spaces between each disc. The read heads are like a fork, with the tines inserted between the discs.
As the discs turn, the read heads use tiny, tiny magnets to both read and write data to and from the discs.
Right.
Back then, if you turned your computer off, just boom, off, the platters would still be spinning, and the read heads would be retracting to their "parked" positions, for a few seconds after your computer powered down. Rebooting during that process - 15 seconds is a long time, but gives a margin for safety even for the most ENIAC-like (look it up, lazybones) monstrosities - would cause the read heads to reactivate before they were clear of the sections of your hard drive's platters that contained data, thus potentially overwriting, changing, corrupting, or otherwise damaging the things stored on your hard drive.
Times have changed, baby. The new motor technologies - brushless, miniaturized DC motors - used in modern hard drives allow the discs to spin up and down much more quickly than before, first of all; ever see that Dyson ad? Second, the hard drive controllers have improved enormously; they are able to deactivate the read/write heads much more quickly, and without damage to the data even if the heads are out of their "parked" position when it happens. During the time your computer takes to turn ITSELF off and back on, the drives are easily able to recover without damage.
So, given that, why does your computer get all goofy and have to run a file check after a power outage?
Because any data operations going on when that one lightning strike happens don't complete correctly. Lightning is for all intents and purposes instantaneous, and even assuming the power surge doesn't physically damage the components of your computer, one thing it does for sure and certain is to interrupt file operations mid-character, which leaves all those partly written - or partly altered - files unfinished, and since Windows maintains a list (that's your registry, chief,) of where all your files are supposed to be and what they're supposed to look like (and how they're supposed to act, but that's getting a bit far afield) which was updated last when the interrupted file operations STARTED, when Windows goes looking for all those files, what it finds is incoherent, incomplete gobbledygook, which causes Windows to start looking for something to take the bitter aftertaste and woeful headache away.
...Thus, "your disk needs to be checked for consistency." Windows is comparing what it actually found on your hard drive to the list of what it was supposed to find, and fixing whatever it can.
Right.
Myth #2: megapixels matter. (For digital photography.)
Now, the Geek Squad guy claimed that once you get above about 3 megapixels, the megapixel count really doesn't matter.
I'm going to both call bullshit, and simultaneously use this forum to explain one thing they always do in movies that is ALSO completely bullshit, and why it is.
See, in a digital image of ANY kind, you have two options. There IS data, or there is NOT data.
The higher the sensitivity of your camera's sensor - that's the megapixels - the more data it records in a single image.
Now, you've all seen that bit in the movies where they have some grainy, hideous image that they need to get a better look at, and they wave their magic computer wand at it - "enhance. Enhance. Enhance." - and like magic, the image suddenly becomes clear and shows that the real killer was standing right next to them the whole time.
That's not LIKE magic.
It IS magic.
Because if the original image didn't have the data to display detail in the first place, it can't now, either.
Digital enhancement software has come a long way in the last few years, and it is the same process by which DVD players can "upscale" to HD resolution.
Here's how it works. Prepare for "fuck layman's terms, do you speak English?"
Standard TV resolution is what's called 480i, or 480-interlaced. That means your TV draws (leaving out the interlacing part as irrelevant, along with where the designations came from and the physical means by which TVs used to draw the picture, which can be found here,) 704 pixels horizontally, by 480 pixels vertically.
HD resolutions are 720p, 1080i, and 1080p.
Translated, 1280 pixels w by 720 pixels h, or two different versions of 1920 pixels w by 1080 pixels h.
Trust me when I say that if you have a TV over 26", 1080p is what you want.
Now, you take a big, fancy HDTV, and hook your upscaling DVD player to it. The DVD player starts the movie, and is faced with a problem.
...There's not 1920x1080 pixels worth of data on a DVD for it to display.
So what it does is multiply it out, and guess, based on the colors of the pixels it DOES have, what the pixels between them would look like if the DVD was actually giving it 1920x1080 pixels to begin with.
Does it look good? Yes.
Is it perfect? HELL no.
But when you're watching a movie, you don't notice the mistakes, because it repeats the entire process 60 times a second, and if it makes a goof, it's gone literally before your eye can physically perceive it.
But working with still photos, that's not the case. Any mistakes are stuck there - and each time you zoom in further and ask the computer to enhance it, those mistakes are multiplying.
Snicker.
Which, ultimately, means that the more "enhancing" you do, the farther the image you're viewing gets from reality.
...Because the computer is INVENTING DATA THAT'S NOT THERE, BASED ON A GUESS.
If your camera actually takes in that much detail, and records an image in a higher megapixel rating, well hell, you can zoom in a lot farther before the computer even starts making mistakes, can't you? Now, a 2 megapixel camera is already recording in higher than HD resolution, but working with still pictures, again, your eye has nothing to distract it and can thus ferret out detail much more easily. A higher-end, but not nearly top-of-the-line 12 megapixel camera is recording images at a resolution of 4096x3072 pixels, which means you can zoom in a hell of a long way on that picture, and see some really impressive detail, before you have to start asking the computer to turn those huge Arkanoid blocks into something meaningful through silicon guesswork.
So megapixels count, and the Geek Squad guy, no surprise, doesn't know his ass from his elbow.
(Surprising no-one.)
Myth #3: You have to run your Ni-Cd battery all the way down before you recharge it.
Ok, regardless of the claptrap the article spouts, this is TRUE.
But misleading. See, the actual wording makes it more specific than most people pay attention to.
Nickel-cadmium batteries, those old-school NiCd batteries you used to see in those 80's cordless phones roughly the size of a brick, actually require this; drain the battery completely before recharging, because partial-charging damages the battery and shortens its life drastically.
Here's the problem with this.
See, we're all terrified of having the power go out at the wrong moment and losing our cell phone, digital camera, photo keychain, camcorder, PDA, laptop, you name it, just at the wrong moment when it's desperately needed.
So we top up our batteries as often as possible.
Now, two things happened in response to this.
First, the tech support guys started telling everyone to drain their batteries completely before recharging, every time or as close to it as humanly possible.
Second, the companies manufacturing batteries figured out that our use habits were destroying the batteries early from partial charging, and they developed batteries that like to be partial-charged.
Thus, we have two new types of rechargeable batteries; Nickel-metal-hydride, NiMH, and lithium-ion.
Both types like to be partial-charged, and take no damage whatsoever from sitting in the charging cradle for weeks on end while you go on vacation to Cabo Yellowstone McDonald's. (Look, the economy sucks for everyone, ok?)
BUT, and this is a big but, both NiMH and Li batteries are actually damaged - permanently - by fully discharging. Each time you let your laptop or cellphone run all the way out - just the way a lot of tech guys have been telling you for years - you shorten the service life of the battery DRASTICALLY.
Don't do it.
Check your battery type. For NiCd, run the sucker until it's bone dry and quits; for NiMH and Li, charge away; it keeps them happy.
As usual, the Geek Squad guy is a dumbass.
Myth #4: You can put a keyboard you've spilled coffee on in the dishwasher.
OK, this "myth" is also true. The Geek Squad guy blathered some bullshit about how it's only corded keyboards, which is utter tripe, this one's true, but there's a catch.
In order for them to work afterwards, you have to wash them WITHOUT SOAP, and you must let them dry out completely before use.
Think "hanging over your shower curtain rod for two days to drip dry" completely.
Here's why on both those caveats.
Soap does not rinse out. Soap makes bubbles. Bubbles conduct electricity. Electricity getting where it's not supposed to makes your keyboard short-circuit and burn out, permanently.
Water is WHY bubbles conduct electricity. If your keyboard isn't dry, it will short-circuit and burn out, permanently.
Geek Squad Guy tried to claim that water temperature makes some difference, but in fact this is also a load of horse puckey. Wash it in glacial streams or the fucking hot tub, as long as it's dry before you use it again, you're good to go.
Myth #5: Anything stored digitally will last longer than its analog counterpart.
Really? I've got vinyl discs from the 50's that still play perfectly, but I also have CDs that are useless from four years ago.
See, things that are stored in digital form are forever dependent on having backups. Stored on your hard drive? What, you've never had a hard drive fail? Good luck with that.
Stored on CDs / DVDs? Good luck with that, too; the cheap, generic media available at consumer level typically begins to degrade within 3 years.
Stored on flash media? Good luck with that; all it takes is one good power surge and your precious childhood memories are fucking gone.
"Stored" on the internet? Better hope Facebook's server never, ever crashes and you don't lose your photos of your drunken antics with a bong because of THEIR hard drive failing.
Digital data has to be stored somewhere, just like analog.
The simple fact is that no matter the technology, there is essentially no way to guarantee that your media will survive in the long term besides active conservatorship, in other words moving from older media to newer ones as soon as they're available, to avoid technological obsolescence of your older media.
See, if you recorded the perfect jam session on an 8-track back when they were the thing, and never moved them forward to CDs or your MP3 player, and you're an average schmoe without the bucks to buy enormously expensive equipment, once your 8-track deck dies, those tracks are gone forever. (You idiot! The file is fucking gone! it's just gone!)
Because there's no such thing as an 8-track to CD converter, or at least not at the price range you or I could likely afford.
Just like in a couple of decades, there won't be anything to convert your precious old Blu-Ray movies to something more common.
Which means, analog or digital, regardless of physical media, maintaining things has to be a constant effort to transfer those files forward onto new media, and maintain backups - and that at some point, the tiny errors in copying things over and over will eventually cause the files to be useless anyway.
So for once, I agree with the article.
...In the section where they don't quote the Geek Squad Guy, go figure.
Myth #6: Turning a computer on and off regularly is bad for it.
For once, the Geek Squad Guy gets one at least partly right here, and points out that the computer is actually designed to be able to turn on and off without taking physical damage.
However, he then goes on to blather about how it's a good idea to turn it off every day, and says - actually fucking says - that computers "need their rest time."
No, they don't, you fucking moron; they're machines, not people.
Hell, they're not even squirrels.
The reason older computers needed to be rebooted fairly regularly is because the Windows registry - remember my discussion of that above - isn't perfect, and makes mistakes; when you do a reboot, it refreshes the registry, by which I mean it loads a copy of it from the disk, then looks at the disk, looks at the registry, and sweeps the registry for things that are obviously bullshit and discards them.
Which is why 90% of the errors that happen in Windows XP can be fixed by rebooting the machine. Most of the time - assuming it wasn't a PEBKAC error in the first place - what happened was that something in the registry got fucked up, and a program went looking for some part of its data in the registry, couldn't find it - or found the wrong thing - and crashed, thus taking everything else with it.
Windows XP is not known for making applications function in their own sandboxes.
So when you reboot, the registry refresh fixes the mistake, and when you load that program again, it works fine.
So, with XP and previously, rebooting once in a while is a good idea, just to prevent your computer from developing a creeping psychosis as its registry gradually gets more digital photography "enhancement" over time.
Right.
But that's entirely, one hundred percent, thanks to the way the Windows Registry functioned prior to Windows Vista. (Boo, hiss, spit.)
Linux and Unix don't have that problem; Windows Vista and Windows 7 don't have that problem, or at least not nearly to the same degree; even Mac OSX can hang without rebooting every night.
In fact, when you get right down to it, most of the time when this is a problem, it IS ultimately a PEBKAC error; the user is doing things they shouldn't be doing.
I haven't rebooted my computer - in Windows XP - in at least two months, and it's humming along fine, thank you very much.
So yeah, turning your computer on and off regularly isn't BAD for it; it's just mostly totally unnecessary and time-consuming for no good reason.