(405) 919-9901

Have you ever found yourself staring at a TV commercial, unsuccessfully trying to understand what the perky little nerd on the screen is saying? Why is it that tech retailers throw around acronyms and technical terms without explaining what they mean? Take, for example, the word “digital.”

What is “digital,” anyway? Ask that question of many supposedly tech-savvy people and you’ll likely get a blank stare, followed by something like, “Uh, I don’t know. Better?”

Why, though, is digital better? We’ve had the notion of “digital equals better” hammered into our brains ever since the popularization of digital recording and CDs back in the 1980s. We accept that digital is “better,” but, better than what?

The word digital comes from the same source as the words digit and digitus (the Latin word for finger), as fingers are used for discrete number counting. In most practical applications, “digital” means that information, such as a sound or image, is represented by a series of discrete numbers, as opposed to “analog” information, which is represented by a continuous range of values.

Think of digital as being a picture printed in a newspaper; when you look at it closely, you can see that it is made up of thousands of tiny dots. On a computer screen, the dots are called “pixels.” Each one of those dots could be represented by a number. The analog opposite of this would be an artist’s oil painting on canvas, done with continuous, uninterrupted strokes from a brush.

Whether or not digital is actually “better” is still the subject of some debate. Our eyes and ears are not digital; they do not interpret the information they receive as numbers. Instead, as analog sensors, they detect things as a continuous, uninterrupted flow. For a detailed explanation of our “analog” brains, visit theness.com/neurologicablog/?p=246.

Many musicians and recording studio experts prefer the sound of analog recordings, such as reel-to-reel tapes or vinyl records, to the sound of digital CDs. Likewise, many photographers prefer old-school spools of film over new-fangled digital cameras.

The challenge of digital is to represent things in the same way that our analog eyes and ears perceive them. The main issue in the digital vs. analog debate is that of “resolution,” a term describing how many numbers are used to represent information. “High” resolution means that more numbers are used; low resolution means that fewer numbers are used.

Regarding digital photography, the numbers are visually represented as “dots” on a printed page or “pixels” on a computer screen. The acronyms “DPI” and “PPI” (“dots per inch” and “pixels per inch”) are used to indicate resolution. The more dots or pixels per square inch, the more realistic the digital image seems to our analog eyes. Therefore, a 10 megapixel digital camera, utilizing 10 million pixels per square inch, will produce more realistic-looking pictures than a 1 megapixel camera.

The main advantage of digital media is that numbers can easily be manipulated by computers. That is, after all, what computers do: “crunch” numbers. This makes it very easy to edit, copy, duplicate and distribute digital items, because all you are really copying is a bunch of numbers, and not the actual musical notes or artist’s brushstrokes. As such, a digital copy of a song is identical to the original, without problems such as tape hiss (that faint “sssss” you can hear in the background of audio cassette tapes) or vinyl record surface noise (scratches, pops and clicks). The question remains, though: does it sound “better?” Is digital audio really “better” than old-fashioned analog audio? Some people say yes, some people say no. Most folks can’t tell the difference.

If you’ve survived this column up until now, you deserve some sort of prize. In the mean time, I’ve been looking at TV commercials about new cell phones and wondering, “Is 4G better than 3G? What is a “G,” anyway? Do I want more Gs, or fewer Gs?”

Hmmm, I feel another column coming on.