In the 1970s, the company Xerox made an engineering decision that shaped the course of technology more than most of us appreciate. Not only did they design the first graphical interface for a personal computer, the remnants of which we all still use today, but to power it they used something pre-built and readily available: a television set. Both Apple and Microsoft copied this computer for their extremely successful products, and ever since, screen technology between escapist entertainment (television) and productivity (personal computing) was unified and has never diverged. As a screen-addicted millennial, a professional software designer for 12 years, and now an independent researcher who studies screens full-time, I write this article to make the case why it’s time to revisit that decision. It’s the reason many of us struggle with technology more than we want.
The year is 2020 and the technology we built to help us is having unintended side effects. Most teenagers feel addicted to their smartphones and they spend 4.5 hours of their day on apps they often immediately regret using. Psychologists summarizing technology research have written how smartphones are destroying a generation, with greater use invariably correlating with less happiness or well-being in “real” life. 10% of the population feels addicted to television, the most rapidly adopted invention in the history of America that quickly became the single greatest use of our free time to the detriment of social relationships. Our children, unsurprisingly, now want to be YouTube-famous vloggers instead of astronauts. We are likely one of the loneliest generations to ever walk the Earth, and this is unfortunate because social connection is the key to our lasting happiness. Most of the time, though, we’re not on Earth to notice much. We use screens for 11 hours per day on average, and much of that is obligatory.
Despite the alarms from psychologists and even technologists themselves, we all recognize that our lives and our children’s lives will involve similar amounts of screen time. Parents want their children to be computer literate as much as they want them to understand finances from a young age. Their jobs and future prospects depend on it. We can’t stop using screens, nor should we. The kernel of digital technology is pure, it’s only been obscured. Computers, tablets, and smartphones were supposed to be “bicycles for our mind”, extending our communicative and creative capabilities in ways we couldn’t otherwise imagine. For the most part, they succeed; I can write this from an office and have it appear in the pocket, desk, or coffee table of billions of people. The problems we have with them, fortunately, are manageable and have very little to do with the content or software we use. To improve our situation, though, we’ll need to understand where they come from and face our accidental addictions in the process. It all comes back to the screen.
Unless you have an e-reader with a web browser, I humbly suggest you print out this article. I know this is inconvenient and you probably won’t want to do it, but I’ll explain why the difference matters later on.
“The greatest trick the devil ever pulled was convincing the world he didn’t exist.”
The greatest trick screens ever pulled was convincing us they don’t exist. This visual trick is what makes us “lose ourselves” when we watch or use them. It’s the true “magic of television” and why we consider film or the Internet escapist and transportive. It’s what simultaneously makes screen content fun, immersive, confusing, and addictive. It’s also why my and other technologists’ attempts at fixing screen technology have all focused on software and content: the hardware is intentionally, literally invisible to us. To fully appreciate conventional screens and their side effects, we need to understand this trick.
Conventional screens are light bulbs we use to create images*. Your smartphone’s screen is nothing more than millions of tiny red, green, and blue light bulbs dimming and brightening in unison to create an image. Since we now look at screens for 11 hours per day on average, it’s fair to say we now stare at light bulbs for 11 hours per day.
You may find yourself intuitively feeling “weird” about that last statement. In fact, it is weird. Humans, like most animals, evolved to take in all visual information from objects that reflect light, not emit it. Before screens, the only light sources we stared at directly were fires, which tend to be similarly hypnotic especially when it’s dark. And fires can’t even play Netflix.
If you’ve taken a physics class you might remember how light is light, whether it’s reflected from an object or emitted from one. This is true, but the visual qualities of light-reflecting (or simply “reflective”) objects are much different than light-emitting ones. In reflecting light, reflective objects tell us much about themselves and our environment. We’re able to get a sense of the texture of objects without having to feel them just based on how they reflect light. The way shadows fall on them informs us of their shape and how they’re positioned 3-dimensionally in space. In visual art this is called lighting/shading. It’s one of the primary reasons you can look at a 2-dimensional painting and see a 3-dimensional scene. Lighting/shading is one of the mechanisms informing humans’ spatial perception, and it’s the one screens disrupt the most.
Screens meant for television, the non-reflective light bulb machines that they are, don’t follow predictable lighting/shading patterns because the goal of “tele-vision” demands it. If the scene being broadcasted on television is bright, it should be the same brightness in the room it’s broadcasted to regardless of how close the television is to a light source. The colors should be accurate to the broadcasted scene regardless of the color temperature of the room it’s broadcasted to.
The result of this approach is an image that is perfectly distinguished from everything else around it. The brightness of a screen’s image has nothing to do with the brightness of the room it’s in. The color temperature of the image is completely independent as well. We can’t even cast a shadow on the image, because shadows can only appear on reflective objects. Without these shading cues, a screen’s image stops looking obviously flat despite other mechanisms in our visual systems informing us otherwise. The images stop looking like they’re “on” the screen altogether, appearing instead as if they’re coming to us from somewhere else. In fact, how much you agree with that statement is one way researchers gauge immersion or “presence” in screen-based content.
Through this design, the screen itself starts becoming a window or portal to someplace else rather than an updating depiction of the same place. The hardware becomes invisible and even irrelevant, just as how a window is only a visual obstruction to what’s behind it. In the case of screens, though, the “world” behind it or coming to us is usually far more interesting than the real one. And even if it’s not, we can usually change it at will. Is it any wonder most young smartphone users suffer from “nomophobia”, the fear of temporarily being without their screens?
This is the basis of how “tele-vision” and now computer, tablet, and smartphone screens work. Inventors of screens hoped for visual teleportation to complement the audio teleportation of radio and the tele-phone. Nobody consciously decided that this transportive quality should apply to the Internet, or that video games should feel like a world. The people who invented television hadn’t heard of personal computing and the people who started using television sets for personal computers hadn’t heard of the Internet. Even researchers who study screens today tend to openly take these effects for granted. It’s up to us now to decide what we want social apps, the Internet, video games, porn, etc. to look and feel like. But before we decide, we should at least be aware of the side effects of the current light bulb model, because this television illusion is a powerful one.
Side effect #1: Addiction
As we’ve now all seen, it turns out that when you give a human being a portal to another world that they can change at will, they can become fairly addicted to it. People used to have to practice meditation, pray, volunteer, or take an unpredictable psychedelic drug just to lose their pesky selves. Now we can forget all our real-life baggage and “check out” from uncomfortable situations at a moment’s notice with an escape hatch in our pocket. Research confirms what psychologists speculated: it’s this escapism that makes screen-based content addictive.
Studies show that the escapism effect simultaneously makes almost everything on screens (including social media) far more enjoyable but also puts us at much greater risk for both Internet addiction and gaming addiction. Brain scans of Internet addicts and video game addicts show the same dopamine receptor changes, which seemed to surprise even researchers. A psychiatrist in California who specializes in screen disorders also believes that screens themselves are problematic, and in 2012 proposed a new diagnosis called “Electronic Screen Syndrome” with a suggestion for parents to take all screens from their children for a few weeks. Parents who enact these “screen fasts” or “digital detoxes” often find their very young children going through screen withdrawal behavior, even though we know kids at their ages understand little about what they see on screens.
Our addictions to technology are legitimate, but if we fail to acknowledge hardware’s influence in them we’ll be playing an indefinite cat-and-mouse game with software. Our current software solutions are well-intentioned but akin to an alcoholic putting a lock on their liquor cabinet that they know the combination to. Withdrawal–incessant cravings to do anything on a screen–is a necessary part of recovering from any addiction, so if we try to address our screen time without ever experiencing withdrawal we’re probably just coping. Children often only exhibit withdrawal symptoms when all screens are taken away from them; personally, I never experienced it myself until I switched to alternative screens with the intention of never going back.
Side effect #2: Persuasion
Researchers are consistently surprised (or delighted) by how much we trust screen content, and it’s thought because we interpret it as a direct experience. When we lose ourselves in screens we’re more likely to make impulse (unplanned/rash) buying decisions, better remember a brand, and view it more favorably. The more we watch shows demonstrating sexual permissiveness, the more likely we are to hold sexually permissive values, even after controlling for things like religious beliefs and our parents’ attitudes towards sex. A large topic in media studies research, called Cultivation Analysis, is concerned with the tendency of screen-based media–traditionally television–to uniquely shape our world views over time. Researchers have been surprised to find how readily we apply television situations to our own life even if they come from shows we recognize as fictional.
Screen content can also traumatize us for years, despite our conscious understanding of it as staged or fake. Alfred Hitchcock received letters after the release of Psycho from parents whose daughters could no longer shower after witnessing the film’s infamous shower scene. Beach attendance decreased significantly after the release of Jaws (called the “Jaws effect”), and many viewers reported “enduring problems” with swimming even 7+ years after seeing the film. Presumably, those same viewers are consciously aware that what they watched was a fictional movie and not real life, and even the shark depicted in that fictional movie was clearly fake and unrealistic. They’ve also likely been informed, like many of us after viewing the movie, of their infinitesimal real-life chances of being attacked by a shark. Despite all this, the screen memory wins.
Side effect #3: Living in virtual “reality”
Our questions about screens and how they affect us or our children are understandable, but in my opinion they usually miss a much bigger picture. We tend to ask, for example:
“How does TV or an iPad affect my child’s brain?”
“What does it do to my child’s brain to believe from a young age that this world isn’t the only one?”
I’ve been away from normal screens for almost a year now. I understand their tricks better than most, I’ve spent hundreds of hours of my life studying them, and yet I still think of the Internet as a place. As I type this on an old reflective laptop I know consciously that I’m using the Internet, but it doesn’t feel like the Internet. The Internet I remember is a place that’s still out there, somewhere, waiting for me. I haven’t made it less real, I simply continue to ignore it. But because I can’t shake this feeling, I still find myself motivated by it. I try to do things to end up in the Internet because that’s where successful people end up. If I could just write well enough or produce something tweet-able enough, if I could just give a talk good enough for YouTube or the news, I could end up there too. Then I’ll know I’ll have made it. I’m really no different than those kids who now want to be YouTube famous instead of travel the galaxy; light bulb worlds are my “final frontier”.
When I look at the television, I want to see me staring right back at me. We all want to be big stars, but we don’t know why, and we don’t know how. But when everybody loves me, I want to be just about as happy as I can be.
— Mr. Jones, by Counting Crows
Experts have chimed in on this virtual world side effect, but it’s an experience that’s difficult to categorize, and not possible to have until recently without a mental disorder or psychedelics. Psychologists claim it’s why boys spend so much time on porn and video games and then fail to learn comparable skills in real life. Similar claims have been made about young girls flocking to social media worlds rather than real ones. Gamers have reported feeling more interested in their virtual wives and children than their real families. Media researchers notice how we yell at actors on screens as if they were actually in front of us, and how people with voyeuristic tendencies seek out reality TV as if they were actually looking at people through the glass and not just light bulbs in their living rooms.
Most questions remain unasked or unanswered, though. For example, how much more consequential is an “artificial” social reward — a Facebook “like” — when we’re mental inhabitants of Facebook-land? (To get an idea, try printing out your Facebook feed next time you want to read it.) What does it do to our adolescents’ brains to borderline-hallucinate sex with porn stars rather than look at depictions of them in magazines? Only time will reveal the full psychological consequences television screens have had on us.
Moving forward/Reflective screens
Obviously, I can’t convince anybody to want something different for themselves or their children. And to be fair, the effects of light-up screens could have potentially therapeutic uses that I’ll cover in another article. If screen technology is something you or your children struggle with, though, I can point to an alternative. It’s the reason my view of screen technology totally changed and why I’ve spent so much time studying them since. Screens don’t have to confuse us, persuade us, or addict us. They don’t have to look like “worlds” at all. A screen’s only requirement is to display dynamic visual information, and light bulbs are not the only way to accomplish this. Light-up screens are only one type of screen, and although alternatives are rare, they do exist and they’re called reflective.
Reflective screens curtail the unintended side effects of light-up screens by being more like what we evolved to look at. They’re not perfect; they’ll never reproduce colors as well as light-up screens for the same reasons printed materials often can’t reproduce colors as well as screens, but they do have many benefits over light-up screens. They tend to be better for our vision, they don’t give us flickering light-induced headaches or migraines, don’t keep us up at night, perform better outdoors, improve the battery life of a device by a factor of 2 or 3, and are easier to focus on. Currently they’re found mostly in e-readers like the Kindle and Nook, but for eye health reasons we’re starting to see them being used in tablets, computer monitors, and even smartphones. They’re mostly limited to black and white right now, but color alternatives exist that could theoretically be put into a future iPad. I maintain a list of these products on the recommendations page.
Most of these reflective technologies have failed to gain traction in the marketplace despite years of R&D. This isn’t surprising given the addictiveness of current screens. Unprocessed food initially tastes worse if we’re used to a steady diet of stimulating, “conventionally” prepared products. Similarly, if we’re not educated about the side effects of conventional screens, our knee-jerk reaction to reflective screens would be misplaced disappointment or even disgust (I’ve seen both when demoing them). Light-up television screens offer some advantages in terms of colors, but we pay a significant psychological price by using them. Now we get to decide what’s important to us and our children. Let’s forge ahead and become more intentional about our technology. Thank you sincerely for reading.
*LCDs are more nuanced than this, but because 99.9% of the LCDs on the market are transmissive with indiscernible backlights, it’s fair to characterize them this way as well.