Larry Kilham Blog
Before we get to computers, let’s see some examples of how truth is handled in everyday life.
“When your head says one thing and your whole life says another, your head always loses,” proclaimed
Humprey Bogart as Frank McCloud in Key Largo (1948). Your whole life that Bogart refers to is all the information relating to an issue you have accumulated over a lifetime. This is a huge amount of information distilled to an essence, which may well closely approximate the truth about the issue.
You are unlikely to find and express your version of the truth unless you are unfettered in an environment of liberty. José Martí, the Cuban philosopher and poet (1853-1895), got to the heart of the matter when he wrote, “Liberty is the right of every man to be truthful…” Ironically, the absence of liberty today in Cuba shows up as the lack of truthfulness there.
This same relationship is manifest in a wide variety of human endeavors. For example:
Politics Politicians, at the least, must compromise their actions originally based on truth, for actions based on expediency. As the saying goes, “Politics is the art of compromise.”
Science Almost all scientists are concerned about criticism and advancement, so they prefer peer-acceptable results over pathbreaking scientific advancements. They become intellectually trapped when in the university and related research institutions. Albert Einstein (1879-1955), on the other hand, did most of his most creative work while employed as a clerk in the Swiss patent office over the years 1903-1906. His productivity seems to have diminished after he joined universities later on.
The Military Generals and admirals must trade their best judgement about military strategy, the truth about when, where and how to win, for survival at the hands of their political masters. The various American conflicts after World War II are good examples of this.
Now, about the computers getting involved in the search for truth:
Artificial intelligence (AI) introduces a new factor into the truth realm. Computers that think with AI, do not have biases against seeking and stating the truth. Non-computer biases against the truth include emotion, egotism, and self-preservation.
As access to databases becomes comprehensive, such as seems will be the case on Google search, and sophisticated search and analysis systems such as IBM’s Watson become more comprehensive, computers will be ever more able to separate non-truths from truths.
How will a computer know a truth to be the truth when it produces it? People will become more concerned about this because, in a world of big data, they might come to trust only what can be verified electronically.
Recently I was eating a sandwich in the food court of a shopping center. A homeless man settled near me, squatted down, and started rummaging through his collection of things—assorted rags, plastic bottles, scraps of paper—and a smartphone. Ignoring me, he started tapping its screen. Startled, and intrigued by where his cyber journey might be taking him, I asked, “Who are you contacting?” and he answered, “I use it for everything.” It was his entire life.
Social media apparently is the most popular use of smartphones with Facebook installed in 70% of them. Google and other search engines, the portals to the world’s knowledge, are installed in fewer smartphones—about 58% in early 2015. This ranking may increase now that search engines can be queried by voice. The average user checks their smart phone more than 100 times a day.
The situation gets worse with children and teens. They look first to smartphone sources for advice and guidance, and, if time permits, their parents and teachers. They may be bright, but they are self-absorbed to the exclusion of everything else.
What is going on? Are we falling into an inescapable black hole? This is a key discussion in my new book , The Digital Rabbit Hole.
Let us imagine today’s version of a classic story…
Alice was so excited about visiting Europe for the first time, but she quickly became tired of sitting by her sister on the flight to London. Her sister was absorbed in a boring book with no pictures or music. “I don’t know why people read books,” thought Alice, “when they can see everything in color and sound on their smartphone.”
She snuggled down in her seat, grasping her glowing smartphone and began listening to it through her tiny earphones. She glanced now and then at the distant clouds to see if she could see one that looked like a sheep or a giraffe. Suddenly a white rabbit appeared on the windowsill.
He took a smartphone out of his vest, glanced at it attentively, and said, “Be quick, follow me, or we will miss the tea.” Alice jumped up, and excited for a little adventure, ran after him. The rabbit tapped his smartphone screen, and Alice’s smartphone screen came to life with a live video of some people and creatures sitting around a picnic table having tea.
“Hurry up,” he said, as he disappeared down a hole under a hedge. Alice followed and found herself falling weightlessly, with the wall of the tunnel fading out of view. “Is there a bottom?” she wondered. She was so absorbed by it all that she forgot to be afraid.
In this new world, Cyberland, Alice could find no places to eat, no malls, only some strangers sitting around a picnic table having tea. Then, boom! Alice hit the ground. She struggled to her wobbly feet and scraped her head on the roof of a space with no walls in any direction.
A button appeared on her smartphone labeled “click here.” Alice clicked without thinking about what could happen next and found herself shrinking. The rabbit appeared again. “You are as tall as me!” Alice cried. “So?” he said. “Hurry, we’re late!”
This Alice in Cyberland scenario is no longer fantasy. More and more people—almost all of the younger generations—are falling down digital rabbit holes. We all make forays into digital places where our friends can be found, or information can be gathered, or adventures and discovery awaits.
For centuries, social groups, books, libraries, songs, movies, and other media fulfilled those functions, but they were optional behavior. Now we have the Internet, which is not optional. It is a digital rabbit hole we fall into and cannot escape. The doors and windows to this infinite cyberland is the smartphone.
There are two basic reasons why this trend is happening and will become pervasive and controlling:
1- Technology – Perpetual digital connection to everything and too easy to get an apparent answer rather than devise one of our own.
2- Human nature – least action, convenience, good enough, alive enough, irrelevance, distractions.
See more at the Rabbit Hole page.
Larry Kilham is a Sloan School of Management graduate from MIT, received three patents, and has founded two high-tech companies. Many of his product designs required innovative use of computers, and as early as the 1960s he was researching artificial intelligence (AI).