Not only are we slaves to technology; we're also slaves to technology's opinion of us.
By now, you've seen the Google Autocomplete Embarrassment Map of the United States, which shows Google's attempt to guess what you're going to ask about a particular state. I live in Kentucky. When you type, "Why is Kentucky so..." Google guesses the next word: "poor."
Why are we celebrating a machine that perpetuates stereotypes? Kentucky is poor, Tennessee is racist, and Nebraska is boring? Is this what people really think of these states?
Or is something else at work here?
As a journalism professor, I teach my students to look for stories that shatter stereotypes. Otherwise, we become part of the problem identified by Chimamanda Adichie as "the danger of the single story."
When we allow a single narrative to dominate, and we don't let people tell their own stories, we tend to have a one-sided view of a region, people, or culture. Google is contributing to the problem of the single story. "Poor" may be, statistically speaking, the most-searched-for-word when people type that phrase into Google. But what does that really mean? The most-searched-for-word might win, with a whopping 2 percent of all searches that contain that syntax. Google is not going to tell us its algorithm, so we'll never know.
As Internet critic Jaron Lanier would point out, it's funny how Google uses everybody's information without their permission, but doesn't give you access to its own information.
I don't think we can take solace behind the argument that Google Autocomplete is simply generating guesses based on input from legions of other people.
"Hey, don't blame Google! It's just performing an algorithm!"
In other words, it's the Bill Murray defense from Ghostbusters: "Back off ... I'm a scientist."
Remember, this is the same logic behind autocorrect, which has probably done more harm than good, and has even inspired websites dedicated to the hilarity and humiliation it has caused people.
Why is Kentucky so poor?
I don't know. Maybe it's because our people have been exploited by powerful companies extracting the state's natural resources. Maybe it's because there are negative cultural forces at work that continue to harm each successive generation, defying efforts by the government to correct the problem.
But here's another question. Why do we allow machines to determine what we think about ourselves?