Two pale little girls sit with their parents at the dinner table, chanting a strange blessing. “The closer we are to the family benchmark, the closer we are to creating the perfect family,” they repeat in unison over digital counters blinking next to their plates.
Yet the parents are dour, and the girls are glum. The scene, from the short, 2016 film “The Quantified Self” depicts dinnertime as just another joyless measurement event in a day that revolves around collecting bio-data. From sleep quality to blood oxygenation to spousal argumentation rate, the characters live their life — and make all their decisions — strictly by the numbers.
To put it another way, the movie asks what it means to listen to data. Even the soundtrack suggests the sound of tracking as something bleak and menacing. Romain Collin, composer of “The Quantified Self’s” score, affirmed this impression: “It’s only two drums carrying the story, like the zeros and ones of the digital world this family is trapped in.”
Is that the case for the actual, real-life quantified selfers among us, the bio-hackers and self-trackers stepping — and stepping, and stepping some more? Does running the numbers on yourself reduce your world to the gray winter landscape of the film? In the words of a Radiohead song, does it make you “fitter, happier, more productive”?
“I got in darn close to 70,000 steps over Memorial Day weekend,” Matt Coddington told me the afternoon in early June that we met at Humana’s Digital Experience Center downtown. He reached that impressive total by installing a swimming pool at home while listening to eight full-length podcasts. “I was exhausted coming into work on Tuesday, but I love the feeling that I’ve gotten a lot done.”
Coddington has been step-counting since 2010, longer than the other self-trackers I talked to. I surmised that his dedication might have something to do with a passion for technology that is as much personal as it is professional. (Before our meeting, he had texted me: “Wearing gray shirt and Google Glass.”) The 51-year-old is a mobile product manager for Humana by day, and he heads a mobile device enthusiasts’ group in his free time. But he said that his tracking habit is not about the gadgetry involved.
“It’s just the feeling it gives me that I’m making the right choice,” he said. The Jawbone Up app paired with his Android Wear watch guides many of his daily decisions, from eating a small salad so he can immediately jump on the treadmill at lunch to walking down 25 flights of stairs from meetings in the conference room at the top of the Humana Tower.
Making the link between personal data and personal choices was an intriguing observation, given the increasing use of data for more decisions of all kinds in our world today.

Come to Big Data
Almost anytime a computer seems to know the right answer to a question, whether we misspell a word but still get the search results we were looking for, or a website’s recommendations supply our next favorite book or movie, it is relying on data. But not just any kind of data: it’s Big Data. In essence, Big Data is about applying algorithms to very large datasets in order to identify patterns and make extrapolations.
Several trends have converged to create the Big Data phenomenon. For one, more data is being generated and captured than ever before. From business supply chains to consumer shopping preferences, infectious disease outbreaks to the shares racked up by a viral video, along with the personal health and fitness data collected by wearable trackers and smartphones, if something matters in business or society, it will be quantified and turned into a data point.
Another factor in the rise of Big Data is expanded digital storage capacity and processing power of computers. This not only allows huge volumes of data to be kept handy, but also for it to be monetized, as when social media companies sell their users’ personal information to advertisers.
Relying on data has both promises and perils
On the positive side, data-based inferences can provide a useful corrective for flawed human thinking, particularly our bias toward seeing the world in terms of cause and effect. “If I had dinner and had an upset stomach I would immediately think it was what I ate, although statistically speaking it’s far more likely I’d get bugs from shaking hands with someone,” explained Viktor Mayer-Schönberger, co-author of “Big Data: A Revolution That Will Transform How We Live, Work, and Think.” “I see the world through my own eyes, through my experiences, preferences, beliefs, ideologies and prejudices — all I’m trying to do is verify that my beliefs are correct.” By contrast, data does not have anything to prove. It does not come with a ready-made story.
What Big Data can do is show correlations between different factors that lead to new insights or predictions. Machine translation is a good example. Google Translate was created by performing statistical analysis on more than a trillion words — a substantial portion of the content in every language on the internet — and finding probabilities that certain words and phrases in one language would correspond to those in another. Although the Google Translate service does not know the languages it is translating — rules of grammar are not part of the algorithm — it can identify equivalents between words based on the statistical likelihood of where and how they occur in text and speech. If you’ve ever used Google Translate, you know that it’s far from perfect. But it’s good enough, enough of the time, to provide a useful service.
That is one of the big tradeoffs of Big Data. It can provide knowledge — albeit without understanding. It can identify patterns — but without meaning. For both of those reasons, it cannot contextualize the individual. Big Data algorithms assign people to categories and classifications based on factors, such as where they live and work, the things they buy, read and watch and the contacts in their social networks. You, as a person, only become relevant by being correlated with others who are statistically similar.
Also, computer calculations cannot extrapolate for the way human beings experience life. Which brings us back to trackers.

‘Inspires me to hit my numbers’
While a wearable device can quantify your movements, it can’t provide information on the quality of your day. “The benefit of getting data like this is if it can be used in service of doing activities you’re willing to do that are enjoyable,” said Art Markman, professor of cognitive psychology at the University of Texas-Austin and author of “Smart Change: Five Tools to Create Sustainable Habits in Yourself and Others.” “If it’s only to be focused on the numbers, you’re going to give up at some point. People have a very hard time staying motivated in the long term doing things they don’t enjoy.”
But there is something about making the numbers personal — my stats, my goals — that lends the data inherent interest. For the self-trackers I talked with, measuring their daily activities certainly seemed to provide a kind of enjoyment.
Brian Wallace, 39, president of infographics design agency NowSourcing in Shelbyhurst, started using a Fitbit about three years ago in order to lose weight and stay active. Although he’s since switched to using an app on his phone, he said step counting has become part of his daily workflow. “I know I better carry my phone around, or I won’t get tracked — that inspires me to hit my numbers.”
Wallace has the Fitbit-branded Aria smart scale at home, which pairs with the Fitbit to record his weight and BMI, but said that tracking food or water intake or monitoring sleep requires too much manual effort for no clear benefit. He criticized the sleep tracker on the Fitbit in particular as being intrusive without providing useful information: “It pretends like it’s tracking your sleep and waking you up, but it’s really just an alarm that shakes your wrist.”
He decided to concentrate on steps because “I find that just looking at very narrow and focused slice of data is good for me.” His daily goal is the 10,000 steps set by default on most devices and apps.
Why 10,000? It is the equivalent of walking about five miles, the American Heart Association’s recommendation for a baseline level of healthy activity. It’s also a kind of magic number: While very large, it can still be grasped. “From a psychological standpoint, 10,000 steps creates a doable challenge,” said Markman. “Most people who are not already runners clock in at around 5,000 to 6,000 steps a day, so if they realize they are already over halfway there, it seems like something achievable.”
Competition is another lever of motivation that goes along with tracking.
Jackie Wagner, 24, an intensive-care unit nurse at Norton Browsnboro Hospital, found by checking her phone’s pedometer that she was reaching 10,000 steps most days just on her rounds at the hospital. But getting a Fitbit led her to boost her activity level. “Before, I didn’t pay any attention to how many steps I was taking and would be more sedentary outside of work,” she said. “Now if I’m just sitting on the couch flipping through TV channels, and I get a notification saying so-and-so is catching up to you, then I think, hey, maybe I better get up and take a walk.”
Game mechanics provide additional incentive to exercise by raising the social stakes. Teams, points and leaderboards aren’t just about competing to win but also a sense of accountability. People are more likely to show up when someone else expects them to.
At the same time, getting social also means that private data becomes public. “Even people you don’t do challenges with but are friends with through the app, it shows them how many steps you take a week,” said Wagner. Did that make her at all uncomfortable? “I had never really thought about it before. I think it’s just they can see how many steps I take, not what I eat or how many hours I sleep or how many glasses of water I drink.” As for the Fitbit or Apple corporation’s window into her private life, she said, “I never looked into the kind of data they collect. I’m one of those people that just kind of rolls with it.”
Brian Wallace made the point that to the extent that self-tracking is voluntary, it doesn’t feel like surveillance. “If you told people there’s a new government program where you’re required to register where you live, where you went to school, who your friends are, and then you’re also required to take pictures of yourself and your activities and upload them on a regular basis, people would be outraged. But once you tell them that it’s Facebook, they’re like, ‘Oh, well that’s OK.’”
Sensors, trackers and the idea of tracking itself have already become so much a part of our world that we accept them at face value. They are the tools we use to become fitter, happier and more productive.
I didn’t doubt Brian Wallace when he told me that being active has helped him get more done at work without having to stay late or rely on coffee. Or Jackie Wagner, when she said she has more energy than she used to, not to mention Matt Coddington’s Herculean feats over long weekends. But I also found myself thinking a lot about Horace Fletcher.
Fletcher was a late 19th-century diet and health guru whose focus (one might say obsession) was on chewing. His slogan was, “Nature will castigate those who do not masticate.” Fletcherizing denoted the practice of chewing food very, very thoroughly — far past the point when most people would deem it time to swallow. So how could you know when you had masticated enough? By counting: either in terms of an absolute number, on the order of 100 chews per bite, or holding to an average rate, such as one chew per second.
Fletcherizing was one of the first big diet fads to sweep America. People diligently counted their chews for decades. Then we started counting calories. Today we count steps.
Again and again we seem to come back to the peculiar attempt to marry self-improvement with the industrial model of manufacturing, with its emphasis on standardizing processes and optimizing output. Are we better people (better Americans?) to the extent that we produce and consume more efficiently? To gauge that kind of progress, we need benchmarks.
Maybe the tyrannical self-surveillance regimen followed by the family in “The Quantified Self” is not an inevitable outcome. But the implications of turning life into data points can still give us something to chew on.