Deliberate Rest

A blog about getting more done by working less

Category: Memory (page 1 of 4)

I just have something in my eye while watching the Google India-Pakistan reunion ad…

This Google India video is a piece of storytelling genius.

As the Indian Express explains, the ad

talks about separation, friendship and the celebration of a reunion. It follows the story of two childhood friends separated during Partition and who find each other after six decades with help from their grandchildren through Google.

The scriptwriter for the ad, Sukesh Kumar Nayak of Ogilvy & Mather, says he was pleasantly surprised when a tech giant like Google specified in their brief that the only thing they wanted was to see was how meaningful the search engine is in real life. "Our entire life revolves around Google, it is our instant response to something we don't know. But we wanted to dig deeper, and make the connection between real life and Google, magical," says Nayak.

I think it also works because many of us have had the experience of finding friends from college or high school (or perhaps even earlier) through Facebook or Google or other means, so there's a ring of familiarity to the ad. Most of us aren't separated by grand historical tragedies for quite as long, but still the ad manages to strike a near-perfect balance between sentimentality and technical feasibility.

The fact that the grandchildren are the ones doing most of the searching also works: yes, it tends to reinforce the stereotype that Old People Don't Know About Computers, but in this instance it ads another layer of drama and sentiment to the story, as well as a little potential tension over whether the grandkids will eventually get together.

Memoto film on lifelogging

The Swedish startup Memoto— you might have heard about their super-successful Kickstarter campaign last year– recently released a video about lifelogging, which is available on Vimeo:


Lifeloggers from Memoto on Vimeo.

I’ve been doing some research on lifelogging– I talk about Gordon Bell’s pioneering MyLifeBits project in my book— and it seems to me that the work of people like Bell and Steve Mann is finally getting out into the marketplace. But like all such academic projects, the commercial versions will seem like fragments of the original vision.

For example, Bell and Mann both had dedicated hardware for lifelogging– in Bell’s case, the SenseCam, which he wears around his neck, and takes chest-height pictures.

Memoto is betting that people who already have smartphones, Instagram, etc., will see the value in owning and always wearing a smaller version of SenseCam– that they will have missed just enough little everyday moments, or cool events where they couldn’t get out their cameras, to want to have something that takes pictures automatically.


Memoto Pre-order Product Video from Memoto on Vimeo.

Personally, I’m skeptical. I suspect that the smartphone occupies enough bodily space already, and that the demand for yet another camera will be pretty small.


My nephew at the Apple Store, Soho

On the other hand, the back end systems that lifelogging envisioned, where you’re able to create timelines or analyses based on lots of pieces of information, might be attractive to people who already are using Facebook, Twitter, Pinterest, Tripip, Mint, Fitbit, etc.. For those users, yet another system that required very little attention, but which pulled together and created some new value from data that you’re already creating on the different platforms and storing in different silos, could be appealing.

Of course, I’m also interested in the question of whether these kinds of systems can be designed to make users more mindful– more mindful of their days, of the passage of time, of what matters in their lives, and so on. I think the objections that people like Abigail Sellen have raised to traditional lifelogging systems are worth taking seriously, as I talk about in the book. In particular, the idea that digital and human “memory” are essentially interchangeable when you’re talking about autobiographical memory seems to me plain wrong. The memory of a first date, or the feeling of an open air market you explored when you spent a summer abroad (I know, first world memories) aren’t ones you simply look up, the way you recall a phone number or your bank balance. They’re part recalled, but also re-experienced.


fishermen in salvador, bahia, 1972

And show me a system that can capture and replay what it’s like to be in a spice or fish market.

IMG_5657.JPG
market in budapest

“the object on Facebook that I am most ambivalent about is me”

Slate contributor Steve Kolowich writes about going through– and often deleting– old Facebook posts, likes, and messages:

A wall post, comment, or "like" stops being useful once everyone involved is done enjoying the fleeting rush of having been publicly acknowledged, in some lightweight way, by another person. Yet Facebook holds on to these data points indefinitely, working them over for information, like old confidants turned informants.

In the end, however, that righteousness rings hollow. Only you can see your Activity Log. And besides, Facebook only cares to understand your past in a way that will help predict what advertisements you might click on in the future. It doesn't care that you used to post self-indulgent status updates while sniping at other people's grammar. It doesn't care that you posted in support of gay rights while trading homophobic jabs with your friends. It doesn't care that you shamelessly flirted with other women on their walls while your girlfriend was posting notes on yours, writing in Swedish, counting down the days until you would visit her in London.

We care. That is what makes Activity Log so discomfiting. We dread being taken out of context. But a lot of context can be too much to bear.

I had a similar experience when I saw a demo of Futureful, a great app that searches the Web for things it thinks you're interested in. To do that, it dives into your Facebook and Twitter accounts (with you permission) to see what you post about. As I write in my book about seeing it analyze me:

I’ve
always assumed that my Facebook and Twitter pages reflect my true
self. But when I look at the Futureful algorithm’s profile of my
interests—its sense of who I am, its snapshot of what I look like
online — I’m puzzled at first, then actually alarmed.

The person Futureful thinks I am is very interested in politics; most
of its recommendations are from partisan American Web sites or Euro-
pean news sources. (To the program’s credit, most of them are new to
me; the system is doing what it’s supposed to.) According to Futureful,
I’m also very cynical. It thinks I like to read about corruption, scan-
dals, and disasters caused by shortsightedness and greed. There’s nothing about history, design, computer science, or futures. Nothing about Buddhism or religion. Nothing about science. This person is an
observer of the follies and stupidity of mankind, a cybernetic H. L.
Mencken.

If I were talking to this person at a party, I’d concoct an excuse to
get away from him.

It actually inspired me to change the way I use those services.

Anyway, go read the Kolowich piece. Well worth it.

“‘Information overload’ once referred to the difficulty of absorbing intelligently the data produced by others”

Gregg Zachary has a short, Proustian think-piece on the "remembrance of everything past" in IEEE Spectrum:

“Information overload” once referred to the difficulty of absorbing intelligently the data produced by others. Now we face the peril of choking on our own. “Many of us no longer think clearly,” insists Silicon Valley futurist Alex Soojung-Kim Pang, because of our compulsive attachment to the digital world. In a new book, The Distraction Addiction (Little, Brown and Co.), Pang argues that humans can kick their habit with the aid of engineers who create “contemplative” computing experiences that enable humans to “restore focus and concentration.”

Alas, our distraction addiction could well worsen. In the future, while Google Glass and its inevitable copycats record the outer world, ever-cheaper sensors will chart our inner biological world.

Go read the whole thing.

“Even in small doses, mindfulness can effect impressive changes in how we feel and think”

Maria Konnikova, author of the soon-to-be-published book Mastermind: How to Think Like Sherlock Holmes, has a piece in the New York Times about the benefits of mindfulness. (Clearly Viking/Penguin’s marketing/PR people are on the job!) No one who’s read two or more posts here will be surprised by the argument:

Though the concept originates in ancient Buddhist, Hindu and Chinese traditions, when it comes to experimental psychology, mindfulness is less about spirituality and more about concentration: the ability to quiet your mind, focus your attention on the present, and dismiss any distractions that come your way. The formulation dates from the work of the psychologist Ellen Langer, who demonstrated in the 1970s that mindful thought could lead to improvements on measures of cognitive function and even vital functions in older adults.

Now we’re learning that the benefits may reach further still, and be more attainable, than Professor Langer could have then imagined. Even in small doses, mindfulness can effect impressive changes in how we feel and think — and it does so at a basic neural level….

An exercise in mindfulness can also help with that plague of modern existence: multitasking. Of course, we would like to believe that our attention is infinite, but it isn’t. Multitasking is a persistent myth. What we really do is shift our attention rapidly from task to task. [ed: Ah, switch-tasking, my old nemesis, we meet again!] Two bad things happen as a result. We don’t devote as much attention to any one thing, and we sacrifice the quality of our attention. When we are mindful, some of that attentional flightiness disappears as if of its own accord.

All absolutely right on, but I admit the first line catches me a bit: there’s a long tradition of contemplative practice in Christianity, but for various reasons, mindfulness and contemplation in the West is viewed more through the prism– or follows the model– of Asian contemplative practice. You’re more likely to find reference to Thich Nhat Hanh than Thomas Merton in discussions of contemplative practice.

Not that there’s anything inherently wrong with this, but I suspect there are people for whom a discussion of the benefits of mindfulness that starts with The Seven Storey Mountain or Richard Foster’s Celebration of Discipline or (for our English readers) Abbot Christopher Jamison’s fabulous book Finding Sanctuary would feel more familiar.

Of course, I think this because one of the things I’m starting to work on now is a version of the contemplative computing argument more grounded in the Christian contemplative tradition. (And one of the benefits of grown up in a household that was entirely unaffiliated with any faith is that it makes me totally unapologetic about investigating and appropriating ideas from all these traditions.) Watch this space.

Google is a search engine, not a Free Will Destruction Machine

The Memory Network has published a new essay by Nick Carr on computer versus human memory. This is a subject I’ve followed with great interest, and when I was at Microsoft Research Cambridge I had the good fortune to be down the hall from Abigail Sellen, whose thinking about the differences between human and computer memory is far subtler than my own.

Carr himself makes points about how human memory is imaginative, creative in both good and bad ways, changes with experience, and has a social and ethical dimension. This isn’t new: Viktor Mayer-Schönberger’s book Delete: The Virtue of Forgetting in the Digital Age is all about this (though how successful it is is a matter of argument), and Liam Bannon likewise argues that we should regard forgetting as a feature, not a bug.

The one serious problem I have with the piece comes after a discussion of Betsy Sparrow’s work on Internet use and transactive memory:

We humans have, of course, always had external, or “transactive,” information stores to supplement our biological memory. These stores can reside in the brains of other people we know (if your friend Julie is an expert on gardening, then you know you can use her knowledge of plant facts to supplement your own memory) or in media technologies such as maps and books and microfilm. But we’ve never had an “external memory” so capacious, so available and so easily searched as the Web. If, as this study suggests, the way we form (or fail to form) memories is deeply influenced by the mere existence of outside information stores, then we may be entering an era in history in which we will store fewer and fewer memories inside our own brains.

To me this paragraph exemplifies both the insights and shortcomings of Carr’s approach: in particular, with the conclusion that “we may be entering an era in history in which we will store fewer and fewer memories inside our own brains,” he ends on a note of technological determinism that I think is both incorrect and counterproductive. Incorrect because we continue to have, and to make, choices about what we memorize, what we entrust to others, and what we leave to books or iPhones or the Web. Counterproductive because thinking we can’t resist the overwhelming wave of Google (or technology more generally) disarms our ability to see that we still can choose to use technology in ways that suit us, rather than using it ways that Larry and Sergei, or Tim Cook, or Bill Gates, want us to use it. 

The question of whether we should memorize something is, in my view, partly practical, partly… moral, for lack of a better word. Once I got a cellphone, I stopped memorizing phone numbers, except for my immediate family’s: in the last decade, the only new numbers I’ve committed to memory are my wife’s and kids’. I carry my phone with me all the time, and it’s a lot better than me at remembering the number of the local taqueria, the pediatrician, etc.. However, in an emergency, or if I lose my phone, I still want to be able to reach my family. So I know those numbers.

Remembering the numbers of my family also feels to me like a statement that these people are different, that they deserve a different space in my mind than anyone else. It’s like birthdays: while I’m not always great at it, I really try to remember the birthdays of relatives and friends, because that feels to me like something that a considerate person does.

The point is, we’re still perfectly capable of making rules about what we remember and don’t, and make choices about where in our extended minds we store things. Generally I don’t memorize things that I won’t need after the zombie apocalypse. But I do seek to remember all sorts of other things, and despite working in a job that invites perpetual distraction, I can still do it. We all can, despite the belief that Google makes it harder. Google is a search engine, not a Free Will Destruction Machine. Unless we act like it’s one.

The future of memory, explored in crystal

If you’re in London at the end of the month, check out the Digital Crystal exhibit at the Design Museum.


ice crystals, via flickr

The Guardian explains the premise:

It is, Deyan Sudjic believes, one of the biggest challenges facing designers today: “What do we do now that the digital world is destroying the physical object?”… [M]aterial objects were almost becoming “an endangered species” in the digital world. “But as a species we are programmed to want to leave things behind, to remember people through things and that is why we have asked designers to think about other ways than the obvious ones.”…

That dilemma, in societies where we don’t really need wristwatches and where we have fewer printed photographs in our albums and books on our shelves, is addressed in a new exhibition at the Design Museum being staged in conjunction with Swarovski.

From the Web site:

For this exhibition, the Design Museum and Swarovski are collaborating to challenge designers to explore the future of memory in the fast developing digital age. Working with some of Swarovski’s previous commissions, alongside a new generation of designers, this exhibition examines the changing nature of our relationship with objects and even with time. Digital Crystal asks some of the most exciting talents in contemporary creativity to explore ways in which we can recover that lost connection we have with things, the result is 15 unique installations giving you a glimpse of the future of memory.

I think that we underestimate the difficulty of creating art that has anything really useful to say about technological or scientific issues, but this exhibit sounds like it’s worth seeing.


crystal palace pub (bath, england), via flickr

But read Abigail Sellen’s work on lifelogging while you’re taking the Underground to the Museum. Get off at Tower Bridge and walk across the Thames. The Museum will be on your left.)

Abigail Sellen on lifelogging

Just came across an interview on CBC Spark with my former Microsoft Research colleague Abigail Sellen. She talks about lifelogging and what's wrong with the idea of total memory capture. You can download the mp3, or listen through the Web site.

Nora Young does a great job with technology interviews. I had a good time with her a couple years ago, talking about RFID.

Is “culture of distraction” an oxymoron? Joe Kraus on Slow Tech

Joe Kraus, a partner at Google Ventures, posted a talk last week about Slow Tech which I highly recommend. Here's the abstract:

  1. We are creating and encouraging a culture of distraction where we are increasingly disconnected from the people and events around us and increasingly unable to engage in long-form thinking. People now feel anxious when their brains are unstimulated.
  2. We are losing some very important things by doing this. We threaten the key ingredients behind creativity and insight by filling up all our “gap” time with stimulation. And we inhibit real human connection when we prioritize our phones over our the people right in front of us.
  3. What can we do about it? Is this path inevitable or can balance be restored?

It should be obvious that I'm very much in agreement with 1) and 2), and have spent the last year working on an answer to 3) that is, in effect, "yes yes, a thousand time yes," as someone somewhere in Jane Austen said (my wife is rereading that last line and probably rethinking having married me).

It makes me wonder, though, if there is such a thing as a "culture of distraction." Not to take anything away from Kraus' talk, but is that an oxymoron? Culture brings to mind things that require a lot of concentration, the accumulated creative thinking and craft work of generations, thousands or millions of person-years. No one who is distracted can make a lasting contribution to their culture. Indeed, part of what scares all of us who worry about distraction is that a "culture of distraction" is a wasteland, the human equivalent of a television, tuned to a dead channel.

[h/t to Eugene Kim, who I hope to actually meet in person one day. And yes, I can mangle Jane Austen, but recite William Gibson from memory.]

What is Contemplative Computing?

The term "contemplative computing" may sound contradictory or complicated, but it's really pretty simple. Information technologies promise to make us smarter and more efficient, but all too often end up being distracting and demanding. Contemplative computing shows how we can use them to be more focused and creative.

Contemplative computing is something you do, not a service you use or a product you consume. It involves deepening your understanding of minds and information technologies work together, becoming more mindful of how you interact with technologies, and discovering ways of using them better.

There's a great Buddhist saying (echoed in virtually every religion) that pain is inevitable, but suffering is a choice. What the Buddha meant was we all face setbacks, get sick, and lose loved ones; but we can choose how we respond even to these difficult events. Likewise, I argue that in today's high-tech world, connection is inevitable, but distraction is a choice. The purpose of my book is to show you that the choice exists– and if you feel overwhelmed by smartphones and email and social media, how to make different choices.

My book on contemplative computing, The Distraction Addiction, will be published by Little, Brown & Co, and will hit the bookstores in August 2013. Until then, these videos offer the easiest way to get a feel for the whole project.

Contemplative Computing, Lift11, Marseille France, July 2011.

This is an overview of the whole contemplative computing project, and the best 20-minute introduction to what I'm doing. There's also a transcript of the talk available.

Secrets of the Blogging Forest Monks, TEDxYouth@Monterey, November 2011.

This talk is about the issue of digital distraction, and how Buddhist monk bloggers and media entrepreneurs manage to spend hours a day online without suffering the ill effects that the rest of us consider a natural consequence of being online. It's a bit rushed at the end, as I was running low on time.

There are also a couple exercises I did with the audience that might not make a lot of sense on video.

Older posts

© 2019 Deliberate Rest

Theme by Anders NorenUp ↑