Deliberate Rest

A blog about getting more done by working less

Category: Technology (page 1 of 33)

Signals, complexity, and knowledge vs data in baseball

Okay, first a caveat: I go to a few Oakland As games every season, but I’m not a baseball fanatic; I married into the game. My wife agreed to move with me to Chicago 20 years ago when I found an apartment within walking distance of Wrigley Field. Proposing to her also helped, but I’m not sure which was the more important factor.

So I’m not a baseball expert by any means, but I still found this article about how “The Rockies Believe They Have an Unbreakable Code,” and more generally how signaling is turning into a point of contention over who makes decisions about pitches, pretty interesting.

Last Sunday, the Washington Nationals broadcast noticed an unusual card sheathed in clear plastic on a wristband that was adorning the left arm of Rockies catcher Chris Iannetta…. The wrist card wasn’t just something the Rockies implemented Sunday. Iannetta has worn the card since the Rockies’ season began in Arizona….

While other teams might have already done something similar, this wristband — or, at least the scope of it — seemed to be unusual in nature. And with the amount of information available in today’s game, when it’s possible to know every batter’s performance against every pitch type in every count, I have wondered if teams would try and get more information to catchers and players on the field.

Mariners’ outfielders are carrying cards in their back pockets to help with positioning this season. In college baseball, the coaches — lacking the trust in their young battery — often call the vast majority of pitches, which can slow things down.

So what’s on the card?

Iannetta said the information is mostly related to controlling the running game, and he explained some of the mechanics of the process. “It’s just a random three-digit number that corresponds to a sign and then we have 10 different cards with random numbers,” Iannetta said. “As soon as they [the MASN broadcast] zoomed in… we heard about it and switched cards immediately. We switched to a different card with a whole new set of numbers. There’s no way to memorize it. There’s a random-number generator spitting out a corresponding number [for the cards], and the coaches have the same cards.”

So the signals are no longer part of a language that each team possesses, or that evolves between specific pitchers and catchers; it’s now more like coded signals in the military.

What’s also interesting about this, though, is that it foreshadows a change in who control the calls, the catcher or the coaches.

In college, the coaches already do so, mainly to keep the game moving. But in major league baseball, there’s a potential competition between two very different kinds of expertise.

On one hand, the catcher possesses a lot of on-the-ground knowledge. They know how confident each hitter is when they come to the plate, can observe how forcefully they’re hitting, whether they’re in the zone or distracted, etc. They also know their pitcher’s performance. “On any given night,” the article says, “the pitcher is going to have a different feel, a different comfort level, with certain pitches that will be unique to that contest.” Put that together, and a good catcher can really shape the play.

On the other hand, coaches now have a staggering amount of data about players, and there’s a temptation to use that to shape the game by allowing coaches to call more of the pitches.

But as the signaling system gets more complex, and you can change it up more often, there’s an opportunity to either use it to deliver more information to catchers, or to displace the knowledge of catchers entirely and rely on big data.

What’s important here is that it’s not that one form of knowledge or expertise is necessarily superior to another; they’re probably best seen as incommensurate, as forms of knowledge that are so different it’s like comparing apples and manta rays. But you can imagine coaches deciding that they want more control over the game, or thinking that big data will let them use less experienced catchers, or simply being super-impressed at the statistics and all the cool things you can do with it. In order words, adopting what looks like a data-intensive and rational system for reasons that actually aren’t that rational.

“What people are angry about… is that we no longer feel in control of the technology in our lives”

In my book The Distraction Addiction I talked about how humans have evolved to have incredibly powerful relationships with technologies, starting with hand axes a million years ago, and continuing down to the present; how our relationships with technologies are among the most powerful we have; and that the challenge with today’s technologies was not to learn to live without them, but to learn to use them better. This meant recognizing the power of those relationships; thinking more deeply about them; and re-learning how to use them well, rather than being used by them.

Arianna Huffington has a piece about “The Anger at the Heart of the Facebook Hearings” that echoes this:

What people are angry about, and what’s truly fueling this moment, is that we no longer feel in control of the technology in our lives. That feeling of losing control has been building steadily for the last several years, as our lives have become both more dominated by technology and more dependent on technology. It’s the feeling that the pace of our lives, and the next thing on our to-do list, is no longer up to us. It comes via the endless screens and algorithms we’re immersed in. And we know that the feeling of autonomy is one of the single most important factors in our happiness. But we’re feeling less and less autonomous.

I think this has a lot truth to it, though there is real ill-feeling toward companies, not just technologies and our relationships with them. Control is one of the things we instinctively use to measure the trustworthiness of a technology; it’s also something we need in order to use them well.

So it make sense that the sense that a company is designing its product to elude our control should inspire suspicion and hostility. We’ve coevolved with technologies, and expect to be able to use them to extend our cognitive and physical abilities; and when that relationship is broken, it’s a big problem for us.

“he failed to do [his job] because he was distracted by tweeting”

Hollywood, California Adventure

So the next time you think you can handle checking your email while driving, think back to the 2016 Oscars when La La Land was incorrectly announced as best picture instead of Moonlight. As the Hollywood Reporter explains in their new “Oral History of Oscar’s Epic Best Picture Fiasco,” it all came down to digital distraction. Continue reading

Roombot and meeting scheduling

In my study of how companies shorten their workdays, one of the things I’ve consistently seen is companies shortening meetings, and doing a number of things to make meetings more effective: requiring pre-circulated agendas and goals, sharing background material beforehand, having walking or standing meetings, and making sure that conference call phones and other tech are running smoothly before the meeting is scheduled to start, so you don’t spend the first 10 minutes looking for dry-erase markers or punching in conference codes.

They also use tools to signal when meeting times are up, or when the group only has a few minutes left. The most popular tools are kitchen timers and smartphone alarms (unless your company bans devices in meetings, which is another popular thing), but a couple have taken a more high-tech approach: using Philips Hue lightbulbs and some locally-sourced code to have the room itself signal when you should start wrapping up.

I first heard about this tool at IIH Nordic, a Copenhagen-based SEO firm that moved to a 4-day week, but others use it, too. Philadelphia design firm O3 World calls their RoomBot, and explains how their system works in this video:

It’s a cool system, but the important thing is to have some kind of external tool that announces when your time is up.

“Exploitation is encoded into the systems we are building”

Writer and artist James Bridle has a long, but rather amazing and disturbing, piece arguing that “Something is wrong on the internet.” Specifically he’s talking about how kids’ videos on YouTube have turned super-strange and -dark, thanks to the weird profitability of kids’ videos, their low production standards, efforts to hit the right SEO and keyword notes, etc.. The result, he says, is that

Automated reward systems like YouTube algorithms necessitate exploitation in the same way that capitalism necessitates exploitation, and if you’re someone who bristles at the second half of that equation then maybe this should be what convinces you of its truth. Exploitation is encoded into the systems we are building, making it harder to see, harder to think and explain, harder to counter and defend against. Not in a future of AI overlords and robots in the factories, but right here, now, on your screen, in your living room and in your pocket.

I’ve been thinking a lot recently about the future of automation and work, and whether it’s possible to avoid the kinds of race-to-the-bottom, exterminate-the-worker imperatives that seem to be implicit in so many automation projects today, so this is a bracing argument.

It goes on, after walking through a number of examples of videos that are literally nightmarish:

To expose children to this content is abuse. We’re not talking about the debatable but undoubtedly real effects of film or videogame violence on teenagers, or the effects of pornography or extreme images on young minds, which were alluded to in my opening description of my own teenage internet use. Those are important debates, but they’re not what is being discussed here. What we’re talking about is very young children, effectively from birth, being deliberately targeted with content which will traumatise and disturb them, via networks which are extremely vulnerable to exactly this form of abuse. It’s not about trolls, but about a kind of violence inherent in the combination of digital systems and capitalist incentives. It’s down to that level of the metal.

This, I think, is my point: The system is complicit in the abuse.

And right now, right here, YouTube and Google are complicit in that system. The architecture they have built to extract the maximum revenue from online video is being hacked by persons unknown to abuse children, perhaps not even deliberately, but at a massive scale. I believe they have an absolute responsibility to deal with this, just as they have a responsibility to deal with the radicalisation of (mostly) young (mostly) men via extremist videos — of any political persuasion. They have so far showed absolutely no inclination to do this, which is in itself despicable. However, a huge part of my troubled response to this issue is that I have no idea how they can respond without shutting down the service itself, and most systems which resemble it. We have built a world which operates at scale, where human oversight is simply impossible, and no manner of inhuman oversight will counter most of the examples I’ve used in this essay.

I spent a little time looking at some of these videos, and they are beyond weird. They combine Second Life-level clunky animation; the kinds of repetition that adults find irritating and toddlers love; that distinctive kids’ music; and extremely strange cuts and changes of scene. About four minutes into one of the videos, the scene shifted from a totally anodyne house to a graveyard in which familiar toys sing a song about how sugar is bad, only they have flayed zombie heads; it was exactly the kind of thing that your mind would cook up as a nightmare.

“One day without notifications changes behaviour for two years”

Several years ago, Spanish telecommunications firm Telefónica ran something called the Do Not Disturb Challenge.

It’s one of a number of such events that have been sponsored by schools, civic organizations, and groups interested in helping people regain control over our devices.

At the time, it looked like it was kind of a failure. Even after they scaled it back from a week to 24 hours (“[W]e couldn’t recruit anybody to take part,” one of the researchers told New Scientist. “We just got empty, horrified stares. And so eventually we backed down to 24 hours.”) Even after that, only about 30 people signed up. (The researchers explained their preliminary findings in a 2015 article.)

However, New Scientist notes, “two-thirds of the participants said they would change how they managed their notifications.” The researchers have gone back to the participants and talked to them about their smartphone use and attitudes towards notifications, and found something really interesting, as they report in a new article (with the somewhat discouraging title “Productive, Anxious, Lonely: 24 Hours Without Push Notifications“).

The New Scientist reports that “half had actually stuck with this goal two years on, suggesting that even a short, enforced holiday is a powerful intervention.” But as they put it in the article,

The evidence indicates that notifications have locked us in a dilemma: without notifications, participants felt less distracted and more productive. But, they also felt no longer able to be as responsive as expected, which made some participants anxious. And, they felt less connected with one’s social group.

It’s really interesting that digital sabbaths can have a long-term effect on behavior.

The other thing I would note is that it’s possible to customize notifications so that you’re still accessible to the people who really matter, but aren’t disturbed by messages about how the online retailer you visited 6 months ago is having 20% off everything. I talk in this article about how to reset your notifications so your phone does what it’s supposed to– keep you accessible to people who count– and not what app makers and retailers want. It’ll help your phone pass what I call the “zombie apocalypse test,” keeping your connected to the people you’d call during the zombie apocalypse, and no one else.

Martin Pielot and Luz Rello, “Productive, Anxious, Lonely – 24 Hours Without Push Notifications,” in Proceedings of MobileHCI ’17 (Vienna, Austria, September 04-07, 2017).

Is work-life balance a myth?

Yesterday I was on an episode of Al Jazeera’s “The Stream” to talk about work-life balance, rest and technology.

It was interesting doing a TV show, especially via Skype from my garage office. This is what it looks like behind the scenes:

About to go on Al Jazeera's "The Stream" to talk about technology and work-life balance.

I have a second screen and the webcam drops down front of it, so i can look at a Skype conversation I can come closer to making eye contact (i.e. staring at the camera not the display); I also had the names of the other participants written on a Post-It and stuck on the screen, as there are few things more embarrassing than forgetting your host’s name!

The studio-grade mic is one I bought a couple years ago, and I’m constantly surprised at how good it sounds.

Finally, I had a pair of earbuds that looped behind my head; I avoid the 1960s NASA mission control look when I can.

Most of the lessons I’ve learned doing radio apply to television appearances, but there are two differences.

First, you’ve gotta be really still. In lots of radio interviews I’m on Skype or my phone, and I can wander around the kitchen as I talk. I’m one of those people who likes to move as they talk or think (embodied cognition in action!), but you don’t have this outlet when you’re on TV. You gotta stand really still.

In fact, next time I’m going to make sure to sit down, because that’ll be easier to sustain for half an hour.

Second, never take your eyes off the camera, even if a wolverine is growling at your ankle. Even a brief look away is noticeable. It’s really striking.

But I’m learning.

The Stream” is also an interesting show because it’s one of those that incorporates feedback from social media, which meant I had several Twitter exchanges after the show with people.

Cyberloafing, work, and recovery

This study came out a couple months ago, when I working on the revisions to Rest, so I didn’t write about it then, but it’s still quite timely: it’s a project by Arizona State researchers to measure “cyberloafing” (i.e., using work time and resources for things other than work) and the efficacy of countermeasures against it.

Here’s the abstract:

The goal of this study is to explore and analyze the effectiveness of a possible countermeasure to the so-called “cyberloafing” problem involving a technical solution of Internet filtering and monitoring. Through a multi-theoretical lens, we utilize operant conditioning and individuals’ psychological morals of procedural justice and social norms to study the effectiveness of this countermeasure in addressing the associated agency problem and in promoting compliance with an organization’s Internet usage policies. We find that in addition to the blocking module, confirmation and quota modules of an Internet filtering and monitoring system can prevent shirking and promote better compliance through employee empowerment and attention resource replenishment.

The idea is that while there are sites that people definitely need as part of their work– workplace compliance rules, training materials, stock prices, etc.– there’s plenty of stuff that’s also either of questionable utility, eats up bandwidth, or can actually raise liability issues for a company; but that just banning sites is less effective a deterrent to cyberloafing than getting people to identify what kinds of material is useful, and what’s not.

[Citation: Jeremy Glassman, Marilyn Prosch, Benjamin B.M. Shao, “To monitor or not to monitor: Effectiveness of a cyberloafing countermeasure,” Information & Management 52:2 (March 2015), 170–182.]

Greyscale your smartphone screen to make it less compelling

I hadn’t heard of this idea until this Atlantic video from James Hamblin:

The Internet of Things is just another way to distract you

One of the things you always, and I mean always, hear about Internet of Things and smart home devices is that they “just work.” They’re all like these magic autonomous robots that’ll connect themselves to your wifi, then go do their thing, yet also be totally unobtrusive and intuitive (whatever those two words mean). Sounds cool, right?

Of course, the reality is very different, as this essay from IoS explains. The light went on sometime around the point when the author’s Internet-enabled thermostat stopped working whenever the wifi connection was lost (and “The only way to control the gadget is via the app, so when it breaks you’re really screwed”), and it came time to update their Philips Hue light bulbs: “When the first firmware update rolled around, it was exciting, until I spent an hour trying to update lightbulbs. Nobody warned me that being an adult would mean wasting my waking hours updating Linux on a set of lightbulbs, rebooting them until they’d take the latest firmware. The future is great.”

In other words, things work great until they don’t, at which point all the wheels come off. Further, as we’ve learned recently, connected devices are “connected” to the fates of their companies, in a way that “dumb” devices are not. If the company that made your hammer or pants goes belly-up, that doesn’t affect your ability to pound nails or cover up your naughty bits. But that’s not the case with smart home devices.

A one-time purchase of a smart device isn’t a sustainable plan for companies that need to run servers to support those devices. Not only are you buying into a smart device that might not turn out to be as smart as you thought, it’s possible it’ll just stop working in two years or so when the company goes under or gets acquired.

The Internet of Things right now is a mess. It’s being built by scrappy startups with delusions of grandeur, but no backup plan for when connectivity fails, or consideration for if their business models reach out more than a year or two — leaving you and me at risk.

Just another indicator of how technologies of the future could turn out to be really distracting.

Older posts

© 2018 Deliberate Rest

Theme by Anders NorenUp ↑