Sunday, June 28, 2015

Beware The Listening Machines: Civic Hacking & Society

Are big-picture social issues something civic hackers should be concerned about and working on?

A narrowly-focused, easily-tackled and non-controversial example of a civic hack is “Is It Recycling Week?” -- a smartphone app that tells you if you need to put out the recycling bin tonight for trash pickup in the morning. Pretty much everybody can agree this is an appropriate thing for civic hackers to work on.

A broadly-focused, hard-to-even-start and very controversial example of a civic hack is working to improve your life and the lives of fellow citizens with regards to how they’re impacted by ‘Listening Machines.’

Or, to expand the civic hacking concept even more broadly, you could call this the topic of hacking society, especially with respect to new technology, or new uses and impacts of technology.

Working on the issue of Listening Machines, or the bigger picture of hacking our current US society and policies for preventing egregious technology blunders, is so daunting and complex that most people, if they even think about it, wouldn’t contemplate trying to improve the system.

I’ll back up for a minute and point you to the article that prompted this post. It explains what Listening Machines are and suggests the need for a conversation (and action) about preventing unnecessary disasters caused by new technologies or new applications of technology. “Beware the Listening Machines,” an article from the director of the Center for Civic Media at MIT, paints a discomforting picture of where the innocent-seeming ‘surveillance technology’ of kids-toys-that-listen and smartphones-that-you-talk-to are taking us.
When dolls [, smartphones] and friendly robots can listen and respond to what people say, where's the line between personal assistance and mass surveillance? 
...my friend Kate Crawford invited me to a daylong “Listening Machine Summit,”...What's a listening machine? The example of everyone's lips was Hello Barbie, a...doll that will listen to your child speak and respond in kind...a Mattel representative introduced the newest version of Barbie by saying: ‘Welcome to New York, Barbie.’ The doll, named Hello Barbie, responded: ‘I love New York! Don't you? Tell me, what's your favorite part about the city? The food, fashion, or the sights?’ 
Barbie accomplishes this magic by recording your child’s question, uploading it to a speech recognition server, identifying a recognizable keyword (“New York”) and offering an appropriate synthesized response. The company behind Barbie’s newfound voice, ToyTalk, uses your child’s utterance to help tune their speech recognition, likely storing the voice file for future use. And that’s the trick with listening systems. If you can imagine reasons why you might not want Mattel maintaining a record of things your child says while talking to his or her doll, you should be able to imagine the possible harms that could come from use—abuse or interrogation of other listening systems... 
As one of the speakers put it...listening machines trigger all three aspects of the surveillance holy trinity:
  • They're pervasive, starting to appear in all aspects of our lives.
  • They're persistent, capable of keeping records of what we've said indefinitely.
  • They process the data they collect, seeking to understand what people are saying and acting on what they're able to understand.
To reduce the creepy nature of their surveillant behavior, listening systems are often embedded in devices designed to be charming, cute, and delightful: toys, robots, and smooth-voiced personal assistants. 
...If a robot observes spousal abuse, should it call the police? If the robot is designed to be friend and confidant to everyone in the house, but was paid for by the mother, should we expect it to rat out one of the kids for smoking marijuana? Despite the helpful provocations offered by real and proposed consumer products, the questions I found most interesting focused on being unwittingly and unwillingly surveilled by listening machines. 
...companies invent new technologies and bring them to market. Consumers occasionally react, and if sufficient numbers react loudly enough, government regulators investigate and mandate changes. There’s a sense that this is the correct process, that more aggressive regulation would crush innovation...But this is a model in which regulation is a very modest counterweight to market forces. So long as a product is on the market, it’s engaged in persuading people that a new type of behavior is the new normal. When Apple brought Siri to market, it engaged in a multi-front campaign to persuade people that they should regularly speak to a computer to make appointments, order dinner, check traffic conditions, and seek advice. Apple was able to lower barriers to adoption by making the product a pre-installed part of their very popular phone, making it available for free, and heavily advertising the new functionality...people talk to their phones and share sensitive information with them, and that's just the way things are now...Apple has already won: We're talking to our phones, sharing our lives, generating terabytes of data in the process. The problem with this approach to regulation is that we rarely, if ever, have a conversation about...do we want a world in which we confide in our phones? And how should companies be forced to handle the data generated by these new interactions? 
...These questions...aren't regulatory questions, but policy ones. The challenge is figuring out how, in our current, barely functional political landscape, we decide what technologies should trigger pre-emptive conversations about whether, when, and how those products should come to market...We need a better culture of policymaking in the IT world. We need a better tradition of talking through the “whethers, whens, and hows” of technologies like listening machines.  And we need more conversations that aren’t about what’s possible, but about what’s desirable.
If you want a glimpse of where the Hello Barbie could lead, check out the fictional-but-highly-believable story of Purza the pukah in Anne McCaffrey’s book “The Rowan.”
Purza is a pukha and she's been mine a long time, the Rowan answered, hefting the pukha behind her in a proprietary way...`A specially programmed stabilizing surrogate device,' the Rowan explained. `It's not a stuffed toy.'
Purza is a cuddly, soft, stuffed creature given to a young girl to help her recover from a traumatic disaster which killed all her relatives and anyone who knew her. The puhka listens to the girl and talks back to her about her concerns and thoughts. The Rowan and Purza have long involved conversations. Purza is fictional, but if psychologists and aid-workers had reasonable-cost pukhas available to them in 2015, they would no doubt be giving them to young survivors of landslides in Turkey or tsunamis in Indonesia that wipe out almost everyone in a small remote village. Or many other young victims of trauma scenarios you might think of. But even if we could (can?) make puhkas, should we?

Regulations, government policies and societal trends regarding listening machines have huge implications for the Internet of Things (IoT) smartphones, virtual assistants, robotics, smart cities, government and civilian drones, self-driving (and always-listening) vehicles and numerous other technologies.

I don’t know what the role of civic hackers should be regarding listening machines. Nor do I have a firm opinion on whether or how civic hacking should be a significant force in improving US policies regarding new technologies and new technological impacts in our lives.

But a guiding principle for civic hackers is to be engaged with their community and their government and take personal responsibility for improving situations where they feel they can contribute. So when your city or county is considering using technology to prevent crime by monitoring who is using public spaces, when the local law enforcement sets up traffic cams that captures and stores all the license plates numbers of vehicles coming into your town to monitor vehicle traffic patterns, and when your local K-12 school proposes GPS technology for tracking where students are to 'ensure the safety of your children,' make sure you get involved with those conversations and decisions.

Did you hear that, Barbie?

(I know you're listening too, Siri, Now, Alexa, Cortana, and Pepper...)

------------------------------------------------------

XKCD has also commented on dealing with technology run amok.

------------------------------------------------------

I'll periodically update the list below to make civic hackers aware of disturbing/amazing articles about listening machines subtly becoming an integral and accepted part of our lives, and other ubiquitous technologies invading and controlling our lives.

Amazon Echo:
Amazon's Echo Speaker: Hey, This Thing Is Remarkably Smart 
Amazon Echo, a.k.a. Alexa 
Code that changes how we (can) act:
Demystifying the algorithm: who designs your life?

*****

No comments:

Post a Comment