Sunday, July 12, 2015

Yin & Yang Of Civic Hacking

One concept that t’ai chi and mindfulness meditation reinforced deeply in me over the past two years is that of yin and yang. Life requires balance, and nothing is all good or all bad.

Civic hacking is not all good.

If civic hackers promise you a rose garden, remember that rose bushes have thorns.

Roses look and smell wonderful, but they also require the right soil, sunlight, nutrition, pest control and TLC. Roses don’t bloom 365 days per year, and not all roses have a delightful fragrance.

(I personally would only buy roses that have a strong and distinct fragrance, one that makes you want to periodically bring your face near the blossom, close your eyes, and inhale slowly through your nose to enjoy its deeply evocative scent...)

In the past couple months, I’ve written posts about all the benefits and good karma that civic hacking can bring us. In today’s post, I’ll cover downsides that should be considered by those doing the hacking and by the non-hackers.

To make sure you DON’T get the impression I’m trying to convince you civic hacking and open data are bad things, watch this TEDx video:



The TEDx video above shows how open data and civic hacking can clearly be good. But for a look at the other side of the coin, in “Open data -- the dark side,” Alan Patrick says:
History shows us that in the early days of any new online technology’s life, over-optimism about benefits is always rife. History also shows us that the Dark Side is nearly always underestimated. My aim today is to show that the Dark Side of Open Data is real, serious, and under-estimated - and could cause a major backlash.”
What these two viewpoints tell us is that, like most things in life, there is more than one way to look at and judge civic hacking.

Civic hacking can be bad, or can seem to be bad, for a variety of reasons:

  • Different Criteria For Determining Good vs Bad
  • Unintended Consequences
  • Wasted Time For Civic Hackers And Government Data Providers
  • Real Harm To The Civic Hackers
  • Intentional Abuse

Different Criteria For Determining Good vs Bad

A prime example of different criteria is the issue of transparency about potentially-controversial government actions. Civic hackers are generally pro-transparency, while elected officials and government administrators will usually prefer to not publicize any information regarding government activities that are or might be controversial. And the government viewpoint is not always ‘wrong,’ because 100% transparency is not always good or helpful. I certainly don’t want 100% transparency for my every thought, word and action.

But in the big picture, because governments make and enforce rules that affect the lives of their citizens, government work and interactions should be held to a higher standard than the work and interactions of private citizens and people in non-governmental organizations.

Another example of good / bad criteria is short term vs long term. The civic hacker may have a long term goal in mind when trying improve a situation. However, improving a situation requires change, and people are resistant to change. If the initial result of the civic hacking change is uncomfortable for the government people involved, they may not be in favor of that change. The current focus on police-community relationships in places like Ferguson and Baltimore is definitely uncomfortable for law enforcement people in those cities and in many other cities. But having civic hackers create meaningful visualizations of criminal justice data might highlight a law enforcement issue that most reasonable people will agree needs to be changed, in spite of the discomfort caused by that change.

Putting low-quality data in open data sets falls partly into this ‘different criteria’ aspect of judging whether you think something is good or bad. When civic hackers, government workers or the general public react to and make decisions based on low-quality data, they often make wrong decisions. If nobody figures out that the data is low-quality, having that data open is a bad thing. Most of the time, someone will dislike that bad thing and try to figure out why the wrong decision was made. When they figure out the reason was low-quality data, the long term result will be high-quality data, with which good decisions can be made. So even though the civic hack initially seemed to have bad results, it cleaned up low-quality data and improved the overall situation.

The other lesson regarding the possibility of low-quality open data is that data quality needs to be baked into the open data process. The potential of low-quality data is Never a sufficient reason not to make a data set open.

Unintended Consequences

In my mind, unintended consequences are why some civic hacks will turn out badly. Unintended consequences is why we have many of the cybersecurity problems with digital systems and the Internet. People who designed, built and expanded computing systems and the Internet didn’t plan for ill-intentioned people to abuse those systems the way some humans will always abuse whatever they can. Power corrupts, absolute power corrupts absolutely.

But we can’t let the fact that everything new, especially complex technology, has unintended consequences stop us from creating new things, investigating the wonders of our planet and the universe, or exploring how civic hacking can improve our lives and our cities. We just need to plan ahead for negative consequences of new things, like civic hacking. We need to have an open dialogue about the good and the bad aspects of open data and civic hacks. And above all, we need to learn lessons from the distant and recent history of computing and the Internet. Excellent cybersecurity needs to be the foundation of civic hacks (which means we need to understand what excellent cybersecurity is and we need to have civic hackers with the ability to build security into civic hacks).

The developers of baby monitors envisioned happier parents and safer babies when they designed their products and wrote the software for them. They didn’t envision these monitors’ cameras broadcasting videos of their babies to everyone on the Internet after the monitors had “...their cameras hacked into. One stream shows a baby in Virgina, another fast asleep in Utah, and a third in her crib in Florida. Not only does the website show these unfiltered images, they also provide the exact coordinates of the location, complete with links to a map.

The Google employees who created Google Earth and Google Maps likely imagined their new technological wonder would be used for planning vacation trips, learning about parts of the world one would never be able to travel to, and for pizza delivery drivers to figure out how to deliver your order while it’s still piping hot. The Googlers didn’t imagine terrorists using their products to plan deadly attacks on civilians in hotels, theaters and colleges in Mumbai, India or UK troops in Iraq.

So when civic hackers are working on transportation hacks and governments or transit agencies are making real-time GPS data for buses available as open data, we need to be thinking about security and potential negative consequences. At the same time, civic hackers need to understand that Google Maps is unlikely to be shut down to prevent terrorists from using it to perpetrate unimaginable horrors. There needs to be both local and national discussion about whether open data on real-time bus locations is a good thing, and people involved in the discussions need to understand that different people will have widely-differing opinions. Our civic hack designs need to account for as many of the potential abuses as possible before the hacks are released. We also need to monitor for abuse of those civic hacks and react swiftly and intelligently when they are abused.

For more on unintended consequences of technology, read "What Technology Wants" by Kevin Kelly.

One negative unintended consequence, Intentional Abuse, is so impactful that it is broken out and discussed separately below.

Wasted Time For Civic Hackers And Government Data Providers

Some people have the opinion that civic hacking is bad and civic hackers are wasting their time because they think:
  1. Civic hackers shouldn’t be giving away their valuable time trying to fix the government’s problems
  2. Too many civic hacks are not widely used by citizens or other intended audiences
  3. Too many civic hacks are never completed or aren’t maintained and improved over the long term.
In circumstances where they agree with reasons 2 and 3 above, government data providers may feel they’re wasting time and money providing open data for civic hacking.

Real Harm To The Civic Hackers

Civic hackers are treading on dangerous ground with they work with government data. They have only the best of intentions and are working to improve life for themselves and their fellow citizens. But they run the real risk of serious jail time, financial loss and worse.

The reason civic hackers are subject to these risks are because of vaguely written laws concerning digital data and computer use, because of prosecutorial abuse of those vaguely written laws, because politicians, bureaucrats and law enforcement often have an incomplete understanding of the technology, and because of the difficulty of dealing effectively with the cybercriminals who are trying to be destructive or are trying to steal money or information.

Compounding problems with computer use laws that have been around for years, recently-passed cybersecurity legislation could:
affect hackers who expose flaws in software that are used to exploit systems, such as the release over the past weeks by Google researchers of holes in Microsoft software that were unpatched...someone driving a person to a coffee house where they committed an illegal hack could be considered a member of organized crime…[the new cybersecurity legislation could be used to charge you with a federal felony for] staying on a computer in a public library for 31 minutes if the terms of use say they can be used for a maximum of half an hour at a time...”
The above dangers to well-intentioned hackers are real; they just haven’t been put to the test in court yet, so I can’t link you to actual cases where the new laws were misused by prosecutors. But below are three cases of real-life overzealous prosecution under vaguely-written computer abuse laws.
Jeremy Rubin, a 19 year old MIT student in Massachusetts, developed a computer program called Tidbit with some classmates as part of the Node Knockout Hackaton in November 2013. Tidbit allows users to mine for Bitcoins on a client's computer as a replacement for traditional advertising. Tidbit was presented as a proof of concept and won the award for having the highest innovation score at the hackathon. In December 2013, the New Jersey Attorney General's office issued a sweeping subpoena to Rubin and Tidbit, seeking Tidbit's source code, documents and narrative responses about how Tidbit worked, which websites it was installed on and the Bitcoin accounts and wallet addresses associated with Tidbit...”
Aaron Hillel Swartz...was an American computer programmer, entrepreneur, writer, political organizer and Internet hacktivist who was involved in the development of...the organization Creative Commons...and the social news site, Reddit...Swartz's work also focused on civic awareness and activism. He helped launch the Progressive Change Campaign Committee in 2009 to learn more about effective online activism. In 2010, he became a research fellow at Harvard University's Safra Research Lab on Institutional Corruption, directed by Lawrence Lessig...On January 6, 2011, Swartz was arrested by MIT police on state breaking-and-entering charges, after surreptitiously installing a computer in an Institute closet which he set to systematically download academic journal articles from JSTOR. Federal prosecutors later charged him with two counts of wire fraud and 11 violations of the Computer Fraud and Abuse Act, carrying a cumulative maximum penalty of $1 million in fines, 35 years in prison, asset forfeiture, restitution, and supervised release. Swartz declined a plea bargain under which he would have served six months in federal prison. Two days after the prosecution rejected a counter-offer by Swartz, he was found dead in his Brooklyn apartment, where he had hanged himself...
Another person prosecuted by the US government for violating computer abuse laws is cynical about the potential pitfalls of civic hacking, as quoted in “When the government approves of hacking”:
“...So, the government is providing public APIs [application programming interface] for civic hacking day. I queried a public API once, AT&T even admitted that it was ‘published’ data. The government asserted that after the fact they can declare a given access to data anyone makes public ‘unauthorized’ and have you thrown in prison...If they don’t like the app you make from the text they publish, they reserve the right to imprison you for five years per count,” he added. Like Swartz, Auerheimer...used published information in a way the government deemed illegal...”
If you’re a civic hacker in NE Wisconsin, you’re unlikely to end up in the same situation as the above three people. But some of the laws they were prosecuted under could be used to prosecute you if the government chose to do so.

If we want people to be involved with civic hacking and to be safe from misguided or malicious prosecution, the computer abuse laws need to be revised in light of the beneficial work of cybersecurity researchers and civic hackers.

Intentional Abuse Of Civic Hacks And Open Data

Intentional abuse is probably the negative aspect of civic hacking that most government workers will specify as the reason not to make data open and not to encourage or allow civic hacking. Intentional abuse is also a consequence that most civic hackers don’t think deeply enough about, for two reasons. The first reason is that civic hackers are focused on using their work to solve problems and make good things happen. They probably don’t have extensive experience with using technology to accomplish bad or illegal things, and probably don’t have a lot of friends or acquaintances who live in the darknet culture.

NE Wisconsin Cybersecurity Initiative
The second reason civic hackers don’t think deeply enough about intentional abuse and how to prevent or minimize it is that they aren’t extensively trained in cybersecurity. Most civic hackers can’t write extremely secure code because they don’t know how, especially if their code is complex, provides a high level of convenience, or interacts with insecure code from other people. So civic hacks need to involve people well trained in cybersecurity, as well as people who know the history of cybersecurity incidents and can imagine how civic hacks and open data might be abused.

Cybersecurity, as far as I can tell, has not been a high priority for the civic hacking community in the US. As I suggested in “Cybersecurity: A New Horizon For Civic Hacking?,” I think it’s time for that to change. (And YOU can help create that change…)

If you’ve read this far, you may be surprised I want to be a civic hacker and may wonder why I want others in NE Wisconsin to be civic hackers. You’re almost certainly wondering why anyone else would want to be a civic hacker.

I believe that, done well, the benefits of civic hacking far outweigh the negatives.

Many people die and are injured badly in auto accidents, people spend too much money on cars, and cars cause a lot of pollution, but I don’t wish cars had not been developed.

Cybertheft, child porn, terrorist communications and ubiquitous government panopticonic surveillance of its citizens are enabled by computers and computing technology, but I don’t wish computers had not been developed.

Cars, computers and civic hacking have a dark side and can cause bad things to happen.

But I think cars, computers and civic hacking are worthwhile, and I choose to use them to do good things.

*****

No comments:

Post a Comment