Home > Big news > News > ARE WE IMPROVING OUR CITIES AT THE EXPENSE OF PRIVACY?
ARE WE IMPROVING OUR CITIES AT THE EXPENSE OF PRIVACY?

 

Big Data is already changing the way cities think about infrastructure, congestion, regulation, and planning. For the most part, we laud the changes ahead because they hold the promise of making our lives better. But has everyone stopped to consider the threats to privacy?

After crunching the numbers on London’s notorious traffic problem, data scientists were surprised by what they found. Worsening congestion wasn’t due to more people driving their cars around the city. In fact, vehicle count was down!

The uptick in congestion was actually due to two unexpected factors: more delivery vans and increased roadworks. Turns out an increase in online shopping traffic has caused a corresponding increase in offline city traffic (the other cause was increased roadworks).

From that insight, planners were able to make suggestions to the city’s mayor: “What’s needed is for the new mayor to ease off excessive roadworks, build new river crossings, devise a plan for managing freight and revisit measures to control congestion, including charging.”

It’s improvements like this that drive the vision for a better future that’s made possible by Big Data. Making our cities better places to live is about as immediate a cause as any, and it touches the lives of the majority, no matter where you are: London, Calgary, Los Angeles, or Bombay.

The Possibilities for Better Urban Living are Exciting to Imagine

All this comes as exciting news because, despite decades of studying urban congestion, scientists have not arrived at workable solutions for improvement. Carpooling, HOV lanes, congestion pricing have all fallen short of the goal of easing delays caused by too much traffic.

Meanwhile, cities are less and less livable because of traffic congestion.

But with real-time information available through GPS on drivers’ devices, plus other IoT sources of Big Data, scientists can apply machine learning to come up with on-demand, re-routing tools to make traffic move faster. None of this was possible in the past because making snap decisions in the face of raw, real-time data simply wasn’t possible without the AI computing power and IoT data sources we now have.

Nature recently published an exhaustive and fascinating study suggesting that artificial intelligence applied to the world’s traffic congestion could increase the benefits of car ownership(2). The study claimed that small changes in the routes drivers chose could reduce congestion-related delays by as much as 30 percent.

These are the types of improvements the general public thinks about when they envision the future and how Big Data and AI can improve our cities, hence our lives.

But Where’s All the Data Coming From?

But sometimes we forget that the improvements we crave could come at the expense of threats to privacy. Easing city congestion gets a yes-vote from everyone, but how about when the solution requires you to fork over personal data?

Such improvements are made possible only when artificial intelligence (AI) is applied to the vast stores of data that exist on everything from city surveillance cameras to the GPS-enabled smartphones of private citizens.

The darker side of our rosy AI-enabled future reveals serious ethical concerns over collecting all that data, especially when it’s collected and stored by governmental agencies like city planning boards and local governments. These concerns include:

  • how the data is collected (is there consent?)
  • the security of the data once it’s collected
  • function creep of the data

Let’s Look at Function Creep

The ethics of the collection of personal data and how it’s stored and secured is a common topic, especially with recent and upcoming privacy legislation both here in Canada and abroad. But function creep isn’t always front-of-mind because its ramifications seem further down the line from the present need for security and privacy.

But function creep is a huge privacy concern. When data is collected for one stated purpose, but then used for another, we have function creep. Surveillance cameras are said to be installed for safety and security. Police officers can tap into them during crime investigations to find clues, for example.

But if footage happens to catch private citizens’ private, legal behavior (which it does, of course) and it’s used against them, that’s function creep. What if, when you’re smoking outside your office building, your insurance company acquires that data and uses it to justify a hike in your insurance premiums?

Data Can be Unknowingly Collected & Shared Inappropriately

There may well be pure intentions behind the collection of data from private citizens to ease city traffic congestion. Cities may not have ulterior motives and may even apply top-notch security measures to their data processes. But other entities may benefit from accessing this data, and they are apt to find ways to get it. Chances are, they’ll be using that data for something other than what consent was given for.

It’s unrealistic to think we can roll back time and put a halt to the collection and use of personal data. The possible advantages are spectacular to imagine (less congestion, cleaner cities, better quality of life). And this is just within the realm of city planning. There are other Big Data environments where AI stands to make tremendous improvements: digitalized medical records, for example.

Most people by now are aware of what’s possible and many are already willingly giving up their personal data via their smartphones, just for small conveniences like mobile banking, better navigation, and staying in touch on social media.

But we have to be realistic and keep in mind that hackers exist, and unintentional leaks happen even when there’s no culprit lurking in the wings to snatch up valuable personal data. Function creep occurs, and abuse of data happens as well. City governments who aim to benefit their citizens through the use of Big Data must consider close regulation, seek informed consent, and initiate legislation that would punish those who violate.

This article was originally publshed on irishtechnews.ie and can be viewed in full

(0)(0)