Introduction

Becoming Smarter About Smart Cities

 

Recognising smart city privacy risks and the inherent threat these bring along, it is important to build technical and legislative structures that defend our privacy, says Marin Ivezic.

  • Client

    Cyber-Physical Systems Security Institute

  • Services

    To find ways to use technology for better sustainability within urban areas

  • Technologies

    Internet Of Things (IoT)

  • Dates

    06/07/2018

PDF

Description

Our species has moved out of the trees onto the savannah – just to build concrete trees instead, in the form of cities. Although this may be a nutshell view of human history, the movement towards making our living places habitable is taking a new turn. We no longer live just in the physical world, we now live also in the cyber-physical, and this transforms how we live, work, and play in the form of the ‘smart city’ built on our own data.

 

The concept of the ‘smart city’ is recent. Reports are that it goes back to a conversation between the Clinton Foundation and Cisco in 2005. Supposedly, the tech vendor was asked to find ways to use technology for better sustainability within urban areas – in other words, go build a “connected urban development program”. From this, the modern concept of a smarter place to live and work evolved, and data is the roots of its growth.

 

We Built This City on Data

The smart city is built on the data we generate. These data, however, may have a complicated life cycle that spans across hardware, household goods, cars, trucks, cameras, smartphones, fridges and even refuse bins. This is our cyber-physical future – a future that promises improvements in sustainability, of economics, and better environmental controls. Whatever we touch in our lives and work can be a conduit for data that then becomes part of the wider city data network. The goal is to gather, analyse and apply all this data in ways that produce greater efficiency and make our lives better. Such goals, however, will not automatically result in bettering our lives. Use models must be carefully scrutinised to ensure the data gathered do not inadvertently create unintended dangers.

 

When Good Data Turns Bad

We humans often try to run before we can walk. We plough ahead with technological innovations without considering negative side-effects that may accompany them. Security of the Internet of Things (IoT) is a case in point. Internet-enablement of devices was rushed to market by most manufacturers while security became an afterthought. A good example of this was the Samsung Smart Fridge, which, in 2015, was found to be vulnerable to a Man-in-the-Middle attack. Since then, inherent security issues within IoT devices have resulted in some of the biggest cyberattacks in history, such as the Mirai botnet. And now, vulnerabilities in our Wi-Fi networks such as the Krack attack add fuel to an already stoked fire. Cyber-kinetic risk, where the digital world penetrates our physical one, is the future playground of hackers. New methodologies for finding and mitigating risks will test security professionals to the limit.

 

In addition to security implications, Internet-enablement and remote data collection pose privacy risks. Privacy concerns have dogged IoT devices built to sit in our homes – and spawned parents’ common nightmare that their children’s toys may secretly be recording conversations. And now, technology expands into our own bodies. In November, the FDA approved pills containing sensors to allow patients’ drug use to be tracked. While comprehensive data is needed to build better cities, that data can also reveal sensitive knowledge about our personal lives.

 

Smarter Living – Smarter Risks?

Advances in technology coupled with population rise and urbanisation challenges have created a perfect storm for the development of the smart city. Using data wisely is a way to manage increasingly difficult urban situations from traffic congestion to waste management.

 

Inherent risks, however, exist in moving to smart living. Let’s look at just three smart city areas that have privacy issues:

 

Autonomous cars, vacuum cleaners, and personal privacy. Cars that understand the environment and our needs can alleviate traffic congestion and make that nightmare journey to work quicker. But these cars also know all your movements. Fancy a quick detour to a less than salubrious locale? Your connected vehicle will know all about it. In your home, spying through Internet-enabled devices is happening through products like Amazon Alexa. Even your vacuum cleaner can become a spy. Check Point recently found that the LG SmartThinQ app, which uses a camera to operate, could be hacked and used to spy on. Several states in the US are adopting laws around the use of autonomous cars, but the data associated with their use is still nascent.

 

Drones, satellites, and surveillance. Back in 2007, Google’s worldwide mapping caused an uproar because of perceived privacy issues. Use of drones for widespread monitoring presents even deeper because drones are much more efficient, autonomous and inconspicuous than the cars that Google used. Drones can serve important uses in a smart city, including monitoring transportation and pollution. But drones that sweep across our skies have obvious privacy implications for individuals, such as civil liberty and freedom of assembly. It isn’t too far-fetched to imagine drones being used for this purpose and this is a concern within the EU’s Article 29 Data Protection Working Party on drones. In another example, Singapore plans to track all cars to allow dynamic management of toll charging and parking. This system monitors by satellite rather than by drone, but the implications around tracking individuals are identical.

 

Smart waste management and impersonal privacy – the issue of re-identification. Waste management is a vital area of smart city life as our population explodes into the mid-21st century. Monitoring water, energy, noise, and air can help improve conditions and reduce pollution. The data generated and processed for this can be impersonal and at. Sophisticated techniques of re-identification, however, can take what seems like anonymous data back down to the individual level. Harvard University has shown, via their specialist Data Privacy Lab, just how easy it is to re-identify individuals, through even supposedly anonymised data. Harvard demonstrated how to use publicly available news items about hospitalisations alongside a ZIP code to identify an individual. In the Smart City, cyber-kinetic attacks, big data, and surveillance will make anonymisation tricky, to say the least. Even our television viewing can be tracked. Researchers recently discovered that smart meters that track a home’s electricity consumption can enable third parties to detect what a household watches by precisely measuring power consumption.

 

As we generate more data, and as we combine and analyse these data, privacy concerns increase. These data can provide valuable insights to help us manage the complex nature of our growing population and urbanisation needs. We should not, and cannot, allow this pressing need for efficiency to undermine our privacy, though. Privacy is sacrosanct and, once lost, hard, if not impossible, to reclaim. As individuals, as organisations, as government bodies, we all need to understand and appreciate what data privacy means. We can make smarter cities, but, in doing so, we must also build in both technical and legislative structures that defend our privacy. Our next big step in human history needs to land on solid ground.

(Author’s disclaimer: Postings on this site are my own and don't necessarily represent PwC positions, strategies or opinions.)

Author

Marin Ivezic

He is PwC Partner Cybersecurity, Cyber-Kinetic Security & Resilience Chair Cyber-Physical Systems Security Institute. Marin specialises in preventing and defeating the threats through which cyber attacks could turn people’s physical lives upside-down. His work life, says Marin, revolves – in best cases – around identifying risks and preventing cyberattacks, and – in worst cases – around undoing the damage when businesses fail to recognise vulnerabilities before someone exploits them.

Related