Asia's Source for Enterprise Network Knowledge

Tuesday, March 26th, 2019

Security

Navigating the digital landscape beyond data protection regulations

Navigating the digital landscape beyond data protection regulations

In the past decade, emerging technology and a growing participation in online social networking has come to shape how we as a society engage with one another. With Data Privacy Day recently over, it’s important to remember that digital privacy and by extension, privacy as a whole, carry different meanings to different people. Contrasting approaches to understanding privacy has resulted in a myriad of data privacy legislation across the globe. Most recently, we’ve seen the European Union’s (EU) implementation of the General Data Protection Regulation (GDPR) in the past year and its impact on enterprises as much as society at large, has come to redefine the narrative of digital privacy as we know it today.

The commodification of the digital self

Beyond the guidance found in regulation, it is understood that through the choices we undertake as individuals, we are ultimately responsible for how our data can be used by companies and services. As we continue to develop, some more slowly than others, it’s important to recognize that we now reside in an increasingly digitized world. As part of a digitally literate society, we should exercise discretion in our engagement with “free” services such as social networks or online applications in order to mitigate risks of being hacked or unknowingly surveilled.

Fundamentally, right to privacy can be understood as an exchange, and as we log on, connect, and contribute on third-party systems, we actively engage in a voluntary transaction where our data is commodified as a product. Consumers should bear in mind that when the product is free, then they themselves are the product. As the real world continues to coalesce with the virtual, we come to live more of our lives online, easily forgetting that digital literacy is learned, and not acquired. What is the true cost of digital privacy and most importantly, who has the right to decide that?

A “smart” nation

Becoming digitally literate is to exercise our right to privacy online with awareness of how or what we choose to share. Despite their rallying cries for social connectivity, tech companies are companies all the same and should be held accountable in how they use, collect, and disseminate personal data. Governments and institutional bodies are similarly accountable to the public and while many have drafted policies and issued legislative measures to protect users online, the onus is also on them to equip society with the right skills and knowledge to navigate the digital landscape.

If we look to Singapore In its aspirations of being a “smart nation”, the country has actively created a Digital Literacy and Participation program. Operated by the Info-communications Media Development Authority (IMDA), the all-ages program aims to ensure that the country’s citizens are able to use technology optimally, in order to advance its agenda of nationwide digitization. Having recently announced their model framework regarding how Artificial Intelligence (AI) can be ethically and responsibly used by businesses in the country, the government is committed to cultivating a delicate balance in its approach to regulation that encourages, rather than stifles, innovation. As the first framework to provide sophisticated yet readily implementable guidelines to businesses using AI, Singapore seeks to establish itself as a leader in innovative technology.

With AI, the challenge is being equipped with sufficient data to make accurate, data-driven decisions. Like all forms of technology, there are ethical considerations at play when considering how data can be used as the basis of automated decision-making. As such, companies and authorities should be held accountable when data is used or misused to draw incorrect conclusions. When coupled with the steady development of decentralized technologies, we can hope to see an even greater standard of transparency in how data is managed, authenticated, and confirmed, in the hopes of advancing a mentality that prioritizes growth and innovation but never at the expense of personal reputation damage and safety.

A state of regulations

Taking its digitization efforts into account, existing legislation on data privacy and protection is centered around Singapore’s Personal Data Privacy Act (PDPA). In the past few years, there have been cases of organizations being reprimanded for misusing data and those found in breach of the act can be fined up to SGD$1 million. Though comprehensive, it is argued that like all data protection regulations, PDPA is very much a work in progress and having only been drawn up in 2012 and enacted in 2014, it is only with time that we can observe whether regulations are revised accordingly as the country continues to work towards its goal as a smart nation.

In comparison, if we return to the EU’s GDPR, we see some of today’s most stringent data privacy legislation. Europe may be celebrated for its stance on data protection and has in fact enacted penalties on foreign tech giants such as Facebook who have failed to comply with their requirements. However, with the scope and reach of GDPR, is protectionist legislation truly the answer? Though GDPR addresses the needs of residents within the EU, it fails to consider that the digital world is devoid of geographical borders. With that in mind, perhaps Singapore should serve as a model for the EU in its proactive creation of a digital literacy scheme.

As we continue to occupy both physical and digital spaces, perhaps legislation is simply inefficient in addressing the greater issue of digital literacy that can help to shape our understanding and exercise of digital privacy on the world stage.

Associate Professor Keith B. Carter, National University of Singapore School of Computing