Published: April 14, 2020

Authors of this Market Insight report: Paige Bartley, Matt Aslett, Csilla Zsigri, and Jeremy Korn

Introduction

Necessity is the mother of invention, so it is not surprising that in periods of global tension and adversity, technological progress intensifies. The Allied code-breaking advances closely associated with Alan Turing and Bletchley Park research in the UK during WWII saved countless lives and set the stage for momentous global advances in computing throughout the twentieth century. Following the World Trade Center attacks in 2001, the US government began to formalize and consolidate public data collection and analysis, however controversial.

But there can be a darker side to these advances. The delicate balance of civil liberties in democratic societies can easily be disrupted by global turbulence. This grows truer as data becomes the lifeblood of economics and governmental functions; with citizens potentially objectified as sources of data, the drive to collect that personal information at any cost can become an issue of business survival for the private sector and an issue of control for the public sector. In times of uncertainty, common global threats historically have often been – and likely will continue to be – justification to amass and analyze yet even more personal data.

The 451 Take

There's no motivation to quietly forgo personal freedoms quite like the urgency of social galvanization against a common cause. When citizens are rallied against a defined and shared enemy – whether that be ideological, biological or otherwise – they typically have a higher tolerance for infringement of personal rights and liberties.

More insidiously, times of national and global duress can be used opportunistically by both the public and private sectors to impose policies and practices that would otherwise be unpopular with the public, given the relative smokescreen of distraction afforded by a major crisis. In short, with many global citizens more concerned about where their next monthly rent payment or roll of toilet paper may come from, it is much easier to pass legislation and allow practices that normally would face harsher public scrutiny. The world is rallying around data collection and analysis as a means to understand the pandemic and allocate appropriate resources in the face of novel coronavirus; however, in doing so, caution must be exercised. Without scrutiny and care, data privacy rights that have been painstakingly slow to forge in the modern era could be undermined.

 

One Step Forward, Potentially Two Steps Back

Since roughly 2016, global data privacy rights have been coalescing into their modern form. In almost all cases, the development that was overdue, and required consideration of factors such as contemporary cloud and hybrid data management. Simply put, many of the initial data privacy laws, often created in the 1980s, fail to provide adequate guidance today because they never foresaw a future in which compute is distinct from storage. Therefore, it is not surprising that modern data privacy policy seeks to set personal data protection rules that are appropriate for the virtualized, cloud era.

The proverbial headliners of this movement include the General Data Protection Regulation (GDPR), which went into effect in 2018 in the EU, and the California Consumer Privacy Act (CCPA), which went into effect in 2020 in the US. We have examined some of the fundamental differences between the two regulations, but what they have in common is more important. As nations around the world look to modernize data protection and data privacy laws, common principles emerge: transparency of personal data use, limited scope of data processing, data minimization policies and the right of individuals to update or remediate incorrect information.

Today, data privacy issues are frequently reported by enterprise organizations as a leading source of tension and concern. In 451 Research's Voice of the Enterprise: Data & Analytics, 2H 2019 survey, 24% of respondents report 'data privacy concerns' as a barrier to becoming more data-driven as an organization, a barrier second only to budget. Meanwhile, according to 451 Research's Voice of the Connected User Landscape, Consumer Spending, December 2019 (Leading Indicator) survey, 94% of consumers with an opinion on the matter are either 'somewhat' or 'very' concerned about the companies they do business with adequately protecting their personal data. Organizations are eager to leverage data to its fullest potential but are wary of violating customer expectations or undermining the customer experience. There is often inherent unease as businesses try to realign their business strategy with proliferating privacy requirements and consumer expectations.

Amid this private sector tension, organizations are looking for release valves: opportunities to increase data leverage rather than moderate it. A pandemic situation presenting immense economic stressors provides the motive for organizations to do things they might not otherwise do with that data, sometimes at the cost of privacy. In the public sector, a dim view of governmental motives might suggest that crises are the perfect opportunities to consolidate control and exploit data from a distressed and confused populace. In both the public and private sector, these ingredients potentially form a recipe for undermining – or at least slowing progress – of personal privacy rights.

Public Sector: Motives Under Pressure

The overarching motive of governments during a public health crisis should be, and typically is, to minimize mortality and morbidity from a common biological threat to citizens. In the modern era, there is more desire for accurate data collection and analysis in this regard than ever before. Governments want to understand the geographic movement of individuals, the progression of symptoms and rates of contagion. But using data to understand a public health threat is typically just the means to an end: to control the negative effects. Controlling the negative effects of a public health threat often requires changing or restricting individual behavior. In democratic societies, there can be inherent friction with expectations of personal liberties. Many nations have mechanisms for raising and consolidating powers of heads of state in times of emergency, increasing this friction.

In the EU's GDPR, a benchmark was set for privacy expectations regarding the use of sensitive categories of personal data: such as biometric data, health-related data and genetic data. The blanket rule is that processing of these data types is prohibited by default unless special exemption criteria are met. One such exception is for 'reasons of public interest in the area of public health' in which the bar to data processing is lowered, specifically by Article 9(2)(i). In this sense, governments are allowed to work with protected data, so long as that processing can be reasonably justified in the name of the common public interest: a potential slippery slope. The concept of 'scope creep' can expand even the most well-intended laws and government practices beyond their original intent over time.

In the EU, the European Data Protection Board sought to provide clarifying guidance on personal data use related to the novel coronavirus outbreak and spread in the EU in guidance that was issued on March 19 in a measure to help prevent the scope creep that might occur via aggressive interpretation of existing laws. However, necessarily, the guidance defers to the laws of individual member states in many cases regarding specific personal rights. Contemporary legislative examples suggest national law regarding privacy can be relatively quick to change in times of national distress.

Perhaps the most critical observation of government actions during times like these comes from the trend to use periods of public duress to implement policy that will extend long past the expected period of turbulence. One notable example was the implementation of the USA PATRIOT Act in fall 2001 following the 9/11 terrorism attacks, which expanded and formalized government powers to collect and analyze data from the public. While many parts of the law were scheduled for sunset in late 2005, over the last 15 years administrations led by both major political parties in the US have acted to expand and extend provisions. Today, the questioned efficacy of the USA PATRIOT Act and related policy is a point of criticism in the privacy and civil liberties communities.

There is also a phenomenon in which times of public distress are used to advance policy that might be more publicly divisive, or at least visible, otherwise. One such example currently in the US is a bi-partisan bill introduced to the Senate in March 2020 – the EARN IT Act – which by most contemporary interpretations would likely allow provisions for the weakening of online encryption, further eroding general public mechanisms for privacy. Positioned as a law to protect children from online exploitation, the bill is receiving little widespread attention in a time where most citizens are fully preoccupied with their own everyday health and basic necessities and media coverage is focused on COVID-19. Washington state, at the end of March 2020, additionally passed a facial recognition technology law – allowing restricted use – that otherwise may have faced more public scrutiny.


Private Sector: Measures to Persevere

In the private sector, it is already established that well-managed and well-leveraged data is a principal competitive advantage regardless of industry. Particularly for organizations operating in the B2C space, consumers' personal data is a valuable commodity to amass and analyze at scale. In times where many businesses are seeing drop-offs in consumer engagement and demand, there is an underlying desire to maximize and sustain data collection as well as preserve basic business functions such as employee productivity to ensure that the lights can be kept on until economic recovery. In a recent report, we already examined the potential for the current public health crisis to widen the gap between the enterprise 'haves' and 'have-nots' with regard to data and analytics capabilities. For a company caught in survival mode, data 'hoarding' might seem like an attractive coping mechanism.

A total of 63.2% of organizational respondents report they are already experiencing, or expect to experience, a loss or reduction of customer demand due to ongoing coronavirus circumstances. Meanwhile, 63.4% of respondents report that their organization is currently experiencing – or plans to experience – a reduction in employee productivity. Both create a nexus leading to potentially painful corporate outcomes. This survey, administered in early March, likely underestimates the true magnitude of these effects because the virus has continued to spread and intensify in the US.

In short, many businesses are taking aggressive cost-cutting measures while also feeling the pressure to increase control and consolidation of all available data resources: customer and employee data alike. Concerns over lost employee productivity are especially prevalent as many businesses transition to a full or mostly remote-work model. In fact, many organizations expect the shift to remote work will persist in the long term; when asked in the same Voice of the Enterprise, Digital Pulse: Coronavirus Flash Survey, 37.8% of respondents replied that expanded/universal work-from-home policies would likely be a long-term or permanent outcome of changes in policy related to coronavirus.

With a more remote workforce, businesses seek to keep a closer eye on employees, ostensibly to ensure compliance with industry regulations, but also to ensure levels of productivity when individuals cannot be monitored in person. While existing research suggests that remote workers may be just as productive, or perhaps even more so than their office-based peers under typical circumstances, many organizations are understandably leery in a time of uncertainty. So, interest in employee monitoring technology, such as remote webcam monitoring on company-issued devices, has risen. In the US, unlike under the EU's GDPR, employees are afforded few federal workplace expectations for data privacy: a key point of potential friction when the employee workplace is increasingly the home.

Well-intended data collection efforts in frontline industries, too, may have the potential for future losses of privacy if not carefully approached. A leading healthcare system with hospitals in NYC recently launched a smartphone app to collect data on personal COVID-19 symptoms, demographics and exposure history. In the official press release for the app, no explicit mentions were made of HIPAA compliance for the data, or how the data might exactly be anonymized and securely allocated for research by public and private bodies. This is not to say this particular app doesn't have such data protections; however, in a time of public fear and confusion, those protections – when they exist – are often not communicated. A health-related app without adequate data protections under current circumstances would have the potential to be particularly harmful to individuals and have opportunities for abuse.


Perhaps Not Doomsday, Today

Not all data privacy examples arising from the novel coronavirus outbreak are stories of exploitation or nefarious intent. Harkening back to other turbulent eras such as WWII, pressure can help hasten some progress. Efforts, such as those by MIT with its Private Kit app, seek to help individuals understand their COVID-19 risk exposure without having personal location data leave the individual's smartphone unless explicit consent is given, in which case the data is anonymized. This emphasis on privacy by design still depends on an individual to truthfully report their own COVID-19 status for the eventual benefit of others, illustrating the delicate balance that societies face with data use and personal freedoms.

Even some governmental initiatives to track and monitor the spread of coronavirus have attempted to emphasize a privacy-oriented approach to data leverage, as is the case with the TraceTogether app developed in collaboration with the Government Technology Agency of Singapore. The app's BlueTrace protocol combines both centralized and decentralized models for contact tracing and uses Bluetooth proximity of participating devices rather than GPS data that is stored on the device. App users that subsequently test positive for COIVD-19 then have the option to share their data with Singapore's Ministry of Health. Perhaps the biggest concern regarding privacy was the decision to make the technology supporting the app freely available to developers across the world.

Customers and employees alike are wary of perceived privacy violations. Now is just a difficult time for individuals to act, given basic personal stressors such as job and resource instability. Businesses that have the strategic resources to do so will likely succeed by focusing not on data hoarding but on broader data governance strategy: strategy that will help them achieve proactive initiatives such as analytics as well as more reactive ones such as data privacy compliance.

Nick Patience
Founder & Research Vice President

Nick Patience is 451 Research’s lead analyst for AI and machine learning, an area he has been researching since 2001. He is part of the company’s Data, AI & Analytics research channel but also works across the entire research team to uncover and understand use cases for machine learning.

Jeremy Korn
Associate Analyst

Jeremy Korn is an Associate Analyst for the Data, AI & Analytics Channel at 451 Research, where he covers artificial intelligence and machine learning in the enterprise. In particular, he focuses on the legal and ethical challenges raised by these emerging technologies. In addition, Jeremy helps lead the Voice of the Enterprise: AI and Machine Learning survey, which provides qualitative insights into AI adoption, use cases and infrastructure.

Rachel Dunning
Research Associate

Rachel Dunning is a Research Associate at 451 Research. Prior to joining 451 Research, she graduated from Fitchburg State University Magna Cum Laude with a BS in Cognitive Psychology and Economics. While attending school, she gained exposure to research methodology and data analytics through her involvement in several academic research projects. She is conversationally fluent in German.

Want to read more? Request a trial now.