The Business Times

Your data, my business: Why data privacy is especially hazardous for startups

Why data privacy issues are especially hazardous for startups

Claudia Chong
Published Fri, Jan 24, 2020 · 09:50 PM

HOW do you add contacts into your smartphone? Do you fill in a name and phone number and be done with it?

It matters, because it could reveal more about you than you suspect. If you go beyond the required fields and also fill in an address, job title, email and birthday, it means you're more likely to be organised or a perfectionist, explains alternative credit scoring firm LenddoEFL.

And what do you think your smartphone calendar might reveal about you? Say, if you have a habit of scheduling activities? Do your events point to you being a social animal, or a workaholic?

All this information - including location data, social media activity, the kinds of mobile apps downloaded, and browsing history - helps determine how likely a person is to repay their debt. Alternative credit scoring for those who lack a credit history is just one example of how even the most innocuous habits of yours are being gathered and analysed to glean insights.

So you are urged to give that data up. Singapore-based LenddoEFL - one of the pioneer startups in the credit scoring space - analyses up to 12,000 variables using its machine learning algorithms to generate a credit score in less than three minutes, according to information on its website. In the past four years, it has assessed five million users across more than 15 countries.

As the world becomes increasingly digitised, companies are amassing large amounts of data and increasingly reliant on it to fuel their businesses. The phrase often uttered is "data is the new oil", thought to be coined in 2006 by British mathematician Clive Humby.

E-commerce firms rely on data to streamline operations by analysing transactions and shopping behaviour. Healthtech firms are only able to provide personalised care by collecting sensitive information on patients. And with each patient treated, each delivery order fulfilled, and each customer satisfied, many companies grow and refine their machine learning models with the reams of data recorded.

And as the data economy grows, so have discussions around privacy. The issues are problematic for all businesses dealing with data (and who doesn't?) but for startups, they are especially hazardous. Many tech-driven startups, young companies still in the process of scaling up their business, use data to the same extent, or more, as big corporates. But are the safeguards in place?

While LenddoEFL has an ambitious goal of achieving financial inclusion for 1 billion people (and profiting off it), it is worth questioning how it collects and handles data.

In October 2018, advocacy group Privacy International published an article highlighting its concerns about the data Lenddo collects through Facebook.

For instance, it can analyse the history of interactions between the user and their social media connections - being wished "happy birthday" on Facebook on the same day every year, by people with whom the user has a close relationship, indicates a high chance that the date of birth provided is real.

This would mean that during the user verification process, Lenddo also collects information on people who have a relationship with that user but might not have consented to their data being collected by Lenddo, argued Privacy International, citing further evidence that supports their concern. The group had sent queries to LenddoEFL about the extent to which they gather and analyse data on people who were not their customers. As of Jan 21, 2019, Privacy International had not received a response. When The Business Times contacted LenddoEFL, management declined to comment.

Now you see me

The importance of data in keeping businesses humming today has also raised questions on whether consumers understand how valuable it actually is. A quick way to get a sense of it? The next time you encounter a website pop-up relating to Internet cookies, instead of automatically clicking "Accept" or ignoring it, try clicking on "Learn more" to view its potentially long list of third party platforms it shares data with.

US politicians have proposed a bill requiring companies to tell consumers how they make money off their data and how much it is worth. "The overall lack of transparency and disclosure in this market have made it impossible for users to know what they're giving up," Virginia Senator Mark Warner said in a statement last June.

On the flip side, author and tech entrepreneur Antonio Garcia Martinez dismissed thinking of data in the same way that one regards oil, or suggesting that people should receive "dividends" when companies use their data (as California's governor Gavin Newsom has done). This shows a misunderstanding of how Internet giants like Google operate, said the former Facebook employee in a WIRED article.

One thing's for sure - whether companies like it or not, consumers are becoming more sensitive to the implications of granting access to their data. Singapore-based technology and data lawyer Charmian Aw says bad actors who have access to that data can cause impactful damage.

"To top it off, data which is personal to an individual that is misused could cause all kinds of harm to that individual, which could be very significant and often irreparable," says Ms Aw, a lawyer at Reed Smith.

Lim Chong Kin, Drew & Napier's head of telecommunications, media and technology, says individuals who have their personal data compromised could be at risk of fraud and identity theft. They may also suffer psychological or emotional harm, loss of business or employment opportunities, or damage to their reputation or relationships.

Insufficient focus on data privacy - the control of how personal data is used - imperils not just individuals but the startups themselves.

Prof Mohan Kankanhalli, director of the NUS Centre for Research in Privacy Technologies (N-CRiPT), points out that startups are often constrained by limited financial resources and engineering expertise. With growth and revenue among a startup's key performance metrics, one might argue that data privacy is not exactly high on its list of priorities.

Startups also have to be nimble and might - to use tech-speak - "pivot" to a new business model quickly, says Prof Kankanhalli. What they may not realise is that under Singapore's Personal Data Protection Act (PDPA) 2012, they can only use collected data for the purposes that its users have consented to; if there is now a new purpose for the data, consent has to be sought again.

In an age where data is mined by a horde of companies out there, individuals should more actively exercise their right to withdraw consent at any time, he notes.

In the book 99 Privacy Breaches to Beware Of, authors Kevin Shepherdson (CEO of Singapore-based data privacy software and consulting firm Straits Interactive), William Hioe and Lynn Boxall discuss how lax data privacy practices could throw a wrench into a startup's plans.

In their consultancy work, the authors came across a startup subscription business operating in Europe and Asia that had built up a healthy revenue stream. The founder was about to sell the business, when discussions ground to a halt.

As it turned out, the startup had not satisfied data privacy and protection requirements when signing up customers, forcing the founder to start on a lengthy and costly process to fix the issues and make the business saleable.

With powerful tech in their hands, some startups play fast and loose with "the creepy line" - a term made famous in 2010 by ex-Google CEO Eric Schmidt.

"The Google policy on a lot of things is to get right up to the creepy line and not cross it," he had said, implying that the company would not do anything considered too insidious.

But this creepy line shifts. Only a number of years ago, it would have been disturbing to think of companies being able to monitor our every move and word, or that we would be the ones enabling it ("Alexa, play Despacito"). Last week, The New York Times ran a story about a secretive facial recognition startup called Clearview AI, whose powerful technology has helped US law enforcement solve shoplifting, credit card fraud and murder cases.

For a month, the company dodged interview requests from the Times. But when the journalist requested police officers to run her photo through the Clearview app, the officers soon got phone calls from company representatives asking if they were talking to the media - "a sign that Clearview has the ability and, in this case, the appetite to monitor whom law enforcement is searching for".

Where is the creepy line drawn now?

Competitive edge

In a sign of how important data privacy has become, some startups tout it as an edge, trumping peers that flail under its burden.

One of South-east Asia's most valuable marketing tech startups, Near, draws data from the mobile apps people use and overlaps it with location data to understand consumer behaviour. It processes information from over 1.6 billion monthly users across 44 countries. One of its clients, gym chain Virgin Active, boosted walk-ins by 82 per cent after Near served targeted mobile ads to specific audiences in the vicinity of the gyms.

Near co-founder and chief revenue officer Shobhit Shukla says the company only shares aggregated analytics information with customers and partners. Additional steps it takes to ensure the data's privacy, security and authenticity include stripping the data of any personal identifiers (known as "anonymising") and hashing it.

Near is also in the process of getting all their app partners, regardless of region, to follow the European Union's GDPR (General Data Protection Regulation) standard by explicitly stating, in its terms and conditions, that it works with third-party platform Near, and the purpose for which their data is collected with consent.

The more granular the data gets, the more the startup needs to look at privacy and security, Mr Shukla says. "We view privacy as a competitive advantage."

It's an attractive proposition for both consumers and businesses. Public companies in the US that mishandled data saw an average stock price fall of 5 per cent immediately after disclosing the data breach, while 31 per cent of consumers axed their relationship with the company to move towards more trusted organisations, according to a 2017 Ponemon Institute study.

Singapore-based credit scoring startup CredoLab is another firm hoping to edge out competitors by paying keen attention to privacy. It uses smartphone metadata to score users. Using metadata means that, for instance, information about a digital image - its size, resolution, or date of creation - can be assessed without seeing the image itself.

CEO Peter Barcak says the company has signed up 62 lenders across 19 countries, with major markets being Indonesia, Vietnam, and the Philippines. More than US$1 billion worth of loans have been disbursed in the past two years based on the startup's credit scoring technology.

The largely untapped opportunity of providing credit to the unbanked and underbanked has led to the proliferation of alternative credit scoring startups like CredoLab and Lenddo. Within the current regulatory environment, credit scoring startups operating in Singapore are not subject to any regulation pertaining specifically to their credit reporting business activities, though they can be within the purview of the PDPA.

But when a new regulatory framework called the Credit Bureau Act 2016 comes into force, alternative credit scoring startups must apply for a licence to continue their credit reporting business.

The act specifies that they must, among other things, safeguard the confidentiality, security and integrity of credit information. The act will also apply to startups that carry out credit reporting for profit or gain, even if it is not part of their core business. No definite date has been given for when the act will come into force.

That said, Drew & Napier's Mr Lim believes data privacy goes beyond complying with regulations. "Ultimately, startups should not see data protection as a mere compliance exercise but rather, recognise how good data practices and policies can contribute to, and even create, business value."

Some companies take it even further - they focus on data privacy itself as their "product".

Australia's Data Republic, which is backed by Singapore Airlines and Singtel Innov8 (the telco's venture arm), allows companies to securely manage data-sharing projects online.

When a company shares data with another, it usually disappears into the data centre of the other firm. The first company cannot see who touches the data or who has access to it, explains Raju Bhupatiraju, the general manager and head of sales for Asia-Pacific.

One of Data Republic's products, called the Senate Platform, puts the data in a secure common workspace with governance controls. It also ensures that data is exchanged without personally identifiable information, and allows parties to negotiate licence terms and kickstart projects in weeks instead of months.

"Imagine the data economy of Singapore becomes so good that you have jobs created, you have new streams created for the company, and the consumer can benefit from hyper personalisation," says Mr Bhupatiraju.

The startup started operating in Singapore 14 months ago and is now projecting a five to ten times growth in revenue from Singapore operations in 2020. It is also working on a tool that would allow companies and individuals to manage consent.

Others in the data privacy space include New York-based OzoneAI, whose vision is one where users have the power to sell anonymised, granular data directly to advertising firms, and London-based Privitar, which helps enterprises stay compliant with data regulations.

But what about startups that eventually fail? What happens to all the data they have collected?

"The PDPA continues to apply to a failed startup, including the retention limitation obligation which requires an organisation to cease to retain personal data if there is no legal or business purpose for doing so," says Ms Aw.

Safety first

As much as companies should lay down rules for how they collect and use data, the first line of defence to preserving privacy is ensuring good security, Prof Terence Sim writes in an N-CRiPT blog post. "If security is lax, you can forget about privacy. The bad guys hack in and help themselves to all your data and you're doomed," says the research centre's principal investigator.

Out of 80 per cent of 90 organisations that breached the PDPA's protection obligation between 2016 and August 2019, 85 per cent were due to negligence while the rest were due to cyber attacks. The findings were uncovered by the Data Protection Excellence (DPEX) Centre, Straits Interactive's autonomous research and education arm.

The lack of data protection policies and failure to obtain consent of individuals were the other two most common PDPC obligations breached.

"Based on our security assessment experience, those startups whose founders have never dealt with cybersecurity pay little or no attention to cyber risks at all," says Ilya Sachkov, CEO and founder of Group-IB, a Singapore-based cybersecurity company.

The group's senior threat intelligence analyst Feixiang He highlights that many startups adopt innovative and disruptive processes, whose unproven nature could increase their systematic cyber risks.

"Startups also have high growth and turnover rate of human talent. It may introduce insider threats to the companies, unless cybersecurity is taken seriously from day one. Insider threat refers to cases where internal employees leak data to external parties."

Mr Sachkov recommends startups appoint experts with an extensive cybersecurity background to their advisory board. When on a tight budget, bug bounty programs can be used to find and fix security vulnerabilities, he says.

Fashion e-commerce firm Zilingo employs a team of "white hat hackers" - also known as ethical hackers - every three to six months to uncover vulnerabilities in the company's systems. The company's chief technology officer Dhruv Kapoor says that of the over 100 employees at Zilingo, about a tenth looks either exclusively or partially at security.

It's in the culture as well, he says. "Our company culture requires us to think about privacy and security from the very start, when there is a new product idea or the start of a business plan."

For instance, Zilingo recently launched a service that would allow merchants to check on the progress of an order placed at a factory - but not before working with partners to create a framework where factories' data is securely shared in a way that does not compromise privacy.

Anonymity and machine learning

Tech bosses can probably sleep better at night thinking that current methods of ensuring data privacy are somewhat foolproof, but emerging threats tell us otherwise.

Take, for instance, the widespread belief that anonymising data can help protect the data owner's privacy. It doesn't, says Reza Shokri, a researcher at N-CRiPT who specialises in data privacy and trustworthy machine learning.

A skilled attacker who has external information about individuals can combine that information with the anonymised data to reconstruct the whole data set, he explains. This is made even more possible if the attacker has access to data from the many apps people use every day.

The truth is, startups - and Big Tech firms - exist in a world where bad actors are becoming more sophisticated. For example, depending on how a machine learning model was constructed, it is possible for an adversary to "reverse engineer" the model and uncover the underlying data on which it was based.

Prof Shokri, together with his colleague Prof Daniel Seng from the National University of Singapore's Faculty of Law, recently worked on a paper that looked at different types of machine learning algorithms and whether they comply with the GDPR and other privacy regulations. They found that the machine learning models were not built and trained in a way that preserves privacy.

"So if you give access to the machine learning model, we argue that you are essentially giving the whole data away. The privacy risk is equivalent," he says.

N-CRiPT has been studying techniques in differential privacy, a response to privacy attacks, which has been used by organisations such as the US Census Bureau. Instead of directly sharing raw or anonymised data, an organisation can construct a quantitative model that captures the characteristics of the original data. It can then sample some records from that model, and release those sample records to a business partner.

But the model has to be constructed in a way that does not reveal sensitive information, such as injecting "noise" into the model so that it conceals specific individual details, but won't destroy high level statistics.

The centre has also been looking into ways to scientifically quantify the value of data to an organisation, which could have applications in insurance and risk management.

Prof Shokri hopes to highlight that data privacy is a science. "One lesson we can learn is that not everything that is intuitive can effectively provide protection," he says.

"Privacy needs to be protected by design...the whole chain of computations done on your data needs to be carefully designed so that they preserve the privacy of data. And this is the difficult part. This is what privacy experts work on, which is designing privacy mechanisms with mathematical rigour."

READ MORE: The asymmetry of 'open banking'

KEYWORDS IN THIS ARTICLE

BT is now on Telegram!

For daily updates on weekdays and specially selected content for the weekend. Subscribe to  t.me/BizTimes

Features

SUPPORT SOUTH-EAST ASIA'S LEADING FINANCIAL DAILY

Get the latest coverage and full access to all BT premium content.

SUBSCRIBE NOW

Browse corporate subscription here