Australian attitudes about what kind of personal data should be collected and used differ strongly from the realities of the big data marketplace, a report indicates, but many consumers feel powerless to do anything about it.
The report from the Consumer Policy Research Centre (CPRC), titled Consumer Data and the Digital Economy, warns that companies big and small are operating with a such a poor degree of transparency that Australians can’t tell if they’re making a fair trade with the data they’re giving up.
The report includes the results of a national survey of 1004 people, which found 95 per cent wanted the ability to “opt out of certain data collection practices” and 91 per cent agreed that companies should collect only the information currently needed to provide the service.
Of data not directly needed to provide a service, Australians were most protective of their phone contacts, messages and device identification numbers being shared (with 87, 86 and 84 per cent of respondents respectively saying such sharing made them uncomfortable), despite this all being data that is regularly collected by all manner of companies, from social media sites to loyalty point platforms.
The study also found 94 per cent of those surveyed did not read privacy policies. Of those that did, two-thirds indicated they still signed up even though they felt uncomfortable. Of those, 73 per cent said it was because accepting the terms was the only way to access the service.
The report recommends regulatory reform that ensures consumers have a legitimate choice when it comes to their data being collected and shared.
At a CPRC event to launch the report, Australian Competition and Consumer Commission chairman Rod Sims said big businesses such as Google and Facebook were benefiting from the use of data but it was not clear that consumers were.
“Their whole business model … completely depends on them having data about you, and then largely selling that data off,” he said. “That business model is only going to grow because both those companies need to grow to justify their share price.”
Mr Sims said the data “genie is out of the bottle” and that the community and regulators needed to decide what to do about it.
“We can’t just let it happen, we’ve got to work out what the response is,” he said.
Consultation in Australia is under way to provide a Consumer Data Right, which will give greater access to data held by utility companies, banks and telcos, as well as the ability to port the data to another provider if the consumer chooses.
Lauren Solomon, the chief executive of the CPRC and co-author of the report, told Fairfax Media the Consumer Data Right was “a step in the right direction” but that the set of data it covered was “very narrow” compared to data reforms internationally.
“What we’re seeing with GDPR [General Data Protection Regulation] in the EU or the California Consumer Privacy Act, we’re seeing them being much more economy-wide protections for consumers,” she said.
Examples of this include the right to erasure (or right to be forgotten), added protections for the processing of children’s data and initiatives such as including a “do not sell my personal information” button being required on websites.
“A lot of the reforms we’re seeing in Europe and California go to the heart of greater transparency, but also greater options for consumers when it comes to how their data’s being collected,” Ms Solomon said. “But without those two fundamental things, it’s very difficult for Australians to make a choice about what suits their preferences when it comes to data sharing.”
Another recommendation for reform in the report is to give consumers greater access to the profiles and scores being built and held about them. This data is used by companies to assess consumers who are looking for a loan or access to some other product, influencing the price they are offered or, in some cases, whether they can access a product at all.
Internationally, bodies such as the Algorithmic Justice League in the US look at the impact in discrimination terms of algorithmic bias in an attempt to make sure incorrect data doesn’t hurt consumers and that algorithms don’t unfairly enforce stereotypes and discrimination.
“It’s something I think we need to be thinking about [in Australia],” Ms Solomon said. “And the extent to which regulators are able to get transparency or a line of sight to what algorithms might be influencing the sorts of products that consumers are presented with is quite important.”
Australia’s human rights commissioner, Ed Santow, who was also at the event, said he was concerned that people were “held over a barrel” by only being given access to vital services – such as people with disabilities using online shopping – on the condition of handing over their private information.
Mr Santow also pointed to an artificial intelligence system used in the US to decide if defendants awaiting trial could receive bail, which was found to be biased against African Americans, and a Microsoft AI twitter bot, which, via machine learning, became anti-semitic less then 24 hours after being unleashed on Twitter, as examples of how AI and data analytics could lead to discriminatory outcomes.