identity management

Google is making autofill on Chrome for mobile more secure

Posted by | Access Control, Android, biometrics, Chrome, computing, cryptography, Google, Identification, identity management, internet security, Mobile, Password, password manager, Security, smartphones, TC | No Comments

Google today announced a new autofill experience for Chrome on mobile that will use biometric authentication for credit card transactions, as well as an updated built-in password manager that will make signing in to a site a bit more straightforward.

Image Credits: Google

Chrome already uses the W3C WebAuthn standard for biometric authentication on Windows and Mac. With this update, this feature is now also coming to Android .

If you’ve ever bought something through the browser on your Android phone, you know that Chrome always asks you to enter the CVC code from your credit card to ensure that it’s really you — even if you have the credit card number stored on your phone. That was always a bit of a hassle, especially when your credit card wasn’t close to you.

Now, you can use your phone’s biometric authentication to buy those new sneakers with just your fingerprint — no CVC needed. Or you can opt out, too, as you’re not required to enroll in this new system.

As for the password manager, the update here is the new touch-to-fill feature that shows you your saved accounts for a given site through a standard Android dialog. That’s something you’re probably used to from your desktop-based password manager already, but it’s definitely a major new built-in convenience feature for Chrome — and the more people opt to use password managers, the safer the web will be. This new feature is coming to Chrome on Android in the next few weeks, but Google says that “is only the start.”

Image Credits: Google

 

Powered by WPeMatico

UK’s NHS COVID-19 app lacks robust legal safeguards against data misuse, warns committee

Posted by | Apps, Bluetooth, data protection law, digital rights, Elizabeth Denham, Europe, Germany, Health, human rights, identity management, ireland, Matt Hancock, Mobile, National Health Service, NHS, NHS COVID-19, NHSX, northern ireland, privacy, privacy policy, terms of service, United Kingdom | No Comments

A UK parliamentary committee that focuses on human rights issues has called for primary legislation to be put in place to ensure that legal protections wrap around the national coronavirus contact tracing app.

The app, called NHS COVID-19, is being fast tracked for public use — with a test ongoing this week in the Isle of Wight. It’s set to use Bluetooth Low Energy signals to log social interactions between users to try to automate some contacts tracing based on an algorithmic assessment of users’ infection risk.

The NHSX has said the app could be ready for launch within a matter of weeks but the committee says key choices related to the system architecture create huge risks for people’s rights that demand the safeguard of primary legislation.

“Assurances from Ministers about privacy are not enough. The Government has given assurances about protection of privacy so they should have no objection to those assurances being enshrined in law,” said committee chair, Harriet Harman MP, in a statement.

“The contact tracing app involves unprecedented data gathering. There must be robust legal protection for individuals about what that data will be used for, who will have access to it and how it will be safeguarded from hacking.

“Parliament was able quickly to agree to give the Government sweeping powers. It is perfectly possible for parliament to do the same for legislation to protect privacy.”

The NHSX, a digital arm of the country’s National Health Service, is in the process of testing the app — which it’s said could be launched nationally within a few weeks.

The government has opted for a system design that will centralize large amounts of social graph data when users experiencing COVID-19 symptoms (or who have had a formal diagnosis) choose to upload their proximity logs.

Earlier this week we reported on one of the committee hearings — when it took testimony from NHSX CEO Matthew Gould and the UK’s information commissioner, Elizabeth Denham, among other witnesses.

Warning now over a lack of parliamentary scrutiny — around what it describes as an unprecedented expansion of state surveillance — the committee report calls for primary legislation to ensure “necessary legal clarity and certainty as to how data gathered could be used, stored and disposed of”.

The committee also wants to see an independent body set up to carry out oversight monitoring and guard against ‘mission creep’ — a concern that’s also been raised by a number of UK privacy and security experts in an open letter late last month.

“A Digital Contact Tracing Human Rights Commissioner should be responsible for oversight and they should be able to deal with complaints from the Public and report to Parliament,” the committee suggests.

Prior to publishing its report, the committee wrote to health minister Matt Hancock, raising a full spectrum of concerns — receiving a letter in response.

In this letter, dated May 4, Hancock told it: “We do not consider that legislation is necessary in order to build and deliver the contact tracing app. It is consistent with the powers of, and duties imposed on, the Secretary of State at a time of national crisis in the interests of protecting public health.”

The committee’s view is Hancock’s ‘letter of assurance’ is not enough given the huge risks attached to the state tracking citizens’ social graph data.

“The current data protection framework is contained in a number of different documents and it is nearly impossible for the public to understand what it means for their data which may be collected by the digital contact tracing system. Government’s assurances around data protection and privacy standards will not carry any weight unless the Government is prepared to enshrine these assurances in legislation,” it writes in the report, calling for a bill that it says myst include include a number of “provisions and protections”.

Among the protections the committee is calling for are limits on who has access to data and for what purpose.

“Data held centrally may not be accessed or processed without specific statutory authorisation, for the purpose of combatting Covid-19 and provided adequate security protections are in place for any systems on which this data may be processed,” it urges.

It also wants legal protections against data reconstruction — by different pieces of data being combined “to reconstruct information about an individual”.

The report takes a very strong line — warning that no app should be released without “strong protections and guarantees” on “efficacy and proportionality”.

“Without clear efficacy and benefits of the app, the level of data being collected will be not be justifiable and it will therefore fall foul of data protection law and human rights protections,” says the committee.

The report also calls for regular reviews of the app — looking at efficacy; data safety; and “how privacy is being protected in the use of any such data”.

It also makes a blanket call for transparency, with the committee writing that the government and health authorities “must at all times be transparent about how the app, and data collected through it, is being used”.

A lack of transparency around the project was another of the concerns raised by the 177 academics who signed the open letter last month.

The government has committed to publishing data protection impact assessments for the app. But the ICO’s Denham still hadn’t had sight of this document as of this Monday.

Another call by the committee is for a time-limit to be attached to any data gathered by or generated via the app. “Any digital contact tracing (and data associated with it) must be permanently deleted when no longer required and in any event may not be kept beyond the duration of the public health emergency,” it writes.

We’ve reached out to the Department of Health and NHSX for comment on the human rights committee’s report.

Let’s go through Matt Hancock’s letter to @HarrietHarman @HumanRightsCtte on the NHSX app and take a closer look at some of these statements 1/ https://t.co/sQe2U8wkiy

— Michael Veale (@mikarv) May 7, 2020

There’s another element to this fast moving story: Yesterday the Financial Times reported that the NHSX has inked a new contract with an IT supplier which suggests it might be looking to change the app architecture — moving away from a centralized database to a decentralized system for contacts tracing. Although NHSX has not confirmed any such switch at this point.

Some other countries have reversed course in their choice of app architecture after running into technical challenges related to Bluetooth. The need to ensure public trust in the system was also cited by Germany for switching to a decentralized model.

The human rights committee report highlights a specific app efficacy issue of relevance to the UK, which it points out is also linked to these system architecture choices, noting that: “The Republic of Ireland has elected to use a decentralised app and if a centralised app is in use in Northern Ireland, there are risks that the two systems will not be interoperable which would be most unfortunate.”

Professor Lilian Edwards, a legal expert from Newcastle University, who has co-authored a draft bill proposing a set of safeguards for coronavirus apps (much of which was subsequently taken up by Australia for a legal instrument that wraps public health contact info during the coronavirus crisis) — and who also now sits as an independent advisor on an ethics committee that’s been set up for the NHSX app — welcomed the committee report.

Speaking in a personal capacity she told TechCrunch: “My team and I welcome this.”

But she flagged a couple of omissions in the report. “They have left out two of the recommendations from my bill — one of which, I totally expected; that there be no compulsion to carry a phone. Because they will just be assumed within our legal system but I don’t think it would have hurt to have said it. But ok.

“The second point — which is important — is the point about there not being compulsion to install the app or to display it. And there not being, therefore, discrimination against you if you don’t. Like not being allowed to go to your workplace is an obvious example. Or not being allowed to go to a football game when they reopen. And that’s the key point where the struggle is.”

The conflict, says Edwards, is on the one hand you could argue what’s the point of doing digital contact tracing at all if you can’t make sure people are able to receive notifications that they might be a contact. But — on the other — if you allow compulsion that then “leaves it open to be very discriminatory” — meaning people could abuse the requirement to target and exclude others from a workplace, for example.

“There are people who’ve got perfectly valid reasons to not want to have this on their phone,” Edwards added. “Particularly if it’s centralized rather than decentralized.”

She also noted that the first version of her draft coronavirus safeguards bill had allowed compulsion re: having the app on the phone but required it to be balanced by a proportionality analysis — meaning any such compulsion must be “proportionate to a legitimate aim”.

But after Australia opted for zero compulsion in its legal instrument she said she and her team decided to revise their bill to also strike out the provision entirely.

Edwards suggested the human rights committee may not have included this particular provision in their recommendations because parliamentary committees are only able to comment on evidence they receive during an inquiry. “So I don’t think it would have been in their remit to recommend on that,” she noted, adding: “It isn’t actually an indication that they’re not interested in these concepts; it’s just procedure I think.”

She also highlighted the issues of so-called ‘immunity passports’ — something the government has reportedly been in discussions with startups about building as part of its digital coronavirus response, but which the committee report also does not touch on.

However, without full clarity on the government’s evolving plans for its digital coronavirus response, and with, inevitably, a high degree of change and flux amid a public health emergency situation, it’s clearly difficult for committees to interrogate so many fast moving pieces.

“The select committees have actually done really, really well,” added Edwards. “But it just shows how the ground has shifted so much in a week.”

This report was updated with additional comment

Powered by WPeMatico

Apple still has work to do on privacy

Posted by | Android, Apple, artificial intelligence, data processing, data protection, data security, digital privacy, digital rights, Europe, european union, General Data Protection Regulation, human rights, identity management, iPhone, privacy, Security, siri, TC, Tim Cook | No Comments

There’s no doubt that Apple’s self-polished reputation for privacy and security has taken a bit of a battering recently.

On the security front, Google researchers just disclosed a major flaw in the iPhone, finding a number of malicious websites that could hack into a victim’s device by exploiting a set of previously undisclosed software bugs. When visited, the sites infected iPhones with an implant designed to harvest personal data — such as location, contacts and messages.

As flaws go, it looks like a very bad one. And when security fails so spectacularly, all those shiny privacy promises naturally go straight out the window.

The implant was used to steal location data and files like databases of WhatsApp, Telegram, iMessage. So all the user messages, or emails. Copies of contacts, photos, https://t.co/AmWRpbcIHw pic.twitter.com/vUNQDo9noJ

— Lukasz Olejnik (@lukOlejnik) August 30, 2019

And while that particular cold-sweat-inducing iPhone security snafu has now been patched, it does raise questions about what else might be lurking out there. More broadly, it also tests the generally held assumption that iPhones are superior to Android devices when it comes to security.

Are we really so sure that thesis holds?

But imagine for a second you could unlink security considerations and purely focus on privacy. Wouldn’t Apple have a robust claim there?

On the surface, the notion of Apple having a stronger claim to privacy versus Google — an adtech giant that makes its money by pervasively profiling internet users, whereas Apple sells premium hardware and services (including essentially now ‘privacy as a service‘) — seems a safe (or, well, safer) assumption. Or at least, until iOS security fails spectacularly and leaks users’ privacy anyway. Then of course affected iOS users can just kiss their privacy goodbye. That’s why this is a thought experiment.

But even directly on privacy, Apple is running into problems, too.

To wit: Siri, its nearly decade-old voice assistant technology, now sits under a penetrating spotlight — having been revealed to contain a not-so-private ‘mechanical turk’ layer of actual humans paid to listen to the stuff people tell it. (Or indeed the personal stuff Siri accidentally records.)

Powered by WPeMatico

Apple ad focuses on iPhone’s most marketable feature — privacy

Posted by | Apple, computing, digital media, digital rights, Facebook, hardware, human rights, identity management, iPhone, law, Mobile, privacy, TC, terms of service, Tim Cook, United States | No Comments

Apple is airing a new ad spot in primetime today. Focused on privacy, the spot is visually cued, with no dialog and a simple tagline: Privacy. That’s iPhone.

In a series of humorous vignettes, the message is driven home that sometimes you just want a little privacy. The spot has only one line of text otherwise, and it’s in keeping with Apple’s messaging on privacy over the long and short term. “If privacy matters in your life, it should matter to the phone your life is on.”

The spot will air tonight in primetime in the U.S. and extend through March Madness. It will then air in select other countries.

You’d have to be hiding under a rock not to have noticed Apple positioning privacy as a differentiating factor between itself and other companies. Beginning a few years ago, CEO Tim Cook began taking more and more public stances on what the company felt to be your “rights” to privacy on their platform and how that differed from other companies. The undercurrent being that Apple was able to take this stance because its first-party business relies on a relatively direct relationship with customers who purchase its hardware and, increasingly, its services.

This stands in contrast to the model of other tech giants like Google or Facebook that insert an interstitial layer of monetization strategy on top of that relationship in the forms of application of personal information about you (in somewhat anonymized fashion) to sell their platform to advertisers that in turn can sell to you better.

Turning the ethical high ground into a marketing strategy is not without its pitfalls, though, as Apple has discovered recently with a (now patched) high-profile FaceTime bug that allowed people to turn your phone into a listening device, Facebook’s manipulation of App Store permissions and the revelation that there was some long overdue house cleaning needed in its Enterprise Certificate program.

I did find it interesting that the iconography of the “Private Side” spot very, very closely associates the concepts of privacy and security. They are separate, but interrelated, obviously. This spot says these are one and the same. It’s hard to enforce privacy without security, of course, but in the mind of the public I think there is very little difference between the two.

The App Store itself, of course, still hosts apps from Google and Facebook among thousands of others that use personal data of yours in one form or another. Apple’s argument is that it protects the data you give to your phone aggressively by processing on the device, collecting minimal data, disconnecting that data from the user as much as possible and giving users as transparent a control interface as possible. All true. All far, far better efforts than the competition.

Still, there is room to run, I feel, when it comes to Apple adjudicating what should be considered a societal norm when it comes to the use of personal data on its platform. If it’s going to be the absolute arbiter of what flies on the world’s most profitable application marketplace, it might as well use that power to get a little more feisty with the bigcos (and littlecos) that make their living on our data.

I mention the issues Apple has had above not as a dig, though some might be inclined to view Apple integrating privacy with marketing as boldness bordering on hubris. I, personally, think there’s still a major difference between a company that has situational loss of privacy while having a systemic dedication to privacy and, well, most of the rest of the ecosystem which exists because they operate an “invasion of privacy as a service” business.

Basically, I think stating privacy is your mission is still supportable, even if you have bugs. But attempting to ignore that you host the data platforms that thrive on it is a tasty bit of prestidigitation.

But that might be a little too verbose as a tagline.

Powered by WPeMatico