encryption

Every secure messaging app needs a self-destruct button

Posted by | Apps, encryption, end-to-end encryption, Government, Mobile, privacy, secure messaging, Security, signal, TC, Telegram, WhatsApp | No Comments

The growing presence of encrypted communications apps makes a lot of communities safer and stronger. But the possibility of physical device seizure and government coercion is growing as well, which is why every such app should have some kind of self-destruct mode to protect its user and their contacts.

End to end encryption like that you see in Signal and (if you opt into it) WhatsApp is great at preventing governments and other malicious actors from accessing your messages while they are in transit. But as with nearly all cybersecurity matters, physical access to either device or user or both changes things considerably.

For example, take this Hong Kong citizen who was forced to unlock their phone and reveal their followers and other messaging data to police. It’s one thing to do this with a court order to see if, say, a person was secretly cyberstalking someone in violation of a restraining order. It’s quite another to use as a dragnet for political dissidents.

@telegram @durov an HK citizen who runs a Telegram channel detained by the police was forced to unlock his phone and reveal his channel followers. Could you please add an option such that channel subscribers cannot be seen under extreme circumstances? Much appreciate. https://t.co/tj4UQztuZ2

— Lo Sinofobo (@tnzqo7f9) June 12, 2019

This particular protestor ran a Telegram channel that had a number of followers. But it could just as easily be a Slack room for organizing a protest, or a Facebook group, or anything else. For groups under threat from oppressive government regimes it could be a disaster if the contents or contacts from any of these were revealed to the police.

Just as you should be able to choose exactly what you say to police, you should be able to choose how much your phone can say as well. Secure messaging apps should be the vanguard of this capability.

There are already some dedicated “panic button” type apps, and Apple has thoughtfully developed an “emergency mode” (activated by hitting the power button five times quickly) that locks the phone to biometrics and will wipe it if it is not unlocked within a certain period of time. That’s effective against “Apple pickers” trying to steal a phone or during border or police stops where you don’t want to show ownership by unlocking the phone with your face.

Those are useful and we need more like them — but secure messaging apps are a special case. So what should they do?

The best-case scenario, where you have all the time in the world and internet access, isn’t really an important one. You can always delete your account and data voluntarily. What needs work is deleting your account under pressure.

The next best-case scenario is that you have perhaps a few seconds or at most a minute to delete or otherwise protect your account. Signal is very good about this: The deletion option is front and center in the options screen, and you don’t have to input any data. WhatsApp and Telegram require you to put in your phone number, which is not ideal — fail to do this correctly and your data is retained.

Signal, left, lets you get on with it. You’ll need to enter your number in WhatsApp (right) and Telegram.

Obviously it’s also important that these apps don’t let users accidentally and irreversibly delete their account. But perhaps there’s a middle road whereby you can temporarily lock it for a preset time period, after which it deletes itself if not unlocked manually. Telegram does have self-destructing accounts, but the shortest time you can delete after is a month.

What really needs improvement is emergency deletion when your phone is no longer in your control. This could be a case of device seizure by police, or perhaps being forced to unlock the phone after you have been arrested. Whatever the case, there need to be options for a user to delete their account outside the ordinary means.

Here are a couple options that could work:

  • Trusted remote deletion: Selected contacts are given the ability via a one-time code or other method to wipe each other’s accounts or chats remotely, no questions asked and no notification created. This would let, for instance, a friend who knows you’ve been arrested remotely remove any sensitive data from your device.
  • Self-destruct timer: Like Telegram’s feature, but better. If you’re going to a protest, or have been “randomly” selected for additional screening or questioning, you can just tell the app to delete itself after a certain duration (as little as a minute perhaps) or at a certain time of the day. Deactivate any time you like, or stall for the five required minutes for it to trigger.
  • Poison PIN: In addition to a normal unlock PIN, users can set a poison PIN that when entered has a variety of user-selectable effects. Delete certain apps, clear contacts, send prewritten messages, unlock or temporarily hard-lock the device, etc.
  • Customizable panic button: Apple’s emergency mode is great, but it would be nice to be able to attach conditions like the poison PIN’s. Sometimes all someone can do is smash that button.

Obviously these open new avenues for calamity and abuse as well, which is why they will need to be explained carefully and perhaps initially hidden in “advanced options” and the like. But overall I think we’ll be safer with them available.

Eventually these roles may be filled by dedicated apps or by the developers of the operating systems on which they run, but it makes sense for the most security-forward app class out there to be the first in the field.

Powered by WPeMatico

The consumer version of BBM is shutting down on May 31

Posted by | Android, apple-app-store, BBM, BlackBerry, computing, emtek, encryption, Google Play Store, imessage, Instant Messaging, messaging apps, Messenger, microsoft windows, Mobile, operating systems, private, research-in-motion, smartphone, smartphones, SMS, technology, WhatsApp, Windows Live Messenger | No Comments

It might be time to move on from BBM. The consumer version of the BlackBerry Messenger will shut down on May 31. Emtek, the Indonesia-based company that partnered with BlackBerry in 2016, just announced the closure. It’s important to note, BBM will still exist and BlackBerry today revealed a plan to open its enterprise-version of BBM to general consumers.

Starting today, BBM Enterprise will be available through the Google Play Store and eventually from the Apple App Store. The service will be free for one year and after that, $2.49 for six months of service. This version of the software, like the consumer version, still features group chats, voice and video calls and the ability to edit and retract messages.

As explained by BlackBerry, BBMe features end-to-end encryption:

BBMe can be downloaded on any device that uses Android, iOS, Windows or MAC operating systems. The sender and recipient each have unique public/private encryption and signing keys. These keys are generated on the device by a FIPS 140-2 certified cryptographic library and are not controlled by BlackBerry. Each message uses a new symmetric key for message encryption. Additionally, TLS encryption between the device and BlackBerry’s infrastructure protects BBMe messages from eavesdropping or manipulation.

BBM is one of the oldest smartphone messaging services. Research in Motion, BlackBerry’s original name, released the messenger in 2005. It quickly became a selling point for BlackBerry devices. BBM wasn’t perfect and occasionally crashed, but it was a robust, feature-filled messaging app when most of the world was still using SMS. Eventually, with the downfall of RIM and eventually BlackBerry, BBM fell behind iMessage, WhatsApp and other independent messaging platforms. Emtek’s partnership with BlackBerry was supposed to bring the service into the current age, but some say the consumer version ended up bloated with games, channels and ads. BlackBerry’s BBMe lacks a lot of those extra features, so consumers might find it a better platform for communicating.

Powered by WPeMatico

Law enforcement needs to protect citizens and their data

Posted by | Android, Australia, Column, computer security, crypto wars, cryptography, encryption, european union, Facebook, Federal Bureau of Investigation, General Data Protection Regulation, human rights, law, law enforcement, national security, privacy, Security, United Kingdom | No Comments
Robert Anderson
Contributor

Robert Anderson served for 21 years in the FBI, retiring as executive assistant director of the Criminal, Cyber, Response and Services Branch. He is currently an advisor at The Chertoff Group and the chief executive of Cyber Defense Labs.

Over the past several years, the law enforcement community has grown increasingly concerned about the conduct of digital investigations as technology providers enhance the security protections of their offerings—what some of my former colleagues refer to as “going dark.”

Data once readily accessible to law enforcement is now encrypted, protecting consumers’ data from hackers and criminals. However, these efforts have also had what Android’s security chief called the “unintended side effect” of also making this data inaccessible to law enforcement. Consequently, many in the law enforcement community want the ability to compel providers to allow them to bypass these protections, often citing physical and national security concerns.

I know first-hand the challenges facing law enforcement, but these concerns must be addressed in a broader security context, one that takes into consideration the privacy and security needs of industry and our citizens in addition to those raised by law enforcement.

Perhaps the best example of the law enforcement community’s preferred solution is Australia’s recently passed Assistance and Access Bill, an overly-broad law that allows Australian authorities to compel service providers, such as Google and Facebook, to re-engineer their products and bypass encryption protections to allow law enforcement to access customer data.

While the bill includes limited restrictions on law enforcement requests, the vague definitions and concentrated authorities give the Australian government sweeping powers that ultimately undermine the security and privacy of the very citizens they aim to protect. Major tech companies, such as Apple and Facebook, agree and have been working to resist the Australian legislation and a similar bill in the UK.

Image: Bryce Durbin/TechCrunch

Newly created encryption backdoors and work-arounds will become the target of criminals, hackers, and hostile nation states, offering new opportunities for data compromise and attack through the newly created tools and the flawed code that inevitably accompanies some of them. These vulnerabilities undermine providers’ efforts to secure their customers’ data, creating new and powerful vulnerabilities even as companies struggle to address existing ones.

And these vulnerabilities would not only impact private citizens, but governments as well, including services and devices used by the law enforcement and national security communities. This comes amidst government efforts to significantly increase corporate responsibility for the security of customer data through laws such as the EU’s General Data Protection Regulation. Who will consumers, or the government, blame when a government-mandated backdoor is used by hackers to compromise user data? Who will be responsible for the damage?

Companies have a fiduciary responsibility to protect their customers’ data, which not only includes personally identifiable information (PII), but their intellectual property, financial data, and national security secrets.

Worse, the vulnerabilities created under laws such as the Assistance and Access Bill would be subject almost exclusively to the decisions of law enforcement authorities, leaving companies unable to make their own decisions about the security of their products. How can we expect a company to protect customer data when their most fundamental security decisions are out of their hands?

phone encryption

Image: Bryce Durbin/TechCrunch

Thus far law enforcement has chosen to downplay, if not ignore, these concerns—focusing singularly on getting the information they need. This is understandable—a law enforcement officer should use every power available to them to solve a case, just as I did when I served as a State Trooper and as a FBI Special Agent, including when I served as Executive Assistant Director (EAD) overseeing the San Bernardino terror attack case during my final months in 2015.

Decisions regarding these types of sweeping powers should not and cannot be left solely to law enforcement. It is up to the private sector, and our government, to weigh competing security and privacy interests. Our government cannot sacrifice the ability of companies and citizens to properly secure their data and systems’ security in the name of often vague physical and national security concerns, especially when there are other ways to remedy the concerns of law enforcement.

That said, these security responsibilities cut both ways. Recent data breaches demonstrate that many companies have a long way to go to adequately protect their customers’ data. Companies cannot reasonably cry foul over the negative security impacts of proposed law enforcement data access while continuing to neglect and undermine the security of their own users’ data.

Providers and the law enforcement community should be held to robust security standards that ensure the security of our citizens and their data—we need legal restrictions on how government accesses private data and on how private companies collect and use the same data.

There may not be an easy answer to the “going dark” issue, but it is time for all of us, in government and the private sector, to understand that enhanced data security through properly implemented encryption and data use policies is in everyone’s best interest.

The “extra ordinary” access sought by law enforcement cannot exist in a vacuum—it will have far reaching and significant impacts well beyond the narrow confines of a single investigation. It is time for a serious conversation between law enforcement and the private sector to recognize that their security interests are two sides of the same coin.

Powered by WPeMatico

Zuckerberg wants messages to auto-expire to make Facebook a ‘living room’

Posted by | Apps, encryption, end-to-end encryption, Facebook, facebook messenger, Facebook Policy, facebook privacy, instagram, Mark Zuckerberg, Mobile, Policy, privacy, Social, TC, WhatsApp | No Comments

On feed-based “broader social networks, where people can accumulate friends or followers until the services feel more public . . . it feels more like a town square than a more intimate space like a living room” Facebook CEO Mark Zuckerberg explained in a blog post today. With messaging, groups, and ephemeral stories as the fastest growing social features, Zuckerberg laid out why he’s rethinking Facebook as a private living room where people can be comfortable being themselves without fear of hackers, government spying, and embarrassment from old content — all without encryption allowing bad actors to hide their crimes.

Perhaps this will just be more lip service in a time of PR crisis for Facebook. But with the business imperative fueled by social networking’s shift away from permanent feed broadcasting, Facebook can espouse the philosophy of privacy while in reality servicing its shareholders and bottom line. It’s this alignment that actually spurs product change. We saw Facebook’s agility with last year’s realization that a misinformation- and hate-plagued platform wouldn’t survive long-term so it had to triple its security and moderation staff. And in 2017, recognizing the threat of Stories, it implemented them across its apps. Now Facebook might finally see the dollar signs within privacy.

The New York Times’ Mike Isaac recently reported that Facebook planned to unify its Facebook, WhatsApp, and Instagram messaging infrastructure to allow cross-app messaging and end-to-end encryption. And Zuckerberg discussed this and the value of ephemerality on the recent earnings call. But now Zuckerberg has roadmapped a clearer slate of changes and policies to turn Facebook into a living room:

-Facebook will let users opt in to the ability to send or receive messages across Facebook, WhatsApp, and Instagram

-Facebook wants to expand that interoperability to SMS on Android

-Zuckerberg wants to make ephemerality automatic on messaging threads, so chats disappear by default after a month or year, with users able to control that or put timers on individual messages.

-Facebook plans to limit how long it retains metadata on messages once it’s no longer needed for spam or safety protections

-Facebook will extend end-to-end encryption across its messaging apps but use metadata and other non-content signals to weed out criminals using privacy to hide their misdeeds.

-Facebook won’t store data in countries with a bad track record of privacy abuse such as Russia, even if that means having to shut down or postpone operations in a country

You can read the full blog post from Zuckerberg below:

A Privacy-Focused Vision for Social Networking

My focus for the last couple of years has been understanding and addressing the biggest challenges facing Facebook. This means taking positions on important issues concerning the future of the internet. In this note, I’ll outline our vision and principles around building a privacy-focused messaging and social networking platform. There’s a lot to do here, and we’re committed to working openly and consulting with experts across society as we develop this.

Over the last 15 years, Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square. But people increasingly also want to connect privately in the digital equivalent of the living room. As I think about the future of the internet, I believe a privacy-focused communications platform will become even more important than today’s open platforms. Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.

Today we already see that private messaging, ephemeral stories, and small groups are by far the fastest growing areas of online communication. There are a number of reasons for this. Many people prefer the intimacy of communicating one-on-one or with just a few friends. People are more cautious of having a permanent record of what they’ve shared. And we all expect to be able to do things like payments privately and securely.

Public social networks will continue to be very important in people’s lives — for connecting with everyone you know, discovering new people, ideas and content, and giving people a voice more broadly. People find these valuable every day, and there are still a lot of useful services to build on top of them. But now, with all the ways people also want to interact privately, there’s also an opportunity to build a simpler platform that’s focused on privacy first.

I understand that many people don’t think Facebook can or would even want to build this kind of privacy-focused platform — because frankly we don’t currently have a strong reputation for building privacy protective services, and we’ve historically focused on tools for more open sharing. But we’ve repeatedly shown that we can evolve to build the services that people really want, including in private messaging and stories.

I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever. This is the future I hope we will help bring about.

We plan to build this the way we’ve developed WhatsApp: focus on the most fundamental and private use case — messaging — make it as secure as possible, and then build more ways for people to interact on top of that, including calls, video chats, groups, stories, businesses, payments, commerce, and ultimately a platform for many other kinds of private services.

This privacy-focused platform will be built around several principles:

Private interactions. People should have simple, intimate places where they have clear control over who can communicate with them and confidence that no one else can access what they share.

Encryption. People’s private communications should be secure. End-to-end encryption prevents anyone — including us — from seeing what people share on our services.

Permanence. People should be comfortable being themselves, and should not have to worry about what they share coming back to hurt them later. So we won’t keep messages or stories around for longer than necessary to deliver the service or longer than people want it.

Safety. People should expect that we will do everything we can to keep them safe on our services within the limits of what’s possible in an encrypted service.

Interoperability. People should be able to use any of our apps to reach their friends, and they should be able to communicate across networks easily and securely.

Secure data storage. People should expect that we won’t store sensitive data in countries with weak records on human rights like privacy and freedom of expression in order to protect data from being improperly accessed.

Over the next few years, we plan to rebuild more of our services around these ideas. The decisions we’ll face along the way will mean taking positions on important issues concerning the future of the internet. We understand there are a lot of tradeoffs to get right, and we’re committed to consulting with experts and discussing the best way forward. This will take some time, but we’re not going to develop this major change in our direction behind closed doors. We’re going to do this as openly and collaboratively as we can because many of these issues affect different parts of society.

Private Interactions as a Foundation

For a service to feel private, there must never be any doubt about who you are communicating with. We’ve worked hard to build privacy into all our products, including those for public sharing. But one great property of messaging services is that even as your contacts list grows, your individual threads and groups remain private. As your friends evolve over time, messaging services evolve gracefully and remain intimate.

This is different from broader social networks, where people can accumulate friends or followers until the services feel more public. This is well-suited to many important uses — telling all your friends about something, using your voice on important topics, finding communities of people with similar interests, following creators and media, buying and selling things, organizing fundraisers, growing businesses, or many other things that benefit from having everyone you know in one place. Still, when you see all these experiences together, it feels more like a town square than a more intimate space like a living room.

There is an opportunity to build a platform that focuses on all of the ways people want to interact privately. This sense of privacy and intimacy is not just about technical features — it is designed deeply into the feel of the service overall. In WhatsApp, for example, our team is obsessed with creating an intimate environment in every aspect of the product. Even where we’ve built features that allow for broader sharing, it’s still a less public experience. When the team built groups, they put in a size limit to make sure every interaction felt private. When we shipped stories on WhatsApp, we limited public content because we worried it might erode the feeling of privacy to see lots of public content — even if it didn’t actually change who you’re sharing with.

In a few years, I expect future versions of Messenger and WhatsApp to become the main ways people communicate on the Facebook network. We’re focused on making both of these apps faster, simpler, more private and more secure, including with end-to-end encryption. We then plan to add more ways to interact privately with your friends, groups, and businesses. If this evolution is successful, interacting with your friends and family across the Facebook network will become a fundamentally more private experience.

Encryption and Safety

People expect their private communications to be secure and to only be seen by the people they’ve sent them to — not hackers, criminals, over-reaching governments, or even the people operating the services they’re using.

There is a growing awareness that the more entities that have access to your data, the more vulnerabilities there are for someone to misuse it or for a cyber attack to expose it. There is also a growing concern among some that technology may be centralizing power in the hands of governments and companies like ours. And some people worry that our services could access their messages and use them for advertising or in other ways they don’t expect.

End-to-end encryption is an important tool in developing a privacy-focused social network. Encryption is decentralizing — it limits services like ours from seeing the content flowing through them and makes it much harder for anyone else to access your information. This is why encryption is an increasingly important part of our online lives, from banking to healthcare services. It’s also why we built end-to-end encryption into WhatsApp after we acquired it.

In the last year, I’ve spoken with dissidents who’ve told me encryption is the reason they are free, or even alive. Governments often make unlawful demands for data, and while we push back and fight these requests in court, there’s always a risk we’ll lose a case — and if the information isn’t encrypted we’d either have to turn over the data or risk our employees being arrested if we failed to comply. This may seem extreme, but we’ve had a case where one of our employees was actually jailed for not providing access to someone’s private information even though we couldn’t access it since it was encrypted.

At the same time, there are real safety concerns to address before we can implement end-to-end encryption across all of our messaging services. Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion. We have a responsibility to work with law enforcement and to help prevent these wherever we can. We are working to improve our ability to identify and stop bad actors across our apps by detecting patterns of activity or through other means, even when we can’t see the content of the messages, and we will continue to invest in this work. But we face an inherent tradeoff because we will never find all of the potential harm we do today when our security systems can see the messages themselves.

Finding the right ways to protect both privacy and safety is something societies have historically grappled with. There are still many open questions here and we’ll consult with safety experts, law enforcement and governments on the best ways to implement safety measures. We’ll also need to work together with other platforms to make sure that as an industry we get this right. The more we can create a common approach, the better.

On balance, I believe working towards implementing end-to-end encryption for all private communications is the right thing to do. Messages and calls are some of the most sensitive private conversations people have, and in a world of increasing cyber security threats and heavy-handed government intervention in many countries, people want us to take the extra step to secure their most private data. That seems right to me, as long as we take the time to build the appropriate safety systems that stop bad actors as much as we possibly can within the limits of an encrypted service. We’ve started working on these safety systems building on the work we’ve done in WhatsApp, and we’ll discuss them with experts through 2019 and beyond before fully implementing end-to-end encryption. As we learn more from those experts, we’ll finalize how to roll out these systems.

Reducing Permanence

We increasingly believe it’s important to keep information around for shorter periods of time. People want to know that what they share won’t come back to hurt them later, and reducing the length of time their information is stored and accessible will help.

One challenge in building social tools is the “permanence problem”. As we build up large collections of messages and photos over time, they can become a liability as well as an asset. For example, many people who have been on Facebook for a long time have photos from when they were younger that could be embarrassing. But people also really love keeping a record of their lives. And if all posts on Facebook and Instagram disappeared, people would lose access to a lot of valuable knowledge and experiences others have shared.

I believe there’s an opportunity to set a new standard for private communication platforms — where content automatically expires or is archived over time. Stories already expire after 24 hours unless you archive them, and that gives people the comfort to share more naturally. This philosophy could be extended to all private content.

For example, messages could be deleted after a month or a year by default. This would reduce the risk of your messages resurfacing and embarrassing you later. Of course you’d have the ability to change the timeframe or turn off auto-deletion for your threads if you wanted. And we could also provide an option for you to set individual messages to expire after a few seconds or minutes if you wanted.

It also makes sense to limit the amount of time we store messaging metadata. We use this data to run our spam and safety systems, but we don’t always need to keep it around for a long time. An important part of the solution is to collect less personal data in the first place, which is the way WhatsApp was built from the outset.

Interoperability

People want to be able to choose which service they use to communicate with people. However, today if you want to message people on Facebook you have to use Messenger, on Instagram you have to use Direct, and on WhatsApp you have to use WhatsApp. We want to give people a choice so they can reach their friends across these networks from whichever app they prefer.

We plan to start by making it possible for you to send messages to your contacts using any of our services, and then to extend that interoperability to SMS too. Of course, this would be opt-in and you will be able to keep your accounts separate if you’d like.

There are privacy and security advantages to interoperability. For example, many people use Messenger on Android to send and receive SMS texts. Those texts can’t be end-to-end encrypted because the SMS protocol is not encrypted. With the ability to message across our services, however, you’d be able to send an encrypted message to someone’s phone number in WhatsApp from Messenger.

This could also improve convenience in many experiences where people use Facebook or Instagram as their social network and WhatsApp as their preferred messaging service. For example, lots of people selling items on Marketplace list their phone number so people can message them about buying it. That’s not ideal, because you’re giving strangers your phone number. With interoperability, you’d be able to use WhatsApp to receive messages sent to your Facebook account without sharing your phone number — and the buyer wouldn’t have to worry about whether you prefer to be messaged on one network or the other.

You can imagine many simple experiences — a person discovers a business on Instagram and easily transitions to their preferred messaging app for secure payments and customer support; another person wants to catch up with a friend and can send them a message that goes to their preferred app without having to think about where that person prefers to be reached; or you simply post a story from your day across both Facebook and Instagram and can get all the replies from your friends in one place.

You can already send and receive SMS texts through Messenger on Android today, and we’d like to extend this further in the future, perhaps including the new telecom RCS standard. However, there are several issues we’ll need to work through before this will be possible. First, Apple doesn’t allow apps to interoperate with SMS on their devices, so we’d only be able to do this on Android. Second, we’d need to make sure interoperability doesn’t compromise the expectation of encryption that people already have using WhatsApp. Finally, it would create safety and spam vulnerabilities in an encrypted system to let people send messages from unknown apps where our safety and security systems couldn’t see the patterns of activity.

These are significant challenges and there are many questions here that require further consultation and discussion. But if we can implement this, we can give people more choice to use their preferred service to securely reach the people they want.

Secure Data Storage

People want to know their data is stored securely in places they trust. Looking at the future of the internet and privacy, I believe one of the most important decisions we’ll make is where we’ll build data centers and store people’s sensitive data.

There’s an important difference between providing a service in a country and storing people’s data there. As we build our infrastructure around the world, we’ve chosen not to build data centers in countries that have a track record of violating human rights like privacy or freedom of expression. If we build data centers and store sensitive data in these countries, rather than just caching non-sensitive data, it could make it easier for those governments to take people’s information.

Upholding this principle may mean that our services will get blocked in some countries, or that we won’t be able to enter others anytime soon. That’s a tradeoff we’re willing to make. We do not believe storing people’s data in some countries is a secure enough foundation to build such important internet infrastructure on.

Of course, the best way to protect the most sensitive data is not to store it at all, which is why WhatsApp doesn’t store any encryption keys and we plan to do the same with our other services going forward.

But storing data in more countries also establishes a precedent that emboldens other governments to seek greater access to their citizen’s data and therefore weakens privacy and security protections for people around the world. I think it’s important for the future of the internet and privacy that our industry continues to hold firm against storing people’s data in places where it won’t be secure.

Next Steps

Over the next year and beyond, there are a lot more details and trade-offs to work through related to each of these principles. A lot of this work is in the early stages, and we are committed to consulting with experts, advocates, industry partners, and governments — including law enforcement and regulators — around the world to get these decisions right.

At the same time, working through these principles is only the first step in building out a privacy-focused social platform. Beyond that, significant thought needs to go into all of the services we build on top of that foundation — from how people do payments and financial transactions, to the role of businesses and advertising, to how we can offer a platform for other private services.

But these initial questions are critical to get right. If we do this well, we can create platforms for private sharing that could be even more important to people than the platforms we’ve already built to help people share and connect more openly.

Doing this means taking positions on some of the most important issues facing the future of the internet. As a society, we have an opportunity to set out where we stand, to decide how we value private communications, and who gets to decide how long and where data should be stored.

I believe we should be working towards a world where people can speak privately and live freely knowing that their information will only be seen by who they want to see it and won’t all stick around forever. If we can help move the world in this direction, I will be proud of the difference we’ve made.

Powered by WPeMatico

Lenovo Watch X was riddled with security bugs, researcher says

Posted by | api, Bluetooth, China, computer security, computing, encryption, Gadgets, lenovo, Password, smartwatches, spokesperson, Wearables, web server, Zhongguancun | No Comments

Lenovo’s Watch X was widely panned as “absolutely terrible.” As it turns out, so was its security.

The low-end $50 smartwatch was one of Lenovo’s cheapest smartwatches. Available only for the China market, anyone who wants one has to buy one directly from the mainland. Lucky for Erez Yalon, head of security research at Checkmarx, an application security testing company, he was given one from a friend. But it didn’t take him long to find several vulnerabilities that allowed him to change user’s passwords, hijack accounts and spoof phone calls.

Because the smartwatch wasn’t using any encryption to send data from the app to the server, Yalon said he was able to see his registered email address and password sent in plain text, as well as data about how he was using the watch, like how many steps he was taking.

“The entire API was unencrypted,” said Yalon in an email to TechCrunch. “All data was transferred in plain-text.”

The API that helps power the watch was easily abused, he found, allowing him to reset anyone’s password simply by knowing a person’s username. That could’ve given him access to anyone’s account, he said.

Not only that, he found that the watch was sharing his precise geolocation with a server in China. Given the watch’s exclusivity to China, it might not be a red flag to natives. But Yalon said the watch had “already pinpointed my location” before he had even registered his account.

Yalon’s research wasn’t just limited to the leaky API. He found that the Bluetooth-enabled smartwatch could also be manipulated from nearby, by sending crafted Bluetooth requests. Using a small script, he demonstrated how easy it was to spoof a phone call on the watch.

Using a similar malicious Bluetooth command, he could also set the alarm to go off — again and again. “The function allows adding multiple alarms, as often as every minute,” he said.

Lenovo didn’t have much to say about the vulnerabilities, besides confirming their existence.

“The Watch X was designed for the China market and is only available from Lenovo to limited sales channels in China,” said spokesperson Andrew Barron. “Our [security team] team has been working with the [original device manufacturer] that makes the watch to address the vulnerabilities identified by a researcher and all fixes are due to be completed this week.”

Yalon said that encrypting the traffic between the watch, the Android app and its web server would prevent snooping and help reduce manipulation.

“Fixing the API permissions eliminates the ability of malicious users to send commands to the watch, spoof calls, and set alarms,” he said.

Powered by WPeMatico

Google makes it easier for cheap phones and smart devices to encrypt your data

Posted by | adiantum, cryptography, encryption, Gadgets, Google, Mobile, Security, TC | No Comments

Encryption is an important part of the whole securing-your-data package, but it’s easy to underestimate the amount of complexity it adds to any service or device. One part of that is the amount of processing encryption takes — an amount that could be impractical on small or low-end devices. Google wants to change that with a highly efficient new method called Adiantum.

Here’s the problem. While encryption is in a way just transforming one block of data reversibly into another, that process is actually pretty complicated. Math needs to be done, data read and written and reread and rewritten and confirmed and hashed.

For a text message that’s not so hard. But if you have to do the same thing as you store or retrieve megabyte after megabyte of data, for instance with images or video, that extra computation adds up quick.

Lots of modern smartphones and other gadgets are equipped with a special chip that performs some of the most common encryption algorithms and processes (namely AES), just like we have GPUs to handle graphics calculations in games and such.

But what about older phones, or cheaper ones, or tiny smart home gadgets that don’t have room for that kind of thing on their boards? Just like they can’t run the latest games, they might not be able to efficiently run the latest cryptographic processes. They can still encrypt things, of course, but it might take too long for certain apps to work, or drain the battery.

Google, clearly interested in keeping cheap phones competitive, is tackling this problem by creating a special encryption method just for low-power phones. They call it Adiantum, and it will be optionally part of Android distributions going forward.

The technical details are all here, but the gist is this. Instead of using AES it relies on a cipher called ChaCha. This cipher method is highly optimized for basic binary operations, which any processor can execute quickly, though of course it will be outstripped by specialized hardware and drivers. It’s well documented and already in use lots of places — this isn’t some no-name bargain bin code. As they show, it performs way better on earlier chipsets like the Cortex A7.

The Adiantum process doesn’t increase or decrease the size of the payload (for instance by padding it or by appending some header or footer data), meaning the same number of bytes come in as go out. That’s nice when you’re a file system and don’t want to have to set aside too many special blocks for encryption metadata and the like.

Naturally new encryption techniques are viewed with some skepticism by security professionals, for whom the greatest pleasure in life is to prove one is compromised or unreliable. Adiantum’s engineers say they have “high confidence in its security,” with the assumption (currently reasonable) that its component “primitives” ChaCha and AES are themselves secure. We’ll soon see!

In the meantime don’t expect any instant gains, but future low-power devices may offer better security without having to use more expensive components — you won’t have to do a thing, either.

Oh, and in case you were wondering:

Adiantum is named after the genus of the maidenhair fern, which in the Victorian language of flowers (floriography) represents sincerity and discretion.

Powered by WPeMatico

Security researchers have busted the encryption in several popular Crucial and Samsung SSDs

Posted by | cryptography, disk encryption, encryption, Gadgets, hardware, open source software, Samsung Electronics, Security, solid state drive | No Comments

Researchers at Radboud University have found critical security flaws in several popular Crucial and Samsung solid state drives (SSDs), which they say can be easily exploited to recover encrypted data without knowing the password.

The researchers, who detailed their findings in a new paper out Monday, reverse engineered the firmware of several drives to find a “pattern of critical issues” across the device makers.

In the case of one drive, the master password used to decrypt the drive’s data was just an empty string and could be easily exploiting by flipping a single bit in the drive’s memory. Another drive could be unlocked with “any password” by crippling the drive’s password validation checks.

That wouldn’t be much of a problem if an affected drive also used software encryption to secure its data. But the researchers found that in the case of Windows computers, often the default policy for BitLocker’s software-based drive encryption is to trust the drive — and therefore rely entirely on a device’s hardware encryption to protect the data. Yet, as the researchers found, if the hardware encryption is buggy, BitLocker isn’t doing much to prevent data theft.

In other words, users “should not rely solely on hardware encryption as offered by SSDs for confidentiality,” the researchers said.

Alan Woodward, a professor at the University of Surrey, said that the greatest risk to users is the drive’s security “failing silently.”

“You might think you’ve done the right thing enabling BitLocker but then a third-party fault undermines your security, but you never know and never would know,” he said.

Matthew Green, a cryptography professor at Johns Hopkins, described the BitLocker flaw in a tweet as “like jumping out of a plane with an umbrella instead of a parachute.”

The researchers said that their findings are not yet finalized — pending a peer review. But the research was made public after disclosing the bugs to the drive makers in April.

Crucial’s MX100, MX200 and MX300 drives, Samsung’s T3 and T5 USB external disks and Samsung 840 EVO and 850 EVO internal hard disks are known to be affected, but the researchers warned that many other drives may also be at risk.

The researchers criticized the device makers’ proprietary and closed-source cryptography that they said — and proved — is “often shown to be much weaker in practice” than their open-source and auditable cryptographic libraries. “Manufacturers that take security seriously should publish their crypto schemes and corresponding code so that security claims can be independently verified,” they wrote.

The researchers recommend using software-based encryption, like the open-source software VeraCrypt.

In an advisory, Samsung also recommended that users install encryption software to prevent any “potential breach of self-encrypting SSDs.” Crucial’s owner Micron is said to have a fix on the way, according to an advisory by the Netherlands’ National Cyber Security Center, but did not say when.

Micron did not immediately respond to a request for comment.

Powered by WPeMatico

Facebook’s ex-CSO, Alex Stamos, defends its decision to inject ads in WhatsApp

Posted by | Advertising Tech, Alex Stamos, Android, Apple, Apps, Brian Acton, e2e encryption, encryption, Facebook, Instant Messaging, Jan Koum, privacy, Sheryl Sandberg, signal foundation, Signal Protocol, Social, social media, WhatsApp | No Comments

Alex Stamos, Facebook’s former chief security officer, who left the company this summer to take up a role in academia, has made a contribution to what’s sometimes couched as a debate about how to monetize (and thus sustain) commercial end-to-end encrypted messaging platforms in order that the privacy benefits they otherwise offer can be as widely spread as possible.

Stamos made the comments via Twitter, where he said he was indirectly responding to the fallout from a Forbes interview with WhatsApp co-founder Brian Acton — in which Acton hit at out at his former employer for being greedy in its approach to generating revenue off of the famously anti-ads messaging platform.

Both WhatsApp founders’ exits from Facebook has been blamed on disagreements over monetization. (Jan Koum left some months after Acton.)

In the interview, Acton said he suggested Facebook management apply a simple business model atop WhatsApp, such as metered messaging for all users after a set number of free messages. But that management pushed back — with Facebook COO Sheryl Sandberg telling him they needed a monetization method that generates greater revenue “scale”.

And while Stamos has avoided making critical remarks about Acton (unlike some current Facebook staffers), he clearly wants to lend his weight to the notion that some kind of trade-off is necessary in order for end-to-end encryption to be commercially viable (and thus for the greater good (of messaging privacy) to prevail); and therefore his tacit support to Facebook and its approach to making money off of a robustly encrypted platform.

Stamos’ own departure from the fb mothership was hardly under such acrimonious terms as Acton, though he has had his own disagreements with the leadership team — as set out in a memo he sent earlier this year that was obtained by BuzzFeed. So his support for Facebook combining e2e and ads perhaps counts for something, though isn’t really surprising given the seat he occupied at the company for several years, and his always fierce defence of WhatsApp encryption.

(Another characteristic concern that also surfaces in Stamos’ Twitter thread is the need to keep the technology legal, in the face of government attempts to backdoor encryption, which he says will require “accepting the inevitable downsides of giving people unfettered communications”.)

I don’t want to weigh into the personal side of the WhatsApp vs Facebook fight, as there are people I respect on both sides, but I do want to use this as an opportunity to talk about the future of end-to-end encryption. (1/13)

— Alex Stamos (@alexstamos) September 26, 2018

This summer Facebook confirmed that, from next year, ads will be injected into WhatsApp statuses (aka the app’s Stories clone). So it is indeed bringing ads to the famously anti-ads messaging platform.

For several years the company has also been moving towards positioning WhatsApp as a business messaging platform to connect companies with potential customers — and it says it plans to meter those messages, also from next year.

So there are two strands to its revenue generating playbook atop WhatsApp’s e2e encrypted messaging platform. Both with knock-on impacts on privacy, given Facebook targets ads and marketing content by profiling users by harvesting their personal data.

This means that while WhatsApp’s e2e encryption means Facebook literally cannot read WhatsApp users’ messages, it is ‘circumventing’ the technology (for ad-targeting purposes) by linking accounts across different services it owns — using people’s digital identities across its product portfolio (and beyond) as a sort of ‘trojan horse’ to negate the messaging privacy it affords them on WhatsApp.

Facebook is using different technical methods (including the very low-tech method of phone number matching) to link WhatsApp user and Facebook accounts. Once it’s been able to match a Facebook user to a WhatsApp account it can then connect what’s very likely to be a well fleshed out Facebook profile with a WhatsApp account that nonetheless contains messages it can’t read. So it’s both respecting and eroding user privacy.

This approach means Facebook can carry out its ad targeting activities across both messaging platforms (as it will from next year). And do so without having to literally read messages being sent by WhatsApp users.

As trade offs go, it’s a clearly a big one — and one that’s got Facebook into regulatory trouble in Europe.

It is also, at least in Stamos’ view, a trade off that’s worth it for the ‘greater good’ of message content remaining strongly encrypted and therefore unreadable. Even if Facebook now knows pretty much everything about the sender, and can access any unencrypted messages they sent using its other social products.

In his Twitter thread Stamos argues that “if we want that right to be extended to people around the world, that means that E2E encryption needs to be deployed inside of multi-billion user platforms”, which he says means: “We need to find a sustainable business model for professionally-run E2E encrypted communication platforms.”

On the sustainable business model front he argues that two models “currently fit the bill” — either Apple’s iMessage or Facebook-owned WhatsApp. Though he doesn’t go into any detail on why he believes only those two are sustainable.

He does say he’s discounting the Acton-backed alternative, Signal, which now operates via a not-for-profit (the Signal Foundation) — suggesting that rival messaging app is “unlikely to hit 1B users”.

In passing he also throws it out there that Signal is “subsidized, indirectly, by FB ads” — i.e. because Facebook pays a licensing fee for use of the underlying Signal Protocol used to power WhatsApp’s e2e encryption. (So his slightly shade-throwing subtext is that privacy purists are still benefiting from a Facebook sugardaddy.)

Then he gets to the meat of his argument in defence of Facebook-owned (and monetized) WhatsApp — pointing out that Apple’s sustainable business model does not reach every mobile user, given its hardware is priced at a premium. Whereas WhatsApp running on a cheap Android handset ($50 or, perhaps even $30 in future) can.

Other encrypted messaging apps can also of course run on Android but presumably Stamos would argue they’re not professionally run.

“I think it is easy to underestimate how radical WhatsApp’s decision to deploy E2E was,” he writes. “Acton and Koum, with Zuck’s blessing, jumped off a bridge with the goal of building a monetization parachute on the way down. FB has a lot of money, so it was a very tall bridge, but it is foolish to expect that FB shareholders are going to subsidize a free text/voice/video global communications network forever. Eventually, WhatsApp is going to need to generate revenue.

“This could come from directly charging for the service, it could come from advertising, it could come from a WeChat-like services play. The first is very hard across countries, the latter two are complicated by E2E.”

“I can’t speak to the various options that have been floated around, or the arguments between WA and FB, but those of us who care about privacy shouldn’t see WhatsApp monetization as something evil,” he adds. “In fact, we should want WA to demonstrate that E2E and revenue are compatible. That’s the only way E2E will become a sustainable feature of massive, non-niche technology platforms.”

Stamos is certainly right that Apple’s iMessage cannot reach every mobile user, given the premium cost of Apple hardware.

Though he elides the important role that second hand Apple devices play in helping to reduce the barrier to entry to Apple’s pro-privacy technology — a role Apple is actively encouraging via support for older devices (and by its own services business expansion which extends its model so that support for older versions of iOS (and thus secondhand iPhones) is also commercially sustainable).

Robust encryption only being possible via multi-billion user platforms essentially boils down to a usability argument by Stamos — which is to suggest that mainstream app users will simply not seek encryption out unless it’s plated up for them in a way they don’t even notice it’s there.

The follow on conclusion is then that only a well-resourced giant like Facebook has the resources to maintain and serve this different tech up to the masses.

There’s certainly substance in that point. But the wider question is whether or not the privacy trade offs that Facebook’s monetization methods of WhatsApp entail, by linking Facebook and WhatsApp accounts and also, therefore, looping in various less than transparent data-harvest methods it uses to gather intelligence on web users generally, substantially erodes the value of the e2e encryption that is now being bundled with Facebook’s ad targeting people surveillance. And so used as a selling aid for otherwise privacy eroding practices.

Yes WhatsApp users’ messages will remain private, thanks to Facebook funding the necessary e2e encryption. But the price users are having to pay is very likely still their personal privacy.

And at that point the argument really becomes about how much profit a commercial entity should be able to extract off of a product that’s being marketed as securely encrypted and thus ‘pro-privacy’? How much revenue “scale” is reasonable or unreasonable in that scenario?

Other business models are possible, which was Acton’s point. But likely less profitable. And therein lies the rub where Facebook is concerned.

How much money should any company be required to leave on the table, as Acton did when he left Facebook without the rest of his unvested shares, in order to be able to monetize a technology that’s bound up so tightly with notions of privacy?

Acton wanted Facebook to agree to make as much money as it could without users having to pay it with their privacy. But Facebook’s management team said no. That’s why he’s calling them greedy.

Stamos doesn’t engage with that more nuanced point. He just writes: “It is foolish to expect that FB shareholders are going to subsidize a free text/voice/video global communications network forever. Eventually, WhatsApp is going to need to generate revenue” — thereby collapsing the revenue argument into an all or nothing binary without explaining why it has to be that way.

Powered by WPeMatico

FBI reportedly overestimated inaccessible encrypted phones by thousands

Posted by | encryption, FBI, Gadgets, Government, Mobile, privacy, Security | No Comments

The FBI seems to have been caught fibbing again on the topic of encrypted phones. FBI director Christopher Wray estimated in December that it had almost 7,800 phones from 2017 alone that investigators were unable to access. The real number is likely less than a quarter of that, The Washington Post reports.

Internal records cited by sources put the actual number of encrypted phones at perhaps 1,200 but perhaps as many as 2,000, and the FBI told the paper in a statement that “initial assessment is that programming errors resulted in significant over-counting of mobile devices reported.” Supposedly having three databases tracking the phones led to devices being counted multiple times.

Such a mistake would be so elementary that it’s hard to conceive of how it would be possible. These aren’t court notes, memos or unimportant random pieces of evidence, they’re physical devices with serial numbers and names attached. The idea that no one thought to check for duplicates before giving a number to the director for testimony in Congress suggests either conspiracy or gross incompetence.

The latter seems more likely after a report by the Office of the Inspector General that found the FBI had failed to utilize its own resources to access locked phones, instead suing Apple and then hastily withdrawing the case when its basis (a locked phone from a terror attack) was removed. It seems to have chosen to downplay or ignore its own capabilities in order to pursue the narrative that widespread encryption is dangerous without a backdoor for law enforcement.

An audit is underway at the Bureau to figure out just how many phones it actually has that it can’t access, and hopefully how this all happened.

It is unmistakably among the FBI’s goals to emphasize the problem of devices being fully encrypted and inaccessible to authorities, a trend known as “going dark.” That much it has said publicly, and it is a serious problem for law enforcement. But it seems equally unmistakable that the Bureau is happy to be sloppy, deceptive or both in its advancement of a tailored narrative.

Powered by WPeMatico

Twitter has an unlaunched ‘Secret’ encrypted messages feature

Posted by | Apps, encryption, Mobile, privacy, Security, Social, Twitter | No Comments

Buried inside Twitter’s Android app is a “Secret conversation” option that if launched would allow users to send encrypted direct messages. The feature could make Twitter a better home for sensitive communications that often end up on encrypted messaging apps like Signal, Telegram or WhatsApp.

The encrypted DMs option was first spotted inside the Twitter for Android application package (APK) by Jane Manchun Wong. APKs often contain code for unlaunched features that companies are quietly testing or will soon make available. A Twitter spokesperson declined to comment on the record. It’s unclear how long it might be before Twitter officially launches the feature, but at least we know it’s been built.

The appearance of encrypted DMs comes 18 months after whistleblower Edward Snowden asked Twitter CEO Jack Dorsey for the feature, which Dorsey said was “reasonable and something we’ll think about.”

Twitter has gone from “thinking about” the feature to prototyping it. The screenshot above shows the options to learn more about encrypted messaging, start a secret conversation and view both your own and your conversation partner’s encryption keys to verify a secure connection.

reasonable and something we’ll think about

— jack (@jack) December 14, 2016

Twitter’s DMs have become a powerful way for people to contact strangers without needing their phone number or email address. Whether it’s to send a reporter a scoop, warn someone of a problem, discuss business or just “slide into their DMs” to flirt, Twitter has established one of the most open messaging mediums. But without encryption, those messages are subject to snooping by governments, hackers or Twitter itself.

Twitter has long positioned itself as a facilitator of political discourse and even uprisings. But anyone seriously worried about the consequences of political dissonance, whistleblowing or leaking should be using an app like Signal that offers strong end-to-end encryption. Launching encrypted DMs could win back some of those change-makers and protect those still on Twitter.

Powered by WPeMatico