encryption

Security researchers have busted the encryption in several popular Crucial and Samsung SSDs

Posted by | cryptography, disk encryption, encryption, Gadgets, hardware, open source software, Samsung Electronics, Security, solid state drive | No Comments

Researchers at Radboud University have found critical security flaws in several popular Crucial and Samsung solid state drives (SSDs), which they say can be easily exploited to recover encrypted data without knowing the password.

The researchers, who detailed their findings in a new paper out Monday, reverse engineered the firmware of several drives to find a “pattern of critical issues” across the device makers.

In the case of one drive, the master password used to decrypt the drive’s data was just an empty string and could be easily exploiting by flipping a single bit in the drive’s memory. Another drive could be unlocked with “any password” by crippling the drive’s password validation checks.

That wouldn’t be much of a problem if an affected drive also used software encryption to secure its data. But the researchers found that in the case of Windows computers, often the default policy for BitLocker’s software-based drive encryption is to trust the drive — and therefore rely entirely on a device’s hardware encryption to protect the data. Yet, as the researchers found, if the hardware encryption is buggy, BitLocker isn’t doing much to prevent data theft.

In other words, users “should not rely solely on hardware encryption as offered by SSDs for confidentiality,” the researchers said.

Alan Woodward, a professor at the University of Surrey, said that the greatest risk to users is the drive’s security “failing silently.”

“You might think you’ve done the right thing enabling BitLocker but then a third-party fault undermines your security, but you never know and never would know,” he said.

Matthew Green, a cryptography professor at Johns Hopkins, described the BitLocker flaw in a tweet as “like jumping out of a plane with an umbrella instead of a parachute.”

The researchers said that their findings are not yet finalized — pending a peer review. But the research was made public after disclosing the bugs to the drive makers in April.

Crucial’s MX100, MX200 and MX300 drives, Samsung’s T3 and T5 USB external disks and Samsung 840 EVO and 850 EVO internal hard disks are known to be affected, but the researchers warned that many other drives may also be at risk.

The researchers criticized the device makers’ proprietary and closed-source cryptography that they said — and proved — is “often shown to be much weaker in practice” than their open-source and auditable cryptographic libraries. “Manufacturers that take security seriously should publish their crypto schemes and corresponding code so that security claims can be independently verified,” they wrote.

The researchers recommend using software-based encryption, like the open-source software VeraCrypt.

In an advisory, Samsung also recommended that users install encryption software to prevent any “potential breach of self-encrypting SSDs.” Crucial’s owner Micron is said to have a fix on the way, according to an advisory by the Netherlands’ National Cyber Security Center, but did not say when.

Micron did not immediately respond to a request for comment.

Powered by WPeMatico

Facebook’s ex-CSO, Alex Stamos, defends its decision to inject ads in WhatsApp

Posted by | Advertising Tech, Alex Stamos, Android, Apple, Apps, Brian Acton, e2e encryption, encryption, Facebook, Instant Messaging, Jan Koum, privacy, Sheryl Sandberg, signal foundation, Signal Protocol, Social, social media, WhatsApp | No Comments

Alex Stamos, Facebook’s former chief security officer, who left the company this summer to take up a role in academia, has made a contribution to what’s sometimes couched as a debate about how to monetize (and thus sustain) commercial end-to-end encrypted messaging platforms in order that the privacy benefits they otherwise offer can be as widely spread as possible.

Stamos made the comments via Twitter, where he said he was indirectly responding to the fallout from a Forbes interview with WhatsApp co-founder Brian Acton — in which Acton hit at out at his former employer for being greedy in its approach to generating revenue off of the famously anti-ads messaging platform.

Both WhatsApp founders’ exits from Facebook has been blamed on disagreements over monetization. (Jan Koum left some months after Acton.)

In the interview, Acton said he suggested Facebook management apply a simple business model atop WhatsApp, such as metered messaging for all users after a set number of free messages. But that management pushed back — with Facebook COO Sheryl Sandberg telling him they needed a monetization method that generates greater revenue “scale”.

And while Stamos has avoided making critical remarks about Acton (unlike some current Facebook staffers), he clearly wants to lend his weight to the notion that some kind of trade-off is necessary in order for end-to-end encryption to be commercially viable (and thus for the greater good (of messaging privacy) to prevail); and therefore his tacit support to Facebook and its approach to making money off of a robustly encrypted platform.

Stamos’ own departure from the fb mothership was hardly under such acrimonious terms as Acton, though he has had his own disagreements with the leadership team — as set out in a memo he sent earlier this year that was obtained by BuzzFeed. So his support for Facebook combining e2e and ads perhaps counts for something, though isn’t really surprising given the seat he occupied at the company for several years, and his always fierce defence of WhatsApp encryption.

(Another characteristic concern that also surfaces in Stamos’ Twitter thread is the need to keep the technology legal, in the face of government attempts to backdoor encryption, which he says will require “accepting the inevitable downsides of giving people unfettered communications”.)

I don’t want to weigh into the personal side of the WhatsApp vs Facebook fight, as there are people I respect on both sides, but I do want to use this as an opportunity to talk about the future of end-to-end encryption. (1/13)

— Alex Stamos (@alexstamos) September 26, 2018

This summer Facebook confirmed that, from next year, ads will be injected into WhatsApp statuses (aka the app’s Stories clone). So it is indeed bringing ads to the famously anti-ads messaging platform.

For several years the company has also been moving towards positioning WhatsApp as a business messaging platform to connect companies with potential customers — and it says it plans to meter those messages, also from next year.

So there are two strands to its revenue generating playbook atop WhatsApp’s e2e encrypted messaging platform. Both with knock-on impacts on privacy, given Facebook targets ads and marketing content by profiling users by harvesting their personal data.

This means that while WhatsApp’s e2e encryption means Facebook literally cannot read WhatsApp users’ messages, it is ‘circumventing’ the technology (for ad-targeting purposes) by linking accounts across different services it owns — using people’s digital identities across its product portfolio (and beyond) as a sort of ‘trojan horse’ to negate the messaging privacy it affords them on WhatsApp.

Facebook is using different technical methods (including the very low-tech method of phone number matching) to link WhatsApp user and Facebook accounts. Once it’s been able to match a Facebook user to a WhatsApp account it can then connect what’s very likely to be a well fleshed out Facebook profile with a WhatsApp account that nonetheless contains messages it can’t read. So it’s both respecting and eroding user privacy.

This approach means Facebook can carry out its ad targeting activities across both messaging platforms (as it will from next year). And do so without having to literally read messages being sent by WhatsApp users.

As trade offs go, it’s a clearly a big one — and one that’s got Facebook into regulatory trouble in Europe.

It is also, at least in Stamos’ view, a trade off that’s worth it for the ‘greater good’ of message content remaining strongly encrypted and therefore unreadable. Even if Facebook now knows pretty much everything about the sender, and can access any unencrypted messages they sent using its other social products.

In his Twitter thread Stamos argues that “if we want that right to be extended to people around the world, that means that E2E encryption needs to be deployed inside of multi-billion user platforms”, which he says means: “We need to find a sustainable business model for professionally-run E2E encrypted communication platforms.”

On the sustainable business model front he argues that two models “currently fit the bill” — either Apple’s iMessage or Facebook-owned WhatsApp. Though he doesn’t go into any detail on why he believes only those two are sustainable.

He does say he’s discounting the Acton-backed alternative, Signal, which now operates via a not-for-profit (the Signal Foundation) — suggesting that rival messaging app is “unlikely to hit 1B users”.

In passing he also throws it out there that Signal is “subsidized, indirectly, by FB ads” — i.e. because Facebook pays a licensing fee for use of the underlying Signal Protocol used to power WhatsApp’s e2e encryption. (So his slightly shade-throwing subtext is that privacy purists are still benefiting from a Facebook sugardaddy.)

Then he gets to the meat of his argument in defence of Facebook-owned (and monetized) WhatsApp — pointing out that Apple’s sustainable business model does not reach every mobile user, given its hardware is priced at a premium. Whereas WhatsApp running on a cheap Android handset ($50 or, perhaps even $30 in future) can.

Other encrypted messaging apps can also of course run on Android but presumably Stamos would argue they’re not professionally run.

“I think it is easy to underestimate how radical WhatsApp’s decision to deploy E2E was,” he writes. “Acton and Koum, with Zuck’s blessing, jumped off a bridge with the goal of building a monetization parachute on the way down. FB has a lot of money, so it was a very tall bridge, but it is foolish to expect that FB shareholders are going to subsidize a free text/voice/video global communications network forever. Eventually, WhatsApp is going to need to generate revenue.

“This could come from directly charging for the service, it could come from advertising, it could come from a WeChat-like services play. The first is very hard across countries, the latter two are complicated by E2E.”

“I can’t speak to the various options that have been floated around, or the arguments between WA and FB, but those of us who care about privacy shouldn’t see WhatsApp monetization as something evil,” he adds. “In fact, we should want WA to demonstrate that E2E and revenue are compatible. That’s the only way E2E will become a sustainable feature of massive, non-niche technology platforms.”

Stamos is certainly right that Apple’s iMessage cannot reach every mobile user, given the premium cost of Apple hardware.

Though he elides the important role that second hand Apple devices play in helping to reduce the barrier to entry to Apple’s pro-privacy technology — a role Apple is actively encouraging via support for older devices (and by its own services business expansion which extends its model so that support for older versions of iOS (and thus secondhand iPhones) is also commercially sustainable).

Robust encryption only being possible via multi-billion user platforms essentially boils down to a usability argument by Stamos — which is to suggest that mainstream app users will simply not seek encryption out unless it’s plated up for them in a way they don’t even notice it’s there.

The follow on conclusion is then that only a well-resourced giant like Facebook has the resources to maintain and serve this different tech up to the masses.

There’s certainly substance in that point. But the wider question is whether or not the privacy trade offs that Facebook’s monetization methods of WhatsApp entail, by linking Facebook and WhatsApp accounts and also, therefore, looping in various less than transparent data-harvest methods it uses to gather intelligence on web users generally, substantially erodes the value of the e2e encryption that is now being bundled with Facebook’s ad targeting people surveillance. And so used as a selling aid for otherwise privacy eroding practices.

Yes WhatsApp users’ messages will remain private, thanks to Facebook funding the necessary e2e encryption. But the price users are having to pay is very likely still their personal privacy.

And at that point the argument really becomes about how much profit a commercial entity should be able to extract off of a product that’s being marketed as securely encrypted and thus ‘pro-privacy’? How much revenue “scale” is reasonable or unreasonable in that scenario?

Other business models are possible, which was Acton’s point. But likely less profitable. And therein lies the rub where Facebook is concerned.

How much money should any company be required to leave on the table, as Acton did when he left Facebook without the rest of his unvested shares, in order to be able to monetize a technology that’s bound up so tightly with notions of privacy?

Acton wanted Facebook to agree to make as much money as it could without users having to pay it with their privacy. But Facebook’s management team said no. That’s why he’s calling them greedy.

Stamos doesn’t engage with that more nuanced point. He just writes: “It is foolish to expect that FB shareholders are going to subsidize a free text/voice/video global communications network forever. Eventually, WhatsApp is going to need to generate revenue” — thereby collapsing the revenue argument into an all or nothing binary without explaining why it has to be that way.

Powered by WPeMatico

FBI reportedly overestimated inaccessible encrypted phones by thousands

Posted by | encryption, FBI, Gadgets, Government, Mobile, privacy, Security | No Comments

The FBI seems to have been caught fibbing again on the topic of encrypted phones. FBI director Christopher Wray estimated in December that it had almost 7,800 phones from 2017 alone that investigators were unable to access. The real number is likely less than a quarter of that, The Washington Post reports.

Internal records cited by sources put the actual number of encrypted phones at perhaps 1,200 but perhaps as many as 2,000, and the FBI told the paper in a statement that “initial assessment is that programming errors resulted in significant over-counting of mobile devices reported.” Supposedly having three databases tracking the phones led to devices being counted multiple times.

Such a mistake would be so elementary that it’s hard to conceive of how it would be possible. These aren’t court notes, memos or unimportant random pieces of evidence, they’re physical devices with serial numbers and names attached. The idea that no one thought to check for duplicates before giving a number to the director for testimony in Congress suggests either conspiracy or gross incompetence.

The latter seems more likely after a report by the Office of the Inspector General that found the FBI had failed to utilize its own resources to access locked phones, instead suing Apple and then hastily withdrawing the case when its basis (a locked phone from a terror attack) was removed. It seems to have chosen to downplay or ignore its own capabilities in order to pursue the narrative that widespread encryption is dangerous without a backdoor for law enforcement.

An audit is underway at the Bureau to figure out just how many phones it actually has that it can’t access, and hopefully how this all happened.

It is unmistakably among the FBI’s goals to emphasize the problem of devices being fully encrypted and inaccessible to authorities, a trend known as “going dark.” That much it has said publicly, and it is a serious problem for law enforcement. But it seems equally unmistakable that the Bureau is happy to be sloppy, deceptive or both in its advancement of a tailored narrative.

Powered by WPeMatico

Twitter has an unlaunched ‘Secret’ encrypted messages feature

Posted by | Apps, encryption, Mobile, privacy, Security, Social, Twitter | No Comments

Buried inside Twitter’s Android app is a “Secret conversation” option that if launched would allow users to send encrypted direct messages. The feature could make Twitter a better home for sensitive communications that often end up on encrypted messaging apps like Signal, Telegram or WhatsApp.

The encrypted DMs option was first spotted inside the Twitter for Android application package (APK) by Jane Manchun Wong. APKs often contain code for unlaunched features that companies are quietly testing or will soon make available. A Twitter spokesperson declined to comment on the record. It’s unclear how long it might be before Twitter officially launches the feature, but at least we know it’s been built.

The appearance of encrypted DMs comes 18 months after whistleblower Edward Snowden asked Twitter CEO Jack Dorsey for the feature, which Dorsey said was “reasonable and something we’ll think about.”

Twitter has gone from “thinking about” the feature to prototyping it. The screenshot above shows the options to learn more about encrypted messaging, start a secret conversation and view both your own and your conversation partner’s encryption keys to verify a secure connection.

reasonable and something we’ll think about

— jack (@jack) December 14, 2016

Twitter’s DMs have become a powerful way for people to contact strangers without needing their phone number or email address. Whether it’s to send a reporter a scoop, warn someone of a problem, discuss business or just “slide into their DMs” to flirt, Twitter has established one of the most open messaging mediums. But without encryption, those messages are subject to snooping by governments, hackers or Twitter itself.

Twitter has long positioned itself as a facilitator of political discourse and even uprisings. But anyone seriously worried about the consequences of political dissonance, whistleblowing or leaking should be using an app like Signal that offers strong end-to-end encryption. Launching encrypted DMs could win back some of those change-makers and protect those still on Twitter.

Powered by WPeMatico

Grindr sends HIV status to third parties, and some personal data unencrypted

Posted by | Apps, encryption, grindr, Health, Mobile, privacy, Security | No Comments

Hot on the heels of last week’s security issues, dating app Grindr is under fire again for inappropriate sharing of HIV status with third parties (not advertisers, as I had written here before) and inadequate security on other personal data transmission. It’s not a good look for a company that says privacy is paramount.

Norwegian research outfit SINTEF analyzed the app’s traffic and found that HIV status, which users can choose to include in their profile, is included in packets sent to Apptimize and Localytics. Users are not informed that this data is being sent.

These aren’t advertising companies but rather services for testing and improving mobile apps — Grindr isn’t selling them this data or anything. The company’s CTO told BuzzFeed News that “the limited information shared with these platforms is done under strict contractual terms that provide for the highest level of confidentiality, data security, and user privacy.” And to the best of my knowledge regulations like HIPAA don’t prevent the company from transmitting medical data provided voluntarily by users to third parties as specified in the privacy policy.

That said, it’s a rather serious breach of trust that something as private as HIV status is being shared in this way, even if it isn’t being done with any kind of ill intention. The laxity with which this extremely important and private information is handled undermines the message of care and consent that Grindr is careful to cultivate.

Update: Grindr’s head of security told Axios that the company will stop sending HIV status data to third parties.

Perhaps more serious from a systematic standpoint, however, is the unencrypted transmission of a great deal of sensitive data.

The SINTEF researchers found that precise GPS position, gender, age, “tribe” (e.g. bear, daddy), intention (e.g. friends, relationship), ethnicity, relationship status, language and device characteristics are sent over HTTP to a variety of advertising companies. A Grindr representative confirmed that location, age, and tribe are “sometimes” sent unencrypted. I’ve asked for clarification on this.

Not only is this extremely poor security practice, but Grindr appears to have been caught in a lie. The company told me last week when news of another security issue arose that “all information transmitted between a user’s device and our servers is encrypted and communicated in a way that does not reveal your specific location to unknown third parties.”

At the time I asked them about accusations that the app sent some data unencrypted; I never heard back. Fortunately for users, though unfortunately for Grindr, my question was answered by an independent body, and the above statement is evidently false.

It would be one thing to merely share this data with advertisers and other third parties — although it isn’t something many users would choose, presumably they at least consent to it as part of signing up.

But to send this information in the clear presents a material danger to the many gay people around the world who cannot openly identify as such. The details sent unencrypted are potentially enough to identify someone in, say, a coffee shop — and anyone in that coffee shop with a bit of technical knowledge could be monitoring for exactly those details. Identifying incriminating traffic in logs also could be done at the behest of one of the many governments that have outlawed homosexuality.

I’ve reached out to Grindr for comment and expect a statement soon; I’ll update this post as soon as I receive it.

Update: Here is Grindr’s full statement on the sharing of HIV data; notably it does not address the unencrypted transmission of other data.

As a company that serves the LGBTQ community, we understand the sensitivities around HIV status disclosure. Our goal is and always has been to support the health and safety of our users worldwide.

Recently, Grindr’s industry standard use of third party partners including Apptimize and Localytics, two highly-regarded software vendors, to test and validate the way we roll out our platform has drawn concern over the way we share user data.

In an effort to clear any misinformation we feel it necessary to state:

Grindr has never, nor will we ever sell personally identifiable user information – especially information regarding HIV status or last test date – to third parties or advertisers.

As an industry standard practice, Grindr does work with highly-regarded vendors to test and optimize how we roll out our platform. These vendors are under strict contractual terms that provide for the highest level of confidentiality, data security, and user privacy.

When working with these platforms, we restrict information shared except as necessary or appropriate. Sometimes this data may include location data or data from HIV status fields as these are features within Grindr, however, this information is always transmitted securely with encryption, and there are data retention policies in place to further protect our users’ privacy from disclosure.

It’s important to remember that Grindr is a public forum. We give users the option to post information about themselves including HIV status and last test date, and we make it clear in our privacy policy that if you chose to include this information in your profile, the information will also become public. As a result, you should carefully consider what information to include in your profile.

As an industry leader and champion for the LGBTQ community, Grindr, recognizes that a person’s HIV status can be highly stigmatized but after consulting several international health organizations and our Grindr For Equality team, Grindr determined with community feedback it would be beneficial for the health and well-being of our community to give users the option to publish, at their discretion, the user’s HIV Status and their Last Tested Date. It is up to each user to determine what, if anything, to share about themselves in their profile.

The inclusion of HIV status information within our platform is always regarded carefully with our users’ privacy in mind, but like any other mobile app company, we too must operate with industry standard practices to help make sure Grindr continues to improve for our community. We assure everyone that we are always examining our processes around privacy, security and data sharing with third parties, and always looking for additional measures that go above and beyond industry best practices to help maintain our users’ right to privacy.

Powered by WPeMatico

Inquiry finds FBI sued Apple to unlock phone without considering all options

Posted by | Apple, Apple vs. FBI, encryption, FBI, Gadgets, Government, Mobile, privacy, Security, TC | No Comments

The Office of the Inspector General has issued its report on the circumstances surrounding the FBI’s 2016 lawsuit attempting to force Apple to unlock an iPhone as part of a criminal investigation. While it stops short of saying the FBI was untruthful in its justification of going to court, the report is unsparing of the bureaucracy and clashing political motives that ultimately undermined that justification.

The official narrative, briefly summarized, is that the FBI wanted to get into a locked iPhone allegedly used in the San Bernardino attack in late 2015. Then-director Comey explained on February 9 that the Bureau did not have the capability to unlock the phone, and that as Apple was refusing to help voluntarily, a lawsuit would be filed compelling it to assist.

But then, a month later, a miracle occurred: a third-party had come forward with a working method to unlock the phone and the lawsuit would not be necessary after all.

Though this mooted the court proceedings, which were dropped, it only delayed the inevitable and escalating battle between tech and law enforcement — specifically the “going dark” problem of pervasive encryption. Privacy advocates saw the suit as a transparent (but abortive) attempt to set a precedent greatly expanding the extent to which tech companies would be required to help law enforcement. Apple of course fought tooth and nail.

In 2016 the OIG was contacted by Amy Hess, a former FBI Executive Assistant Director, who basically said that the process wasn’t nearly so clean as the Bureau made it out to be. In the course of its inquiries the Inspector General did find that to be the case, though although the FBI’s claims were not technically inaccurate or misleading, they also proved simply to be incorrect — and it is implied that they may have been allowed to be incorrect in order to further the “going dark” narrative.

The full report is quite readable (if you can mentally juggle the numerous acronyms), but the findings are essentially as follows.

Although Comey stated on February 9 that the FBI did not have the capability to unlock the phone and would seek legal remedy, the inquiry found that the Bureau had not exhausted all the avenues available to it, including some rather obvious ones.

Comey at a hearing in 2017

For instance, one senior engineer was tasked with asking trusted vendors if they had anything that could help — two days after Comey already said the FBI had no options left. Not only that, but there was official friction over whether classified tools generally reserved for national security purposes should be considered for this lesser, though obviously serious, criminal case.

In the first case, it turned out that yes, a vendor did have a solution “90 percent” done, and was happy to finish it up over the next month. How could the director have said that the FBI didn’t have the resources to do this, when it had not even asked its usual outside sources for help?

In the second, it’s still unclear whether there in fact exist classified tools that could have been brought to bear on the device in question. Testimony is conflicting on this point, with some officials saying that there was a “line in the sand” drawn between classified and unclassified tools, and another saying it was just a matter of preference. Regardless, those involved were less than forthcoming even within the Bureau, and even internal leadership was left wondering if there were solutions they hadn’t considered.

Hess, who brought the initial complaint to the OIG, was primarily concerned not that there was confusion in the ranks — it’s a huge organization and communication can be difficult — but that the search for a solution was deliberately allowed to fail in order that the case could act as a precedent advantageous to the FBI and other law enforcement agencies. Comey was known to be very concerned with the “going dark” issue and would likely have pursued such a case with vigor.

So the court case, Hess implied, was the real goal, and the meetings early in 2016 were formalities, nothing more than a paper trail to back up Comey’s statements. When a solution was actually found, because an engineer had taken initiative to ask around, officials hoping for a win in court were dismayed:

She became concerned that the CEAU Chief did not seem to want to find a technical solution, and that perhaps he knew of a solution but remained silent in order to pursue his own agenda of obtaining a favorable court ruling against Apple. According to EAD Hess, the problem with the Farook iPhone encryption was the “poster child” case for the Going Dark challenge.

The CEAU Chief told the OIG that, after the outside vendor came forward, he became frustrated that the case against Apple could no longer go forward, and he vented his frustration to the ROU Chief. He acknowledged that during this conversation between the two, he expressed disappointment that the ROU Chief had engaged an outside vendor to assist with the Farook iPhone, asking the ROU Chief, “Why did you do that for?”

While this doesn’t really imply a pattern of deception, it does suggest a willingness and ability on the part of FBI leadership to manipulate the situation to its advantage. A judge saying the likes of Apple must do everything possible to unlock an iPhone, and all forward ramifications of that, would be a tremendous coup for the Bureau and a major blow to user privacy.

The OIG ultimately recommends that the FBI “improve communication and coordination” so that this type of thing doesn’t happen (and it is reportedly doing so). Ironically, if the FBI had communicated to itself a bit better, the court case likely would have continued under pretenses that only its own leadership would know were false.

Powered by WPeMatico

Signal expands into the Signal Foundation with $50M from WhatsApp co-founder Brian Acton

Posted by | encryption, Fundings & Exits, Mobile, nonprofit, privacy, Security, signal, signal foundation, TC, WhatsApp | No Comments

 Perhaps the most surprising thing I learned about Signal when I spoke with Moxie Marlinspike, the app’s creator, last year at Disrupt, was that it was essentially running on a shoestring budget. A tool used by millions and feared by governments worldwide, barely getting by! But $50M from WhatsApp founder Brian Acton should help secure the app’s future. Read More

Powered by WPeMatico

Electric car charge-station payment systems may lack basic security measures

Posted by | automotive, chargers, electric cars, encryption, Gadgets, Security, TC, Transportation | No Comments

 Just a PSA: If you charge your car regularly at a public charge station, you might want to keep an eye out for fraudulent charges on whatever card you use to pay for it. Researchers have found that some charge stations, specifically those that require a dedicated card, “have not implemented basic security mechanisms” like encryption. Read More

Powered by WPeMatico

Signal update keeps your address book secret, keeps it safe

Posted by | Developer, encryption, Mobile, privacy, Security, signal, TC | No Comments

 No one would use a secure messaging service like Signal if you couldn’t find out who else was on it — but how can you trust Signal and others not to snoop when you submit your contacts for it to check against its list of users? You shouldn’t have to — it should be impossible. That’s the intention of an update to the app that makes contact discovery even more private. Read More

Powered by WPeMatico

Confide CEO Jon Brod on the White House, bad press, and what’s next for his secure messaging app

Posted by | Confide, encryption, First Round Capital, jon brod, Mobile, privacy, secure messaging, sv angel | No Comments

 Thursday night, at a StrictlyVC event in San Francisco, I sat down with Confide cofounder and president Jon Brod to talk with him about his decidedly topsy turvy 2017. Though his three-year-old messaging app was the belle of the ball at the start of the year — Wired, the Washington Post, and Axios were among others to note it was a hit with frustrated White House staffers —… Read More

Powered by WPeMatico