General Data Protection Regulation

Europe agrees platform rules to tackle unfair business practices

Posted by | Amazon, Android, antitrust, competition, e-commerce, eBay, EC, eCommerce, Europe, european commission, european parliament, european union, General Data Protection Regulation, Google, google search, Google Shopping, Margrethe Vestager, microsoft store, online marketplaces, online platforms, search engine, search engines, search results | No Comments

The European Union’s political institutions have reached agreement over new rules designed to boost transparency around online platform businesses and curb unfair practices to support traders and other businesses that rely on digital intermediaries for discovery and sales.

The European Commission proposed a regulation for fairness and transparency in online platform trading last April. And late yesterday the European Parliament, Council of the EU and Commission reached a political deal on regulating the business environment of platforms, announcing the accord in a press release today.

The political agreement paves the way for adoption and publication of the regulation, likely later this year. The rules will apply 12 months after that point.

Online platform intermediaries such as ecommerce marketplaces and search engines are covered by the new rules if they provide services to businesses established in the EU and which offer goods or services to consumers located in the EU.

The Commission estimates there are some 7,000 such platforms and marketplaces which will be covered by the regulation, noting this includes “world giants as well as very small start-ups”.

Under the new rules, sudden and unexpected account suspensions will be banned — with the Commission saying platforms will have to provide “clear reasons” for any termination and also possibilities for appeal.

Terms and conditions must also be “easily available and provided in plain and intelligible language”.

There must also be advance notice of changes — of at least 15 days, with longer notice periods applying for more complex changes.

For search engines the focus is on ranking transparency. And on that front dominant search engine Google has attracted more than its fair share of criticism in Europe from a range of rivals (not all of whom are European).

In 2017, the search giant was also slapped with a $2.7BN antitrust fine related to its price comparison service, Google Shopping. The EC found Google had systematically given prominent placement to its own search comparison service while also demoting rival services in search results. (Google rejects the findings and is appealing.)

Given the history of criticism of Google’s platform business practices, and the multi-year regulatory tug of war over anti-competitive impacts, the new transparency provisions look intended to make it harder for a dominant search player to use its market power against rivals.

Changing the online marketplace

The importance of legislating for platform fairness was flagged by the Commission’s antitrust chief, Margrethe Vestager, last summer — when she handed Google another very large fine ($5BN) for anti-competitive behavior related to its mobile platform Android.

Vestager said then she wasn’t sure breaking Google up would be an effective competition fix, preferring to push for remedies to support “more players to have a real go”, as her Android decision attempts to do. But she also stressed the importance of “legislation that will ensure that you have transparency and fairness in the business to platform relationship”.

If businesses have legal means to find out why, for example, their traffic has stopped and what they can do to get it back that will “change the marketplace, and it will change the way we are protected as consumers but also as businesses”, she argued.

Just such a change is now in sight thanks to EU political accord on the issue.

The regulation represents the first such rules for online platforms in Europe and — commissioners’ contend — anywhere in the world.

“Our target is to outlaw some of the most unfair practices and create a benchmark for transparency, at the same time safeguarding the great advantages of online platforms both for consumers and for businesses,” said Andrus Ansip, VP for the EU’s Digital Single Market initiative in a statement.

Elżbieta Bieńkowska, commissioner for internal market, industry, entrepreneurship, and SMEs, added that the rules are “especially designed with the millions of SMEs in mind”.

“Many of them do not have the bargaining muscle to enter into a dispute with a big platform, but with these new rules they have a new safety net and will no longer worry about being randomly kicked off a platform, or intransparent ranking in search results,” she said in another supporting statement.

In a factsheet about the new rules, the Commission specifies they cover third-party ecommerce market places (e.g. Amazon Marketplace, eBay, Fnac Marketplace, etc.); app stores (e.g. Google Play, Apple App Store, Microsoft Store etc.); social media for business (e.g. Facebook pages, Instagram used by makers/artists etc.); and price comparison tools (e.g. Skyscanner, Google Shopping etc.).

The regulation does not target every online platform. For example, it does not cover online advertising (or b2b ad exchanges), payment services, SEO services or services that do not intermediate direct transactions between businesses and consumers.

The Commission also notes that online retailers that sell their own brand products and/or don’t rely on third party sellers on their own platform are also excluded from the regulation, such as retailers of brands or supermarkets.

Where transparency is concerned, the rules require that regulated marketplaces and search engines disclose the main parameters they use to rank goods and services on their site “to help sellers understand how to optimise their presence” — with the Commission saying the aim is to support sellers without allowing gaming of the ranking system.

Some platform business practices will also require mandatory disclosure — such as for platforms that not only provide a marketplace for sellers but sell on their platform themselves, as does Amazon for example.

The ecommerce giant’s use of merchant data remains under scrutiny in the EU. Vestager revealed a preliminary antitrust probe of Amazon last fall — when she said her department was gathering information to “try to get a full picture”. She said her concern is dual platforms could gain an unfair advantage as a consequence of access to merchants’ data.

And, again, the incoming transparency rules look intended to shrink that risk — requiring what the Commission couches as exhaustive disclosure of “any advantage” a platform may give to their own products over others.

“They must also disclose what data they collect, and how they use it — and in particular how such data is shared with other business partners they have,” it continues, noting also that: “Where personal data is concerned, the rules of the GDPR [General Data Protection Regulation] apply.”

(GDPR of course places further transparency requirements on platforms by, for example, empowering individuals to request any personal data held on them, as well as the reasons why their information is being processed.)

The platform regulation also includes new avenues for dispute resolution by requiring platforms set up an internal complaint-handling system to assist business users.

“Only the smallest platforms in terms of head count or turnover will be exempt from this obligation,” the Commission notes. (The exemption limit is set at fewer than 50 staff and less than €10M revenue.)

It also says: “Platforms will have to provide businesses with more options to resolve a potential problem through mediators. This will help resolve more issues out of court, saving businesses time and money.”

But, at the same time, the new rules allow business associations to take platforms to court to stop any non-compliance — mirroring a provision in the GDPR which also allows for collective enforcement and redress of individual privacy rights (where Member States adopt it).

“This will help overcome fear of retaliation, and lower the cost of court cases for individual businesses, when the new rules are not followed,” the Commission argues.

“In addition, Member States can appoint public authorities with enforcement powers, if they wish, and businesses can turn to those authorities.”

One component of the regulation that appears to be being left up to EU Member States to tackle is penalties for non-compliance — with no clear regime of fines set out (as there is in GDPR). So it’s not clear whether the platform regulation might not have rather more bark than bite, at least initially.

“Member States shall need to take measures that are sufficiently dissuasive to ensure that the online intermediation platforms and search engines comply with the requirements in the Regulation,” the Commission writes in a section of its factsheet dealing with how to make sure platforms respect the new rules.

It also points again to the provision allowing business associations or organisations to take action in national courts on behalf of members — saying this offers a legal route to “stop or prohibit non-compliance with one or more of the requirements of the Regulation”. So, er, expect lawsuits.

The Commission says the rules will be subject to review within 18 months after they come into force — in a bid to ensure the regulation keeps pace with fast-paced tech developments.

A dedicated Online Platform Observatory has been established in the EU for the purpose of “monitoring the evolution of the market and the effective implementation of the rules”, it adds.

Powered by WPeMatico

Is Europe closing in on an antitrust fix for surveillance technologists?

Posted by | Android, antitrust, competition law, data protection, data protection law, DCMS committee, digital media, EC, Europe, european commission, european union, Facebook, General Data Protection Regulation, Germany, Giovanni Buttarelli, Google, instagram, Margrethe Vestager, Messenger, photo sharing, privacy, Social, social media, social networks, surveillance capitalism, TC, terms of service, United Kingdom, United States | No Comments

The German Federal Cartel Office’s decision to order Facebook to change how it processes users’ personal data this week is a sign the antitrust tide could at last be turning against platform power.

One European Commission source we spoke to, who was commenting in a personal capacity, described it as “clearly pioneering” and “a big deal”, even without Facebook being fined a dime.

The FCO’s decision instead bans the social network from linking user data across different platforms it owns, unless it gains people’s consent (nor can it make use of its services contingent on such consent). Facebook is also prohibited from gathering and linking data on users from third party websites, such as via its tracking pixels and social plugins.

The order is not yet in force, and Facebook is appealing, but should it come into force the social network faces being de facto shrunk by having its platforms siloed at the data level.

To comply with the order Facebook would have to ask users to freely consent to being data-mined — which the company does not do at present.

Yes, Facebook could still manipulate the outcome it wants from users but doing so would open it to further challenge under EU data protection law, as its current approach to consent is already being challenged.

The EU’s updated privacy framework, GDPR, requires consent to be specific, informed and freely given. That standard supports challenges to Facebook’s (still fixed) entry ‘price’ to its social services. To play you still have to agree to hand over your personal data so it can sell your attention to advertisers. But legal experts contend that’s neither privacy by design nor default.

The only ‘alternative’ Facebook offers is to tell users they can delete their account. Not that doing so would stop the company from tracking you around the rest of the mainstream web anyway. Facebook’s tracking infrastructure is also embedded across the wider Internet so it profiles non-users too.

EU data protection regulators are still investigating a very large number of consent-related GDPR complaints.

But the German FCO, which said it liaised with privacy authorities during its investigation of Facebook’s data-gathering, has dubbed this type of behavior “exploitative abuse”, having also deemed the social service to hold a monopoly position in the German market.

So there are now two lines of legal attack — antitrust and privacy law — threatening Facebook (and indeed other adtech companies’) surveillance-based business model across Europe.

A year ago the German antitrust authority also announced a probe of the online advertising sector, responding to concerns about a lack of transparency in the market. Its work here is by no means done.

Data limits

The lack of a big flashy fine attached to the German FCO’s order against Facebook makes this week’s story less of a major headline than recent European Commission antitrust fines handed to Google — such as the record-breaking $5BN penalty issued last summer for anticompetitive behaviour linked to the Android mobile platform.

But the decision is arguably just as, if not more, significant, because of the structural remedies being ordered upon Facebook. These remedies have been likened to an internal break-up of the company — with enforced internal separation of its multiple platform products at the data level.

This of course runs counter to (ad) platform giants’ preferred trajectory, which has long been to tear modesty walls down; pool user data from multiple internal (and indeed external sources), in defiance of the notion of informed consent; and mine all that personal (and sensitive) stuff to build identity-linked profiles to train algorithms that predict (and, some contend, manipulate) individual behavior.

Because if you can predict what a person is going to do you can choose which advert to serve to increase the chance they’ll click. (Or as Mark Zuckerberg puts it: ‘Senator, we run ads.’)

This means that a regulatory intervention that interferes with an ad tech giant’s ability to pool and process personal data starts to look really interesting. Because a Facebook that can’t join data dots across its sprawling social empire — or indeed across the mainstream web — wouldn’t be such a massive giant in terms of data insights. And nor, therefore, surveillance oversight.

Each of its platforms would be forced to be a more discrete (and, well, discreet) kind of business.

Competing against data-siloed platforms with a common owner — instead of a single interlinked mega-surveillance-network — also starts to sound almost possible. It suggests a playing field that’s reset, if not entirely levelled.

(Whereas, in the case of Android, the European Commission did not order any specific remedies — allowing Google to come up with ‘fixes’ itself; and so to shape the most self-serving ‘fix’ it can think of.)

Meanwhile, just look at where Facebook is now aiming to get to: A technical unification of the backend of its different social products.

Such a merger would collapse even more walls and fully enmesh platforms that started life as entirely separate products before were folded into Facebook’s empire (also, let’s not forget, via surveillance-informed acquisitions).

Facebook’s plan to unify its products on a single backend platform looks very much like an attempt to throw up technical barriers to antitrust hammers. It’s at least harder to imagine breaking up a company if its multiple, separate products are merged onto one unified backend which functions to cross and combine data streams.

Set against Facebook’s sudden desire to technically unify its full-flush of dominant social networks (Facebook Messenger; Instagram; WhatsApp) is a rising drum-beat of calls for competition-based scrutiny of tech giants.

This has been building for years, as the market power — and even democracy-denting potential — of surveillance capitalism’s data giants has telescoped into view.

Calls to break up tech giants no longer carry a suggestive punch. Regulators are routinely asked whether it’s time. As the European Commission’s competition chief, Margrethe Vestager, was when she handed down Google’s latest massive antitrust fine last summer.

Her response then was that she wasn’t sure breaking Google up is the right answer — preferring to try remedies that might allow competitors to have a go, while also emphasizing the importance of legislating to ensure “transparency and fairness in the business to platform relationship”.

But it’s interesting that the idea of breaking up tech giants now plays so well as political theatre, suggesting that wildly successful consumer technology companies — which have long dined out on shiny convenience-based marketing claims, made ever so saccharine sweet via the lure of ‘free’ services — have lost a big chunk of their populist pull, dogged as they have been by so many scandals.

From terrorist content and hate speech, to election interference, child exploitation, bullying, abuse. There’s also the matter of how they arrange their tax affairs.

The public perception of tech giants has matured as the ‘costs’ of their ‘free’ services have scaled into view. The upstarts have also become the establishment. People see not a new generation of ‘cuddly capitalists’ but another bunch of multinationals; highly polished but remote money-making machines that take rather more than they give back to the societies they feed off.

Google’s trick of naming each Android iteration after a different sweet treat makes for an interesting parallel to the (also now shifting) public perceptions around sugar, following closer attention to health concerns. What does its sickly sweetness mask? And after the sugar tax, we now have politicians calling for a social media levy.

Just this week the deputy leader of the main opposition party in the UK called for setting up a standalone Internet regulatory with the power to break up tech monopolies.

Talking about breaking up well-oiled, wealth-concentration machines is being seen as a populist vote winner. And companies that political leaders used to flatter and seek out for PR opportunities find themselves treated as political punchbags; Called to attend awkward grilling by hard-grafting committees, or taken to vicious task verbally at the highest profile public podia. (Though some non-democratic heads of state are still keen to press tech giant flesh.)

In Europe, Facebook’s repeat snubs of the UK parliament’s requests last year for Zuckerberg to face policymakers’ questions certainly did not go unnoticed.

Zuckerberg’s empty chair at the DCMS committee has become both a symbol of the company’s failure to accept wider societal responsibility for its products, and an indication of market failure; the CEO so powerful he doesn’t feel answerable to anyone; neither his most vulnerable users nor their elected representatives. Hence UK politicians on both sides of the aisle making political capital by talking about cutting tech giants down to size.

The political fallout from the Cambridge Analytica scandal looks far from done.

Quite how a UK regulator could successfully swing a regulatory hammer to break up a global Internet giant such as Facebook which is headquartered in the U.S. is another matter. But policymakers have already crossed the rubicon of public opinion and are relishing talking up having a go.

That represents a sea-change vs the neoliberal consensus that allowed competition regulators to sit on their hands for more than a decade as technology upstarts quietly hoovered up people’s data and bagged rivals, and basically went about transforming themselves from highly scalable startups into market-distorting giants with Internet-scale data-nets to snag users and buy or block competing ideas.

The political spirit looks willing to go there, and now the mechanism for breaking platforms’ distorting hold on markets may also be shaping up.

The traditional antitrust remedy of breaking a company along its business lines still looks unwieldy when faced with the blistering pace of digital technology. The problem is delivering such a fix fast enough that the business hasn’t already reconfigured to route around the reset. 

Commission antitrust decisions on the tech beat have stepped up impressively in pace on Vestager’s watch. Yet it still feels like watching paper pushers wading through treacle to try and catch a sprinter. (And Europe hasn’t gone so far as trying to impose a platform break up.) 

But the German FCO decision against Facebook hints at an alternative way forward for regulating the dominance of digital monopolies: Structural remedies that focus on controlling access to data which can be relatively swiftly configured and applied.

Vestager, whose term as EC competition chief may be coming to its end this year (even if other Commission roles remain in potential and tantalizing contention), has championed this idea herself.

In an interview on BBC Radio 4’s Today program in December she poured cold water on the stock question about breaking tech giants up — saying instead the Commission could look at how larger firms got access to data and resources as a means of limiting their power. Which is exactly what the German FCO has done in its order to Facebook. 

At the same time, Europe’s updated data protection framework has gained the most attention for the size of the financial penalties that can be issued for major compliance breaches. But the regulation also gives data watchdogs the power to limit or ban processing. And that power could similarly be used to reshape a rights-eroding business model or snuff out such business entirely.

#GDPR allows imposing a permanent ban on data processing. This is the nuclear option. Much more severe than any fine you can imagine, in most cases. https://t.co/X772NvU51S

— Lukasz Olejnik (@lukOlejnik) January 28, 2019

The merging of privacy and antitrust concerns is really just a reflection of the complexity of the challenge regulators now face trying to rein in digital monopolies. But they’re tooling up to meet that challenge.

Speaking in an interview with TechCrunch last fall, Europe’s data protection supervisor, Giovanni Buttarelli, told us the bloc’s privacy regulators are moving towards more joint working with antitrust agencies to respond to platform power. “Europe would like to speak with one voice, not only within data protection but by approaching this issue of digital dividend, monopolies in a better way — not per sectors,” he said. “But first joint enforcement and better co-operation is key.”

The German FCO’s decision represents tangible evidence of the kind of regulatory co-operation that could — finally — crack down on tech giants.

Blogging in support of the decision this week, Buttarelli asserted: “It is not necessary for competition authorities to enforce other areas of law; rather they need simply to identity where the most powerful undertakings are setting a bad example and damaging the interests of consumers.  Data protection authorities are able to assist in this assessment.”

He also had a prediction of his own for surveillance technologists, warning: “This case is the tip of the iceberg — all companies in the digital information ecosystem that rely on tracking, profiling and targeting should be on notice.”

So perhaps, at long last, the regulators have figured out how to move fast and break things.

Powered by WPeMatico

This early GDPR adtech strike puts the spotlight on consent

Posted by | Advertising Tech, Android, Apps, artificial intelligence, China, data processing, data protection, Europe, european union, Facebook, Fidzup, GDPR, General Data Protection Regulation, Google, location based services, mobile advertising, mobile device, online advertising, privacy, retail, smartphone, TC, terms of service | No Comments

What does consent as a valid legal basis for processing personal data look like under Europe’s updated privacy rules? It may sound like an abstract concern but for online services that rely on things being done with user data in order to monetize free-to-access content this is a key question now the region’s General Data Protection Regulation is firmly fixed in place.

The GDPR is actually clear about consent. But if you haven’t bothered to read the text of the regulation, and instead just go and look at some of the self-styled consent management platforms (CMPs) floating around the web since May 25, you’d probably have trouble guessing it.

Confusing and/or incomplete consent flows aren’t yet extinct, sadly. But it’s fair to say those that don’t offer full opt-in choice are on borrowed time.

Because if your service or app relies on obtaining consent to process EU users’ personal data — as many free at the point-of-use, ad-supported apps do — then the GDPR states consent must be freely given, specific, informed and unambiguous.

That means you can’t bundle multiple uses for personal data under a single opt-in.

Nor can you obfuscate consent behind opaque wording that doesn’t actually specify the thing you’re going to do with the data.

You also have to offer users the choice not to consent. So you cannot pre-tick all the consent boxes that you really wish your users would freely choose — because you have to actually let them do that.

It’s not rocket science but the pushback from certain quarters of the adtech industry has been as awfully predictable as it’s horribly frustrating.

This has not gone unnoticed by consumers either. Europe’s Internet users have been filing consent-based complaints thick and fast this year. And a lot of what is being claimed as ‘GDPR compliant’ right now likely is not.

So, some six months in, we’re essentially in a holding pattern waiting for the regulatory hammers to come down.

But if you look closely there are some early enforcement actions that show some consent fog is starting to shift.

Yes, we’re still waiting on the outcomes of major consent-related complaints against tech giants. (And stockpile popcorn to watch that space for sure.)

But late last month French data protection watchdog, the CNIL, announced the closure of a formal warning it issued this summer against drive-to-store adtech firm, Fidzup — saying it was satisfied it was now GDPR compliant.

Such a regulatory stamp of approval is obviously rare this early in the new legal regime.

So while Fidzup is no adtech giant its experience still makes an interesting case study — showing how the consent line was being crossed; how, working with CNIL, it was able to fix that; and what being on the right side of the law means for a (relatively) small-scale adtech business that relies on consent to enable a location-based mobile marketing business.

From zero to GDPR hero?

Fidzup’s service works like this: It installs kit inside (or on) partner retailers’ physical stores to detect the presence of user-specific smartphones. At the same time it provides an SDK to mobile developers to track app users’ locations, collecting and sharing the advertising ID and wi-fi ID of users’ smartphone (which, along with location, are judged personal data under GDPR.)

Those two elements — detectors in physical stores; and a personal data-gathering SDK in mobile apps — come together to power Fidzup’s retail-focused, location-based ad service which pushes ads to mobile users when they’re near a partner store. The system also enables it to track ad-to-store conversions for its retail partners.

The problem Fidzup had, back in July, was that after an audit of its business the CNIL deemed it did not have proper consent to process users’ geolocation data to target them with ads.

Fidzup says it had thought its business was GDPR compliant because it took the view that app publishers were the data processors gathering consent on its behalf; the CNIL warning was a wake up call that this interpretation was incorrect — and that it was responsible for the data processing and so also for collecting consents.

The regulator found that when a smartphone user installed an app containing Fidzup’s SDK they were not informed that their location and mobile device ID data would be used for ad targeting, nor the partners Fidzup was sharing their data with.

CNIL also said users should have been clearly informed before data was collected — so they could choose to consent — instead of information being given via general app conditions (or in store posters), as was the case, after the fact of the processing.

It also found users had no choice to download the apps without also getting Fidzup’s SDK, with use of such an app automatically resulting in data transmission to partners.

Fidzup’s approach to consent had also only been asking users to consent to the processing of their geolocation data for the specific app they had downloaded — not for the targeted ad purposes with retail partners which is the substance of the firm’s business.

So there was a string of issues. And when Fidzup was hit with the warning the stakes were high, even with no monetary penalty attached. Because unless it could fix the core consent problem, the 2014-founded startup might have faced going out of business. Or having to change its line of business entirely.

Instead it decided to try and fix the consent problem by building a GDPR-compliant CMP — spending around five months liaising with the regulator, and finally getting a green light late last month.

A core piece of the challenge, as co-founder and CEO Olivier Magnan-Saurin tells it, was how to handle multiple partners in this CMP because its business entails passing data along the chain of partners — each new use and partner requiring opt-in consent.

“The first challenge was to design a window and a banner for multiple data buyers,” he tells TechCrunch. “So that’s what we did. The challenge was to have something okay for the CNIL and GDPR in terms of wording, UX etc. And, at the same time, some things that the publisher will allow to and will accept to implement in his source code to display to his users because he doesn’t want to scare them or to lose too much.

“Because they get money from the data that we buy from them. So they wanted to get the maximum money that they can, because it’s very difficult for them to live without the data revenue. So the challenge was to reconcile the need from the CNIL and the GDPR and from the publishers to get something acceptable for everyone.”

As a quick related aside, it’s worth noting that Fidzup does not work with the thousands of partners an ad exchange or demand-side platform most likely would be.

Magnan-Saurin tells us its CMP lists 460 partners. So while that’s still a lengthy list to have to put in front of consumers — it’s not, for example, the 32,000 partners of another French adtech firm, Vectaury, which has also recently been on the receiving end of an invalid consent ruling from the CNIL.

In turn, that suggests the ‘Fidzup fix’, if we can call it that, only scales so far; adtech firms that are routinely passing millions of people’s data around thousands of partners look to have much more existential problems under GDPR — as we’ve reported previously re: the Vectaury decision.

No consent without choice

Returning to Fidzup, its fix essentially boils down to actually offering people a choice over each and every data processing purpose, unless it’s strictly necessary for delivering the core app service the consumer was intending to use.

Which also means giving app users the ability to opt out of ads entirely — and not be penalized by not being able to use the app features itself.

In short, you can’t bundle consent. So Fidzup’s CMP unbundles all the data purposes and partners to offer users the option to consent or not.

“You can unselect or select each purpose,” says Magnan-Saurin of the now compliant CMP. “And if you want only to send data for, I don’t know, personalized ads but you don’t want to send the data to analyze if you go to a store or not, you can. You can unselect or select each consent. You can also see all the buyers who buy the data. So you can say okay I’m okay to send the data to every buyer but I can also select only a few or none of them.”

“What the CNIL ask is very complicated to read, I think, for the final user,” he continues. “Yes it’s very precise and you can choose everything etc. But it’s very complete and you have to spend some time to read everything. So we were [hoping] for something much shorter… but now okay we have something between the initial asking for the CNIL — which was like a big book — and our consent collection before the warning which was too short with not the right information. But still it’s quite long to read.”

Fidzup’s CNIL approved GDPR-compliant consent management platform

“Of course, as a user, I can refuse everything. Say no, I don’t want my data to be collected, I don’t want to send my data. And I have to be able, as a user, to use the app in the same way as if I accept or refuse the data collection,” he adds.

He says the CNIL was very clear on the latter point — telling it they could not require collection of geolocation data for ad targeting for usage of the app.

“You have to provide the same service to the user if he accepts or not to share his data,” he emphasizes. “So now the app and the geolocation features [of the app] works also if you refuse to send the data to advertisers.”

This is especially interesting in light of the ‘forced consent’ complaints filed against tech giants Facebook and Google earlier this year.

These complaints argue the companies should (but currently do not) offer an opt-out of targeted advertising, because behavioural ads are not strictly necessary for their core services (i.e. social networking, messaging, a smartphone platform etc).

Indeed, data gathering for such non-core service purposes should require an affirmative opt-in under GDPR. (An additional GDPR complaint against Android has also since attacked how consent is gathered, arguing it’s manipulative and deceptive.)

Asked whether, based on his experience working with the CNIL to achieve GDPR compliance, it seems fair that a small adtech firm like Fidzup has had to offer an opt-out when a tech giant like Facebook seemingly doesn’t, Magnan-Saurin tells TechCrunch: “I’m not a lawyer but based on what the CNIL asked us to be in compliance with the GDPR law I’m not sure that what I see on Facebook as a user is 100% GDPR compliant.”

“It’s better than one year ago but [I’m still not sure],” he adds. “Again it’s only my feeling as a user, based on the experience I have with the French CNIL and the GDPR law.”

Facebook of course maintains its approach is 100% GDPR compliant.

Even as data privacy experts aren’t so sure.

One thing is clear: If the tech giant was forced to offer an opt out for data processing for ads it would clearly take a big chunk out of its business — as a sub-set of users would undoubtedly say no to Zuckerberg’s “ads”. (And if European Facebook users got an ads opt out you can bet Americans would very soon and very loudly demand the same, so…)

Bridging the privacy gap

In Fidzup’s case, complying with GDPR has had a major impact on its business because offering a genuine choice means it’s not always able to obtain consent. Magnan-Saurin says there is essentially now a limit on the number of device users advertisers can reach because not everyone opts in for ads.

Although, since it’s been using the new CMP, he says a majority are still opting in (or, at least, this is the case so far) — showing one consent chart report with a ~70:30 opt-in rate, for example.

He expresses the change like this: “No one in the world can say okay I have 100% of the smartphones in my data base because the consent collection is more complete. No one in the world, even Facebook or Google, could say okay, 100% of the smartphones are okay to collect from them geolocation data. That’s a huge change.”

“Before that there was a race to the higher reach. The biggest number of smartphones in your database,” he continues. “Today that’s not the point.”

Now he says the point for adtech businesses with EU users is figuring out how to extrapolate from the percentage of user data they can (legally) collect to the 100% they can’t.

And that’s what Fidzup has been working on this year, developing machine learning algorithms to try to bridge the data gap so it can still offer its retail partners accurate predictions for tracking ad to store conversions.

“We have algorithms based on the few thousand stores that we equip, based on the few hundred mobile advertising campaigns that we have run, and we can understand for a store in London in… sports, fashion, for example, how many visits we can expect from the campaign based on what we can measure with the right consent,” he says. “That’s the first and main change in our market; the quantity of data that we can get in our database.”

“Now the challenge is to be as accurate as we can be without having 100% of real data — with the consent, and the real picture,” he adds. “The accuracy is less… but not that much. We have a very, very high standard of quality on that… So now we can assure the retailers that with our machine learning system they have nearly the same quality as they had before.

“Of course it’s not exactly the same… but it’s very close.”

Having a CMP that’s had regulatory ‘sign-off’, as it were, is something Fidzup is also now hoping to turn into a new bit of additional business.

“The second change is more like an opportunity,” he suggests. “All the work that we have done with CNIL and our publishers we have transferred it to a new product, a CMP, and we offer today to all the publishers who ask to use our consent management platform. So for us it’s a new product — we didn’t have it before. And today we are the only — to my knowledge — the only company and the only CMP validated by the CNIL and GDPR compliant so that’s useful for all the publishers in the world.”

It’s not currently charging publishers to use the CMP but will be seeing whether it can turn it into a paid product early next year.

How then, after months of compliance work, does Fidzup feel about GDPR? Does it believe the regulation is making life harder for startups vs tech giants — as is sometimes suggested, with claims put forward by certain lobby groups that the law risks entrenching the dominance of better resourced tech giants. Or does he see any opportunities?

In Magnan-Saurin’s view, six months in to GDPR European startups are at an R&D disadvantage vs tech giants because U.S. companies like Facebook and Google are not (yet) subject to a similarly comprehensive privacy regulation at home — so it’s easier for them to bag up user data for whatever purpose they like.

Though it’s also true that U.S. lawmakers are now paying earnest attention to the privacy policy area at a federal level. (And Google’s CEO faced a number of tough questions from Congress on that front just this week.)

“The fact is Facebook-Google they own like 90% of the revenue in mobile advertising in the world. And they are American. So basically they can do all their research and development on, for example, American users without any GDPR regulation,” he says. “And then apply a pattern of GDPR compliance and apply the new product, the new algorithm, everywhere in the world.

“As a European startup I can’t do that. Because I’m a European. So once I begin the research and development I have to be GDPR compliant so it’s going to be longer for Fidzup to develop the same thing as an American… But now we can see that GDPR might be beginning a ‘world thing’ — and maybe Facebook and Google will apply the GDPR compliance everywhere in the world. Could be. But it’s their own choice. Which means, for the example of the R&D, they could do their own research without applying the law because for now U.S. doesn’t care about the GDPR law, so you’re not outlawed if you do R&D without applying GDPR in the U.S. That’s the main difference.”

He suggests some European startups might relocate R&D efforts outside the region to try to workaround the legal complexity around privacy.

“If the law is meant to bring the big players to better compliance with privacy I think — yes, maybe it goes in this way. But the first to suffer is the European companies, and it becomes an asset for the U.S. and maybe the Chinese… companies because they can be quicker in their innovation cycles,” he suggests. “That’s a fact. So what could happen is maybe investors will not invest that much money in Europe than in U.S. or in China on the marketing, advertising data subject topics. Maybe even the French companies will put all the R&D in the U.S. and destroy some jobs in Europe because it’s too complicated to do research on that topics. Could be impacts. We don’t know yet.”

But the fact of GDPR enforcement having — perhaps inevitably — started small, with so far a small bundle of warnings against relative data minnows, rather than any swift action against the industry dominating adtech giants, that’s being felt as yet another inequality at the startup coalface.

“What’s sure is that the CNIL started to send warnings not to Google or Facebook but to startups. That’s what I can see,” he says. “Because maybe it’s easier to see I’m working on GDPR and everything but the fact is the law is not as complicated for Facebook and Google as it is for the small and European companies.”

Powered by WPeMatico

Popular avatar app Boomoji exposed millions of users’ contact lists and location data

Posted by | Android, california, database, General Data Protection Regulation, privacy, Security, social media, Software, spokesperson, web browser | No Comments

Popular animated avatar creator app Boomoji, with more than five million users across the world, exposed the personal data of its entire user base after it failed to put passwords on two of its internet-facing databases.

The China-based app developer left the ElasticSearch databases online without passwords — a U.S.-based database for its international customers and a Hong Kong-based database containing mostly Chinese users’ data in an effort to comply with China’s data security laws, which requires Chinese citizens’ data to be located on servers inside the country.

Anyone who knew where to look could access, edit or delete the database using their web browser. And, because the database was listed on Shodan, a search engine for exposed devices and databases, they were easily found with a few keywords.

After TechCrunch reached out, Boomoji pulled the two databases offline. “These two accounts were made by us for testing purposes,” said an unnamed Boomoji spokesperson in an email.

But that isn’t true.

The database contained records on all of the company’s iOS and Android users — some 5.3 million users as of this week. Each record contained their username, gender, country and phone type.

Each record also included a user’s unique Boomoji ID, which was linked to other tables in the database. Those other tables included if and which school they go to — a feature Boomoji touts as a way for users to get in touch with their fellow students. That unique ID also included the precise geolocation of more than 375,000 users that had allowed the app to know their location at any given time.

Worse, the database contained every phone book entry of every user who had allowed the app access to their contacts.

One table had more than 125 million contacts, including their names (as written in a user’s phone book) and their phone numbers. Each record was linked to a Boomoji’s unique ID, making it relatively easy to know whose contact list belonged to whom.

Even if you didn’t use the app, anyone who has your phone number stored on their device and used the app more than likely uploaded your number to Boomoji’s database. To our knowledge, there’s no way to opt out or have your information deleted.

Given Boomoji’s response, we verified the contents of the database by downloading the app on a dedicated iPhone using a throwaway phone number, containing a few dummy, but easy-to-search contact list entries. To find friends, the app matches your contacts with those registered with the app in its database. When we were prompted to allow the app access to our contacts list, the entire dummy contact list was uploaded instantly — and viewable in the database.

So long as the app was installed and had access to the contacts, new phone numbers would be automatically uploaded.

Yet, none of the data was encrypted. All of the data was stored in plaintext.

Although Boomoji is based in China, it claims to follow California state law, where data protection and privacy rules are some of the strongest in the U.S. We asked Boomoji if it has or plans to inform California’s attorney general of the exposure as required by state law, but the company did not answer.

Given the vast amount of European users’ information in the database, the company may also face penalties under the EU’s General Data Protection Regulation, which can impose fines of up to four percent of the company’s global annual revenue for serious breaches.

But given its China-based presence, it’s not clear, however, what actionable repercussions the company could face.

This is the latest in a series of exposures involving ElasticSearch instances, a popular open source search and database software. In recent weeks, several high-profile data exposures have been reported as a result of companies’ failure to practice basic data security measures — including Urban Massage exposing its own customer database, Mindbody-owned FitMetrix forgetting to put a password on its servers and Voxox, a communications company, which leaked phone numbers and two-factor codes on millions of unsuspecting users.


Got a tip? You can send tips securely over Signal and WhatsApp to +1 646-755–8849. You can also send PGP email with the fingerprint: 4D0E 92F2 E36A EC51 DAAE 5D97 CB8C 15FA EB6C EEA5.

Powered by WPeMatico

Google faces GDPR complaint over ‘deceptive’ location tracking

Posted by | Android, Apps, Europe, european union, GDPR, General Data Protection Regulation, Google, google search, Mobile, Norwegian Consumer Council, privacy, smartphones, TC | No Comments

A group of European consumer watchdogs has filed a privacy complaint against Google — arguing the company uses manipulative tactics in order to keep tracking web users’ locations for ad-targeting purposes.

The consumer organizations are making the complaint under the EU’s new data protection framework, GDPR, which regulators can use to levy major fines for compliance breaches — of up to 4 percent of a company’s global annual turnover.

Under GDPR, a consent-based legal basis for processing personal data (e.g. person’s location) must be specific, informed and freely given.

In their complaint, the groups, which include Norway’s Consumer Council, argue that Google does not have proper legal basis to track users through “Location History” and “Web & App Activity” — settings which are integrated into all Google accounts, and which, for users of Android -based smartphones, they assert are particularly difficult to avoid.

The Google mobile OS remains the dominant smartphone platform globally, as well as across Europe.

“Google is processing incredibly detailed and extensive personal data without proper legal grounds, and the data has been acquired through manipulation techniques,” said Gro Mette Moen, acting head of the Norwegian Consumer Council’s digital services unit in a statement.

“When we carry our phones, Google is recording where we go, down to which floor we are on and how we are moving. This can be combined with other information about us, such as what we search for, and what websites we visit. Such information can in turn be used for things such as targeted advertising meant to affect us when we are receptive or vulnerable.”

Responding to the complaint, a Google spokesperson sent TechCrunch the following statement:

Location History is turned off by default, and you can edit, delete, or pause it at any time. If it’s on, it helps improve services like predicted traffic on your commute. If you pause it, we make clear that — depending on your individual phone and app settings — we might still collect and use location data to improve your Google experience. We enable you to control location data in other ways too, including in a different Google setting called Web & App Activity, and on your device. We’re constantly working to improve our controls, and we’ll be reading this report closely to see if there are things we can take on board.

Earlier this year the Norwegian watchdog produced a damning report calling out dark pattern design tricks being deployed by Google and Facebook meant to manipulate users by nudging them toward “privacy intrusive options.” It also examined Microsoft’s consent flows, but judged the company to be leaning less heavily on such unfair tactics.

Among the underhand techniques that the Google-targeted GDPR complaint, which draws on the earlier report, calls out are allegations of deceptive click-flow, with the groups noting that a “location history” setting can be enabled during Android set-up without a user being aware of it; key settings being both buried in menus (hidden) and enabled by default; users being presented at the decision point with insufficient and misleading information; repeat nudges to enable location tracking even after a user has previously turned it off; and the bundling of “invasive location tracking” with other unrelated Google services, such as photo sorting by location.

GDPR remains in the early implementation phrase — just six months since the regulation came into force across Europe. But a large chunk of the first wave of complaints have been focused on consent, according to Europe’s data protection supervisor, who also told us in October that more than 42,000 complaints had been lodged in total since the regulation came into force.

Where Google is concerned, the location complaint is by no means the only GDPR — or GDPR consent-related — complaint it’s facing.

Another complaint, filed back in May also by a consumer-focused organization, took aim at what it dubbed the use of “forced consent” by Google and Facebook — pointing out that the companies were offering users no choice but to have their personal data processed to make use of certain services, yet the GDPR requires consent to be freely given.

Powered by WPeMatico

Facebook, Google face first GDPR complaints over ‘forced consent’

Posted by | Advertising Tech, Android, data protection, Europe, european union, Facebook, General Data Protection Regulation, Google, instagram, lawsuit, Mark Zuckerberg, Max Schrems, privacy, Social, social network, social networking, terms of service, WhatsApp | No Comments

After two years coming down the pipe at tech giants, Europe’s new privacy framework, the General Data Protection Regulation (GDPR), is now being applied — and long time Facebook privacy critic, Max Schrems, has wasted no time in filing four complaints relating to (certain) companies’ ‘take it or leave it’ stance when it comes to consent.

The complaints have been filed on behalf of (unnamed) individual users — with one filed against Facebook; one against Facebook-owned Instagram; one against Facebook-owned WhatsApp; and one against Google’s Android.

Schrems argues that the companies are using a strategy of “forced consent” to continue processing the individuals’ personal data — when in fact the law requires that users be given a free choice unless a consent is strictly necessary for provision of the service. (And, well, Facebook claims its core product is social networking — rather than farming people’s personal data for ad targeting.)

“It’s simple: Anything strictly necessary for a service does not need consent boxes anymore. For everything else users must have a real choice to say ‘yes’ or ‘no’,” Schrems writes in a statement.

“Facebook has even blocked accounts of users who have not given consent,” he adds. “In the end users only had the choice to delete the account or hit the “agree”-button — that’s not a free choice, it more reminds of a North Korean election process.”

We’ve reached out to all the companies involved for comment and will update this story with any response. Update: Facebook has now sent the following statement, attributed to its chief privacy officer, Erin Egan: “We have prepared for the past 18 months to ensure we meet the requirements of the GDPR. We have made our policies clearer, our privacy settings easier to find and introduced better tools for people to access, download, and delete their information. Our work to improve people’s privacy doesn’t stop on May 25th. For example, we’re building Clear History: a way for everyone to see the websites and apps that send us information when you use them, clear this information from your account, and turn off our ability to store it associated with your account going forward.”

Schrems most recently founded a not-for-profit digital rights organization to focus on strategic litigation around the bloc’s updated privacy framework, and the complaints have been filed via this crowdfunded NGO — which is called noyb (aka ‘none of your business’).

As we pointed out in our GDPR explainer, the provision in the regulation allowing for collective enforcement of individuals’ data rights is an important one, with the potential to strengthen the implementation of the law by enabling non-profit organizations such as noyb to file complaints on behalf of individuals — thereby helping to redress the power imbalance between corporate giants and consumer rights.

That said, the GDPR’s collective redress provision is a component that Member States can choose to derogate from, which helps explain why the first four complaints have been filed with data protection agencies in Austria, Belgium, France and Hamburg in Germany — regions that also have data protection agencies with a strong record of defending privacy rights.

Given that the Facebook companies involved in these complaints have their European headquarters in Ireland it’s likely the Irish data protection agency will get involved too. And it’s fair to say that, within Europe, Ireland does not have a strong reputation as a data protection rights champion.

But the GDPR allows for DPAs in different jurisdictions to work together in instances where they have joint concerns and where a service crosses borders — so noyb’s action looks intended to test this element of the new framework too.

Under the penalty structure of GDPR, major violations of the law can attract fines as large as 4% of a company’s global revenue which, in the case of Facebook or Google, implies they could be on the hook for more than a billion euros apiece — if they are deemed to have violated the law, as the complaints argue.

That said, given how freshly fixed in place the rules are, some EU regulators may well tread softly on the enforcement front — at least in the first instances, to give companies some benefit of the doubt and/or a chance to make amends to come into compliance if they are deemed to be falling short of the new standards.

However, in instances where companies themselves appear to be attempting to deform the law with a willfully self-serving interpretation of the rules, regulators may feel they need to act swiftly to nip any disingenuousness in the bud.

“We probably will not immediately have billions of penalty payments, but the corporations have intentionally violated the GDPR, so we expect a corresponding penalty under GDPR,” writes Schrems.

Only yesterday, for example, Facebook founder Mark Zuckerberg — speaking in an on stage interview at the VivaTech conference in Paris — claimed his company hasn’t had to make any radical changes to comply with GDPR, and further claimed that a “vast majority” of Facebook users are willingly opting in to targeted advertising via its new consent flow.

“We’ve been rolling out the GDPR flows for a number of weeks now in order to make sure that we were doing this in a good way and that we could take into account everyone’s feedback before the May 25 deadline. And one of the things that I’ve found interesting is that the vast majority of people choose to opt in to make it so that we can use the data from other apps and websites that they’re using to make ads better. Because the reality is if you’re willing to see ads in a service you want them to be relevant and good ads,” said Zuckerberg.

He did not mention that the dominant social network does not offer people a free choice on accepting or declining targeted advertising. The new consent flow Facebook revealed ahead of GDPR only offers the ‘choice’ of quitting Facebook entirely if a person does not want to accept targeting advertising. Which, well, isn’t much of a choice given how powerful the network is. (Additionally, it’s worth pointing out that Facebook continues tracking non-users — so even deleting a Facebook account does not guarantee that Facebook will stop processing your personal data.)

Asked about how Facebook’s business model will be affected by the new rules, Zuckerberg essentially claimed nothing significant will change — “because giving people control of how their data is used has been a core principle of Facebook since the beginning”.

“The GDPR adds some new controls and then there’s some areas that we need to comply with but overall it isn’t such a massive departure from how we’ve approached this in the past,” he claimed. “I mean I don’t want to downplay it — there are strong new rules that we’ve needed to put a bunch of work into making sure that we complied with — but as a whole the philosophy behind this is not completely different from how we’ve approached things.

“In order to be able to give people the tools to connect in all the ways they want and build community a lot of philosophy that is encoded in a regulation like GDPR is really how we’ve thought about all this stuff for a long time. So I don’t want to understate the areas where there are new rules that we’ve had to go and implement but I also don’t want to make it seem like this is a massive departure in how we’ve thought about this stuff.”

Zuckerberg faced a range of tough questions on these points from the EU parliament earlier this week. But he avoided answering them in any meaningful detail.

So EU regulators are essentially facing a first test of their mettle — i.e. whether they are willing to step up and defend the line of the law against big tech’s attempts to reshape it in their business model’s image.

Privacy laws are nothing new in Europe but robust enforcement of them would certainly be a breath of fresh air. And now at least, thanks to GDPR, there’s a penalties structure in place to provide incentives as well as teeth, and spin up a market around strategic litigation — with Schrems and noyb in the vanguard.

Schrems also makes the point that small startups and local companies are less likely to be able to use the kind of strong-arm ‘take it or leave it’ tactics on users that big tech is able to unilaterally apply and extract ‘consent’ as a consequence of the reach and power of their platforms — arguing there’s an underlying competition concern that GDPR could also help to redress.

“The fight against forced consent ensures that the corporations cannot force users to consent,” he writes. “This is especially important so that monopolies have no advantage over small businesses.”

Powered by WPeMatico

IoT redux… this time, it’s personal

Posted by | Column, data privacy, Gadgets, General Data Protection Regulation, Internet of Things, Internet of Things Consortium, IoT, TC | No Comments

iot internet of things city As we advance toward 2017, a quick analysis of our connected landscape indicates that the IoT is taking a short break from destroying the internet and is now letting Mark Zuckerberg take a shot at it. All joking aside, the process of rapidly restructuring our world via technology — and functioning in the flux — has become our new normal. More of the same is on tap in 2017. Read More

Powered by WPeMatico