national security

UK report blasts Huawei for network security incompetence

Posted by | 5g, 5G network security, Asia, China, Ciaran Martin, computer security, cyberattack, cybercrime, ernst & young, Europe, european union, huawei, Mobile, National Cyber Security Centre, national security, Security, telecommunications, UK government, United Kingdom | No Comments

The latest report by a UK oversight body set up to evaluation Chinese networking giant Huawei’s approach to security has dialled up pressure on the company, giving a damning assessment of what it describes as “serious and systematic defects” in its software engineering and cyber security competence.

Although the report falls short of calling for an outright ban on Huawei equipment in domestic networks — an option U.S. president Trump continues dangling across the pond.

The report, prepared for the National Security Advisor of the UK by the Huawei Cyber Security Evaluation Centre (HCSEC) Oversight Board, also identifies new “significant technical issues” which it says lead to new risks for UK telecommunications networks using Huawei kit.

The HCSEC was set up by Huawei in 2010, under what the oversight board couches as “a set of arrangements with the UK government”, to provide information to state agencies on its products and strategies in order that security risks could be evaluated.

And last year, under pressure from UK security agencies concerned about technical deficiencies in its products, Huawei pledged to spend $2BN to try to address long-running concerns about its products in the country.

But the report throws doubt on its ability to address UK concerns — with the board writing that it has “not yet seen anything to give it confidence in Huawei’s capacity to successfully complete the elements of its transformation programme that it has proposed as a means of addressing these underlying defects”.

So it sounds like $2BN isn’t going to be nearly enough to fix Huawei’s security problem in just one European country.

The board also writes that it will require “sustained evidence” of better software engineering and cyber security “quality”, verified by HCSEC and the UK’s National Cyber Security Centre (NCSC), if there’s to be any possibility of it reaching a different assessment of the company’s ability to reboot its security credentials.

While another damning assessment contained in the report is that Huawei has made “no material progress” on issues raised by last year’s report.

All the issues identified by the security evaluation process relate to “basic engineering competence and cyber security hygiene”, which the board notes gives rise to vulnerabilities capable of being exploited by “a range of actors”.

It adds that the NCSC does not believe the defects found are a result of Chinese state interference.

This year’s report is the fifth the oversight board has produced since it was established in 2014, and it comes at a time of acute scrutiny for Huawei, as 5G network rollouts are ramping up globally — pushing governments to address head on suspicions attached to the Chinese giant and consider whether to trust it with critical next-gen infrastructure.

“The Oversight Board advises that it will be difficult to appropriately risk-manage future products in the context of UK deployments, until the underlying defects in Huawei’s software engineering and cyber security processes are remediated,” the report warns in one of several key conclusions that make very uncomfortable reading for Huawei.

“Overall, the Oversight Board can only provide limited assurance that all risks to UK national security from Huawei’s involvement in the UK’s critical networks can be sufficiently mitigated long-term,” it adds in summary.

Reached for its response to the report, a Huawei UK spokesperson sent us a statement in which it describes the $2BN earmarked for security improvements related to UK products as an “initial budget”.

It writes:

The 2019 OB [oversight board] report details some concerns about Huawei’s software engineering capabilities. We understand these concerns and take them very seriously. The issues identified in the OB report provide vital input for the ongoing transformation of our software engineering capabilities. In November last year Huawei’s Board of Directors issued a resolution to carry out a companywide transformation programme aimed at enhancing our software engineering capabilities, with an initial budget of US$2BN.

A high-level plan for the programme has been developed and we will continue to work with UK operators and the NCSC during its implementation to meet the requirements created as cloud, digitization, and software-defined everything become more prevalent. To ensure the ongoing security of global telecom networks, the industry, regulators, and governments need to work together on higher common standards for cyber security assurance and evaluation.

Seeking to find something positive to salvage from the report’s savaging, Huawei suggests it demonstrates the continued effectiveness of the HCSEC as a structure to evaluate and mitigate security risk — flagging a description where the board writes that it’s “arguably the toughest and most rigorous in the world”, and which Huawei claims shows at least there hasn’t been any increase in vulnerability of UK networks since the last report.

Though the report does identify new issues that open up fresh problems — albeit the underlying issues were presumably there last year too, just laying undiscovered.

The board’s withering assessment certainly amps up the pressure on Huawei which has been aggressively battling U.S.-led suspicion of its kit — claiming in a telecoms conference speech last month that “the U.S. security accusation of our 5G has no evidence”, for instance.

At the same time it has been appealing for the industry to work together to come up with collective processes for evaluating the security and trustworthiness of network kit.

And earlier this month it opened another cyber security transparency center — this time at the heart of Europe in Brussels, where the company has been lobbying policymakers to help establish security standards to foster collective trust. Though there’s little doubt that’s a long game.

Meanwhile, critics of Huawei can now point to impatience rising in the U.K., despite comments by the head of the NCSC, Ciaran Martin, last month — who said then that security agencies believe the risk of using Huawei kit can be managed, suggesting the government won’t push for an outright ban.

The report does not literally overturn that view but it does blast out a very loud and alarming warning about the difficulty for UK operators to “appropriately” risk-manage what’s branded defective and vulnerable Huawei kit. Including flagging the risk of future products — which the board suggests will be increasingly complex to manage. All of which could well just push operators to seek alternatives.

On the mitigation front, the board writes that — “in extremis” — the NCSC could order Huawei to carry out specific fixes for equipment currently installed in the UK. Though it also warns that such a step would be difficult, and could for example require hardware replacement which may not mesh with operators “natural” asset management and upgrades cycles, emphasizing it does not offer a sustainable solution to the underlying technical issues.

“Given both the shortfalls in good software engineering and cyber security practice and the currently unknown trajectory of Huawei’s R&D processes through their announced transformation plan, it is highly likely that security risk management of products that are new to the UK or new major releases of software for products currently in the UK will be more difficult,” the board writes in a concluding section discussing the UK national security risk.

“On the basis of the work already carried out by HCSEC, the NCSC considers it highly likely that there would be new software engineering and cyber security issues in products HCSEC has not yet examined.”

It also describes the number and severity of vulnerabilities plus architectural and build issues discovered by a relatively small team in the HCSEC as “a particular concern”.

“If an attacker has knowledge of these vulnerabilities and sufficient access to exploit them, they may be able to affect the operation of the network, in some cases causing it to cease operating correctly,” it warns. “Other impacts could include being able to access user traffic or reconfiguration of the network elements.”

In another section on mitigating risks of using Huawei kit, the board notes that “architectural controls” in place in most UK operators can limit the ability of attackers to exploit any vulnerable network elements not explicitly exposed to the public Internet — adding that such controls, combined with good opsec generally, will “remain critically important in the coming years to manage the residual risks caused by the engineering defects identified”.

In other highlights from the report the board does have some positive things to say, writing that an NCSC technical review of its capabilities showed improvements in 2018, while another independent audit of HCSEC’s ability to operate independently of Huawei HQ once again found “no high or medium priority findings”.

“The audit report identified one low-rated finding, relating to delivery of information and equipment within agreed Service Level Agreements. Ernst & Young concluded that there were no major concerns and the Oversight Board is satisfied that HCSEC is operating in line with the 2010 arrangements between HMG and the company,” it further notes.

Last month the European Commissioner said it was preparing to step in to ensure a “common approach” across the European Union where 5G network security is concerned — warning of the risk of fragmentation across the single market. Though it has so far steered clear of any bans.

Earlier this week it issued a set of recommendations for Member States, combining legislative and policy measures to assess 5G network security risks and help strengthen preventive measures.

Among the operational measures it suggests Member States take is to complete a national risk assessment of 5G network infrastructures by the end of June 2019, and follow that by updating existing security requirements for network providers — including conditions for ensuring the security of public networks.

“These measures should include reinforced obligations on suppliers and operators to ensure the security of the networks,” it recommends. “The national risk assessments and measures should consider various risk factors, such as technical risks and risks linked to the behaviour of suppliers or operators, including those from third countries. National risk assessments will be a central element towards building a coordinated EU risk assessment.”  

At an EU level the Commission said Member States should share information on network security, saying this “coordinated work should support Member States’ actions at national level and provide guidance to the Commission for possible further steps at EU level” — leaving the door open for further action.

While the EU’s executive body has not pushed for a pan-EU ban on any 5G vendors it did restate Member States’ right to exclude companies from their markets for national security reasons if they fail to comply with their own standards and legal framework.

Powered by WPeMatico

Law enforcement needs to protect citizens and their data

Posted by | Android, Australia, Column, computer security, crypto wars, cryptography, encryption, european union, Facebook, Federal Bureau of Investigation, General Data Protection Regulation, human rights, law, law enforcement, national security, privacy, Security, United Kingdom | No Comments
Robert Anderson
Contributor

Robert Anderson served for 21 years in the FBI, retiring as executive assistant director of the Criminal, Cyber, Response and Services Branch. He is currently an advisor at The Chertoff Group and the chief executive of Cyber Defense Labs.

Over the past several years, the law enforcement community has grown increasingly concerned about the conduct of digital investigations as technology providers enhance the security protections of their offerings—what some of my former colleagues refer to as “going dark.”

Data once readily accessible to law enforcement is now encrypted, protecting consumers’ data from hackers and criminals. However, these efforts have also had what Android’s security chief called the “unintended side effect” of also making this data inaccessible to law enforcement. Consequently, many in the law enforcement community want the ability to compel providers to allow them to bypass these protections, often citing physical and national security concerns.

I know first-hand the challenges facing law enforcement, but these concerns must be addressed in a broader security context, one that takes into consideration the privacy and security needs of industry and our citizens in addition to those raised by law enforcement.

Perhaps the best example of the law enforcement community’s preferred solution is Australia’s recently passed Assistance and Access Bill, an overly-broad law that allows Australian authorities to compel service providers, such as Google and Facebook, to re-engineer their products and bypass encryption protections to allow law enforcement to access customer data.

While the bill includes limited restrictions on law enforcement requests, the vague definitions and concentrated authorities give the Australian government sweeping powers that ultimately undermine the security and privacy of the very citizens they aim to protect. Major tech companies, such as Apple and Facebook, agree and have been working to resist the Australian legislation and a similar bill in the UK.

Image: Bryce Durbin/TechCrunch

Newly created encryption backdoors and work-arounds will become the target of criminals, hackers, and hostile nation states, offering new opportunities for data compromise and attack through the newly created tools and the flawed code that inevitably accompanies some of them. These vulnerabilities undermine providers’ efforts to secure their customers’ data, creating new and powerful vulnerabilities even as companies struggle to address existing ones.

And these vulnerabilities would not only impact private citizens, but governments as well, including services and devices used by the law enforcement and national security communities. This comes amidst government efforts to significantly increase corporate responsibility for the security of customer data through laws such as the EU’s General Data Protection Regulation. Who will consumers, or the government, blame when a government-mandated backdoor is used by hackers to compromise user data? Who will be responsible for the damage?

Companies have a fiduciary responsibility to protect their customers’ data, which not only includes personally identifiable information (PII), but their intellectual property, financial data, and national security secrets.

Worse, the vulnerabilities created under laws such as the Assistance and Access Bill would be subject almost exclusively to the decisions of law enforcement authorities, leaving companies unable to make their own decisions about the security of their products. How can we expect a company to protect customer data when their most fundamental security decisions are out of their hands?

phone encryption

Image: Bryce Durbin/TechCrunch

Thus far law enforcement has chosen to downplay, if not ignore, these concerns—focusing singularly on getting the information they need. This is understandable—a law enforcement officer should use every power available to them to solve a case, just as I did when I served as a State Trooper and as a FBI Special Agent, including when I served as Executive Assistant Director (EAD) overseeing the San Bernardino terror attack case during my final months in 2015.

Decisions regarding these types of sweeping powers should not and cannot be left solely to law enforcement. It is up to the private sector, and our government, to weigh competing security and privacy interests. Our government cannot sacrifice the ability of companies and citizens to properly secure their data and systems’ security in the name of often vague physical and national security concerns, especially when there are other ways to remedy the concerns of law enforcement.

That said, these security responsibilities cut both ways. Recent data breaches demonstrate that many companies have a long way to go to adequately protect their customers’ data. Companies cannot reasonably cry foul over the negative security impacts of proposed law enforcement data access while continuing to neglect and undermine the security of their own users’ data.

Providers and the law enforcement community should be held to robust security standards that ensure the security of our citizens and their data—we need legal restrictions on how government accesses private data and on how private companies collect and use the same data.

There may not be an easy answer to the “going dark” issue, but it is time for all of us, in government and the private sector, to understand that enhanced data security through properly implemented encryption and data use policies is in everyone’s best interest.

The “extra ordinary” access sought by law enforcement cannot exist in a vacuum—it will have far reaching and significant impacts well beyond the narrow confines of a single investigation. It is time for a serious conversation between law enforcement and the private sector to recognize that their security interests are two sides of the same coin.

Powered by WPeMatico

Civil servant who watched porn at work blamed for infecting a US government network with malware

Posted by | Android, computer security, computing, cybercrime, Cyberwarfare, Government, malware, national security, Prevention, ransomware, Removable media, Security, security breaches, spokesperson, U.S. government, United States | No Comments

A U.S. government network was infected with malware thanks to one employee’s “extensive history” of watching porn on his work computer, investigators have found.

The audit, carried out by the U.S. Department of the Interior’s inspector general, found that a U.S. Geological Survey (USGS) network at the EROS Center, a satellite imaging facility in South Dakota, was infected after an unnamed employee visited thousands of porn pages that contained malware, which downloaded to his laptop and “exploited the USGS’ network.” Investigators found that many of the porn images were “subsequently saved to an unauthorized USB device and personal Android cell phone,” which was connected to the employee’s government-issued computer.

Investigators found that his Android cell phone “was also infected with malware.”

The findings were made public in a report earlier this month but buried on the U.S. government’s oversight website and went largely unreported.

It’s bad enough in this day and age that a government watchdog has to remind civil servants to not watch porn at work — let alone on their work laptop. The inspector general didn’t say what the employee’s fate was, but ripped into the Department of the Interior’s policies for letting him get that far in the first place.

“We identified two vulnerabilities in the USGS’ IT security posture: web-site access and open USB ports,” the report said.

There is a (slightly) bright side. The EROS Center, which monitors and archives images of the planet’s land surface, doesn’t operate any classified networks, a spokesperson for Interior’s inspector general told TechCrunch in an email, ruling out any significant harm to national security. But the spokesperson wouldn’t say what kind of malware used — only that, “the malware helps enable data exfiltration and is also associated with ransomware attacks.”

Investigators recommended that USGS enforce a “strong blacklist policy” of known unauthorized websites and “regularly monitor employee web usage history.”

The report also said the agency should lock down its USB drive policy, restricting employees from using removable media on government devices, but it’s not known if the recommendations have yet gone into place. USGS did not return a request for comment.

Powered by WPeMatico

Smart home makers hoard your data, but won’t say if the police come for it

Posted by | Amazon, Apple, computer security, Facebook, Gadgets, Google, Government, hardware, Internet of Things, law enforcement, national security, privacy, Security, smart home devices, television, transparency report | No Comments

A decade ago, it was almost inconceivable that nearly every household item could be hooked up to the internet. These days, it’s near impossible to avoid a non-smart home gadget, and they’re vacuuming up a ton of new data that we’d never normally think about.

Thermostats know the temperature of your house, and smart cameras and sensors know when someone’s walking around your home. Smart assistants know what you’re asking for, and smart doorbells know who’s coming and going. And thanks to the cloud, that data is available to you from anywhere — you can check in on your pets from your phone or make sure your robot vacuum cleaned the house.

Because the data is stored or accessible by the smart home tech makers, law enforcement and government agencies have increasingly sought data from the companies to solve crimes.

And device makers won’t say if your smart home gadgets have been used to spy on you.

For years, tech companies have published transparency reports — a semi-regular disclosure of the number of demands or requests a company gets from the government for user data. Google was first in 2010. Other tech companies followed in the wake of Edward Snowden’s revelations that the government had enlisted tech companies’ aid in spying on their users. Even telcos, implicated in wiretapping and turning over Americans’ phone records, began to publish their figures to try to rebuild their reputations.

As the smart home revolution began to thrive, police saw new opportunities to obtain data where they hadn’t before. Police sought Echo data from Amazon to help solve a murder. Fitbit data was used to charge a 90-year old man with the murder of his stepdaughter. And recently, Nest was compelled to turn over surveillance footage that led to gang members pleading guilty to identity theft.

Yet, Nest — a division of Google — is the only major smart home device maker that has published how many data demands it receives.

As first noted by Forbes last week, Nest’s little-known transparency report doesn’t reveal much — only that it’s turned over user data about 300 times since mid-2015 on over 500 Nest users. Nest also said it hasn’t to date received a secret order for user data on national security grounds, such as in cases of investigating terrorism or espionage. Nest’s transparency report is woefully vague compared to some of the more detailed reports by Apple, Google and Microsoft, which break out their data requests by lawful request, by region and often by the kind of data the government demands.

As Forbes said, “a smart home is a surveilled home.” But at what scale?

We asked some of the most well-known smart home makers on the market if they plan to release a transparency report, or disclose the number of demands they receive for data from their smart home devices.

For the most part, we received fairly dismal responses.

What the big four tech giants said

Amazon did not respond to requests for comment when asked if it will break out the number of demands it receives for Echo data, but a spokesperson told me last year that while its reports include Echo data, it would not break out those figures.

Facebook said that its transparency report section will include “any requests related to Portal,” its new hardware screen with a camera and a microphone. Although the device is new, a spokesperson did not comment on if the company will break out the hardware figures separately.

Google pointed us to Nest’s transparency report but did not comment on its own efforts in the hardware space — notably its Google Home products.

And Apple said that there’s no need to break out its smart home figures — such as its HomePod — because there would be nothing to report. The company said user requests made to HomePod are given a random identifier that cannot be tied to a person.

What the smaller but notable smart home players said

August, a smart lock maker, said it “does not currently have a transparency report and we have never received any National Security Letters or orders for user content or non-content information under the Foreign Intelligence Surveillance Act (FISA),” but did not comment on the number of subpoenas, warrants and court orders it receives. “August does comply with all laws and when faced with a court order or warrant, we always analyze the request before responding,” a spokesperson said.

Roomba maker iRobot said it “has not received any demands from governments for customer data,” but wouldn’t say if it planned to issue a transparency report in the future.

Both Arlo, the former Netgear smart home division, and Signify, formerly Philips Lighting, said they do not have transparency reports. Arlo didn’t comment on its future plans, and Signify said it has no plans to publish one. 

Ring, a smart doorbell and security device maker, did not answer our questions on why it doesn’t have a transparency report, but said it “will not release user information without a valid and binding legal demand properly served on us” and that Ring “objects to overbroad or otherwise inappropriate demands as a matter of course.” When pressed, a spokesperson said it plans to release a transparency report in the future, but did not say when.

Spokespeople for Honeywell and Canary — both of which have smart home security products — did not comment by our deadline.

And, Samsung, a maker of smart sensors, trackers and internet-connected televisions and other appliances, did not respond to a request for comment.

Only Ecobee, a maker of smart switches and sensors, said it plans to publish its first transparency report “at the end of 2018.” A spokesperson confirmed that, “prior to 2018, Ecobee had not been requested nor required to disclose any data to government entities.”

All in all, that paints a fairly dire picture for anyone thinking that when the gadgets in your home aren’t working for you, they could be helping the government.

As helpful and useful as smart home gadgets can be, few fully understand the breadth of data that the devices collect — even when we’re not using them. Your smart TV may not have a camera to spy on you, but it knows what you’ve watched and when — which police used to secure a conviction of a sex offender. Even data from when a murder suspect pushed the button on his home alarm key fob was enough to help convict someone of murder.

Two years ago, former U.S. director of national intelligence James Clapper said the government was looking at smart home devices as a new foothold for intelligence agencies to conduct surveillance. And it’s only going to become more common as the number of internet-connected devices spread. Gartner said more than 20 billion devices will be connected to the internet by 2020.

As much as the chances are that the government is spying on you through your internet-connected camera in your living room or your thermostat are slim — it’s naive to think that it can’t.

But the smart home makers wouldn’t want you to know that. At least, most of them.

Powered by WPeMatico

Tortuga Logic raises $2 million to build chip-level security systems

Posted by | computer security, computing, cryptography, Cyberwarfare, Gadgets, national security, Startups, TC, vulnerability | No Comments

 Tortuga Logic has raised $2 million in seed funding from Eclipse Ventures to help in their effort to maintain chip-level system security. Based in Palo Alto, the company plans to use the cash to build products that will find “lurking vulnerabilities” on computer hardware. The founders, Dr. Jason Oberg, Dr. Jonathan Valamehr, Professor Ryan Kastner of UC San Diego, and Professor… Read More

Powered by WPeMatico

Apple says most vulnerabilities in Wikileaks docs are already patched

Posted by | Android, Apple, national security, Samsung, Security, smart tv, vault7, wikileaks | No Comments

 Wikileaks today published a trove of documents, allegedly taken from the CIA, that detail the government’s efforts to hack popular devices like iPhones, Android phones, and Samsung smart TVs. But Apple is pushing back against claims that the CIA’s hoarded vulnerabilities for its devices were effective.
The documents, if they are indeed legitimate, include charts that detail iOS… Read More

Powered by WPeMatico

This must be the year of mobile security

Posted by | computer security, Europe, iPhone, Mobile, mobile security, national security, Password, Prevention, Safety, Security, surveillance, TC | No Comments

cyber-security-data-phone If I gave you my phone right you’d be able to figure out a lot of stuff about me. If I didn’t unlock it you’d see some of the news I read, the apps I use, and even some of the messages I’ve gotten from my friends. You’d be able to see that my friend Rick just wrote “If she gets desperate enough, let me know?” which, if taken out of context, is pretty… Read More

Powered by WPeMatico

Harvard Report Debunks Claim Surveillance Is “Going Dark”

Posted by | Edward Snowden, encryption, Gadgets, Government, harvard, Internet of Things, jonathan zittrain, law enforcement, national security, privacy, Security, surveillance, TC, Wearables | No Comments

Nest Since the 2013 Snowden disclosures revealed the extent of government surveillance programs it’s been a standard claim by intelligence agencies, seeking to justify their push for more powers, that their ability to track suspects using new technologies is under threat because of growing use of end-to-end encryption by technology companies. Read More

Powered by WPeMatico

Pressure In Congress Grows For GPS Tracking Reform After Supreme Court Passes On Cell Phone Case

Posted by | al franken, Congress, Government, GPS Tracking, law enforcement, Mobile, mobile phone, national security, privacy, Ron Wyden, supreme court, surveillance, TC | No Comments

capitol Senators and House representatives this week are calling on Congress to act on bills that would limit location tracking and phone surveillance after the Supreme Court decided not to hear a cell phone case earlier this week. The justices on Monday declined to review a federal court’s decision from earlier this year that police do not need a warrant to seize and search cell phone records… Read More

Powered by WPeMatico