Category Archives: Encryption

How to Get the Most Out of Your Smartphone’s Encryption – WIRED

You may not think much about encryption day to day, but its the reason the FBI can't easily get at the data on the iPhones that come into its possession; it also means if someone steals your phone, they won't be able to get anything off it without the PIN code.

In terms of individual apps, it stops anyone from snooping on your WhatsApp and Signal conversations when theyre in transit from one device to the otherand that includes anyone who works at WhatsApp or the Signal Foundation. In short, it makes it much, much harder for anyone to get at your photos, messages, documents, and everything else you've got stored on your phone. Heres how to make sure its working for you.

iPhone Encryption

It was the 2014 release of iOS 8 that encrypted every iPhone back to the 4S by default. Much to the chagrin of various law enforcement agencies, that encryption has only gotten tougher over time.

Everything on an iPhone is locked down as soon as you set a PIN code, a Touch ID fingerprint, or a Face ID faceyour PIN, fingerprint, or face acts as the key to unlock the encryption, which is why you're able to read your messages and view your files as soon as your phone is unlocked.

This is also why you should never leave your phone lying around unlocked if you value the data on it. You can configure the screen lock on your iPhone by going to Face ID & Passcodeor Touch ID & Passcodeon the iOS Settings menu. If you go the PIN route, use at least a six-digit alphanumeric code. Anything shorter, or using numbers only, is too easy for forensic devices to brute-force.

Encryption extends to backups of your iPhone made through Apple's own software too, whether that's on the web in iCloud, or in iTunes or Finder on a connected computer. (Tap your name at the top of the iOS Settings screen, then iCloud and iCloud Backup to set which one you're using.) You can choose to leave local iTunes or Finder backups unencrypted if you want, via the tick box labeled Encrypt local backup on the Summary or General tab.

iCloud backups are encrypted, but Apple can potentially get at them if needed.

However, theres a crucial distinction between data on your iPhone and data in your iCloud backups. While the latter are encrypted and thus protected against hackers, Apple does hold its own key to decrypt them and will pass the data on to law enforcement if forced to. Apple will also use it to help you regain access to your backup if you lose it. If thats a concern for you, keep your backups stored locally on a Windows or Mac laptop.

Android Encryption

The encryption picture used to be patchy for Android, but in the past three or four years most new Android smartphonesincluding the popular Samsung Galaxy and Google Pixel lineshave come with encryption enabled by default. You can check this under Advanced and Encryption and Credentials in the Security page of Settings.

View original post here:
How to Get the Most Out of Your Smartphone's Encryption - WIRED

Options to End the End to End Encryption Debate – Infosecurity Magazine

Its a long-simmering disagreement that shows no sign of reaching a conclusion: law enforcement wants access to encrypted devices and messaging apps to fight crime. Tech companies say any system that allows for lawful access would instantly be attacked and put legitimate users in danger.

The latest spat between the FBI and Apple was over the locked devices of Mohammed Saeed Alshamrani, who was suspected of killing three people and injuring eight in a shooting spree on a Navy base in Pensacola, Florida on December 6, may have escalated the conflict, but it's unlikely to break the deadlock.

While the debate has been framed as a battle between privacy and security, the reason for the stalemate is that the conversation between law enforcement and tech firms has largely focused on one solution. With tech firms moving to stronger security and end-to-end encryption across messaging apps, the US Justice Department along with the UK and Australia - has asked companies to create a key or backdoor into the design of their products that would allow law enforcement to unlock the phones of criminal suspects and access data a move that Facebook says is impossible without weakening the strength of its encryption.

Surprisingly little thought, however, has been given to alternative ways of handling the challenge of thwarting criminals who hide behind encryption, while also preserving the privacy of legitimate users. So what are the alternatives, and is there a possibility that both sides could agree a middle ground?

Facebook has offered its own solution. Anxious to avoida scenario where unbreakable encryption would effectively become illegal,Facebook says it should still be able to provide some critical location and account information.

This is because end-to-end encryption hides all content, but not all metadata of the conversation taking place.We are building tools to look for signals and patterns of suspicious activity so that we can stop abusers from reaching potential victims, Facebooks Jay Sullivan told the Judiciary Committee last month.

The big fear, however, is that 12 million referrals of child sexual abuse - currently flagged by tech giants - would be lost annually if Facebook implements its plans. Stronger encryption would limit the chances of identifying the abusers and rescuing the victims.

Then there is the argument that Facebook cannot be trusted, with critics pointing to numerous security breaches and the mass collection of users personal data for financial gain.

Anotheroption, put forward by the Carnegie Endowment for International Peace in a new paper calledMoving the Encryption Policy Conversation Forward, attempts to find some middle ground by separating data at rest and data in motion. It would prevent police from being able to carry out live surveillance of discussions that are in progress, but allow them with a court-ordered search warrant to see data at rest on mobile phones.This would include photos and messages that are already held on suspects mobile phones, laptops and in cloud storage.

Exploring mobile phone data at rest seems to be an area most likely to kick start the debate.New York County District Attorney Cyrus Vance is among supporters of this approach and wants federal legislative action to push it through.His frustration stems from Apples refusal to provide access to the phone of the San Bernardino shooter following the 2015 massacre.

Even so,many in the computer security community are skeptical, and the approach rigorous testing and debate to see if its viable.

A third option isnt so much a backdoor, more an emergency entrance. Here the government, the tech company and a neutral third party, such as a court, would each keep a fragment of a cryptographic key. Authorities would get sanctioned and pre-agreed access to messaging data a bit like a bank safe deposit box which can only be opened if the bank and the customer are present.

According to Andersen Cheng, CEO of Post-Quantum, this scenario option would significantly limit the ability of rogue actors to get access because it means no one authority has a master key to unlock millions of accounts. Any concerns over government control can be allayed because the key management could be hosted by the social media companies, he says.

The only problem and its a big one - is that no one appears to have any idea how to create such a thing at scale that will remain secret. Tech companies are likely to rail against any technical steps that would fundamentally weaken communications.

Then, theres the current solution. Each year,US police districts give millions of dollars to third-party commercial developers to access data saved to the cloud. As we know from recent scandals, undetectable spyware exploits vulnerabilities in software, allowing the buyer to access a device to read texts, pilfer address books, remotely switch on microphones and track the location of their target. There is no shortage of commercial surveillance companies that offer these services, and police reportedly used similar tools to access the phone of the San Bernardino shooter when Apple wouldnt help.

This kind of technology is playing an increasing part in helping government agencies all over the world prevent and investigate terrorism and crime and save lives: almost 50% of police investigations now involve cloud data.

Controversial Israeli firm NSO Group was involved in the capture notorious drug lord El Chapo, and recently police in Western Europe said that NSO spyware was helping them track a terror suspect they feared was plotting an attack during Christmas.

Despite this, encrypted devices and messaging platforms continue to complicate crime investigations, not least becausecritical evidence is often only available on the device itself, not in the cloud. The tools provided by commercial companies can also be expensive, with police claiming that justice is sometimes unattainable for crime victims in areas where police departments do not have the means to decrypt phones.

Campaigners also point to potential abuses and a lack of transparency over new forms of surveillance being used, and a more widespread adoption of this approach will mean that governments will have to impose careful controls to prevent misuse and enforce oversight.

Whatever the solution to the current debate over encryption, its unlikely to perfectly suit everyone. As the Carnegie Endowment report points out,cybersecurity advocates may have to accept some level of increased security risk, just as law enforcement advocates may not be able to access all the data they seek.

The first step, however, is recognizing that, with the lives and safety of so many at stake, lawmakers and tech firms should investigate every option.

Read the original post:
Options to End the End to End Encryption Debate - Infosecurity Magazine

Remember the Clipper chip? NSA’s botched backdoor-for-Feds from 1993 still influences today’s encryption debates – The Register

Enigma More than a quarter century after its introduction, the failed rollout of hardware deliberately backdoored by the NSA is still having an impact on the modern encryption debate.

Known as Clipper, the encryption chipset developed and championed by the US government only lasted a few years, from 1993 to 1996. However, the project remains a cautionary tale for security professionals and some policy-makers. In the latter case, however, the lessons appear to have been forgotten, Matt Blaze, McDevitt Professor of Computer Science and Law at Georgetown University in the US, told the USENIX Enigma security conference today in San Francisco.

In short, Clipper was an effort by the NSA to create a secure encryption system, aimed at telephones and other gear, that could be cracked by investigators if needed. It boiled down to a microchip that contained an 80-bit key burned in during fabrication, with a copy of the key held in escrow for g-men to use with proper clearance. Thus, any data encrypted by the chip could be decrypted as needed by the government. The Diffie-Hellman key exchange algorithm was used to exchange data securely between devices.

Any key escrow mechanism is going to be designed from the same position of ignorance that Clipper was designed with in the 1990s

Not surprisingly, the project met stiff resistance from security and privacy advocates who, even in the early days of the worldwide web, saw the massive risk posed by the chipset: for one thing, if someone outside the US government was able to get hold of the keys or deduce them, Clipper-secured devices would be vulnerable to eavesdropping. The implementation was also buggy and lacking. Some of the people on the Clipper team were so alarmed they secretly briefed opponents of the project, alerting them to insecurities in the design, The Register understands.

Blaze, meanwhile, recounted how Clipper was doomed from the start, in part because of a hardware-based approach that was expensive and inconvenient to implement, and because technical vulnerabilities in the encryption and escrow method would be difficult to fix. Each chip cost about $30 when programmed, we note, and the relatively short keys could be broken by future computers.

In the years following Clipper's unveiling, a period dubbed the "first crypto wars," Blaze said, the chipset was snubbed and faded into obscurity while software-based encryption rose and led to the loosening of government restrictions on its sale and use. It helped that Blaze revealed in 1994 a major vulnerability [PDF] in the design of Clipper's escrow design, sealing its fate.

It is important to note, said Blaze, that the pace of innovation and unpredictability of how technologies will develop makes it incredibly difficult to legislate an approach to encryption and backdoors. In other words, security mechanisms made mandatory today, such as another escrow system, could be broken within a few years, by force or by exploiting flaws, leading to disaster.

This unpredictability in technological development, said Blaze, thus undercuts the entire concept of backdoors and key escrow. The FBI and Trump administration (and the Obama one before that) pushed hard for such a system but need to learn the lessons of history, Blaze opined.

"The FBI is the only organization on Earth complaining that computer security is too good," the Georgetown prof quipped.

"Any key escrow mechanism is going to be designed from the same position of ignorance that Clipper was designed with in the 1990s. We are going to be looking back at those engineering decisions ten years from now as being equally laughably wrong."

Daniel Weitzner, founding director of the MIT Internet Policy Research Initiative, said this problem is not lost on all governments trying to work out new encryption laws and policies in the 21st century. He sees a number of administrations trying to address the issue by bringing developers and telcos in on the process.

"What the legislators hear is a complicated problem that they don't know how to resolve," Weitzner noted. "Moving the debate to experts on one hand gets you down to details, but it is not necessarily easy."

Sponsored: Detecting cyber attacks as a small to medium business

View original post here:
Remember the Clipper chip? NSA's botched backdoor-for-Feds from 1993 still influences today's encryption debates - The Register

Why Public Wi-Fi is a Lot Safer Than You Think – EFF

If you follow security on the Internet, you may have seen articles warning you to beware of public Wi-Fi networks" in cafes, airports, hotels, and other public places. But now, due to the widespread deployment of HTTPS encryption on most popular websites, advice to avoid public Wi-Fi is mostly out of date and applicable to a lot fewer people than it once was.

The advice stems from the early days of the Internet, when most communication was not encrypted. At that time, if someone could snoop on your network communicationsfor instance by sniffing packets from unencrypted Wi-Fi or by being the NSAthey could read your email. They could also steal your passwords or your login cookies and impersonate you on your favorite sites. This was widely accepted as a risk of using the Internet. Sites that used HTTPS on all pages were safe, but such sites were vanishingly rare.However, starting in 2010 that all changed. Eric Butler released Firesheep, an easy-to-use demonstration of sniffing insecure HTTP to take over peoples accounts. Site owners started to take note and realized they needed to implement HTTPS (the more secure, encrypted version of HTTP) for every page on their site. The timing was good: earlier that year, Google had turned on HTTPS by default for all Gmail users and reported that the costs to do so were quite low. Hardware and software had advanced to the point where encrypting web browsing was easy and cheap.

However, practical deployment of HTTPS across the whole web took a long time. One big obstacle was the difficulty for webmasters and site administrators of buying and installing a certificate (a small file required in order to set up HTTPS). EFF helped launch Lets Encrypt, which makes certificates available for free, and we wrote Certbot, the easiest way to get a free certificate from Lets Encrypt and install it.

Meanwhile, lots of site owners were changing their software and HTML in order to make the switch to HTTPS. Theres been tremendous progress, and now 92% of web page loads from the United States use HTTPS. In other countries the percentage is somewhat lower80% in India, for examplebut HTTPS still protects the large majority of pages visited. Sites with logins or sensitive data have been among the first to upgrade, so the vast majority of commercial, social networking, and other popular websites are now protected with HTTPS.

There are still a few small information leaks: HTTPS protects the content of your communications, but not the metadata. So when you visit HTTPS sites, anyone along the communication pathfrom your ISP to the Internet backbone provider to the sites hosting providercan see their domain names (e.g. wikipedia.org) and when you visit them. But these parties cant see the pages you visit on those sites (e.g. wikipedia.org/controversial-topic), your login name, or messages you send. They can see the sizes of pages you visit and the sizes of files you download or upload. When you use a public Wi-Fi network, people within range of it could choose to listen in. Theyd be able to see that metadata, just as your ISP could see when you browse at home. If this is an acceptable risk for you, then you shouldnt worry about using public Wi-Fi.

Similarly, if there is software with known security bugs on your computer or phone, and those bugs are specifically exploitable only on the local network, you might be at somewhat increased risk. The best defense is to always keep your software up-to-date so it has the latest bug fixes.

What about the risk of governments scooping up signals from open public Wi-Fi that has no password? Governments that surveill people on the Internet often do it by listening in on upstream data, at the core routers of broadband providers and mobile phone companies. If thats the case, it means the same information is commonly visible to the government whether they sniff it from the air or from the wires.

In general, using public Wi-Fi is a lot safer than it was in the early days of the Internet. With the widespread adoption of HTTPS, most major websites will be protected by the same encryption regardless of how you connect to them.

There are plenty of things in life to worry about. You can cross public Wi-Fi off your list.

Read the original here:
Why Public Wi-Fi is a Lot Safer Than You Think - EFF

There is no legislation mandating encryption of private information – Kamloops This Week

While the fallout from the LifeLabs privacy breach continues to reverberate in the form of proposed class action lawsuits and patients still trying to determine if their personal medical information was accessed, the Office of the Information and Privacy Commissioner of B.C. has confirmed there is no legislation that mandates private information held by a company be encrypted.

Neither the Freedom of Information and Protection of Personal Information Act (FIPPA), which applies to public bodies, nor the Personal Information Protection Act, (PIPA), which applies to private organizations, specifically mention encryption, the office confirmed in an email response to a query from KTW.

Personal information of up to 15-million LifeLabs patients, primarily in B.C. and Ontario, may have been accessed during a cyberattack on the companys computer systems in October. LifeLabs reported it to authorities on Nov. 1, but the breach was not made public until mid-December.

LifeLabs said it retained outside cybersecurity consultants to investigate and assist with restoring the security of its data.

While LifeLabs states on its website that its patient information is encrypted, company CEO Charles Brown told the CBCs Early Edition on Dec. 18 that he did not know if the information hacked was, indeed, encrypted.

Here is the text that can be found on the Life Labs website: Our security practices are designed to protect your personal information and prevent unauthorized access. Only authorized employees are permitted to access personal information and only when the access is necessary. Your information is protected using industry best practices, and all information is transmitted over secure, encrypted channels.

Section 30 of the Freedom of Information and Protection of Personal Information Act states: A public body must protect personal information in its custody or under its control by making reasonable security arrangements against such risks as unauthorized access, collection, use, disclosure or disposal.

Section S.34 of the Personal Information Protection Act states: An organization must protect personal information in its custody or under its control by making reasonable security arrangements to prevent unauthorized access, collection, use, disclosure, copying, modification or disposal or similar risks.

Noel Boivin, senior communications officer for the Office of the Information and Privacy Commissioner of B.C., said the department has the authority to issue legally binding orders to ensure organizations comply with those requirements.

Decisions such as these are made based on the unique facts of each case, Boivin said. Based on these requirements in both pieces of legislation, our office recommends encryption as a best practice.

The Office of the Information and Privacy Commissioner recommends organizations implement technical safeguards, including ensuring computers and networks are secure from intrusion by using firewalls, intrusion-detection software and antivirus software and by encrypting personal information.

Boivin noted findings from previous investigation reports call for organizations to encrypt data on personal storage devices.

Our guidance is that personal information should be encrypted in transit and at rest in order to protect against unauthorized access, said Caitlin Lemiski, the Office of the Information and Privacy Commissioner's director of policy. The encryption, and key management, should be based on current industry-accepted standards for protecting data and should be reviewed regularly.

LifeLabs has four clinics in Kamloops two downtown, one in Aberdeen and one in North Kamloops.

According to the company, hackers gained access to the computer system that held customer information from 2016 and earlier that could include names, addresses, email addresses, login user names and passwords, dates of birth, health card numbers and lab test results. The access was accompanied by a ransom demand, which LifeLabs paid.

LifeLabs set up a dedicated phone line and information on its website for those affected by the breach. To find out more, the public should go online tocustomernotice.lifelabs.comor contact LifeLabs at 1-888-918-0467.

In January 2013, patient information for 16,100 Kamloops-area residents was on a computer hard drive that went missing as it was being transferred by LifeLabs to Burnaby from Kamloops.

See the rest here:
There is no legislation mandating encryption of private information - Kamloops This Week

Apple Watch rewards, iCloud encryption, and WhatsApp hacks on the AppleInsider Podcast – AppleInsider

Feature

By Lester Victor MarksFriday, January 24, 2020, 05:49 am PT (08:49 am ET)

AppleInsider editor Victor Marks and writer William Gallagher discuss:

We like reader email send us your comments and concerns!

The show is available on iTunes and your favorite podcast apps by searching for "AppleInsider." Click here to listen, subscribe, and don't forget to rate our show.

Listen to the embedded SoundCloud feed below:

Sponsors:

Masterclass - Get unlimited access to EVERY MasterClass, and as an AppleInsider listener, you get 15% off the Annual All-Access Pass! Go to masterclass.com/appleinsider.

CLEAR is the absolute best way to get through airport security. It works great with Pre-Check too! Right now, listeners of our show can get their first two months of CLEAR for FREE. Go to clearme.com/appleinsider and use code appleinsider.

Show notes:

Follow our hosts on Twitter: @vmarks and @wgallagher

Feedback and comments are always appreciated. Please contact the AppleInsider podcast at [emailprotected] and follow us on Twitter @appleinsider, plus Facebook and Instagram.

Those interested in sponsoring the show can reach out to us at [emailprotected].

See the rest here:
Apple Watch rewards, iCloud encryption, and WhatsApp hacks on the AppleInsider Podcast - AppleInsider

Apple Wanted the iPhone to Have End-to-End Encryption. Then the FBI Stepped In – Popular Mechanics

Apple had intended to make end-to-end encryption of an entire device's data, which would then be uploaded to iCloud, available to customers. But then the FBI stepped in and put the kibosh on those plans.

The problem, according to law enforcement: Fully locked-down iPhones could be a roadblock to investigations, like the probe into a Saudi Air Force officer who shot three people dead at a Pensacola, Florida naval base last month.

U.S. Attorney General William Barr publicly asked Apple to unlock the two iPhones the shooter had in his possession. The company eventually did hand over backups from his iCloud account, but the whole ordeal shone a light on the back-and-forth dialogue going on between the U.S. government and tech companies that disagree about whether or not end-to-end encryption should be allowed. Just last month, both Democratic and Republican senators considered legislation to ban end-to-end encryption, using unrecoverable evidence in crimes against children as an example.

Apple had been planning to introduce end-to-end encryption for over two years and even told the FBI, according to a Reuters report that cited one current and three former Bureau officials, as well as one current and one former Apple employee. Shortly thereafter, the FBIs cybercrime agents and its operational technology division came out as staunchly opposed to those plans because it would make it impossible for Apple to recover people's messages for use in investigations.

"Legal killed it, for reasons you can imagine," another former Apple employee told Reuters. "They decided they werent going to poke the bear anymore."

In this case, the bear is the government. In 2016, a nearly identical showdown between the FBI and Apple took place after the two parties got into a legal battle over access to an iPhone owned by a suspect in the San Bernardino, California mass shooting.

The nixed encryption plans are a loss for iPhone users because end-to-end encryption is more advanced than today's industry standard for security: basic encryption. Loads of companies use encryption, which basically scrambles the contents of a message or some other snippet of data, rendering it completely useless without the decryption key, which can unshuffle the jargon and restore the original.

Under this framework, a company usually has the cryptographic encryption key, which means the data isn't truly safe if a government or hacker gets their hands on the key. End-to-end encryption, though, means only the, well, end computerthe one receiving the datahas the encryption key stored. In theory, that person's computer could still be hacked and the encryption key could be forfeited, but it really reduces those odds.

But that limitation on who has access to the encryption key is the very crux of law enforcement's issue with end-to-end encryption: If Apple doesn't have the encryption key to access backups of a person's iPhone on the cloud, then the government can't access that data either.

Still, it's not entirely clear that the government is to blame for this project being killed. It's entirely possible Apple didn't want to have to deal with the headache of its customers accidentally locking themselves out of their own data.

For the rest of the world's smartphone users who rely on the Android operating system, end-to-end encryption is an option. Back in October 2018, Google announced that customers could use a new capability that would keep backed-up data from their phones completely locked down by using a decryption key that's randomly generated on that user's phone, using their lock screen pin, pattern, or passcode.

"By design, this means that no one (including Google) can access a user's backed-up application data without specifically knowing their passcode," the company wrote in a blog post. This end-to-end encryption offering is still available.

Read more here:
Apple Wanted the iPhone to Have End-to-End Encryption. Then the FBI Stepped In - Popular Mechanics

Amazon Engineer Leaked Private Encryption Keys. Outside Analysts Discovered Them in Minutes – Gizmodo

An Amazon Web Services (AWS) engineer last week inadvertently made public almost a gigabytes worth of sensitive data, including their own personal documents as well as passwords and cryptographic keys to various AWS environments.

While these kinds of leaks are not unusual or special, what is noteworthy here is how quickly the employees credentials were recovered by a third party, whoto the employees good fortune, perhapsimmediately warned the company.

On the morning of January 13, an AWS employee, identified as a DevOps Cloud Engineer on LinkedIn, committed nearly a gigabytes worth of data to a personal GitHub repository bearing their own name. Roughly 30 minutes later, Greg Pollock, vice president of product at UpGuard, a California-based security firm, received a notification about a potential leak from a detection engine pointing to the repo.

Despite the privacy concerns, labor strikes, and reports that Amazon is selling literal trash on

An analyst began working to verify what specifically had triggered the alert. Around two hours later, Pollock was convinced the data had been committed to the repo inadvertently and might pose a threat to the employee, if not AWS itself. In reviewing this publicly accessible data, I have come to the conclusion that data stemming from your company, of some level of sensitivity, is present and exposed to the public internet, he told AWS by email.

AWS responded gratefully about four hours later and the repo was suddenly offline.

Since UpGuards analysts didnt test the credentials themselveswhich would have been illegalits unclear what precisely they grant access to. An AWS spokesperson told Gizmodo on Wednesday that all of the files were personal in nature and unrelated to the employees work. No customer data or company systems were exposed, they said.

At least some of the documents in the cache, however, are labeled Amazon Confidential.

Alongside those documents are AWS and RSA key pairs, some of which are marked mock or test. Others, however, are marked admin and cloud. Another is labeled rootkey, suggesting it provides privileged control of a system. Other passwords are connected to mail services. And there are numerous of auth tokens and API keys for a variety of third-party products.

AWS did not provide Gizmodo with an on-the-record statement.

It is possible that GitHub would have eventually alerted AWS that this data was public. The site itself automatically scans public repositories for credentials issued by a specific list of companies, just as UpGuard was doing. Had GitHub been the one to detect the AWS credentials, it would have, hypothetically, alerted AWS. AWS would have then taken appropriate action, possibly by revoking the keys.

But not all of the credentials leaked by the AWS employee are detected by GitHub, which only looks for specific types of tokens issued by certain companies. The speed with which UpGuards automated software was able to locate the keys also raises concerns about what other organizations have this capability; surely many of the worlds intelligence agencies are among them.

GitHubs efforts to identify the leaked credentials its users uploadwhich began in earnest around five years agoreceived scrutiny last year after a study at North Carolina State University (NCSU) unearthed over 100,000 repositories hosting API tokens and keys. (Notably, the researchers only examined 13 percent of all public repositories, which alone included billions of files.)

While Amazon access key IDs and auth tokens were among the data examined by the NCSU researchers, a majority of the leaked credentials were linked to Google services.

GitHub did not respond to a request for comment.

UpGuard says it chose to make the incident known to demonstrate the importance of early detection and underscore that cloud security is not invulnerable to human error.

Amazon Web Services is the largest provider of public cloud services, claiming about half of the market share, Pollock said. In 2019, a former Amazon employee allegedly stole over a hundred million credit applications from Capital One, illustrating the scale of potential data loss associated with insider threats at such large and central data processors.

In this case, Pollock added, theres no evidence that the engineer acted maliciously or that any customer data was affected. Rather, this case illustrates the value of rapid data leaks detection to prevent small accidents from becoming larger incidents.

Read more here:
Amazon Engineer Leaked Private Encryption Keys. Outside Analysts Discovered Them in Minutes - Gizmodo

Deployed 82nd Airborne unit told to use these encrypted messaging apps on government cell phones – Military Times

A brigade of paratroopers deployed in early January to the Middle East in the wake of mounting tensions with Iran has been asked by its leadership to use two encrypted messaging applications on government cell phones.

The use of the encrypted messaging applications Signal and Wickr by the 82nd Airbornes Task Force Devil underscores the complexity of security and operations for U.S. forces deployed to war zones where adversaries can exploit American communications systems, cell phones and the electromagnetic spectrum.

But it also raises questions as to whether the Department of Defense is scrambling to fill gaps in potential security vulnerabilities for American forces operating overseas by relying on encrypted messaging apps available for anyone to download in the civilian marketplace.

All official communication on government cell phones within TF Devil has been recommended to use Signal or Wickr encrypted messaging apps, Maj. Richard Foote, a spokesman for the 1st Brigade Combat Team, told Military Times.

These are the two apps recommended by our leadership, as they are encrypted and free for download and use, Foote said.

Foote added that there is no operational discussions via the apps and an extra layer of security is provided because users must go through virtual private networks.

However, there are government transparency concerns with the use of encrypted messaging apps like Signal and Wickr, which feature auto-delete functions where messages are erased after a set period of time. Electronic communications and text messages sent as part of official government business are part of the public record, and should be accessible via a Freedom of Information Act request.

The Department of Defense did not respond to queries from Military Times regarding government records keeping policies and whether Signal and Wickr have been audited for security flaws by the DoD. Military Times has reached out to the National Security Agency, and has yet to receive a response.

Get the military's most comprehensive news and information every morning.

(please select a country) United States United Kingdom Afghanistan Albania Algeria American Samoa Andorra Angola Anguilla Antarctica Antigua and Barbuda Argentina Armenia Aruba Australia Austria Azerbaijan Bahamas Bahrain Bangladesh Barbados Belarus Belgium Belize Benin Bermuda Bhutan Bolivia Bosnia and Herzegovina Botswana Bouvet Island Brazil British Indian Ocean Territory Brunei Darussalam Bulgaria Burkina Faso Burundi Cambodia Cameroon Canada Cape Verde Cayman Islands Central African Republic Chad Chile China Christmas Island Cocos (Keeling) Islands Colombia Comoros Congo Congo, The Democratic Republic of The Cook Islands Costa Rica Cote D'ivoire Croatia Cuba Cyprus Czech Republic Denmark Djibouti Dominica Dominican Republic Ecuador Egypt El Salvador Equatorial Guinea Eritrea Estonia Ethiopia Falkland Islands (Malvinas) Faroe Islands Fiji Finland France French Guiana French Polynesia French Southern Territories Gabon Gambia Georgia Germany Ghana Gibraltar Greece Greenland Grenada Guadeloupe Guam Guatemala Guinea Guinea-bissau Guyana Haiti Heard Island and Mcdonald Islands Holy See (Vatican City State) Honduras Hong Kong Hungary Iceland India Indonesia Iran, Islamic Republic of Iraq Ireland Israel Italy Jamaica Japan Jordan Kazakhstan Kenya Kiribati Korea, Democratic People's Republic of Korea, Republic of Kuwait Kyrgyzstan Lao People's Democratic Republic Latvia Lebanon Lesotho Liberia Libyan Arab Jamahiriya Liechtenstein Lithuania Luxembourg Macao Macedonia, The Former Yugoslav Republic of Madagascar Malawi Malaysia Maldives Mali Malta Marshall Islands Martinique Mauritania Mauritius Mayotte Mexico Micronesia, Federated States of Moldova, Republic of Monaco Mongolia Montserrat Morocco Mozambique Myanmar Namibia Nauru Nepal Netherlands Netherlands Antilles New Caledonia New Zealand Nicaragua Niger Nigeria Niue Norfolk Island Northern Mariana Islands Norway Oman Pakistan Palau Palestinian Territory, Occupied Panama Papua New Guinea Paraguay Peru Philippines Pitcairn Poland Portugal Puerto Rico Qatar Reunion Romania Russian Federation Rwanda Saint Helena Saint Kitts and Nevis Saint Lucia Saint Pierre and Miquelon Saint Vincent and The Grenadines Samoa San Marino Sao Tome and Principe Saudi Arabia Senegal Serbia and Montenegro Seychelles Sierra Leone Singapore Slovakia Slovenia Solomon Islands Somalia South Africa South Georgia and The South Sandwich Islands Spain Sri Lanka Sudan Suriname Svalbard and Jan Mayen Swaziland Sweden Switzerland Syrian Arab Republic Taiwan, Province of China Tajikistan Tanzania, United Republic of Thailand Timor-leste Togo Tokelau Tonga Trinidad and Tobago Tunisia Turkey Turkmenistan Turks and Caicos Islands Tuvalu Uganda Ukraine United Arab Emirates United Kingdom United States United States Minor Outlying Islands Uruguay Uzbekistan Vanuatu Venezuela Viet Nam Virgin Islands, British Virgin Islands, U.S. Wallis and Futuna Western Sahara Yemen Zambia Zimbabwe

Subscribe

By giving us your email, you are opting in to the Early Bird Brief.

Operational planners and military commanders rely on government cell phones for basic menial tasks from scheduling and daily muster even when deployed overseas.

Foote told Military Times that there is no requirement for extensive use of cell phones for work communication for the deployed 82nd paratroopers.

If cell phones are used, we have taken the best steps, readily available, to ensure the best security of our transmissions, Foote explained

To be clear, the term official communication in this setting refers to coordination of assets, sharing of meeting time changes, etc. There is no operational discussion on these platforms, Foote said.

Adversaries like Iran, which boast robust cyber and electronic warfare capabilities can glean much information from phone collections and basic text messages that could highlight daily patterns on an installation or sudden shifts and changes in schedules potential indications of pending operations.

But Foote explained to Military Times that the 82nds government cell communications include an extra layer of security.

When official business is being conducted via cell, it is done on the apps over VPN-protected [virtual private network] connectionssystems reviewed and recommended by our Communications and Cyber sections, Foote said.

In 2016, Signal received a positive security review when it was audited by the International Association for Cryptologic Research.

We have found no major flaws in the design, IACR said in its 2016 security audit of Signal.

A former military intelligence operator who has extensive experience working with the special operations community told Military Times that the Signal app was very secure with no known bugs.

He explained that the 82nd Airbornes reliance on the app for government cell communications wasnt necessarily an indication that the DoD was behind the curve on protecting cellphone security for deployed troops. The former intelligence operator said he believed the DoD was just being lazy.

Unfortunately, those apps are more secure than texting in the clear, which is more or less the alternative. Granted, if a hostile party has access to the handset, that encryption isnt particularly helpful, a former U.S. defense official told Military Times.

The former U.S. defense official, who spoke to Military Times on condition of anonymity because he was not authorized to speak on the record, said the DoD should use commercial applications as long as they are tested and meet security requirements.

I dont have confidence that DoD could build a unique texting system with proper security protocols that would beat any commercial, off the shelf, version, the former official said.

With regards to transparency and records keeping requirements, Foote said he cannot confirm if any personnel have Signal or Wickr settings which allow auto-delete of messages at this time.

Military Times has not been able to confirm if Signal and Wickr have been audited for security flaws and vulnerabilities by the DoD.

Officials from Signal and Wickr did not immediately respond to requests for comment.

Original post:
Deployed 82nd Airborne unit told to use these encrypted messaging apps on government cell phones - Military Times

The FBI doesn’t need Apple to give it a backdoor to encryption, because it already has all the access it needs – Boing Boing

Once again, the FBI is putting pressure on Apple to help them break into the phone of a mass shooter. And once again, Apple has been largely resistant to the effort. Which is good, because a government having control over a private company that gives them secret backdoor access into people's personal technology devices is an authoritarian wet dream waiting to happen.

It also doesn't matter anyway because as Reuters pointed out this week Apple already buckled under FBI pressure a few years and cancelled their plans to add end-to-end encryption to all iPhone backups in iCloud:

The company said it turned over at least some data for 90% of the requests it received [from the FBI]. It turns over data more often in response to secret U.S. intelligence court directives, which sought content from more than 18,000 accounts in the first half of 2019, the most recently reported six-month period.

But what if the FBI wants access to someone's locked iPhone, and they haven't backed it up to iCloud? Theystill don't need Apple's help, because as with the San Bernardino shooting there are plenty of third-party companies that can and will gladly solve the problem in exchange for money.

From OneZero:

Over the past three months,OneZero sent Freedom of Information Act (FOIA) requests to over 50 major police departments, sheriffs, and prosecutors around the country asking for information about their use of phone-cracking technology. Hundreds of documents from these agencies reveal that law enforcement in at least 11 states spent over $4 million in the last decade on devices and software designed to get around passwords and access information stored on phones.

[]

The documents range from contracts, requests for proposals (RFPs), invoices for payments by law enforcement, quotes from forensic companies, and emails traded between officials discussing vendor approval. They suggest that most law enforcement agencies bought forensic investigation products from a small group of companies that includeCellebrite, Grayshift, Paraben, BlackBag, and MSAB. In addition to selling the software and hardware needed to unlock phones, these companies also charge thousands of dollars each year to upgrade the software in their products. In addition, their customers spend thousands on training sessions to teach personnel in their offices how to use the tools.

And perhaps that's the most frustrating thing about this whole scenario. The US government is always warning us about the authoritarian overreaches of surveillance states like those in China, but really, they just want to replicate it without feeling guilty. Meanwhile, supposed-innovations of free market enterprise are providing the same opportunities for authoritarian surveillance capitalism, but, ya know, privately-owned, so immune to any legal oversight or transparency, because America. Isn't that supposed to be the dream?

Exclusive: Apple dropped plan for encrypting backups after FBI complained [Joseph Menn / Reuters]

Exclusive: U.S. Cops Have Wide Access to Phone Cracking Software, New Documents Reveal [Michael Hayes / OneZero]

Image via the White House

No encrypted iCloud backups for you, citizen!

The time is always right to do what is right, thats true. But the timing of this is a pretty ugly retconespecially after a new trove of FBI files on Martin Luther King, Jr. were just released six months ago, painting an ugly picture of frequent sexual misconduct.

Gee, thanks.

Thanks to a series of progressive movements throughout the United States, more and more states are allowing people to smoke in the great outdoors with absolute freedom. Unfortunately, most pipe-makers have been slow to catch up with this new reality, which leads to avid smokers stuffing a cumbersome glass pipe in their pocket every time []

Its no secret that when it comes to building your brand online, nothing beats having a powerful and streamlined website. BoxHosting Website Hosting makes it easy to create an extensive online presence with room for 500 domains, 500 10GB email accounts, and unlimited desk spaceand you only have to pay $45 for life. In addition []

Theres never been a better time to work as a web developerregardless of whether youre looking to work with a big company or as a solo freelancer. The Essential PHP Coding Bundle will get you up to speed with one of the worlds most popular and powerful web development scripting languages, and its currently available []

Here is the original post:
The FBI doesn't need Apple to give it a backdoor to encryption, because it already has all the access it needs - Boing Boing