RCMP wants to use AI to learn passwords in investigations, but experts warn of privacy risks – The Globe and Mail

The RCMP want to use artificial-intelligence technology to obtain passwords so they can decrypt data seized during criminal investigations, prompting cybersecurity and civil-liberties experts to warn that the technology could risk Canadians privacy rights.

Police in Canada are not allowed to compel people to reveal passwords. The force said this week that it is seeking potential partners to build a system powered by artificial intelligence that would ingest material seized during an investigation to figure out passwords for encrypted data.

Law-enforcement officials often say that the criminal use of encryption to obscure activities is an increasing problem as more sophisticated technology becomes available.

But the courts have recognized personal devices as requiring heightened privacy expectations precisely because we live our lives on these devices, said Brenda McPhail, director of the Canadian Civil Liberties Associations privacy, technology and surveillance program.

Compelling people to decrypt devices would be unconstitutional, she said, adding that using technology to do so would amount to a workaround with significant implications on privacy rights. The scanning technology the RCMP seeks could examine the content of these devices so deeply that a police force could gain a knowledge of who we are and what we do. So the privacy invasion is crystal clear, Dr. McPhail said.

The decryption technology sought by the RCMP might not fall under Ottawas directive on automated decision-making, which requires federal departments to assess the risk of using technology such as predictive models and artificial intelligence.

Yuan Stevens, the policy lead at Ryerson Universitys technology, cybersecurity and democracy program, said the RCMPs proposed system might be eligible for exemption because it provides an internal service for government, not an external one.

The federal Privacy Commissioner found earlier this year that the Department of National Defence skirted risk-assessment rules in a diversity recruitment campaign. Theres a pattern of agencies wanting to use AI and not wanting to be subject to oversight, and that might be problematic, Ms. Stevens said.

Canadians have little reason to trust that this technology wont be used in ways that violate our Charter rights and fundamental freedoms, including the rights to privacy and anonymity online.

The request for proposals seeks AI technology that would process a persons known passwords, web history and documents to determine potential passwords for encrypted data.

The RCMP did not respond to specific questions from The Globe and Mail about the tenders details, the frequency with which the force encounters encrypted data in investigations, or the potential that the proposed system could be misused by malicious actors.

In an e-mailed statement, Sergeant Caroline Duval cautioned that the RCMP is only presently seeking to do research and development around decryption, and would evaluate any legal and privacy implications.

The use of any investigative tools by the RCMP is governed by the Canadian Charter of Rights and Freedoms and is subject to appropriate judicial processes, Sgt. Duval said.

The increasing sophistication of encryption tools is an arms race, and police forces need tools to counter criminals in that race, said David Shipley, the chief executive officer of Beauceron Security in Fredericton. Decently strong encryption is stupid simple to use and its stupid hard to break, he said.

If a police force targets data in individual cases where a warrant has been issued, Mr. Shipley said, this is a better way, frankly, than giving them backdoor access to every system which criminals absolutely will abuse to weaken security for everybody.

Such technology comes with the potential for misuse. There is a history of North American law enforcement officials and technology company employees using their professional tools to access information about partners or other individuals they personally know. In some cases, they have broken internal protocols or laws, and endangered the individual theyve sought information about.

The RCMP has also used controversial AI-powered technology before. This past June, the federal Privacy Commissioner found that the forces use of Clearview AIs facial-recognition technology violated Canadas Privacy Act after the RCMP had earlier told the commissioner it hadnt used Clearview.

The force was subsequently accused by NDP MP Charlie Angus of lying to the public. In response, the RCMP said at the time that it was not initially aware that one of its divisions had used Clearview, but was working to fix issues.

The RCMPs new request for proposals says that winners of the AI development contract could receive as much as $1-million. The tender is part of the governments Innovative Solutions Canada challenge, which encourages Canadian entrepreneurs to develop technology for which the government would be the first of many potential customers.

But even if government and law-enforcement officials use such tools within a set of rigid rules, they can also be co-opted by malicious actors. After the U.S. National Security Agency developed a way to exploit Microsoft software called EternalBlue last decade, it was later used by nefarious hacking groups.

Mr. Shipley, who said he generally supports the RCMPs need to strengthen its decryption powers in the face of tech-savvy criminals, warned that it could be used by malicious outsiders. Can the RCMP really keep something that powerful safe? he asked.

Such risks need to be weighed when developing this kind of technology, said Christopher Parsons, senior research associate at the University of Torontos Citizen Lab, which studies digital threats to society. Its understandable that a law-enforcement organization would want these sorts of powers, but whats not apparent is the need for them.

Dr. Parsons has extensively researched Ottawas relationship with encryption. Until 2019, he said, the Liberal government largely took a pro-encryption position. Then, the federal government began what he called an irresponsible shift supporting weakened encryption including for the sake of cracking down on child exploitation despite broader risks.

Hes also found that governments and police forces often describe vague threats of encryption-enabled criminality without supporting evidence. There actually is pretty rarely statistical or demonstrable evidence there is a problem, he said.

If governments work to develop technologies that exploit vulnerabilities in peoples devices, it could create a slippery slope, he said. The same vulnerability that affects my iPhone that can be targeted by this is the same vulnerability that affects the Prime Ministers, Mr. Parsons said.

Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.

Visit link:
RCMP wants to use AI to learn passwords in investigations, but experts warn of privacy risks - The Globe and Mail

Related Posts

Comments are closed.