Page 2,890«..1020..2,8892,8902,8912,892..2,9002,910..»

Bitcoin rise could leave carbon footprint the size of London’s – The Guardian

The surge in bitcoins price since the start of 2021 could result in the cryptocurrency having a carbon footprint the same as that of London, according to research.

Alex de Vries, a Dutch economist, created the Bitcoin Energy Consumption Index, one of the first systematic attempts to estimate the energy use of the bitcoin network. By late 2017 he estimated the network used 30 terawatt hours (TWh) a year, the same as the whole of the Republic of Ireland.

Now De Vries estimates the network uses more than twice and possibly three times as much energy: between 78TWh and 101TWh, or about the same as Norway.

Bitcoin is a 'cryptocurrency' a decentralised tradeable digital asset. The lack of any central authority oversight is one of the attractions.

Invented in 2008, you store your bitcoins in a digital wallet, and transactions are stored in a public ledger known as the bitcoin blockchain, which prevents the digital currency being double-spent.

Cryptocurrencies can be used to send transactions between two parties via the use of private and public keys. These transfers can be done with minimal processing cost, allowing users to avoid the fees charged by traditional financial institutions - as well as the oversight and regulation that entails.

This means it has attracted a range of backers, from libertarian monetarists who enjoy the idea of a currency with no inflation and no central bank, to drug dealers who like the fact that it is hard (but not impossible) to trace a bitcoin transaction back to a physical person.

The exchange rate has been volatile, making it a risky investment. In January 2021 the UK's Financial Conduct Authority warned consumers they should be prepared to lose all their money if they invest in schemes promising high returns from digital currencies such as bitcoin.

In practice it has been far more important for the dark economy than it has for most legitimate uses. In January 2021 it hit a record high of above $40,000, as a growing number of investors backed it as an alternative to other assets during the Covid crisis.

Bitcoin has been criticised for the vast energy reserves and associated carbon footprint of the system. New bitcoins are created by mining coins, which is done by using computers to carry out complex calculations. The more bitcoins there are, the longer it takes to mine new coin, and the more electricity is used in the process.

New bitcoins are created by mining coins, which is done by using computers to carry out complex calculations. The more bitcoins there are, the longer it takes to mine new coin, and the more electricity is used in the process.

Roughly 60% of the costs of bitcoin mining is the price of the electricity used, de Vries estimates. The more money miners get per bitcoin, the more they will be able to spend on mining it.

However, energy use often lags behind swings in currency due to the time it takes for bitcoin miners to acquire new hardware. De Vries writes that energy use is likely to increase substantially in the short term as a result of the currencys recent price rises, as new and established miners invest in more hardware.

By January this year the price of a bitcoin had reached $42,000. At this rate, miners would earn just over $15bn annually.

With 60% of this income going to pay for electricity, at a price of $0.05 per kWh [kilowatt hour], the total network could consume up to 184TWh per year, estimates De Vries.

That energy use is about the same as the 200TWh consumed by every datacentre for every other digital industry globally. The size of bitcoins electrical footprint means the carbon emissions are substantial.

The paper cites an assumption of 480-500g of carbon dioxide produced for every kWh consumed. A total energy consumption of 184TWh would result in a carbon footprint of 90.2m metric tons of CO2, De Vries writes in the journal Joule, which is roughly comparable to the carbon emissions produced by the metropolitan area of London.

As well as the carbon emissions of the bitcoin network, which have been widely debated as a result of the recent surge in interest, De Vries highlights other impacts of the protocols growth.

As well as consuming electricity, for instance, bitcoin miners need access to powerful computers, preferably including specialist chips created for mining. To produce 1m such computers, the largest provider, Bitmain, would have to use a months capacity of one of only two chip fabricators in the world capable of producing such high-power silicon potentially crowding out demand from other sectors such as AI, transportation and home electronics.

In order to limit the growing footprint of the sector, De Vries suggests policymakers could follow the lead of regions that have put pressure on bitcoin miners, such as Qubec in Canada, where a moratorium on new mining operations has been imposed, or Iran, which decided to confiscate mining equipment as the country suffered from outages blamed on cryptocurrency mining.

Although bitcoin might be a decentralised currency, many aspects of the ecosystem surrounding it are not, he writes. Large-scale miners can easily be targeted with higher electricity rates, moratoria or, in the most extreme case, confiscation of the equipment used.

Moreover, the supply chain of specialised bitcoin mining devices is concentrated among only a handful of companies. Manufacturers like Bitmain can be burdened with additional taxes like tobacco companies or be limited in their access to chip production.

Read the original here:
Bitcoin rise could leave carbon footprint the size of London's - The Guardian

Read More..

The Bitcoin boom: The future of the company balance sheet – Cointelegraph

Bitcoin has seen unparalleled growth in early 2021, reaching highs of over $58,000, almost triple its peak of the 20172018 boom. We are entering an era where institutions are starting to turn to Bitcoin (BTC), as many countries worldwide have been printing unprecedented amounts of money to service mounting debt. And to make matters worse, they are also facing the risk of unmanageable inflation. This perfect storm of macro conditions means institutions like pension funds, hedge funds, as well as high-net-worth individuals with trillions of dollars in combined value are starting to pay attention and learn about Bitcoin for the first time.

Unlike the 2017 bull run, this current run is driven less by hype and more by Bitcoin being accepted in the traditional financial world as a scarce asset class. Enterprise and institutional adoption of crypto assets has been the driving theme of 2021, with Tesla investing $1.5 billion in Bitcoin, one of the most prominent examples of corporate adoption to date.

Related: Tesla, Bitcoin and the crypto space: The show Musk go on? Experts answer

Additionally, large institutions are recognizing the importance of Bitcoin as a store of value, with many adding millions of dollars of the asset to their balance sheets, including Goldman Sachs, Standard Chartered, Square, BlackRock, Fidelity Investments, MicroStrategy and more.

Related: Will PayPals crypto integration bring crypto to the masses? Experts answer

But the crypto landscape needs to change to truly allow Bitcoin to move into the traditional world. Institutions cant use private keys that can easily be lost, transact with long strings of letters and numbers, or store funds on exchanges with high counterparty risk.

New crypto regulation in the U.S. is making it easier and more acceptable to hold cryptocurrencies by providing more certainty across jurisdictions. Just last month in the U.S., The Office of the Comptroller of the Currency provided much-needed regulatory certainty regarding crypto activities. Brian Brooks, acting comptroller of the currency, stated that access to blockchains, such as Bitcoin or Ethereum, the holding of coins from these rails directly or on behalf of clients, and the running of nodes for a public blockchain is permitted. In other words, this allows banks to get actively involved a huge step in the direction of improving the comfort level of institutions interested in holding crypto.

We are also seeing more developments in terms of the custody and management of digital assets, which allows even more institutional and corporate players to enter the space. Goldman Sachs recently issued a request for information to explore the banks digital asset custody plans, part of a broader strategy in entering the stablecoin market. While the details arent yet firm, these seismic moves by key institutions are fueling the fire.

While these institutions have huge teams to manage and oversee their new crypto holdings, smaller companies have also started to experiment with adding Bitcoin and other cryptocurrencies to their balance sheet. As companies, big and small, start to hold crypto, it is becoming increasingly clear that the next generation of companies will act more like investors holding and balancing funds in multiple asset classes.

This includes companies for which crypto and blockchain is not their core business, reshaping businesses very value proposition: Everyone is now a fund whose returns may be decoupled from their core business offering. Small companies that may have only been holding cash are now investors concerned about their liquidity. In the emerging world of decentralized finance, the skys the limit to how complex asset management can become; you can buy and sell derivative products, engage in lending and much more.

Related: Where does the future of DeFi belong: Ethereum or Bitcoin? Experts answer

I envision a future where all companies hold crypto on their balance sheet, and every company is an investor, whether that is their core business offering or not. But this future is dependent on both user experience and regulation. Some companies and institutions holding crypto are willing to take risks by figuring out their own operational and financial security measures to manage their crypto, while for others, this is a non-starter. The traditional world will require custody solutions, a traditional UX for transactions, crypto wealth management and more.

For smaller companies starting to dip their toes into holding crypto, my advice is to keep it simple without getting too distracted by all the crypto volatility and noise. The current crypto rally brings great excitement and opportunity for growth, but companies need to do what makes sense for them. Keeping a basic index approach to corporate crypto treasury management for example, holding 5% of funds in Bitcoin, 95% cash and equivalents and rebalancing when the price increases or decreases allows you to gain exposure to the market while being smart with cash and runway.

Overall, as institutions start to get serious about Bitcoin and the combination of regulation and user experience helps to make crypto a more accessible and accepted asset class, the traditional world of financial management will evolve.

This article does not contain investment advice or recommendations. Every investment and trading move involves risk, and readers should conduct their own research when making a decision.

The views, thoughts and opinions expressed here are the authors alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.

Arianne Flemming is the chief operating officer of Informal Systems, a research-and-development institution focused on distributed systems and protocols. She has extensive experience in financial organization and operational leadership within the blockchain space, having helped design and execute long-term financial and operational strategies.

Original post:
The Bitcoin boom: The future of the company balance sheet - Cointelegraph

Read More..

Bitcoin snaps five-day rally after flirting with record high – Mint

Bitcoin declined for the first time in six trading sessions alongside a broader easing of risk-on sentiment in financial markets.

The worlds largest cryptocurrency slumped as much as 4.4% Friday before trimming some losses to trade at $55,600 at 8:58 a.m. in New York. It hit an all-time high of $58,350 on Feb. 21. The Bloomberg Galaxy Crypto Index, tracking Bitcoin, Ether and three other cryptocurrencies, slipped as much as 4.9%.

The bears last stand is the $57,800 level, and it looks like we might be seeing that battle play out before the week is over," said Matt Blom, global head of sales trading at EQUOS. On the downside, continued selling above $57,000 will see us slip back towards $56,620 and potentially $55,000. Any move below here will be supported by dip-buying bulls and dip-buying bears alike."

The digital asset is up nearly tenfold in the past year as optimism over rising institutional demand pushed prices to record highs. While some say that Bitcoin is a stimulus-fueled bubble likely to burst, industry participants argue that institutional adoption will prevent Bitcoin from plummeting from its highs as was witnessed in 2017-2018.

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Read more from the original source:
Bitcoin snaps five-day rally after flirting with record high - Mint

Read More..

Quantum computing: Honeywell just quadrupled the power of its computer – ZDNet

The System Model H1, a ten-qubit quantum computer, has reached a quantum volume of 512.

Honeywell's quantum scientists have quadrupled the capabilities of the company's quantum computer, with the device achieving record levels of performance less than a year after the first generation of the system was released.

The System Model H1, a ten-qubit quantum computer, effectively reached a quantum volume of 512 four times as much as was attained in the previous tweak of the system, which saw the H1 reach a quantum volume of 128.

Released commercially last June (at the time as the System Model H0), the H1 makes use of trapped ions, unlike IBM and Google's devices, which are built with superconducting qubits. Honeywell's new record is eight times as much as was achieved with the System Model H0, which launched with a quantum volume of 64.

Quantum volume is a concept that IBM developed in 2017 as a way of measuring various aspects of a quantum computer's performance; in simple terms, the higher the quantum volume, the higher the potential for resolving real-world problems across industry and research. Designed to be independent of the architecture of any given quantum computer, quantum volume can measure any system that runs quantum circuits.

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

For example, one measurement that is indicative of a quantum computer's capabilities is qubit fidelity, which is critical to understanding how well a device can implement quantum code. According to Honeywell, the average single-qubit gate fidelity in the latest version of the H1 was 99.991%.

The final number that determines quantum volume is an aggregate of many other measurements and tests of a single quantum system's operations: they include the number of physical qubits in the quantum computer, but also the device's error rate, and connectivity, which reflects the extent to which qubits can be fully connected to each other within the device.

This is why it is possible for a quantum system to reach a high quantum volume, even with few qubits. Despite having only ten qubits, for instance, Honeywell's System Model H1 performs well when it comes to error rates and connectivity, which has earned the device a top spot for its overall capabilities. In comparison,last year IBM's 27-qubit client-deployed system achieved a quantum volume of 64.

The new milestone, therefore, hasprompted Honeywell's president of quantum solutions Tony Uttley to describethe System Model H1 as "the highest performing quantum computing system in the world."

Honeywell has made no secret of its strategy, which consists of focusing on qubit fidelity and connectedness, before attempting to scale up the number of qubits. "When you hear about fidelity and error, that's about the quality of the quantum operation," Uttley told ZDNet. "It's about knowing how often you get the right answer when you run these quantum algorithms."

"We have taken one approach that is very unique when it comes to how to get the most out of these near-term systems," he continued. "Nobody is talking about millions of qubits right now we're talking about tens of qubits. To get the most out of these tens of qubits, you have to have super-high fidelity, fully-connected and highly-controlled systems. That's our approach."

SEE: The EU wants to build its first quantum computer. That plan might not be ambitious enough

Making these highly reliable systems available to Honeywell's customers now enables businesses to test and trial with small-scale applications while waiting for the company to design and build new generations of more capable quantum computers, according to Uttley.

Honeywell recently introduced the first subscription-based plan for the usage of the H1,which grants paying customers a monthly access to the machine.

With only ten qubits, there is little that the device can achieve on top of proofs of concepts, designed to be implemented in full scale once a larger computer is available; but high-profile customers are nevertheless flocking to Honeywell's services.

J.P. Morgan Chase, for example, is investigating how the company's quantum computermight improve operations in banking; and BMW ispiloting the use of Honeywell's hardwareto optimize supply chains for car manufacturing.

Read this article:
Quantum computing: Honeywell just quadrupled the power of its computer - ZDNet

Read More..

After merger, College Park startup IonQ plans to go public with $2 billion valuation – The Diamondback

IonQ, a quantum computing startup born in College Park, announced Monday that it would likely soon become the first publicly traded company to specialize in commercialized quantum computing.

The company plans to file paperwork with the Securities Exchange Commission in the next week, which will allow it to go public on the New York Stock Exchange through an acquisition deal that would set the valuation of the combined entity to nearly $2 billion.

The ability to become a public company gives us access to a huge capital base, and that will allow us to spend more time building our system, deploying them for useful application, said Chris Monroe, IonQs founder and a physics professor at the University of Maryland. We can start to do our own research and development We can do more risky things.

Monroe and co-founder Junsang Kim formed IonQ with the goal of taking quantum computing into the market. They initially received $2 million in seed funding from New Enterprise Associates, giving them a license to lab technology from the University of Maryland and Duke University. From there, they were able to raise tens of millions of dollars in funding from companies like Samsung and Mubadala, and partnered with Amazon Web Services and Microsoft.

[Gov. Hogan names College Park quantum computing company one of top state start-ups]

The company going public was made possible by a planned merger with a blank-check firm, dMY Technology Group Inc. III.

If it goes through, the merger will result in over $650 million in gross proceeds, including $350 million from private investors, according to a press release from IonQ. Combined with the $84 million the company has raised in venture capital funding, the deal would place IonQs total earnings at about $734 million.

The transition to quantum computing is unprecedented, Monroe said, and it will allow people to solve problems that a regular computer often cant.

Some problems like optimizing a fleet of trucks or discovering medicines have too many variables to solve with regular computing. But at the quantum level, more information can be handled, Monroe said, making it radically different from todays computing.

University President Darryll Pines, formerly the dean of the engineering school, explained that classical computing uses a stream of electrical pulses called bits, which represent 1s and 0s, to store information. However, on the quantum scale, subatomic particles known as qubits are used to store information, greatly increasing the speed of computing.

IonQs approach to researching quantum computing has been rooted in university-led research. Quantum physics has strange rules that arent always accepted in the engineering world, Monroe said, so many of these laws have become the domain of research at universities and national laboratories.

And this university especially, with its proximity to Washington, D.C., has one of the biggest communities of quantum scientists, Monroe said.

We have students and postdocs and all kinds of researchers on Marylands campus studying the field, and at IonQ, weve hired many of them, Monroe said. And thats a huge advantage for us.

As a company with about 60 employees, some of whom attended this university, IonQ has become a pioneer in quantum computing. In October, Peter Chapman, IonQs CEO and president, announced the companys newest 32-qubit computer, the most powerful quantum computer on the market.

And in November, Maryland Gov. Larry Hogan named IonQ one of the states top 20 startup companies in the state.

[Women of color in UMD community are making it as entrepreneurs despite challenges]

The biggest advantage for IonQ has been its technology, Monroe said. Companies like IBM, Google or Microsoft use silicon to build their computers but IonQ uses individual atoms, which, unlike silicon, float over a chip in a vacuum chamber.

That technology has been perfected at this university, Monroe said, and IonQ has a concrete plan over the next five years to manufacture quantum computer modules and wire them together.

By 2030, 20 percent of global organizations whether in the public or private sector are expected to budget for quantum-computing projects, according to Gartner Inc., a global research and advisory firm. That number is up from less than 1 percent in 2018, according to Gartner.

Niccolo de Masi, CEO of dMY, said in IonQs press release that he expects the quantum computing industry to grow immensely in the next ten years, with a market opportunity of approximately $65 billion by 2030.

Pines expressed his excitement at seeing a university startup make strides in computing.

Were happy for building the ecosystem from science, to translation, to startup, to possibly developing a product and adding value to society and growing jobs in the state of Maryland, Pines said.

See the article here:
After merger, College Park startup IonQ plans to go public with $2 billion valuation - The Diamondback

Read More..

Quantum computing company D-Wave Systems secures $40M in government funding – IT World Canada

Burnaby, B.C.-based D-Wave Systems is getting $40 million from the federal government to help advance its efforts in the development of quantum computing.

The funding comes from Ottawas Strategic Innovation Fund to support a $120 million project to advance D-Waves hardware and software.

Quantum will help us quickly solve problems that would have otherwise taken decades, said Franois-Philippe Champagne, Minister of Innovation, Science and Industry, told reporters during a virtual press briefing for the announcement.

In a separate release, he added that the funding will will help place Canada at the forefront of quantum technology development, and will create new jobs and opportunities to help Canadians and advance the economy.

A brief history (so far) of quantum computing [PART 1]

A brief history (so far) of quantum computing [PART 3]

A brief history (so far) of quantum computing [PART 2]

D-Wave is the first company to offer a commercially available quantum computer but is still only in the early stages of building a sustainable business after 20 years of development and more than USD $300 million in funds raised.

D-Wave promoted Silicon Valley veteran executive Alan Baratz to chief executive officer last year, replacing Vern Brownell. The company also experienced other changes at the top of the corporate ladder and has parted ways with long-time board members.

Jim Love, Chief Content Officer, IT World Canada

See the original post here:
Quantum computing company D-Wave Systems secures $40M in government funding - IT World Canada

Read More..

Europe moves to exclude neighbors from its quantum and space research – Science Magazine

A department overseen by European Union research commissioner Mariya Gabriel wants to safeguard strategic research by barring non-EU researchers.

By Nicholas WallaceMar. 11, 2021 , 4:25 PM

In a sign of growing national tensions over the control of strategic research, the European Commission is trying to block countries outside the European Union from participating in quantum computing and space projects under Horizon Europe, its new research funding program.

The proposed calls, which must still be approved by delegates from the 27 EU member states in the coming weeks, would shut out researchers in countries accustomed to full access to European research programs, including Switzerland, the United Kingdom, and Israel. European Economic Area (EEA) countries Norway, Lichtenstein, and Iceland would be barred from space research calls while remaining eligible for quantum computing projects.

Research advocates see the proposed restrictions as self-defeating for all parties, including the European Union. It would be a classic lose-lose, with researchers in all countries having to work harder, and spend more, to make progress in these fields, says Vivienne Stern, director of UK Universities International. The unexpected news has upset some leaders of existing collaborations and left them scrambling to find out whether they will need to exclude partnersor even drop out themselvesif they want their projects to be eligible for further funding. It is really a pity because we have a tight and fruitful relationship with our partners in the U.K., says Sandro Mengali, director of the Italian research nonprofit Consorzio C.R.E.O. and coordinator of an EU-funded project developing heat shields for spacecraft.

In 2018, when the European Commission first announced plans for the 85 billion, 7-year Horizon Europe program, it said it would beopen to the world. Switzerland, Israel, the EEA nations, and other countries have long paid toassociate with EU funding programs like Horizon Europegiving their researchers the right to apply for grants, just like those in EU member states. After leaving the European Union,the United Kingdom struck a dealin December 2020 to join Horizon Europe, which put out its first grant calls last month through the European Research Council.

But more recently,strategic autonomy andtechnological sovereignty have become watchwords among policymakers in Brussels, who argue the European Union should domestically produce components in key technologies, such as quantum computers and space technology. Those views influenced the Commissions research policy department, overseen by EU research commissioner Mariya Gabriel, which drafted the calls and their eligibility rules,first revealed by Science|Business. The draft says the restrictions are necessary tosafeguard the Unions strategic assets, interests, autonomy, or security.

Its a bit of a contradiction, says a Swiss government official who asked to remain anonymous because of the sensitivity of forthcoming discussions.You want to open the program to the world and work with the best. But the core group of associated countries with whom youre used to working, suddenly you exclude them and force them to work with the competitors. The official says the Commission gave no warnings the proposal was coming but believes the combination of Brexit and the COVID-19 crisis, in which Europe has struggled to secure access to vaccines, masks, and other equipment, may have further spurred Europe to guard its technologies. Negotiations on Swiss membership in Horizon Europe have not begun, but the country intends to join.

The restrictions affect 170 million in funding that could be available in the next few months. The affected areas include quantum computing, quantum communications, satellite communications, space transport, launchers, andspace technologies for European non-dependence and competitiveness. Projects relating to the Copernicus Earth-observation system and the Galileo satellite navigation programs would remain largely open to associated countries.

Shutting out the associated countries would be alost opportunity and could slow progress in quantum computing, says Lieven Vandersypen, a quantum nanoscientist at the Delft University of Technology.To me, it doesnt make sense. Vandersypen contributes to an EU-funded project that is investigating how to create the basic bits of a quantum computer from cheap and readily available silicon. The project includes U.K. and Swiss researchers at University College London and the University of Basel.They are in there for a good reason, Vandersypen says.They bring in really valuable expertise. With a few years left on the grant, the project isn't in any immediate danger. But the exclusions are bad for long-term planning, Vandersypen says.

Non-EU researchers working on a 150 million European quantum flagship initiative set up in 2018 are also upset by the sudden reversal and wonder about their future status. We discuss with our partners in Europe, they ask us, Can you join?And we dont knowthats probably the worst thing, says Hugo Zbinden, a quantum physicist at the University of Geneva and coordinator of one of these flagship projects, QRANGE, which is investigating how a quantum random number generator can be used to improve encryption.

The restrictions are not yet set in stone; national delegates could reject the draft calls and ask the Commission to open them up. But member states accepted the legal basis for the restrictions last year, when they agreed to the Horizon Europe legislation.Of course, you hope that we will be in, Zbinden says. For the time being, we are waiting for some news.

More here:
Europe moves to exclude neighbors from its quantum and space research - Science Magazine

Read More..

After year of reset expectations, D-Wave secures $40-million from Ottawa for quantum computing – The Globe and Mail

D-Wave is the first company to offer a commercially available quantum computer.

Reuters

One of Canadas most heavily financed technology development companies, quantum computer maker D-Wave Systems Inc., has secured a $40-million financial contribution from the federal government.

The funding, through Ottawas Strategic Innovation Fund, follows a year of reset expectations for D-Wave, a leader in the global race to develop computers whose chips draw their power by harnessing natural properties of subatomic particles to perform complex calculations faster than conventional computers.

Burnaby, B.C.-based D-Wave is the first company to offer a commercially available quantum computer, but after 20-plus years of development and more than US$300-million in funds raised, it is still in the early stages of building a sustainable business.

Story continues below advertisement

Last year D-Wave promoted Silicon Valley veteran executive Alan Baratz to chief executive officer, replacing Vern Brownell, to step up commercialization efforts. The company also parted ways with other top executives and long-time board members.

Mr. Baratz, who led Sun Microsystems Inc.s effort in the 1990s to transform Java from a nascent programming language into the internets main software-writing platform, directed D-Wave to stop selling its shed-sized computers, which listed for US$15-million and had just a handful of customers including NASA, Google, Lockheed Martin and the U.S. Los Alamos National Laboratory.

Instead, D-Wave has focused on selling online access to the technology and expanded its software applications, which Mr. Baratz had started developing after joining as chief product officer in 2017. Customers including Volkswagen and biotechnology startups have used D-Waves technology to find answers to dense optimization problems, such as improving traffic flows in big cities, identifying proteins that could become breakthrough drugs and improving the efficiency of painting operations on vehicle production assembly lines.

D-Wave also completed a costly US$40-million refinancing last year that wiped out most of the value of some long-time investors, including the U.S. Central Intelligence Agencys venture capital arm, Amazon CEO Jeff Bezos and fund giant Fidelity Investments. The capital restructuring cut D-Waves valuation to less than US$170-million, down from US$450-million, The Globe reported in October. Investors that ponied up, including Public Sector Pension Investment Board, D-Waves top shareholder, BDC Capital and Goldman Sachs, maintained their relative stakes, limiting their writedowns.

Over the years [D-wave has] had to raise money and more money and more money ... and as such you end up getting diluted over time because every third quarter it seems like you run out of the $50-million that you raised, Kevin Rendino, CEO and portfolio manager of D-Wave investor 180 Degree Capital Corp., told his investors last November. D-Wave has been a source of bitter disappointment for all of us.

Meanwhile, D-Wave faces years and tens of millions of dollars more in costs to continue developing its core technology. The government aid will support a $120-million project to advance D-Waves hardware and software and will help place Canada at the forefront of quantum technology development, and will create new jobs and opportunities to help Canadians and advance the economy, Franois-Philippe Champagne, Minister of Innovation, Science and Industry, said in a release.

During a press conference to discuss the funding, the minister was asked if the government would review potential takeovers of quantum computing companies, as the U.S. government is considering doing. Mr. Champagne provided a non-committal response, saying Im sure you would expect us to be eyes wide open when it comes to whatever we would need to take in terms of steps to protect.[intellectual property] that has been developed in Canada.

Story continues below advertisement

Were always out there looking at how we can improve to make sure that new technologies and inventions and improvements and IP that has been developed in Canada stays in Canada.

D-Wave faces a slew of competitors including Google, Microsoft, Intel, IBM and Honeywell that are also trying to build the first quantum machine that can outperform classical or conventional computers. In addition, a new class of startups including Torontos Xanadu Quantum Technologies Inc. and College Park, Md.-based IonQ Inc. believe they can build quantum chips that dont have to be supercooled to function, as D-Waves system and others in development do. IonQ said this week it would go public through a special purpose acquisition company become the first publicly traded quantum computing-focused company.

Mr. Baratz said in an emailed statement that since D-Waves launch last September of of its latest quantum chip and expanded efforts to sell online access to its computers weve been encouraged by the positive customer response to the value delivered by a quantum system designed for practical, in-production business-scale applications. Were eager to see even more developers, academics, and companies leverage it to solve larger, more complex problems.

Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.

View post:
After year of reset expectations, D-Wave secures $40-million from Ottawa for quantum computing - The Globe and Mail

Read More..

Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI? – Analytics India Magazine

As per Open AI data, the amount of computational power needed to train large AI models has grown massively doubling every three and a half months since 2021. GPT-3, which requires 3.14E23 FLOPS of computing for training, is a good case in point.

Typically, to carry out high-performance computing tasks, conventional AI chips are equipped with transistors that work with electrons. Although they perform a wide array of complex high performing tasks, energy consumption and engineering glitches pose a challenge. Thus, the growing need for computing power has set researchers on a quest to find a workaround to boost these chips power without increasing energy consumption.

And thats when experts turned to photons and light particles that can easily substitute electrons in AI chips to reduce the heat, leading to a massive reduction in energy consumption and a dramatic upgrade in processor speed.

While electrons perform calculations by reducing the information to a series of 1s and 0s, photonic chips split and mix beams of light within tiny channels to carry out the tasks. Compared to regular AI chips, photonics chips are only designed to perform a certain kind of mathematical calculation, critical for running large AI models.

Lightmatter, an MIT-backed startup, last year developed an AI chip Envise that leverages photons (light particles) to perform computing tasks.

Lights travel faster than electrons. The concept of using light as a substitute for carrying out heavy tasks (aka photonics computing/optical computing) dates back to the 1980s, when Nokia Bell Labs, an American industrial research and scientific development company, tried to develop a light-based processor. However, due to the impracticality of creating a working optical transistor, the concept didnt take off.

We experience optical technology in cameras, CDs, and even in Blue-Ray discs. But these photons are usually converted into electrons to deploy in chips. Four decades later, photonic computing gained momentum when IBM and researchers from the University of Oxford Muenster developed the system that uses light instead of electricity to perform several AI model-based computations.

Alongside, Lightmatters new AI chip has created a buzz in the industry. According to the company website, Envise can run the largest neural networks three times higher inferences/second than the Nvidia DGX-A100, with seven times the inferences/second/Watt on BERT-Base with the SQuAD dataset.

Japan-based NTT company has also been developing an optical computer believed to outpace quantum computing to solve optimisation problems. Last year, Chinese quantum physicist, Chao-Yang Lu, has also announced light-based quantum computing.

Other companies like US-based Honeywell and IonQ have also been working around the issue by using trapped ions.

Such developments have led the experts to believe photonics computing will gain ground once the big tech companies throw their weight behind it and understand the importance of using light for their AI chips.

On the other hand, like any other remarkable technology, photonics computing also comes with certain challenges. Despite its less energy-consumption, photons chips are considered less accurate and precise than electron-based chips. Much of this could be attributed to its analogue-based calculations, making it perfect for running pre-trained models and deep neural networks.

On the designing aspect, silicon-based computer chips dont go well with photo particles that limit their usage in computing.

The cost issues and environmental impact of digital chips might set the stage for photonics computing to rise as a substitute. With startups like Lightmatter and giants like IBM committing resources to this computing paradigm, AI might get a photonic boost.

Read the rest here:
Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI? - Analytics India Magazine

Read More..

Working at the intersection of data science and public policy | Penn Today – Penn Today

One of the ideas you discuss in the book is algorithmic fairness. Could you explain this concept and its importance in the context of public policy analytics?

Structural inequality and racism is the foundation of American governance and planning. Race and class dictate who gets access to resources; they define where one lives, where children go to school, access to health care, upward mobility, and beyond.

If resource allocation has historically been driven by inequality, why should we assume that a fancy new algorithm will be any different? This theme is present throughout the book. Those reading for context get several in-depth anecdotes about how inequality is baked into government data. Those reading to learn the code, get new methods for opening the algorithmic black box, testing whether a solution further exasperates disparate impact across race and class.

In the end, I develop a framework called algorithmic governance, helping policymakers and community stakeholders understand how to tradeoff algorithmic utility with fairness.

From your perspective, what are the biggest challenges in integrating tools from data science with traditional planning practices?

Planning students learn a lot about policy but very little about program design and service delivery. Once a legislature passes a $50 million line item to further a policy, it is up to a government agency to develop a program that can intervene with the affected population, allocating that $50 million in $500, $1,000 or $5,000 increments.

As I show in the book, data science combined with governments vast administrative data is good at identifying at-risk populations. But doing so is meaningless unless a well-designed program is in place to deliver services. Thus, the biggest challenge is not teaching planners how to code data science but how to consider algorithms more broadly in the context of service delivery. The book provides a framework for this by comparing an algorithmic approach to service delivery to the business-as-usual approach.

Has COVID-19 changed the way that governments think about data science? If so, how?

Absolutelyspeaking of service delivery, data science can help governments allocate limited resources. The COVID-19 pandemic is marked entirely by limited resources: From testing, PPE, and vaccines to toilet paper, home exercise equipment, and blow-up pools (the latter was a serious issue for my 7-year-old this past summer).

Government failed at planning for the allocation of testing, PPE, and vaccines. We learned that it is not enough for government to invest in a vaccine; it must also plan for how to allocate vaccines equitably to populations at greatest risk. This is exactly what we teach in Penns MUSA Program, and I was disappointed at how governments at all levels failed to ensure that the limited supply of vaccine aligned with demand.

We see this supply/demand mismatch show up time and again in government, from disaster response to the provision of health and human services. I truly believe that data can unlock new value here, but, again, if government is uninterested in thinking critically about service delivery and logistics, then the data is merely a sideshow.

What do you hope people gain by reading this book?

There is no equivalent book currently on the market. If you are an aspiring social data scientist, this book will teach you how to code spatial analysis, data visualization, and machine learning in R, a statistical programming language. It will help you build solutions to address some of todays most complex problems.

If you are a policymaker looking to adopt data and algorithms into government, this book provides a framework for developing powerful algorithmic planning tools, while also ensuring that they will not disenfranchise certain protected classes and neighborhoods.

See the original post here:

Working at the intersection of data science and public policy | Penn Today - Penn Today

Read More..