Category Archives: Quantum Computer

New Technique May Be Capable of Creating Qubits From Silicon Carbide Wafer – Tom’s Hardware

Researchers from the University of Chicago have discovered a technique that might be able to produce qubits from defects in commercially available silicon carbide wafers. This could represent a scalable way of creating qubits using off-the-shelf tools.

As published in Science and reported on by IEEE Spectrum, the researchers bought a silicon carbide wafer and shot an electron beam at it. This created deficiencies that behaved as a single electron spin that could be manipulated electrically, optically (with lasers) and magnetically. Basically, the defects act as room-temperature cages for electrons. The electrons spin, an inherent property of electrons, could then be used as a qubit. Individual electron spins can hold their information for up to 1 millisecond.

"Our approach is to see if we can leverage the trillion dollars or so of American industry thats building todays nanoelectronics and see if we can pivot that technology," David Awschalom, one of the researchers and a professor of molecular engineering at the University of Chicago, said, according to IEEE.

However, the researchers' work is still in early stages. They do not have a working quantum computer yet or even a provable qubit.

The work seems similar to one of Intels two approaches to building quantum computers, which makes use of spin qubits manufactured on Intels standard 300mm CMOS wafer lines. The difference seems to be that Intel uses silicon wafers instead of silicon carbide. Intel announced testing the chip in 2018. In December, Intel also created a control chip for its quantum chips.

Continue reading here:
New Technique May Be Capable of Creating Qubits From Silicon Carbide Wafer - Tom's Hardware

How to verify that quantum chips are computing correctly – MIT News

In a step toward practical quantum computing, researchers from MIT, Google, and elsewhere have designed a system that can verify when quantum chips have accurately performed complex computations that classical computers cant.

Quantum chips perform computations using quantum bits, called qubits, that can represent the two states corresponding to classic binary bits a 0 or 1 or a quantum superposition of both states simultaneously. The unique superposition state can enable quantum computers to solve problems that are practically impossible for classical computers, potentially spurring breakthroughs in material design, drug discovery, and machine learning, among other applications.

Full-scale quantum computers will require millions of qubits, which isnt yet feasible. In the past few years, researchers have started developing Noisy Intermediate Scale Quantum (NISQ) chips, which contain around 50 to 100 qubits. Thats just enough to demonstrate quantum advantage, meaning the NISQ chip can solve certain algorithms that are intractable for classical computers. Verifying that the chips performed operations as expected, however, can be very inefficient. The chips outputs can look entirely random, so it takes a long time to simulate steps to determine if everything went according to plan.

In a paper published today in Nature Physics, the researchers describe a novel protocol to efficiently verify that an NISQ chip has performed all the right quantum operations. They validated their protocol on a notoriously difficult quantum problem running on custom quantum photonic chip.

As rapid advances in industry and academia bring us to the cusp of quantum machines that can outperform classical machines, the task of quantum verification becomes time critical, says first author Jacques Carolan, a postdoc in the Department of Electrical Engineering and Computer Science (EECS) and the Research Laboratory of Electronics (RLE). Our technique provides an important tool for verifying a broad class of quantum systems. Because if I invest billions of dollars to build a quantum chip, it sure better do something interesting.

Joining Carolan on the paper are researchers from EECS and RLE at MIT, as well from the Google Quantum AI Laboratory, Elenion Technologies, Lightmatter, and Zapata Computing.

Divide and conquer

The researchers work essentially traces an output quantum state generated by the quantum circuit back to a known input state. Doing so reveals which circuit operations were performed on the input to produce the output. Those operations should always match what researchers programmed. If not, the researchers can use the information to pinpoint where things went wrong on the chip.

At the core of the new protocol, called Variational Quantum Unsampling, lies a divide and conquer approach, Carolan says, that breaks the output quantum state into chunks. Instead of doing the whole thing in one shot, which takes a very long time, we do this unscrambling layer by layer. This allows us to break the problem up to tackle it in a more efficient way, Carolan says.

For this, the researchers took inspiration from neural networks which solve problems through many layers of computation to build a novel quantum neural network (QNN), where each layer represents a set of quantum operations.

To run the QNN, they used traditional silicon fabrication techniques to build a 2-by-5-millimeter NISQ chip with more than 170 control parameters tunable circuit components that make manipulating the photon path easier. Pairs of photons are generated at specific wavelengths from an external component and injected into the chip. The photons travel through the chips phase shifters which change the path of the photons interfering with each other. This produces a random quantum output state which represents what would happen during computation. The output is measured by an array of external photodetector sensors.

That output is sent to the QNN. The first layer uses complex optimization techniques to dig through the noisy output to pinpoint the signature of a single photon among all those scrambled together. Then, it unscrambles that single photon from the group to identify what circuit operations return it to its known input state. Those operations should match exactly the circuits specific design for the task. All subsequent layers do the same computation removing from the equation any previously unscrambled photons until all photons are unscrambled.

As an example, say the input state of qubits fed into the processor was all zeroes. The NISQ chip executes a bunch of operations on the qubits to generate a massive, seemingly randomly changing number as output. (An output number will constantly be changing as its in a quantum superposition.) The QNN selects chunks of that massive number. Then, layer by layer, it determines which operations revert each qubit back down to its input state of zero. If any operations are different from the original planned operations, then something has gone awry. Researchers can inspect any mismatches between the expected output to input states, and use that information to tweak the circuit design.

Boson unsampling

In experiments, the team successfully ran a popular computational task used to demonstrate quantum advantage, called boson sampling, which is usually performed on photonic chips. In this exercise, phase shifters and other optical components will manipulate and convert a set of input photons into a different quantum superposition of output photons. Ultimately, the task is to calculate the probability that a certain input state will match a certain output state. That will essentially be a sample from some probability distribution.

But its nearly impossible for classical computers to compute those samples, due to the unpredictable behavior of photons. Its been theorized that NISQ chips can compute them fairly quickly. Until now, however, theres been no way to verify that quickly and easily, because of the complexity involved with the NISQ operations and the task itself.

The very same properties which give these chips quantum computational power makes them nearly impossible to verify, Carolan says.

In experiments, the researchers were able to unsample two photons that had run through the boson sampling problem on their custom NISQ chip and in a fraction of time it would take traditional verification approaches.

This is an excellent paper that employs a nonlinear quantum neural network to learn the unknown unitary operation performed by a black box, says Stefano Pirandola, a professor of computer science who specializes in quantum technologies at the University of York. It is clear that this scheme could be very useful to verify the actual gates that are performed by a quantum circuit [for example] by a NISQ processor. From this point of view, the scheme serves as an important benchmarking tool for future quantum engineers. The idea was remarkably implemented on a photonic quantum chip.

While the method was designed for quantum verification purposes, it could also help capture useful physical properties, Carolan says. For instance, certain molecules when excited will vibrate, then emit photons based on these vibrations. By injecting these photons into a photonic chip, Carolan says, the unscrambling technique could be used to discover information about the quantum dynamics of those molecules to aid in bioengineering molecular design. It could also be used to unscramble photons carrying quantum information that have accumulated noise by passing through turbulent spaces or materials.

The dream is to apply this to interesting problems in the physical world, Carolan says.

Read the rest here:
How to verify that quantum chips are computing correctly - MIT News

The hunt for the ‘angel particle’ continues – Big Think

A theoretical class of particles called Majorana fermions remains a mystery. In 2017, scientists believed they had uncovered evidence for the existence Majorana fermions. Unfortunately, recent research shows that their findings were actually due to a faulty experimental device, bringing researchers back to the drawing board in the search for the exotic particles.

The Standard Model of particle physics currently is our best means of explaining the fundamental forces of the universe. It classifies the various elementary particles, like photons, the Higgs boson, and the various quarks and leptons. Broadly, its particles are divided into two classes: Bosons, like the photon and Higgs, and fermions, which comprise the quarks and leptons.

There are a few major differences between these types of particles. One, for instance, is that fermions have antiparticles, while bosons do not. There can be an anti-electron (i.e., a positron), but there's no such thing as an antiphoton. Fermions also can't occupy the same quantum state; for instance, electrons orbiting an atom's nucleus can't both occupy the same orbital level and spin in the same direction two electrons can hang out in the same orbital and spin in opposite directions because this represents a different quantum state. Bosons, on the other hand, don't have this problem.

But back in 1937, a physicist named Ettore Majorana discovered that there a different, unusual kind of fermion could exist; the so-called Majorana fermion.

All the fermions in the Standard Model are referred to as Dirac fermions. Where they and Majorana fermions differ is that the Majorana fermion would be its own antiparticle. Because of this quirk, the Majorana fermion has been nicknamed the "angel particle" after the Dan Brown novel "Angels and Demons," whose plot involved a matter/anti-matter bomb.

Until 2017, however, there remained no definitive experimental evidence for Majorana fermions. But during that year, physicists constructed a complicated experimental device involving a superconductor, a topological insulator which conducts electricity along its edges but not through its center and a magnet. The researchers observed that in addition to electrons flowing along the edge of the topological insulator, this device also showed signs of producing Majorana quasiparticles.

Quasiparticles are an important tool that physicists use when searching for evidence of "real" particles. They aren't the real thing themselves, but they can be thought of as disturbances in a medium that represent a real particle. You can think of them like bubbles in a Coca Cola a bubble itself isn't an independent object, but rather a phenomenon that emerges from the interaction between carbon dioxide and the Coca Cola. If we were to say there was some hypothetical "bubble particle" that really existed, we could measure the "quasi"-bubbles in a Coca Cola to learn more about its characteristics and provide evidence for this imaginary particle's existence.

By observing quasiparticles with properties that matched theoretical predictions of Majorana fermions, the researchers believed that they had found a smoking gun that proved these peculiar particles really existed.

Regrettably, recent research showed that this finding was in error. The device that the 2017 researchers used was only supposed to generate signs of Majorana quasiparticles when exposed to a precise magnetic field. But new researchers from Penn State and the University of Wurzburg found that these signs emerged whenever a superconductor and topological insulator were combined regardless of the magnetic field. The superconductor, it turns out, acted as an electrical short in this system, resulting in a measurement that looked right, but was really just a false alarm. Since the magnetic field wasn't contributing to this signal, the measurements didn't match theory.

"This is an excellent illustration of how science should work," said one of the researchers. "Extraordinary claims of discovery need to be carefully examined and reproduced. All of our postdocs and students worked really hard to make sure they carried out very rigorous tests of the past claims. We are also making sure that all of our data and methods are shared transparently with the community so that our results can be critically evaluated by interested colleagues."

Majorana fermions are predicted to appear in devices where a superconductor is affixed on top of a topological insulator (also referred to as a quantum anomalous Hall insulator [QAH]; left panel). Experiments performed at Penn State and the University of Wrzburg in Germany show that the small superconductor strip used in the proposed device creates an electrical short, preventing the detection of Majoranas (right panel).

Cui-zu Chang, Penn State

Beyond the intrinsic value of better understanding the nature of our universe, Majorana fermions could be put to serious practical use. They could lead to the development of what's known as a topological quantum computer.

A regular quantum computer is prone to decoherence essentially, this is the loss of information to the environment. But Majorana fermions have a unique property when applied in quantum computers. Two of these fermions can store a single qubit (the quantum computer's equivalent of a bit) of information, as opposed to a regular quantum computer where a single qubit of information is stored in a single quantum particle. Thus, if environmental noise disturbs one Majorana fermion, its associated particle would still store the information, preventing decoherence.

To make this a reality, researchers are still persistently searching for the angel particle. As promising as the 2017 research appeared, it looks like the hunt continues.

From Your Site Articles

Related Articles Around the Web

Read more:
The hunt for the 'angel particle' continues - Big Think

Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 – The Coin Republic

Ritika Sharma Monday, 13 January 2020, 03:49 EST Modified date: Monday, 13 January 2020, 05:00 EST

Quantum computing whenever hit the headlines left not just Bitcoin holders but also every Cryptocurrency holder worried about the uncertainty around their holdings.

It widely believed that the underlying technology of Bitcoin, Blockchain is immutable, meaning it cannot be changed or encrypted without authority over encryption keys.

However, with quantum computers, it is possible to break a blockchains cryptographic codes. Quantum computing can hit the most significant features of Blockchain like unchangeable data, unalterable, and security making it vulnerable.

Google has achieved quantum supremacy as of late 2019, which poses a threat to Bitcoin. It will be a threat to Blockchain, as quantum computing will affect one blockchains key features like inalterability and security, thus making Blockchain as highly vulnerable technology.

Later, china Joined Google in the quantum supremacy Race and announced working on quantum technology. With this, the year 2020 might witness the end of the Crypto Era.

How can Quantum computing break the Blockchain?

The reason behind this fear is quite genuine and straightforward: Bitcoin or any Cryptocurrency depends on cryptography, hash functions, and asymmetric cryptographic number mainly relies on the computing power of computers. The hash function calculates a random number for each block.

The results obtained by this process are effortless to verify, but challenging to find. However, quantum computing has powerful algorithmic capabilities, which is precisely the enemy of this key.

Quantum computing uses subatomic particles, which will be available in more than one state at one time. This feature makes Quantum computing faster than the technology we use today.

Quantum computers can work 100 million times faster than current systems; the computational power is capable of solving any complex mathematical equation in a matter of a few seconds, which current systems take 10,000 years to solve.

With such super computational powers, Quantum computers is capable of calculating the one-way functions that will make one-way encryption obsolete.

The risk over Blockchain is more if it gets in the wrong hands. Hackers with a quantum computer can hack the Cryptocurrency ledger and take complete control of Blockchain.

Will Googles Quantum computing wipe out your Bitcoins?

Googles quantum Supremacy only to traditional computers on classical problems; this isnt actual quantum technology. It was presented bluntly as, quantum supremacy, though it is just a step in the world of quantum computing space.

Even if Googles quantum computer demonstrates, its computing power on specific problems far exceeds the best performing supercomputing. The results of this research by Google do not have much meaning in terms of Bitcoin. This isnt even near to what we can call breaking Bitcoin or Blockchain.

However, Googles quantum supremacy does not pose any threat to Bitcoin; many people in the space still stressed about quantum threat theory. Many analysts claim that the quantum algorithm used by Shor can crack private keys, but again, there Is a long way to go before it could break bitcoins Blockchain.

According to researchers, a quantum computer with 4,000 qubits is undoubtedly able to break the Blockchain. Still, googles the quantum computer has only 53 qubits, which cannot cause any harm to Blockchain, and it is worth mentioning that The higher the qubit, the more difficult it becomes.

Satoshi Nakamotos Proposed solution to beat Quantum Supremacy

Satoshi was a true visionary, the things we are concerned about today, and had already been answered by him. In 2010, satoshi Nakamoto responded to the question about quantum computers by username llama on bitcoin talk.

He replied that If Bitcoin suddenly cracked, the signature will be destroyed; but if it is slowly changed, the system still has time to convert to a stronger function, and Re-sign all your assets. Another cruder answer to this question suggested by the author of Mastering Bitcoin, Andreas Antonopoulos, If the quantum computer comes, we will upgrade.

The Quantum supremacy threat isnt new to the crypto world, and many cryptocurrency projects such as Ethereum, quantum chains, etc., focused on making blockchain quantum resistance, experts in Cryptocurrency space also advocating the development of quantum encryption technology to ensure the security of funds.

Unless a threat of Actual Quantum computing of far more powerful processor explodes, Bitcoin and its developers still have time to secure it. With the continuous development in Quantum technology and the development of more qubit chips, still, there will be the sword of Damocles hanging on the head of the cryptocurrency.

Follow this link:
Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 - The Coin Republic

Bleeding edge information technology developments – IT World Canada

What are some bleeding-edge information technology developments that a forward-thinking CIO should keep an eye on?

Here are a few emerging technologies that have caught my attention. These are likely to have an increasing impact on the world of business in the future. Consider which ones you should follow a little more closely.

A recent advance in quantum computing that a Google team achieved indicates that quantum computing technology is making progress out of the lab and closing in on practical business applications. Quantum computing is not likely to change routine business transaction processing or data analytics applications. However, quantum computing is likely to dramatically change computationally intense applications required for:

Since most businesses can benefit from at least a few of these applications, quantum computing is worth evaluating. For a more detailed discussion of specific applications in various topic areas, please read: Applying Paradigm-Shifting Quantum Computers to Real-World Issues.

Machine learning is the science of computers acting without software developers writing detailed code to handle every case in the data that the software will encounter. Machine learning software develops its own algorithms that discover knowledge from specific data and the softwares prior experience. Machine learning is based on statistical concepts and computational principles.

The leading cloud computing infrastructure providers machine learning routines that are quite easy to integrate into machine learning applications. These routines greatly reduce expertise barriers that have slowed machine learning adoption at many businesses.

Selected business applications of machine learning include:

For summary descriptions of specific applications, please read: 10 Companies Using Machine Learning in Cool Ways.

Distributed ledger technology is often called blockchain. It enables new business and trust models. A distributed ledger enables all parties in a business community to see agreed information about all transactions, not just their own. That visibility builds trust within the community.

Bitcoin, a cryptocurrency, is the mostly widely known example application of blockchain.

Distributed ledger technology has great potential to revolutionize the way governments, institutions, and corporations interact with each other and with their clients or customers.Selected business applications of distributed ledger technology include:

For descriptions of industry-specific distributed ledger applications, please read: 17 Blockchain Applications That Are Transforming Society.

The Industrial Internet of Things (IIoT) is a major advance on Supervisor Control and Data Acquisition (SCADA). SCADA, in many forms, has been used for decades to safely operate major industrial facilities including oil refineries, petrochemical plants, electrical power generation stations, and assembly lines of all kinds.

IIOT is a major advance over relatively expensive SCADA. IIoT relies on dramatically cheaper components including sensors, network bandwidth, storage and computing resources. As a result, IIoT is feasible in many smaller facilities and offers a huge increase in data points for larger facilities. Business examples where IIoT delivers considerable value include production plants, trucks, cars, jet engines, elevators, and weather buoys.

The aggressive implementation of IIoT can:

For summary descriptions of specific IIOT applications, please read: The Top 20 Industrial IoT Applications.

RISC-V is an open-source hardware instruction set architecture (ISA) for CPU microprocessors that is growing in importance. Its based on established reduced instruction set computer (RISC) principles. The open-source aspect of the RISC-V ISA is a significant change compared to the proprietary ISA designs of the dominant computer chip manufacturers Intel and Arm.

RISC-V offers a way around paying ISA royalties for CPU microprocessors to either of the monopolists. The royalties may not be significant for chips used in expensive servers or smartphones, but they are significant for the cheap chips required in large numbers to implement the IIOT applications listed above.

For an expanded discussion of RISC-V, please read: A new blueprint for microprocessors challenges the industrys giants.

What bleeding edge information technology developments would you add to this list? Let us know in the comments below.

Go here to read the rest:
Bleeding edge information technology developments - IT World Canada

Jeffrey Epstein scandal: MIT professor put on leave, he ‘failed to inform’ college that sex offender made donations – CNBC

Jeffrey Epstein in 2004.

Rick Friedman | Corbis News | Getty Images

The Massachusetts Institute of Technology said Friday that it had placed one of its tenured professors on paid administrative leave after finding that he "purposefully failed to inform MIT" that convicted sex offender Jeffrey Epstein was the source of two donations in 2012 to support the professor's research, and that the professor got a $60,000 personal gift from Epstein.

A scathing report released by MIT also found that the decision by three administrators to accept donations from Epstein, who pleaded guilty to sex crimes in Florida in 2008 one of which involved a minor girl "was the result of collective and serious errors in judgment that resulted in serious damage to the MIT community."

The report noted that even as its findings have been made public, "MIT is still without a clear and comprehensive gift policy or a process to properly vet donors." However, the university has begun to develop such a process.

Epstein, a former friend of Presidents Donald Trump and Bill Clinton, donated $850,000 to MIT from 2002 through 2017 in 10 separate gifts, the report said.

That was $50,000 more than the amount MIT has previously reported to have received from Epstein.

"The earliest gift was $100,000 given in 2002 to support the research of the late Professor Marvin Minsky, who died in 2016," MIT said as it released the report, which comes after four months of investigation of Epstein's ties to MIT conducted by the law firm Goodwin Procter.

"The remaining nine donations, all made after Epstein's 2008 conviction, included $525,000 to the Media Lab and $225,000 to" mechanical engineering professor Seth Lloyd, the report said.

The report also found that, "Unbeknownst to any members of MIT's senior leadership ... Epstein visited MIT nine times between 2013 and 2017."

"The fact-finding reveals that these visits and all post-conviction gifts from Epstein were driven by either former Media Lab director Joi Ito or professor of mechanical engineering Seth Lloyd, and not by the MIT administration or the Office of Resource Development."

Ito resigned last year after revelations about Epstein's donations to the Media Lab.

Lloyd received two donations of $50,000 in 2012, and the remaining $125,000 in 2017, according to the report.

"Epstein viewed the 2012 gifts as a trial balloon to test MIT's willingness to accept donations following his conviction" in Florida, MIT said.

"Professor Lloyd knew that donations from Epstein would be controversial and that MIT might reject them," MIT said.

"We conclude that, in concert with Epstein, he purposefully decided not to alert the Institute to Epstein's criminal record, choosing instead to allow mid-level administrators to process the donations without any formal discussion or diligence concerning Epstein."

Seth Lloyd is a professor of mechanical engineering and physics at the Massachusetts Institute of Technology.

Photo:Dmitry Rozhkov | Wikipedia CC

Lloyd was put on paid leave after it was found that he "purposefully failed to" tell MIT that Epstein was the source of the two earliest donations to him.

The report also found that Lloyd had "received a personal gift of $60,000 from Epstein in 2005 or 2006, which he acknowledged was deposited into a personal bank account and not reported to MIT," the university said in a press statement.

Lloyd is an influential thinker in the field of quantum mechanical engineering.

Educated at Harvard College and Cambridge University in England, Lloyd was the first person to propose a "technologically feasible design for a quantum computer," according to his resume. His 2006 book, "Programming the Universe," argues that the universe is a giant quantum computer calculating its own evolution.

University President L. Rafael Reif had not been aware that MIT was accepting donations from Epstein, who killed himself in a Manhattan jail in August after being arrested the prior month on federal child sex trafficking charges, according to the report.

"But the review finds that three MIT vice presidents learned of Epstein's donations to the MIT Media Lab, and his status as a convicted sex offender, in 2013," the university said in a prepared statement.

"In the absence of any MIT policy regarding controversial gifts, Epstein's subsequent gifts to the Institute were approved under an informal framework developed by the three administrators, R. Gregory Morgan, Jeffrey Newton, and Israel Ruiz."

"Since MIT had no policy or processes for handling controversial donors in place at the time, the decision to accept Epstein's post-conviction donations cannot be judged to be a policy violation," the report said.

"But it is clear that the decision was the result of collective and significant errors in judgment that resulted in serious damage to the MIT community."

Reif, in a letter addressed to the university's community, said, "Today's findings present disturbing new information about Jeffrey Epstein's connections with individuals at MIT: how extensive those ties were and how long they continued. This includes the decision by a lab director to bring this Level 3 sex offender to campus repeatedly."

"That it was possible for Epstein to have so many opportunities to interact with members of our community is distressing and unacceptable; I cannot imagine how painful it must be for survivors of sexual assault and abuse," Reif said.

"Clearly, we must establish policy guardrails to prevent this from happening again."

Read the original here:
Jeffrey Epstein scandal: MIT professor put on leave, he 'failed to inform' college that sex offender made donations - CNBC

The teenager that’s at CES to network – Yahoo Singapore News

It's not that Alishba Imran isn't impressed by her tour of Zappos HQ, the Disneyland of corporate campuses, with its "zapponians" who earn "zollars" and play "zing zong" on breaks. But she might not see herself working at a big corporation like this.

Her goal is to be "influential." She describes herself as a blockchain and machine learning developer and researcher and sees her future in health care and finance infrastructure. She chats about fractional ownership and the direction of 5G as well as stoicism and first principles. She is 16 years old.

"I'm really interested in meaningful work, and I think the best way to do that is in a startup," Imran says. Eventually she wants to run her own company, perhaps serving the developing world. College may or may not be part of the equation, at least not before a gap year.

This is Imran's first time in Las Vegas, first CES and somewhere between her 10th and 15th tech conference in the past year, she thinks. She flew here from the suburbs of Toronto -- where her parents moved from Pakistan when Imran was six -- to receive an award as a "Young Innovator to Watch" from Living In Digital Times. Many of the winners turn out to be children of immigrants, says LIDT founder Robin Raskin, and this year's teenagers also developed screening for respiratory diseases and an app for sexual-assault survivors.

Imran's project HonestBlocks uses blockchain to track health care supply chains and prevent counterfeit medicine. She came to Vegas with her chaperone Navid Nathoo, co-founder of the Knowledge Society, a program for talented kids, which Imran credits with giving her a direction beyond the pointlessness of school clubs and grades.

In school, she even started a nonprofit for girls in STEM but soon realized she didn't want to be standing on stages instead of solving problems. "I didn't feel my work was making a real impact," she says. "Doing events will only go so far." She tells me it'd be good to write this article about what she was like then versus now.

"People are impressed by that?" she says, quite seriously. "The only cool thing is the quantum computer."

On Wednesday, Imran roams the convention center, an ocean of middle-aged men. Her prime goal is to network. "That's the only utility in conferences," she says. There's a concentration of experts on emerging tech here, and she has a list of relationships to build, with Google, Uber and IBM.

Also, she's psyched to see the quantum computer, a regular CES appearance. "It's gonna be an iconic picture," she says.

First, the networking. She meets a blockchain expert at IBM, gives her standard intro and listens with arms folded, legs crossed, standing upright. "You definitely know your stuff," says the man. "How old are you?"

She tells him. He tells her to apply to IBM.

"You definitely know your stuff," says the man. "How old are you?"

Off again, through the Las Vegas Convention Center. Imran walks past Sharp's 90-inch transparent TV, attendees buzzing all around.

"People are impressed by that?" she says, quite seriously. "The only cool thing is the quantum computer," she adds, more in jest this time.

Eventually, she finds the glistening chandelier of a machine. She takes the photos and texts Nathoo her excitement. She talks to a quantum consultant and notes down the name of a paper about lithium-sulphur batteries. A man appears -- several earrings, mildly spiked hair -- and says he helped assemble the quantum computer.

"Can you explain how it works?" Imran asks.

"What do you want to do first, qubits or cooling?" he says.

She replies: "Let's do qubits."

Visit link:
The teenager that's at CES to network - Yahoo Singapore News

AI, ML and quantum computing to cement position in 2020: Alibabas Jeff Zhang – Tech Observer

From the emerge of cognitive intelligence, in-memory-computing, fault-tolerant quantum computing, new materials-based semiconductor devices, to faster growth of industrial IoT, large-scale collaboration between machines, production-grade blockchain applications, modular chip design, and AI technologies to protect data privacy, more technology advancements and breakthroughs are expected to gain momentum and generate big impacts on our daily life.

We are at the era of rapid technology development. In particular, technologies such as cloud computing, artificial intelligence, blockchain, and data intelligence are expected to accelerate the pace of the digital economy, said Jeff Zhang, Head of Alibaba DAMO Academy and President of Alibaba Cloud Intelligence.

The following are highlights from the Alibaba DAMO Academy predictions for the top 10 trends in the tech community for this year:

Artificial intelligence has reached or surpassed humans in the areas of perceptual intelligence such as speech to text, natural language processing, video understanding etc. but in the field of cognitive intelligence that requires external knowledge, logical reasoning, or domain migration, it is still in its infancy. Cognitive intelligence will draw inspiration from cognitive psychology, brain science, and human social history, combined with techniques such as cross domain knowledge graph, causality inference, and continuous learning to establish effective mechanisms for stable acquisition and expression of knowledge. These make machines to understand and utilize knowledge, achieving key breakthroughs from perceptual intelligence to cognitive intelligence.

In Von Neumann architecture, memory and processor are separate and the computation requires data to be moved back and forth. With the rapid development of data-driven AI algorithms in recent years, it has come to a point where the hardware becomes the bottleneck in the explorations of more advanced algorithms. In Processing-in-Memory (PIM) architecture, in contrast to the Von Neumann architecture, memory and processor are fused together and computations are performed where data is stored with minimal data movement. As such, computation parallelism and power efficiency can be significantly improved. We believe the innovations on PIM architecture are the tickets to next-generation AI.

In 2020, 5G, rapid development of IoT devices, cloud computing and edge computing will accelerate the fusion of information system, communication system, and industrial control system. Through advanced Industrial IoT, manufacturing companies can achieve automation of machines, in-factory logistics, and production scheduling, as a way to realize C2B smart manufacturing. In addition, interconnected industrial system can adjust and coordinate the production capability of both upstream and downstream vendors. Ultimately it will significantly increase the manufacturers productivity and profitability. For manufacturers with production goods that value hundreds of trillion RMB, if the productivity increases 5-10%, it means additional trillions of RMB.

Traditional single intelligence cannot meet the real-time perception and decision of large-scale intelligent devices. The development of collaborative sensing technology of Internet of things and 5G communication technology will realize the collaboration among multiple agents machines cooperate with each other and compete with each other to complete the target tasks. The group intelligence brought by the cooperation of multiple intelligent bodies will further amplify the value of the intelligent system: large-scale intelligent traffic light dispatching will realize dynamic and real-time adjustment, while warehouse robots will work together to complete cargo sorting more efficiently; Driverless cars can perceive the overall traffic conditions on the road, and group unmanned aerial vehicle (UAV) collaboration will get through the last -mile delivery more efficiently.

Traditional model of chip design cannot efficiently respond to the fast evolving, fragmented and customized needs of chip production. The open source SoC chip design based on RISC-V, high-level hardware description language, and IP-based modular chip design methods have accelerated the rapid development of agile design methods and the ecosystem of open source chips. In addition, the modular design method based on chiplets uses advanced packaging methods to package the chiplets with different functions together, which can quickly customize and deliver chips that meet specific requirements of different applications.

BaaS (Blockchain-as-a-Service) will further reduce the barriers of entry for enterprise blockchain applications. A variety of hardware chips embedded with core algorithms used in edge, cloud and designed specifically for blockchain will also emerge, allowing assets in the physical world to be mapped to assets on blockchain, further expanding the boundaries of the Internet of Value and realizing multi-chain interconnection. In the future, a large number of innovative blockchain application scenarios with multi-dimensional collaboration across different industries and ecosystems will emerge, and large-scale production-grade blockchain applications with more than 10 million DAI (Daily Active Items) will gain mass adoption.

In 2019, the race in reaching Quantum Supremacy brought the focus back to quantum computing. The demonstration, using superconducting circuits, boosts the overall confidence on superconducting quantum computing for the realization of a large-scale quantum computer. In 2020, the field of quantum computing will receive increasing investment, which comes with enhanced competitions. The field is also expected to experience a speed-up in industrialization and the gradual formation of an eco-system. In the coming years, the next milestones will be the realization of fault-tolerant quantum computing and the demonstration of quantum advantages in real-world problems. Either is of a great challenge given the present knowledge. Quantum computing is entering a critical period.

Under the pressure of both Moores Law and the explosive demand of computing power and storage, it is difficult for classic Si based transistors to maintain sustainable development of the semiconductor industry. Until now, major semiconductor manufacturers still have no clear answer and option to chips beyond 3nm. New materials will make new logic, storage, and interconnection devices through new physical mechanisms, driving continuous innovation in the semiconductor industry. For example, topological insulators, two-dimensional superconducting materials, etc. that can achieve lossless transport of electron and spin can become the basis for new high-performance logic and interconnect devices; while new magnetic materials and new resistive switching materials can realize high-performance magnetics Memory such as SOT-MRAM and resistive memory.

Abstract: The compliance costs demanded by the recent data protection laws and regulations related to data transfer are getting increasingly higher than ever before. In light of this, there have been growing interests in using AI technologies to protect data privacy. The essence is to enable the data user to compute a function over input data from different data providers while keeping those data private. Such AI technologies promise to solve the problems of data silos and lack of trust in todays data sharing practices, and will truly unleash the value of data in the foreseeable future.

With the ongoing development of cloud computing technology, the cloud has grown far beyond the scope of IT infrastructure, and gradually evolved into the center of all IT technology innovations. Cloud has close relationship with almost all IT technologies, including new chips, new databases, self-driving adaptive networks, big data, AI, IoT, blockchain, quantum computing and so forth. Meanwhile, it creates new technologies, such as serverless computing, cloud-native software architecture, software-hardware integrated design, as well as intelligent automated operation. Cloud computing is redefining every aspect of IT, making new IT technologies more accessible for the public. Cloud has become the backbone of the entire digital economy.

The rest is here:
AI, ML and quantum computing to cement position in 2020: Alibabas Jeff Zhang - Tech Observer

Perspective: End Of An Era | WNIJ and WNIU – WNIJ and WNIU

David Gunkel's "Perspective" (January 8, 2020).

The holiday shopping is over and everyone is busy playing with their new toys. But what was remarkable about Christmas 2019 might have been the conspicuous absence of such toys.

Previous holiday seasons saw the introduction of impressive technological wonders -- tablet computers, the iPhone, Nintendo Wii and the X-box. But this year, there was no stand-out, got-to-have technological object.

On the one hand, this may actually be a good thing. The amount of waste generated by discarded consumer electronics is a massive global problem that we are not even close to managing responsibly. On the other hand however, this may be an indication of the beginning of the end of an era -- the era of Moores Law.

In 1965, Gordon Moore, then CEO of Intel, predicted that the number of transistors on a microchip doubles every two years, meaning that computer chip performance would develop at an almost exponential rate. But even Moore knew there was a physical limit to this dramatic escalation in computer power, and we are beginning to see it top out. That may be one reason why there were no new, got-to-have technological gizmos and gadgets this holiday season.

Sure, quantum computing is already being positioned as the next big thing. But it will be years, if not decades, before it finds its way into consumer products. So for now, do not ask Santa to fill your stocking with a brand-new quantum device. It will, for now at least, continue to be lumps of increasingly disappointing silicon.

Im David Gunkel, and thats my perspective.

Original post:
Perspective: End Of An Era | WNIJ and WNIU - WNIJ and WNIU

Volkswagen carried out the world’s first pilot project for traffic optimization with a quantum computer – Quantaneo, the Quantum Computing Source

In Lisbon, Volkswagen carried out the world's first pilot project for traffic optimization with a quantum computer. MAN buses of the public transport provider CARRIS were equipped with a system developed in-house. This system uses a quantum computer and calculates the individually fastest route for each of the participating buses in almost real time. In this way, traffic jams could be detected and avoided.

In the future, Volkswagen plans to develop its traffic optimization system to market maturity. For this reason, the Volkswagen developers have designed the system so that it can generally be applied to any city and to vehicle fleets of any size. Further pilot projects for cities in Germany and other European countries are already being considered. Volkswagen believes that such a traffic optimization system could be offered to public transport companies, taxi companies or fleet operators.

Originally posted here:
Volkswagen carried out the world's first pilot project for traffic optimization with a quantum computer - Quantaneo, the Quantum Computing Source