Category Archives: Quantum Computing
Atom Computing Reports Advance in Scaling Up Neutral Atom Qubit Arrays – HPCwire
The scale-up challenge facing quantum computing (QC) is daunting and varied. Its commonly held that 1 million qubits (or more) will be needed to deliver practical fault tolerant QC. Its also a varied challenge because today there are many qubit types (superconducting, trapped ion, neutral atom, photonics, topological, quantum dots, etc.) each facing very different scale-up approaches; for example in solid state systems part of the challenge typically involves efficient interconnecting of stationary qubits, while neutral atom- and trapped ion-based systems can require actually moving the qubits around.
Recently, neutral atom specialist Atom Computing reported developing techniques that now permits it to full load an array of 1225 sites with its qubits, neutral atoms. Atom posted a paper in late January on the work, explaining some of the technical details. Atom first reported creating a 1225-site array with 1180 occupied sites last fall claiming to be the first to break the 1000 qubit threshold.
The new paper sheds more detail on Atoms approach for scaling up its arrays, which the company says is a significant step on its path to effective scaling and error correction.
In this protocol, the tweezers provide microscopic rearrangement of atoms, while the cavity-enhanced lattices enable the creation of large numbers of deep optical potentials that allow for rapid low-loss imaging of atoms. We apply this protocol to demonstrate deterministic filling (99% per-site occupancy) of 1225-site arrays. Because the reservoir is repeatedly filled with fresh atoms, the array can be maintained in a filled state indefinitely. We anticipate that this protocol will be compatible with mid-circuit reloading, which will be a key capability for running large-scale error-corrected quantum computations whose durations exceed the lifetime of a single atom in the system, write the researchers.
Currently the range of qubit-counts on quantum processors varies from single digits to hundreds. In December, IBM, whose QCs use superconducting qubits, introduced an 1121-qubit Condor QPU (not yet available) and a smaller 133-qubit Heron QPU thats optimized for combining with multiple QPUs into larger quantum systems. QuEra, another neutral atom-based QC developer, has a 256-qubit (Aquila) device, and in January it showcased a roadmap QuEra says will lead to 10,000 physical qubits and 100 logical qubits in the 2026 time-frame.
The race is on to scale up physical qubit counts and logical qubit counts (comprised of many physical qubits to implement error correction).
Heres a brief description from the paper:
Typically, tweezer rearrangement is performed by stochastically loading up to a single atom into each trap within an array, imaging the atoms to determine trap occupancy, and then rearranging atoms within the array to create a deterministically occupied sub-array. Crucially, the number of atoms contained in the final array with this approach is no greater than the number initially loaded. Further, because the initial loading is stochastic, the number of sites in the array must generally be substantially larger than de- sired final sub-array (though under certain conditions, near-deterministic loading can be achieved).
Recently, repeated loading of a buffer array from an optical dipole trap reservoir has demonstrated that one can decouple the filling of a six-site target array from a single loading of a cold reservoir. In this work, we ex- tend this concept to repeated loading of a reservoir array, from which we create a deterministically filled target array (typically 99% occupancy) of over 1200 171Yb atoms in 1225 sites. This is made possible by combining optical tweezer arrays with a cavity-enhanced optical lattice to provide both microscopic control and the large number of deep traps required for rapid, high-fidelity, low-loss imaging of large numbers of atoms.
The paper is best read in full.
Link to Atom Computing paper (Iterative assembly of 171Yb atom arrays in cavity-enhanced optical lattices), https://arxiv.org/abs/2401.16177
Top image: Atom Computings neutral atom-based quantum system. Source: Atom Computing
Continued here:
Atom Computing Reports Advance in Scaling Up Neutral Atom Qubit Arrays - HPCwire
Researchers take a different approach with measurement-based quantum computing – Phys.org
This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:
fact-checked
peer-reviewed publication
trusted source
proofread
close
The race to develop quantum computers has really heated up over the past few years. State-of-the-art systems can now run simple algorithms using dozens of qubitsor quantum bitswhich are the building blocks of quantum computers.
Much of this success has been achieved in so-called gate-based quantum computers. These computers use physical components, most notably superconducting circuits, to host and control the qubits. This approach is quite similar to conventional, device-based classical computers. The two computing architectures are thus relatively compatible and could be used together. Furthermore, future quantum computers could be fabricated by harnessing the technologies used to fabricate conventional computers.
But the Optical Quantum Computing Research Team at the RIKEN Center for Quantum Computing has been taking a very different approach. Instead of optimizing gate-based quantum computers, Atsushi Sakaguchi, Jun-ichi Yoshikawa and Team Leader Akira Furusawa have been developing measurement-based quantum computing.
Measurement-based quantum computers process information in a complex quantum state known as a cluster state, which consists of three (or more) qubits linked together by a non-classical phenomenon called entanglement. Entanglement is when the properties of two or more quantum particles remain linked, even when separated by vast distances.
Measurement-based quantum computers work by making a measurement on the first qubit in the cluster state. The outcome of this measurement determines what measurement to perform on the second entangled qubit, a process called feedforward. This then determines how to measure the third. In this way, any quantum gate or circuit can be implemented through the appropriate choice of the series of measurements.
Measurement-based schemes are very efficient when used on optical quantum computers, since it's easy to entangle a large number of quantum states in an optical system. This makes a measurement-based quantum computer potentially more scalable than a gate-based quantum computer. For the latter, qubits need to be precisely fabricated and tuned for uniformity and physically connected to each other. These issues are automatically solved by using a measurement-based optical quantum computer.
Importantly, measurement-based quantum computation offers programmability in optical systems. "We can change the operation by just changing the measurement," says Sakaguchi. "This is much easier than changing the hardware, as gated-based systems require in optical systems."
But feedforward is essential. "Feedforward is a control methodology in which we feed the measurement results to a different part of the system as a form of control," explains Sakaguchi. "In measurement-based quantum computation, feedforward is used to compensate for the inherent randomness in quantum measurements. Without feedforward operations, measurement-based quantum computation becomes probabilistic, while practical quantum computing will need to be deterministic."
The Optical Quantum Computing Research Team and their co-workersfrom The University of Tokyo, Palack University in the Czech Republic, the Australian National University and the University of New South Wales, Australiahave now demonstrated a more advanced form of feedforward: nonlinear feedforward. Nonlinear feedforward is required to implement the full range of potential gates in optics-based quantum computers. The findings are published in the journal Nature Communications.
"We've now experimentally demonstrated nonlinear quadrature measurement using a new nonlinear feedforward technology," explains Sakaguchi. "This type of measurement had previously been a barrier to realizing universal quantum operations in optical measurement-based quantum computation."
close
Optical computers
Optical quantum computers use qubits made of wave packets of light. At other institutions, some of the current RIKEN team had previously constructed the large optical cluster states needed for measurement-based quantum computation. Linear feedforward has also been achieved to construct simple gate operations, but more advanced gates need nonlinear feedforward.
A theory for practical implementation of nonlinear quadrature measurement was proposed in 2016. But this approach presented two major practical difficulties: generating a special ancillary state (which the team achieved in 2021) and performing a nonlinear feedforward operation.
The team overcame the latter challenge with complex optics, special electro-optic materials and ultrafast electronics. To do this they exploited digital memories, in which the desired nonlinear functions were precomputed and recorded in the memory. "After the measurement, we transformed the optical signal into an electrical one," explains Sakaguchi. "In linear feedforward, we just amplify or attenuate that signal, but we needed to do much more complex processing for nonlinear feedforward."
The key advantages of this nonlinear feedforward technique are its speed and flexibility. The process needs to be fast enough that the output can be synchronized with the optical quantum state.
"Now that we have shown that we can perform nonlinear feedforward, we want to apply it to actual measurement-based quantum computation and quantum error correction using our previously developed system," says Sakaguchi. "And we hope to be able to increase the higher speed of our nonlinear feedforward for high-speed optical quantum computation."
"But the key message is that, although superconducting circuit-based approaches may be more popular, optical systems are a promising candidate for quantum-computer hardware," he adds.
More information: Atsushi Sakaguchi et al, Nonlinear feedforward enabling quantum computation, Nature Communications (2023). DOI: 10.1038/s41467-023-39195-w
Journal information: Nature Communications
Here is the original post:
Researchers take a different approach with measurement-based quantum computing - Phys.org
IBM demonstrates useful Quantum computing within 133-qubit Heron, announces entry into Quantum-centric … – Tom’s Hardware
At its Quantum Summit 2023, IBM took the stage with an interesting spirit: one of almost awe at having things go their way. But the quantum of today the one thats changing IBMs roadmap so deeply on the back of breakthrough upon breakthrough was hard enough to consolidate. As IBM sees it, the future of quantum computing will hardly be more permissive. IBM announced cutting-edge devices at the event, including the 133-qubit Heron Quantum Processing Unit (QPU), which is the company's first utility-scale quantum processor, and the self-contained Quantum System Two, a quantum-specific supercomputing architecture. And further improvements to the cutting-edge devices are ultimately required.
Each breakthrough that afterward becomes obsolete is another accelerating bump against what we might call quantum's "plateau of understanding." Weve already crested this plateau with semiconductors, so much so that the latest CPUs and GPUs are reaching practical, fundamental design limits where quantum effects start ruining our math. Conquering the plateau means that utility and understanding are now enough for research and development to be somewhat self-sustainable at least for a Moores-law-esque while.
IBMs Quantum Summit serves as a bookend of sorts for the companys cultural and operational execution, and its 2023 edition showcased an energized company that feels like it's opening up the doors towards a "quantum-centric supercomputing era." That vision is built on the company's new Quantum Processing Unit, Heron, which showcases scalable quantum utility at a 133-qubit count and already offers things beyond what any feasible classical system could ever do. Breakthroughs and a revised understanding of its own roadmap have led IBM to present its quantum vision in two different roadmaps, prioritizing scalability in tandem with useful, minimum-quality rather than monolithic, hard-to-validate, high-complexity products.
IBM's announced new plateau for quantum computing packs in two particular breakthroughs that occurred in 2023. One breakthrough relates to a groundbreaking noise-reduction algorithm (Zero Noise Extrapolation, or ZNE) which we covered back in July basically a system through which you can compensate for noise. For instance, if you know a pitcher tends to throw more to the left, you can compensate for that up to a point. There will always be a moment where you correct too much or cede ground towards other disruptions (such as the opponent exploring the overexposed right side of the court). This is where the concept of qubit quality comes into account the more quality your qubits, the more predictable both their results and their disruptions and the better you know their operational constraints then all the more useful work you can extract from it.
The other breakthrough relates to an algorithmic improvement of epic proportions and was first pushed to Arxiv on August 15th, 2023. Titled High-threshold and low-overhead fault-tolerant quantum memory, the paper showcases algorithmic ways to reduce qubit needs for certain quantum calculations by a factor of ten. When what used to cost 1,000 qubits and a complex logic gate architecture sees a tenfold cost reduction, its likely youd prefer to end up with 133-qubit-sized chips chips that crush problems previously meant for 1,000 qubit machines.
Enter IBMs Heron Quantum Processing Unit (QPU) and the era of useful, quantum-centric supercomputing.
Image 1 of 2
The two-part breakthroughs of error correction (through the ZNE technique) and algorithmic performance (alongside qubit gate architecture improvements) allow IBM to now consider reaching 1 billion operationally useful quantum gates by 2033. It just so happens that its an amazing coincidence (one born of research effort and human ingenuity) that we only need to keep 133 qubits relatively happy within their own environment for us to extract useful quantum computing from them computing that we wouldnt classically be able to get anywhere else.
The Development and Innovation roadmap showcase how IBM is thinking about its superconducting qubits: as weve learned to do with semiconductors already, mapping out the hardware-level improvements alongside the scalability-level ones. Because as weve seen through our supercomputing efforts, theres no such thing as a truly monolithic approach: every piece of supercomputing is (necessarily) efficiently distributed across thousands of individual accelerators. Your CPU performs better by knitting and orchestrating several different cores, registers, and execution units. Even Cerebras Wafer Scale Engine scales further outside its wafer-level computing unit. No accelerator so far no unit of computation - has proven powerful enough that we dont need to unlock more of its power by increasing its area or computing density. Our brains and learning ability seem to provide us with the only known exception.
IBMs modular approach and its focus on introducing more robust intra-QPU and inter-QPU communication for this years Heron shows its aware of the rope it's walking between quality and scalability. The thousands of hardware and scientist hours behind developing the tunable couplers that are one of the signature Heron design elements that allow parallel execution across different QPUs is another. Pushing one lever harder means other systems have to be able to keep up; IBM also plans on steadily improving its internal and external coupling technology (already developed with scalability in mind for Heron) throughout further iterations, such as Flamingos planned four versions which still only end scaling up to 156 qubits per QPU.
Considering how you're solving scalability problems and the qubit quality x density x ease of testing equation, the ticks - the density increases that don't sacrifice quality and are feasible from a testing and productization standpoint - may be harder to unlock. But if one side of development is scalability, the other relates to the quality of whatever youre actually scaling in this case, IBMs superconducting qubits themselves. Heron itself saw a substantial rearrangement of its internal qubit architecture to improve gate design, accessibility, and quantum processing volumes not unlike an Intel tock. The planned iterative improvements to Flamingo's design seem to confirm this.
Theres a sweet spot for the quantum computing algorithms of today: it seems that algorithms that fit roughly around a 60-gate depth are complex enough to allow for useful quantum computing. Perhaps thinking about Intels NetBurst architecture with its Pentium 4 CPUs is appropriate here: too deep an instruction pipeline is counterproductive, after a point. Branch mispredictions are terrible across computing, be it classical or quantum. And quantum computing as we still currently have it in our Noisy Intermediate-Scale Quantum (NISQ)-era is more vulnerable to a more varied disturbance field than semiconductors (there are world overclocking records where we chill our processors to sub-zero temperatures and pump them with above-standard volts, after all). But perhaps that comparable quantum vulnerability is understandable, given how were essentially manipulating the essential units of existence atoms and even subatomic particles into becoming useful to us.
Useful quantum computing doesnt simply correlate with an increasing number of available in-package qubits (announcements of 1,000-qubit products based on neutral atom technology, for instance). But useful quantum computing is always stretched thin throughout its limits, and if it isnt bumping against one fundamental limit (qubit count), its bumping against another (instability at higher qubit counts); or contending with issues of entanglement coherence and longevity; entanglement distance and capability; correctness of the results; and still other elements. Some of these scalability issues can be visualized within the same framework of efficient data transit between different distributed computing units, such as cores in a given CPU architecture, which can themselves be solved in a number of ways, such as hardware-based information processing and routing techniques (AMDs Infinity Fabric comes to mind, as does Nvidia's NVLink).
This feature of quantum computing already being useful at the 133-qubit scale is also part of the reason why IBM keeps prioritizing quantum computing-related challenges around useful algorithms occupying a 100 by 100 grid. That quantum is already useful beyond classical, even in gate grids that are comparably small to what we can achieve with transistors, and points to the scale of the transition of how different these two computational worlds are.
Then there are also the matters of error mitigation and error correction, of extracting ground-truth-level answers to the questions we want our quantum computer to solve. There are also limitations in our way of utilizing quantum interference in order to collapse a quantum computation at just the right moment that we know we will obtain from it the result we want or at least something close enough to correct that we can then offset any noise (non-useful computational results, or the difference of values ranging between the correct answer and the not-yet-culled wrong ones) through a clever, groundbreaking algorithm.
The above are just some of the elements currently limiting how useful qubits can truly be and how those qubits can be manipulated into useful, algorithm-running computation units. This is usually referred to as a qubits quality, and we can see how it both does and doesnt relate to the sheer number of qubits available. But since many useful computations can already be achieved with 133-qubit-wide Quantum Processing Units (theres a reason IBM settled on a mere 6-qubit increase from Eagle towards Heron, and only scales up to 156 units with Flamingo), the company is setting out to keep this optimal qubit width for a number of years of continuous redesigns. IBM will focus on making correct results easier to extract from Heron-sized QPUs by increasing the coherence, stability, and accuracy of these 133 qubits while surmounting the arguably harder challenge of distributed, highly-parallel quantum computing. Its a onetwo punch again, and one that comes from the bump in speed at climbing ever-higher stretches of the quantum computing plateau.
But there is an admission that its a barrier that IBM still wants to punch through its much better to pair 200 units of a 156-qubit QPU (that of Flamingo) than of a 127-qubit one such as Eagle, so long as efficiency and accuracy remain high. Oliver Dial says that Condor, "the 1,000-qubit product", is locally running up to a point. It was meant to be the thousand-qubit processor, and was a part of the roadmap for this years Quantum Summit as much as the actual focus, Heron - but its ultimately not really a direction the company thinks is currently feasible.
IBM did manage to yield all 1,000 Josephson Junctions within their experimental Condor chip the thousand-qubit halo product that will never see the light of day as a product. Its running within the labs, and IBM can show that Condor yielded computationally useful qubits. One issue is that at that qubit depth, testing such a device becomes immensely expensive and time-consuming. At a basic level, its harder and more costly to guarantee the quality of a thousand qubits and their increasingly complex possibility field of interactions and interconnections than to assure the same requirements in a 133-qubit Heron. Even IBM only means to test around a quarter of the in-lab Condor QPUs area, confirming that the qubit connections are working.
But Heron? Heron is made for quick verification that its working to spec that its providing accurate results, or at least computationally useful results that can then be corrected through ZNE and other techniques. That means you can get useful work out of it already, while also being a much better time-to-market product in virtually all areas that matter. Heron is what IBM considers the basic unit of quantum computation - good enough and stable enough to outpace classical systems in specific workloads. But that is quantum computing, and that is its niche.
Heron is IBMs entrance into the mass-access era of Quantum Processing Units. Next years Flamingo builds further into the inter-QPU coupling architecture so that further parallelization can be achieved. The idea is to scale at a base, post-classical utility level and maintain that as a minimum quality baseline. Only at that point will IBM maybe scale density and unlock the appropriate jump in computing capability - when that can be similarly achieved in a similarly productive way, and scalability is almost perfect for maintaining quantum usefulness.
Theres simply never been the need to churn out hundreds of QPUs yet the utility wasnt there. The Canaries, Falcons, and Eagles of IBMs past roadmap were never meant to usher in an age of scaled manufacturing. They were prototypes, scientific instruments, explorations; proofs of concept on the road towards useful quantum computing. We didnt know where usefulness would start to appear. But now, we do because weve reached it.
Heron is the design IBM feels best answers that newly-created need for a quantum computing chip that actually is at the forefront of human computing capability one that can offer what no classical computing system can (in some specific areas). One that can slice through specific-but-deeper layers of our Universe. Thats what IBM means when it calls this new stage the quantum-centric supercomputing one.
Classical systems will never cease to be necessary: both of themselves and the way they structure our current reality, systems, and society. They also function as a layer that allows quantum computing itself to happen, be it by carrying and storing its intermediate results or knitting the final informational state mapping out the correct answer Quantum computing provides one quality step at a time. The quantum-centric bit merely refers to how quantum computing will be the core contributor to developments in fields such as materials science, more advanced physics, chemistry, superconduction, and basically every domain where our classical systems were already presenting a duller and duller edge with which to improve upon our understanding of their limits.
However, through IBMs approach and its choice of transmon superconducting qubits, a certain difficulty lies in commercializing local installations. Quantum System Two, as the company is naming its new almost wholesale quantum computing system, has been shown working with different QPU installations (both Heron and Eagle). When asked about whether scaling Quantum System Two and similar self-contained products would be a bottleneck towards technological adoption, IBMs CTO Oliver Dial said that it was definitely a difficult problem to solve, but that he was confident in their ability to reduce costs and complexity further in time, considering how successful IBM had already proven in that regard. For now, its easier for IBMs quantum usefulness to be unlocked at a distance through the cloud and its quantum computing framework, Quiskit than it is to achieve it by running local installations.
Quiskit is the preferred medium through which users can actually deploy IBM's quantum computing products in research efforts just like you could rent X Nvidia A100s of processing power through Amazon Web Services or even a simple Xbox Series X console through Microsofts xCloud service. On the day of IBM's Quantum Summit, that freedom also meant access to the useful quantum circuits within IBM-deployed Heron QPUs. And it's much easier to scale access at home, serving them through the cloud, than delivering a box of supercooled transmon qubits ready to be plugged and played with.
Thats one devil of IBMs superconducting qubits approach not many players have the will, funding, or expertise to put a supercooled chamber into local operation and build the required infrastructure around it. These are complex mechanisms housing kilometers of wiring - another focus of IBMs development and tinkering culminating in last years flexible ribbon solution, which drastically simplified connections to and from QPUs.
Quantum computing is a uniquely complex problem, and democratized access to hundreds or thousands of mass-produced Herons in IBMs refrigerator-laden fields will ultimately only require, well a stable internet connection. Logistics are what they are, and IBMs Quantum Summit also took the necessary steps to address some needs within its Quiskit runtime platform by introducing its official 1.0 version. Food for thought is realizing that the era of useful quantum computing seems to coincide with the beginning of the era of Quantum Computing as a service as well. That was fast.
The era of useful, mass-producible, mass-access quantum computing is what IBM is promising. But now, theres the matter of scale. And theres the matter of how cost-effective it is to install a Quantum System Two or Five or Ten compared to another qubit approach be it topological approaches to quantum computing, or oxygen-vacancy-based, ion-traps, or others that are an entire architecture away from IBMs approach, such as fluxonium qubits. Its likely that a number of qubit technologies will still make it into the mass-production stage and even then, we can rest assured that everywhere in the road of human ingenuity lie failed experiments, like Intels recently-decapitated Itanium or AMDs out-of-time approach to x86 computing in Bulldozer.
It's hard to see where the future of quantum takes us, and its hard to say whether it looks exactly like IBMs roadmap the same roadmap whose running changes we also discussed here. Yet all roadmaps are a permanently-drying painting, both for IBM itself and the technology space at large. Breakthroughs seem to be happening daily on each side of the fence, and its a fact of science that the most potential exists the earlier the questions we ask. The promising qubit technologies of today will have to answer to actual interrogations on performance, usefulness, ease and cost of manipulation, quality, and scalability in ways that now need to be at least as good as what IBM is proposing with its transmon-based superconducting qubits, and its Herons, and scalable Flamingos, and its (still unproven, but hinted at) ability to eventually mass produce useful numbers of useful Quantum Processing Units such as Heron. All of that even as we remain in this noisy, intermediate-scale quantum (NISQ) era.
Its no wonder that Oliver Dial looked and talked so energetically during our interview: IBM has already achieved quantum usefulness and has started to answer the two most important questions quality and scalability, Development, and Innovation. And it did so through the collaboration of an incredible team of scientists to deliver results years before expected, Dial happily conceded. In 2023, IBM unlocked useful quantum computing within a 127-qubit Quantum Processing Unit, Eagle, and walked the process of perfecting it towards the revamped Heron chip. Thats an incredible feat in and of itself, and is what allows us to even discuss issues of scalability at this point. Its the reason why a roadmap has to shift to accommodate it and in this quantum computing world, its a great follow-up question to have.
Perhaps the best question now is: how many things can we improve with a useful Heron QPU? How many locked doors have sprung ajar?
See more here:
IBM demonstrates useful Quantum computing within 133-qubit Heron, announces entry into Quantum-centric ... - Tom's Hardware
Research group launches Japan’s third quantum computer at Osaka University – Fujitsu
Center for Quantum Information and Quantum Biology at Osaka University, RIKEN, National Institute of Information and Communications Technology, e-trees.Japan, Inc., Fujitsu Limited, NTT Corporation, QuEL, Inc., QunaSys Inc., Systems Engineering Consultants Co.,LTD.
Tokyo and Osaka, December 20, 2023
A consortium of joint research partners including the Center for Quantum Information and Quantum Biology at Osaka University, RIKEN, the Advanced Semiconductor Research Center at the National Institute of Advanced Industrial Science and Technology (AIST), the Superconducting ICT Laboratory at the National Institute of Information and Communications Technology (NICT), Amazon Web Services, e-trees.Japan, Inc., Fujitsu Limited, NTT Corporation (NTT), QuEL, Inc., QunaSys Inc., and Systems Engineering Consultants Co.,LTD. (SEC) today announced the successful development of Japans third superconducting quantum computer (1) installed at Osaka University. Starting December 22, 2023, the partners will provide users in Japan access to the newly developed computer via the cloud, enabling researchers to execute quantum algorithms (2), improve and verify the operation of software, and explore use cases remotely.
The newly developed superconducting quantum computer uses a 64 qubit chip provided by RIKEN, which leverages the same design as the chip in RIKEN's first superconducting quantum computer, which was unveiled to users in Japan as a cloud service for non-commercial use on March 27, 2023 (3).
For the new quantum computer, the research team sourced more domestically manufactured components (excluding the refrigerator). The research team confirmed that the new quantum computer, including its components, provides sufficient performance and will utilize the computer as a test bed for components made in Japan.
Moving forward, the research group will operate the new computer while improving its software and other systems for usage including the processing of heavy workloads on the cloud. The research team anticipates that the computer will drive further progress in the fields of machine learning and the development of practical quantum algorithms, enable the exploration of new use cases in material development and drug discovery, and contribute to the solution of optimization problems to mitigate environmental impact.
The joint research group is comprised of: Dr. Masahiro Kitagawa, (Professor, Graduate School of Engineering Science, Director of the Center for Quantum Information and Quantum Biology at Osaka University), Dr. Makoto Negoro (Associate Professor, Vice Director of the Center for Quantum Information and Quantum Biology at Osaka University), Dr. Yasunobu Nakamura (Director of the RIKEN Center for Quantum Computing (RQC)), Dr. Katsuya Kikuchi (Group Leader of the 3D Integration System Group of the Device Technology Research Institute at AIST), Dr. Hirotaka Terai (Executive Researcher at the Superconductive ICT Device Laboratory at the Kobe Frontier Research Center of the Advanced ICT Research Institute of NICT), Dr. Yoshitaka Haribara (Senior Startup Machine Learning and Quantum Solutions Architect, Amazon Web Services), Dr. Takefumi Miyoshi(Director of e-trees.Japan, Inc., Specially Appointed Associate Professor, Center for Quantum Information and Quantum Biology at Osaka University, CTO of QuEL, Inc.), Dr. Shintaro Sato (Head of Quantum Laboratory, Fujitsu Research, Fujitsu Limited), Dr. Yuuki Tokunaga (Distinguished Researcher at NTT Computer & Data Science Laboratories), Yosuke Ito (CEO of QuEL, Inc.), Keita Kanno (CTO of QunaSys Inc.), and Ryo Uchida (Chief Technologist of Systems Engineering Consultants Co.,LTD. (SEC)).
This research was supported by grants from:
The Center for Quantum Information and Quantum Biology consists of six research groups: Quantum Computing, Transdisciplinary Quantum Science, Quantum Information Devices, Quantum Communication and Security, Quantum Sensing, and Quantum Biology. QIQB promotes transdisciplinary research between each of these research groups and with other academic fields. The Center is an international research hub for quantum innovation by actively promoting international academic exchange and collaboration across borders. QIQB seeks to play a key role in nurturing future quantum leaders and specialists through education and training. For more information: https://qiqb.osaka-u.ac.jp/en/
RIKEN is Japan's largest research institute for basic and applied research. Over 2500 papers by RIKEN researchers are published every year in leading scientific and technology journals covering a broad spectrum of disciplines including physics, chemistry, biology, engineering, and medical science. RIKEN's research environment and strong emphasis on interdisciplinary collaboration and globalization has earned a worldwide reputation for scientific excellence.Website: http://www.riken.jp/en/Facebook: http://www.facebook.com/RIKENHQX (formerly Twitter): @riken_en
As the only public research institution of Japan that specializes in the field of information and communications technology, the National Institute of Information and Communications Technology (NICT) promotes ICT R&D, from the foundational to the implementation, while collaborating with universities, industry, and domestic and overseas research institutions. NICT will be advancing R&D in the fields of advanced electromagnetic wave technology, innovative networks, cybersecurity, universal communication, and frontier science. Furthermore, NICT will also actively be promoting R&D in four strategic research fields (Beyond 5G, AI, quantum ICT, and cybersecurity), which are essential cutting-edge technologies for next-generation ICT infrastructure for the early realization of Society 5.0. For more information, visit https://www.nict.go.jp/en/.
e-trees.Japan, Inc. provides solutions and implements applications with powerful hardware and flexible software combinations. Our key concept is Keep It Simple and Smart. We primarily address R&D items, such as FPGA and network (with network protocol stack for FPGAs implemented by ourselves), embedded systems, and systems with low power consumption or renewable energy. Find out more: https://e-trees.jp/en/.
Fujitsus purpose is to make the world more sustainable by building trust in society through innovation. As the digital transformation partner of choice for customers in over 100 countries, our 124,000 employees work to resolve some of the greatest challenges facing humanity. Our range of services and solutions draw on five key technologies: Computing, Networks, AI, Data & Security, and Converging Technologies, which we bring together to deliver sustainability transformation. Fujitsu Limited (TSE:6702) reported consolidated revenues of 3.7 trillion yen (US$28 billion) for the fiscal year ended March 31, 2023 and remains the top digital services company in Japan by market share. Find out more: http://www.fujitsu.com.
NTT contributes to a sustainable society through the power of innovation. We are a leading global technology company providing services to consumers and business as a mobile operator, infrastructure, networks, applications, and consulting provider. Our offerings include digital business consulting, managed application services, workplace and cloud solutions, data center and edge computing, all supported by our deep global industry expertise. We are over $97B in revenue and 330,000 employees, with $3.6B in annual R&D investments. Our operations span across 80+ countries and regions, allowing us to serve clients in over 190 of them. We serve over 75% of Fortune Global 100 companies, thousands of other enterprise and government clients and millions of consumers.NTT Service Innovation Laboratory Group Public Relationsnttrd-pr@ml.ntt.com
QuEL, Inc. is an Osaka University-affiliated startup established in 2021. We are a team of experienced researchers and engineers with various backgrounds, strongly supporting quantum computing researchers to supply novel qubit controllers.Find out more: https://quel-inc.com/
QunaSys is a Japanese startup engaged in advancing algorithms in chemistry to drive real-world applications of quantum technology. Our primary focus is on leveraging quantum computing potential by collaborating on research with industry leaders and fostering a community-driven approach within the QPARC industry network. Our flagship innovation, QURI, represents a user-friendly quantum computational web software. This platform allows users without specialized quantum algorithm expertise to engage in quantum calculations seamlessly. Additionally, our QURI Parts act as essential building blocks, aiding in the assembly of quantum algorithms into efficient Python code. For more information, visit us at https://qunasys.com/en
Systems Engineering Consultants (SEC) is a software development company specialized in real-time technology, contributing to the safety and development of society. We offer real-time software in four different business fields: mobile networking, internet technology, public infrastructure, and space, robotics and advanced technologies. Find out more: https://www.sec.co.jp/en/.
RIKENRIKEN Global CommunicationsX (formerly Twitter) : @riken_enTel: +81-(0)48-462-1225E-mail: pr@riken.jp
Fujitsu LimitedPublic and Investor Relations DivisionInquiries
All company or product names mentioned herein are trademarks or registered trademarks of their respective owners. Information provided in this press release is accurate at time of publication and is subject to change without advance notice.
See original here:
Research group launches Japan's third quantum computer at Osaka University - Fujitsu
Japan’s Third Superconducting Quantum Computer Installed at Osaka University – The Quantum Insider
Insider Brief
PRESS RELEASE A consortium of joint research partners announced the successful development of Japans third superconducting quantum computer installed at Osaka University.
The partnership includes the Center for Quantum Information and Quantum Biology at Osaka University, RIKEN, the Advanced Semiconductor Research Center at the National Institute of Advanced Industrial Science and Technology (AIST), the Superconducting ICT Laboratory at the National Institute of Information and Communications Technology (NICT), Amazon Web Services, e-trees.Japan, Inc., Fujitsu Limited, NTT Corporation (NTT), QuEL, Inc., QunaSys Inc., and Systems Engineering Consultants Co.,LTD. (SEC).
Starting December 22, 2023, the partners will provide users in Japan access to the newly developed computer via the cloud, enabling researchers to execute quantum algorithms, improve and verify the operation of software, and explore use cases remotely.
The newly developed superconducting quantum computer uses a 64 qubit chip provided by RIKEN, which leverages the same design as the chip in RIKENs first superconducting quantum computer, which was unveiled to users in Japan as a cloud service for non-commercial use on March 27, 2023.
For the new quantum computer, the research team sourced more domestically manufactured components (excluding the refrigerator). The research team confirmed that the new quantum computer, including its components, provides sufficient performance and will utilize the computer as a test bed for components made in Japan.
Moving forward, the research group will operate the new computer while improving its software and other systems for usage including the processing of heavy workloads on the cloud. The research team anticipates that the computer will drive further progress in the fields of machine learning and the development of practical quantum algorithms, enable the exploration of new use cases in material development and drug discovery, and contribute to the solution of optimization problems to mitigate environmental impact.
The joint research group is comprised of: Dr. Masahiro Kitagawa, (Professor, Graduate School of Engineering Science, Director of the Center for Quantum Information and Quantum Biology at Osaka University), Dr. Makoto Negoro (Associate Professor, Vice Director of the Center for Quantum Information and Quantum Biology at Osaka University), Dr. Yasunobu Nakamura (Director of the RIKEN Center for Quantum Computing (RQC)), Dr. Katsuya Kikuchi (Group Leader of the 3D Integration System Group of the Device Technology Research Institute at AIST), Dr. Hirotaka Terai (Executive Researcher at the Superconductive ICT Device Laboratory at the Kobe Frontier Research Center of the Advanced ICT Research Institute of NICT), Dr. Yoshitaka Haribara (Senior Startup Machine Learning and Quantum Solutions Architect, Amazon Web Services), Dr. Takefumi Miyoshi(Director of e-trees.Japan, Inc., Specially Appointed Associate Professor, Center for Quantum Information and Quantum Biology at Osaka University, CTO of QuEL, Inc.), Dr. Shintaro Sato (Head of Quantum Laboratory, Fujitsu Research, Fujitsu Limited), Dr. Yuuki Tokunaga (Distinguished Researcher at NTT Computer & Data Science Laboratories), Yosuke Ito (CEO of QuEL, Inc.), Keita Kanno (CTO of QunaSys Inc.), and Ryo Uchida (Chief Technologist of Systems Engineering Consultants Co.,LTD. (SEC)).
For more information: Center for Quantum Information and Quantum Biology, Osaka University
This research was supported by grants from:
Read more from the original source:
Japan's Third Superconducting Quantum Computer Installed at Osaka University - The Quantum Insider
COO of Quantum Computing Inc., Bill McGann, Presents a New Quantum Intelligence Platform Leveraging Reservoir … – The Quantum Insider
Quantum Computing Inc. (QCi), a company based out of Leesburg, Virginia that specializes in quantum hardware and software, offers practical quantum computing solutions for business applications.
Just recently in a video, the companys COO, Bill McGann, introduced a unique quantum intelligence platform using reservoir computers. This initiative aims to augment current artificial intelligence (AI) systems rather than replace them, focusing on a hardware-centric approach for edge deployment. This distinguishes it from the common software-centric strategies, highlighting its potential to enhance AI with advanced hardware innovations.
In the short presentation, McGann discusses QCis platform, mentioning that reservoir computers are just the beginning. He emphasized that the plan is to gradually integrate quantum information into these platforms. The eventual goal is to develop a reservoir quantum computer, representing the next generation of their technology. McGann noted that this development is currently underway.
Our contribution is not going to replace what artificial intelligence does today, said McGann. But its really good to enhance it.
McGann clarified that their focus isnt on replicating existing technologies like GPT. Instead, they aim to enhance current systems. He highlighted the uniqueness of their approach to quantum intelligence, machine learning (ML) and AI, and noted their commitment to hardware-based solutions.
So everybody that does reservoir computing today, for the most part, we havent found anybody that does it our way. Everyones using like, you know, software, he said.
To conclude, McGann explained their process as cloud-based, involving data input into a reservoir to generate output. This output is then used in training and applied to tests. Unlike other approaches, they implement this process in hardware, allowing for intelligence deployment at the edge, which is crucial for some applications. Therefore, their strategy for the quantum budgets platform is centred around a hardware solution.
This strategywhich integrates quantum information for edge deploymentmay set QCi apart in the future, one where AI and ML will be all important.
Featured image: Credit: QCi
Quantum AI Brings the Power of Quantum Computing to the Public – GlobeNewswire
Luton, Dec. 20, 2023 (GLOBE NEWSWIRE) -- Quantum AI is set to bring the power of quantum computing to the public and has already reached a stunning quantum volume (QV) score of 14,082 in a year since its inception.
Quantum AI Ltd. was conceived by Finlay and Qaiser Sajjad during their time as students at MIT. They were inspired by the exclusive use of new-age technology by the elites on Wall Street. Recognising the transformative power of this technology, they were determined to make its potential accessible to all. Thus, the platform was born, and it has evolved and flourished in just a short time.
Quantum AI
Often, everyday traders have limited access to such advanced tools.
We are fueled by the belief that the power of quantum computing should not be confined to the financial giants but should be available to empower amateur traders as well, asserted the founders of the platform. Since its launch in 2022, they have worked to achieve this vision and have become a significant force in the industry.
The platform combines the power of the technology with the strength of artificial intelligence. By using these latest technologies, including machine learning, algorithms that are more than just lines of code have been created. They harness the potential of quantum mechanics and deep learning to analyse live data in unique ways.
Our quantum system leverages quantum superposition and coherence, providing a quantum advantage through sophisticated simulation and annealing techniques, added the founders.
Quantum AI has shown exceptional results in a brief period. It has received overwhelmingly positive reviews from customers, highlighting the enhanced speed and accuracy of trading. The transformative and groundbreaking impact the platform has had on trading is evident in its growth to 330,000 active members. Notably, it has nearly 898 million lines of code and an impressive quantum value score of 14,082. The performance on this benchmark that IBM established is a massive testament to the impact quantum AI has had in a short span of time.
According to the founders, they have bigger plans on the horizon to take the power of the technology to the public. Quantum AI is growing its team of experts and expanding its operations in Australia and Canada. Its goal of democratising the power of technology is well on its way to being realised. With trading being the first thing they cracked to pay the bills the main focus has turned to aviation, haulage and even e-commerce. The power of
To learn more about the platform and understand the transformative power of the technology for traders, one can visit https://quantumai.co/.
About Quantum AI
With the aim of democratising the power and potential of quantum computing, the company was founded by Finlay and Qaiser Sajjad during their time at MIT. Since its establishment, it has grown to over 330,000 active members and 18 full-time employees, alongside winning the trust of its customers.
###
Media Contact
Quantum AIPR Manager: Nadia El-Masri Email: nadia.el.masri@quantumai.coAddress: Quantum AI Ltd, 35 John Street, Luton, United Kingdom, LU1 2JEPhone: +442035970878URL: https://quantumai.co/
More:
Quantum AI Brings the Power of Quantum Computing to the Public - GlobeNewswire
Year of covers: Tech and sport, quantum advances and Gen AI – Technology Magazine
From groundbreaking breakthroughs in AI and quantum computing to the continued evolution of augmented and virtual reality, 2023 has witnessed a surge of innovation that is poised to revolutionise our world.
AI continues to evolve at an astonishing pace, with advancements in natural language processing (NLP) enabling more natural and intuitive human-computer interactions. Computer vision, another key AI domain, has made strides in image and video analysis, leading to improved object detection, facial recognition, and medical imaging capabilities. AI is also making significant contributions in drug discovery, medical diagnosis, and self-driving car development, further demonstrating its transformative potential.
The immersive worlds of augmented reality (AR) and virtual reality (VR) have taken significant steps forward, blurring the lines between the physical and digital realms. AR applications are becoming increasingly prevalent in gaming, education, and training, enhancing real-world experiences with digital overlays. VR, meanwhile, is gaining momentum in entertainment, healthcare, and remote collaboration, offering users immersive and interactive experiences.
Quantum computing, still in its early stages, holds immense promise for solving problems that are intractable for classical computers. Researchers are making progress in building and optimizing quantum computers, paving the way for breakthroughs in fields like materials science, drug discovery and AI.
All of these topics and more have featured in our magazine over the past 12 months, and the trends we have witnessed are likely to accelerate in the years to come. As 2023 comes to a close, join us for a review of Technology Magazine's covers from 2023.
Read the original post:
Year of covers: Tech and sport, quantum advances and Gen AI - Technology Magazine
Dell expects quantum computing and generative AI to link in 2024 – SiliconRepublic.com
Dell predicts that generative AI will move from theory to practice, powerful PCs will unlock more AI advancements and zero trust will become central to cybersecurity practices.
Dell Technologies Ireland MD Catherine Doyle has shared predictions on how technological advancements will impact businesses in 2024.
While AI was certainly one of the hottest topics of 2023, Doyle believes generative AI will be the centre of business focus and that 2024 will be all about putting AI into action.
The first step on the AI journey should be to organise and structure data which will help avoid AI sprawl, Doyle said.
Last year, the company made five predictions for 2023 that included advancements in AI, quantum computing and a greater role for technology in employee satisfaction.
Dell believes these advanced technologies will remain a central focus next year, but also believes zero-trust technologies and a growing focus on sustainability in IT will help organisations to innovate, enhance productivity and remain secure.
Breakthrough technologies will help Irish businesses to navigate new challenges that may emerge in the coming months and to innovate at speed, Doyle said.
Here are five predictions Dell has shared on how technology will advance in 2024.
Doyle said that 2023 saw various creative ideas on how generative AI will transform businesses, but claimed that there are very few real-world examples of generative AI in action.
As we enter 2024, [generative] AI projects will start to be business ready with visible productivity gains becoming evident, Doyle said. An increasing number of Irish businesses will adopt AI and scale it across their organisations.
Earlier this year, a State of IT report from Salesforce suggested that 91pc of Irish IT leaders believe generative AI will have a prominent role in their organisations in the near future. However, there also appeared to be feelings of caution, as 53pc of leaders said they were concerned about the ethics of generative AI.
Doyle said a recent report from Dell also suggests many Irish businesses are looking to integrate generative AI.
According to our latest GenAI Pulse survey, nearly half of IT leaders expect to see meaningful results from [generative] AI initiatives within six months to a year, Doyle said. One of the first steps that leaders can take to ensure successful AI adoption in the year ahead is to organise and structure data within their business.
Quantum computing has had some interesting developments this year, with some leaders claiming that the sector is moving at breakneck speed as interest grows among customers, investors and governments.
Doyle said that the growing demand for data will present an opportunity in the near future for quantum computing and generative AI to become closely linked.
With the global explosion of data and AI, there will be an increasing need for organisations in Ireland to put in place the computing power to manage it effectively, Doyle said. Quantum computing will begin to address this and bring about a massive leap in the computing power that is required to unlock AI innovation.
If we were surprised by the [generative] AI advancements of the last year, leaders should expect to see a bigger jump forward when quantum computing becomes intertwined in the near future.
Doyle also predicts that the promise of powerful digital assistants will become a reality next year, as PCs and AI technology become more powerful.
Over the next 12 months the PC experience will shift from searching to prompting, from reading to understanding and from editing to directing, Doyle said. This will result in the emergence of a two-way human-machine partnership in workplaces across Ireland.
Also, as AI becomes a key part of laptops and devices, it will unlock improved privacy and security while also advancing sustainable design.
Zero trust is a modern cybersecurity strategy that is easy to understand based on the term following a belief to never trust and always verify. Doyle believes 2024 will see zero-trust cybersecurity evolve from a concept to a real technology.
Adopting a zero-trust approach helps organisations build a more resilient and responsive security infrastructure while ultimately lessening the impact of cyberattacks, Doyle said. As the benefits of zero-trust technology becomes evident, it is expected that zero trust will become the norm in a wider range of industries in Ireland.
Some experts argue that technology is one of the key ways companies and the wider community can achieve their sustainability goals. Luis Neves, the CEO of Global Enabling Sustainability Initiative, recently told SiliconRepublic.com that digital technologies will play a vital role in tackling the climate crisis and broader sustainability issues.
Doyle predicts that the role played by technology in advancing sustainability will grow in 2024.
With larger companies being required by the EU to disclose their environmental, social and governance (ESG) performance and activity, business and IT leaders in Ireland will increasingly rely on technology to track their climate data and reduce emissions, Doyle said.
Our latest Digital Pulse Survey revealed that almost half of companies are looking at upgrading technology in the coming year to cut rising energy costs and drive sustainable innovation.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republics digest of essential sci-tech news.
Original post:
Dell expects quantum computing and generative AI to link in 2024 - SiliconRepublic.com
Innovation Experts Make Their Pitch for Quantum Technology, Unleashing Excitement, Urgency – The Quantum Insider
Insider Brief
PRESS RELEASEConnecticut/December 18, 2023If Connecticut is afforded the opportunity to develop the most advanced technology in the worldtechnology that would save lives, create jobs, strengthen the economy, and revolutionize many industrieshow could it not vigorously pursue it?
That was the question that Rajeeb Hazra, an international quantum-computing expert, and the keynote speaker at Thursdays Imagining a Quantum Future event, posed to the audience of more than 150 academic, government, and industry leaders.
What gets me fired up is a future where the entire world benefits from novel developments, Hazra says. We could see cures for chronic diseases. Cancer and autism, and many other diseases, could be a thing of the past.
We could create batteries that last a lifetime, so that youd never have to stop to charge an electric vehicle, he continued. Quantum technology could solve problems that humanity, despite its best efforts, has not been able to solve.
Participants at the event learned how quantum technologies will be particularly valuable to Connecticut industries, including aerospace and advanced manufacturing, healthcare, and insurance and financial services. Leaders from each of those industries were present and helped lead breakout session discussions.
I think Connecticut, if you look at the recipe, has all of the ingredients in the right proportion to succeed, says Hazra, who is the president and CEO of Quantinuum, an international firm. Collectively I see people changing their question from Why Connecticut? to Why not Connecticut?
Hazra also emphasized the urgency to move forward quickly.
We have to be smart because the quantum future is closer than we think, he says. There is not an infinite window to make these decisions.
Connecticut Innovation Officer: This is Our Moment!
The effort to make Connecticut a national quantum technology hub is spearheaded by Quantum CT, a collaboration of academic, business, and government entities.
UConns President Radenka Maric, Vice President for Research, Innovation & Entrepreneurship Pamir Alpay, and Yale Universitys Vice Provost for Research Michael Crair, have all taken visible leadership positions on the initiative.
This effort was initiated by Dr. Maric some 20 months ago, because she recognized the vast potential of quantum technologies, Alpay said. Since then, the two universities have been working hand-in-hand to advance Connecticuts position as a center of excellence in quantum.
In May, the National Science Foundation awarded Quantum CT with a $1 million planning grant to build a network of partners in science, education, industry, government, and within the broader community, as a whole. At the end of 2024, the team plans to submit a phase-two proposal for a $160 million award to make Connecticut the hub for nations quantum technology.
This is our moment!, says Dan OKeefe, the commissioner designate of the Department of Economic and Community Development and Connecticuts chief innovation officer. Connecticut has the brainpower and the expertise to advance complicated projects, including those involving quantum technologies. The state has recovered from the pandemic, and is poised for something big. Now we need everyone rowing in the same direction, he says.
Technology Could Revolutionize Almost Every Industry
Imagining a Quantum Future was the first event of its kind put on in support of Quantum CT. It was organized jointly by Launc[H], the driving force behind Hartfords innovation ecosystem, the City of Hartford, the events lead sponsor, and UConn and Yale. Hosting it at the School of Business facility in Hartford was also strategic.
Quantum technologies could revolutionize business in virtually every industry, says David Souder, senior associate dean at the School of Business. We are committed to developing talent that is prepared to address these complex challenges.
Dr. Jeffrey Hines, UConn Healths Chief Diversity Officer, says quantum technology has the ability to benefit many previously marginalized communities, through workforce development, education, manufacturing and more. He believes that efforts being made to develop comprehensive workforce development and education strategies, through Quantum CT, offer tremendous opportunities to advance equity and inclusion in our state.
Hartford Mayor Luke Bronin, who is completing his eighth and final year in office, says that quantum technology has enormous potential to transform the world and Connecticuts industries. He says he is excited about the synergy between Yale and UConn.
Its a powerful partnership that marshals all the strengths of academics and industry to accelerate this work and position Connecticut at the forefront, he says.
Hartford Mayor-Elect Arunan Arulampalam also expressed his enthusiasm and support for advancing Hartfords position as a center of innovation. Were reclaiming the soul of this city!, he said, citing the fact that the first electric vehicle was developed and built in Hartford a century before the emergence of Tesla.
Crair, from Yale, says that the collegiate partnership has been mutually beneficial and will help the institutions recruit and retain the best educators, who seek environment of innovation and robust discovery.
Alpay, his UConn counterpart, says he is excited about the economic development opportunities that Quantum CT can generate, as well as its focus on developing new strategies to educate the next-generation workforce.
This is the technology of the future and we recognize the importance and opportunities of leading from the inception, he says.
Maric says she is pleased with the progress that the team has made so far, and remains excited for what is to come.
UConn is honored to join Yale as leaders in the effort to make Connecticut Americas accelerator by transforming a diverse, compact region into an economic development powerhouse using quantum tech, Maric says. Our event, Imagining a Quantum Future in Hartford generated tremendous excitement and helped to strengthen the partnerships that we will need to succeed.
SOURCE
See the original post here:
Innovation Experts Make Their Pitch for Quantum Technology, Unleashing Excitement, Urgency - The Quantum Insider