Category Archives: Quantum Computer
Quantum computer company IonQ makes Wall Street debut – Financial Times
- Quantum computer company IonQ makes Wall Street debut Financial Times
- Quantum computing company IonQ goes public via SPAC Fast Company
- IonQ Becomes First Publicly Traded, Pure-Play Quantum Computing Company HPCwire
- IONQ Stock: 7 Things to Know as Quantum Computing Firm IonQ Starts Trading InvestorPlace
- UMD president: Quantum physics will revolutionize the DMV region | COMMENTARY Baltimore Sun
- View Full Coverage on Google News
See the original post:
Quantum computer company IonQ makes Wall Street debut - Financial Times
‘Quantum computer algorithms are linear algebra, probabilities. This is not something that we do a good job of teaching our kids’ – The Register
Let's say, for the sake of argument, that quantum computers will exist in some useful fashion in the not too distant future.
And if that is the case, fundamental changes will be needed in education, supply chains, and national policies for us to use the machines to solve complex problems, panelists said a forum hosted by R Street Institute this week.
"We need ... to prepare people to think about computation in a fundamentally different way," said Chris Fall, senior advisor at the Center for Strategic and International Studies, during the discussion.
On conventional computers, information is encoded in strings of 0s and 1s, while in quantum computers, information is encoded in quantum bits that have a value of 0, 1, or a superposition of both states. This allows quantum computers to store much more information than a classic machine and process it in less time, in theory. There are limitations, such as the fact that they are unstable and prone to error despite efforts to address that, and may hit a wall if unprotected from background radiation. Encryption-breaking quantum computers are forever 15 years away.
Sorry, yes, we're assuming they will eventually work.
Google, D-Wave, IBM, Intel, Microsoft, Honeywell, and so on, are building qubits in different ways. Their goal is to build fault-tolerant machines that can run super-fast calculations by tempering qubit behavior and correcting errors introduced from the environment.
"The routine manipulation of the properties of single atoms in people's devices, devices, cars that is going to change everything. We don't have a full understanding of how that's going to happen." Fall said.
Starting now, education needs to be better for people to take advantage of the quantum processing breakthroughs as the hardware journey matures, the panelists said. Problem solving and algorithms will look very different in areas like finance and science, for example.
"The language of quantum algorithms are linear algebra and probabilities. This is not something that we do a good job of teaching our kids from a very early stage. That is kind of where we need to get started now," Fall said.
Quantum computing is a different problem-solving system and calculates differently from conventional computers, was the gist of the discussion.
Governments will need to drive change if quantum computing is a matter of national interest and public need, said Scott Friedman, a senior policy advisor of the House Homeland Security Committee.
Global legislation to protect semiconductor supply chains, like the CHIPS for America Act and Europe's Chips Act, needs to factor in quantum computing infrastructure, panelists said.
Most cryogenic refrigerators for quantum computers are made in Europe, and the United States needs to work with allies to secure those supply chains, said Allison Schwartz, global government relations and public affairs leader at quantum computer maker D-Wave Systems.
The government also needs to facilitate collaboration and bridge a gap between educators, developers, and scientists involved in algorithms and developing hardware, the panelists said.
The US introduced legislation called QUEST (Quantum User Expansion for Science and Technology) for increased access of quantum hardware and resources for research and further education. A National Quantum Initiative Act (NQI) was signed into law in 2018 to supercharge quantum computing development and research, but activity around these have stalled.
"The advisory committee for the NQI hasn't met in a while ... on the executive branch side. An easy next step to bring more focus in this area would be to convene that again and get broader input from the community," said Kate Weber, policy lead for quantum, robotics, and fundamental research at Google, which hopes to a build a fault-tolerant computer by 2030.
The moderator, R Street Institute senior fellow Miles Taylor, raised the idea of quantum computers creating sentient beings, much like the machines in the Terminator movies.
"I don't know if we're going to have a sentient computer," CSIS's Fall said, adding, "we're learning to manipulate single atoms at ... industrial scale. That's not a laboratory project. It'll change the world."
See the original post:
'Quantum computer algorithms are linear algebra, probabilities. This is not something that we do a good job of teaching our kids' - The Register
Fujitsu and Osaka University Deepen Collaborative Research and Development for Fault-Tolerant Quantum Computers – HPCwire
TOKYO, Oct. 1, 2021 Osaka University and Fujitsu Limited today announced the establishment of the Fujitsu Quantum Computing Joint Research Division as a collaborative research division at the Center for Quantum Information and Quantum Biology (hereinafter QIQB) of Osaka University.
The newly-established research division will focus on the development of foundational technologies for fault-tolerant quantum computers, which are able to perform accurate calculations while correcting errors that occur in quantum bits (qubits). These efforts will draw on the respective strengths of the two partners, combining QIQBs advanced quantum error correction and quantum software technologies with Fujitsus applied knowledge in computing and quantum technologies.
More specifically, QIQB and Fujitsu aim to develop quantum software for fault-tolerant quantum computers with up to several thousand qubits as well as technologies to verify its error correcting operations.
Going forward, the two partners will strengthen their cooperation in R&D towards the realization of fault-tolerant quantum computing technologies to innovate solutions to complex societal problems through quantum technology.
Background
Quantum computers, which make use of the principles of quantum mechanics including quantum superposition states and quantum entanglement(1), offer the potential to one day revolutionize computing, significantly exceeding the capabilities of conventional computing technologies to perform high-speed calculations.
Fault-tolerant quantum computing, capable of accurate and large-scale high-speed calculations using error correction codes, may become a key technology especially in the fields of drug discovery and finance, which require a technology able to solve complex and large-scale problems at high speed.
In March 2020, Osaka University established QIQB in order to promote quantum information and quantum biology research, focusing on research in a wide range of fields ranging from quantum computing, quantum information fusion, quantum information devices, quantum communications and security, quantum measurement and sensing, and quantum biology.
QIQB has also been chosen as the main center for quantum software research in the field of quantum technology of the COI-NEXT program(2)of the Japan Science and Technology Agency (JST) and thus plays an important role in Japans strategy for quantum technology innovation.
Cooperating with domestic and overseas research institutes, Fujitsu has been engaged in full-scale research and development of quantum computing since 2020, aiming to further improve the performance of computing technologies.
Leveraging its quantum-inspired computing(3)solution Digital Annealer, which is designed to solve large-scale combinatorial optimization problems, Fujitsu is providing customers solutions in various fields like drug discovery and logistics.
In October 2020, Fujitsu started collaborative research(4)with Osaka University on quantum error correction. The establishment of the Fujitsu Quantum Computing Joint Research Unit will further strengthen R&D in fault-tolerant quantum computer systems.
Outline of the Joint Research
Name: Fujitsu Quantum Computing Joint Research Division
Location: Center for Quantum Information and Quantum Biology (QIQB), International Advanced Research Institute (IARI), Osaka University (Toyonaka City, Osaka Prefecture)
Research Period: October 1, 2021 to March 31, 2024
Research Contents: R&D of Quantum Software for fault-tolerant quantum computers
*Assuming a quantum computer with a scale of several thousand qubits, the joint division will research and develop an error correction algorithm able to restore the original information from faulty qubits, as well as technologies to evaluate the performance of this algorithm.
*In order to perform quantum computation using logical qubits(5)generated through quantum error correction codes, the joint division will focus on the R&D and implementation of a set of software solutions required from program input to the result output.With regard to future practical applications of this technology, the division will furthermore verify the operation of these solutions using a virtual machine environment to evaluate the effects of noise add up.
Roles and Responsibilities
Osaka University
Fujitsu
Future Plans
In order to contribute to the further development of quantum computing science and technology, Osaka University and Fujitsu will strengthen their cooperation with a variety of research institutions and companies. Through the practical application of the results of this joint research, the partners aim to contribute to an early practical application of quantum computing with the potential to drive innovations and create a sustainable society.
Osaka University and Fujitsu will also collaborate with related industries and academia to support the training of new human resources in the field of quantum technology.
All company or product names mentioned herein are trademarks or registered trademarks of their respective owners. Information provided in this press release is accurate at time of publication and is subject to change without advance notice.
About Osaka University
Osaka University was founded in 1931 as one of the seven imperial universities of Japan and is now one of Japans leading comprehensive universities with a broad disciplinary spectrum. This strength is coupled with a singular drive for innovation that extends throughout the scientific process, from fundamental research to the creation of applied technology with positive economic impacts. Its commitment to innovation has been recognized in Japan and around the world, being named Japans most innovative university in 2015 (Reuters 2015 Top 100) and one of the most innovative institutions in the world in 2017 (Innovative Universities and the Nature Index Innovation 2017). Now, Osaka University is leveraging its role as a Designated National University Corporation selected by the Ministry of Education, Culture, Sports, Science and Technology to contribute to innovation for human welfare, sustainable development of society, and social transformation. Website:https://resou.osaka-u.ac.jp/en.
About Fujitsu
Fujitsu is the leading Japanese information and communication technology (ICT) company offering a full range of technology products, solutions and services. Approximately 126,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited reported consolidated revenues of 3.6 trillion yen (US$34 billion) for the fiscal year ended March 31, 2021. For more information, please seewww.fujitsu.com.
Source: Fujitsu
The rest is here:
Fujitsu and Osaka University Deepen Collaborative Research and Development for Fault-Tolerant Quantum Computers - HPCwire
Fermilab on the trail for a new building block of matter and quantum computing power – Medill Reports Chicago – Medill Reports: Chicago
By Sarah Anderson and Yuliya KlochanMedill Reports
Researchers transported a gigantic electromagnetic ring from Brookhaven National Laboratory on Long Island to Fermilab near Chicago eight years ago in the search for a new building block of matter.While it wasnt the secret spaceship bystanders thought it was, it did allow scientists to explore fundamental questions about our universe.
The ring was needed to confirm an experimental result that had intrigued particle physicists for 20 years. The subject of the experiment was the muon, one of the 17 fundamental particles of nature. The muon has the same negative charge as an electron, but the mass of about 200 electrons. Muons behave like tiny spinning tops that generate their own magnetic field.
In 2001, scientists at Brookhaven National Laboratory measured the frequency at which muons rotated in an external magnetic field. This rotation frequency is used to calculate a g factora scaling constant that relates the magnetic strength and rotational momentum of the muon. The g factor is important because it can indicate the presence of other particles that block the muons interaction with the applied magnetic force.
The researchers observed that the experimental rotation frequency produced a g factor greater than the value predicted by the standard theoretical model of physics. The Standard Model accounts for all the known fundamental particles and forces of nature, so the Brookhaven result hinted at the existence of undiscovered particles or forces.
If these two numbers dont agree with each other, its the space in the middle where the new physics can lie, said Chris Polly, a senior scientist for the muon experiment at Fermilab.
Fermilab combined its muon-generating particle accelerator with Brookhavens electromagnetic ring to repeat Brookhavens initial experiment on a much larger scale. They again observed that the measured rotation frequency did not align with the theoretical g factor, suggesting that the Standard Model may need to be overhauled. There is only a 1 in 40,000 probability that the results differed by chance, providing further evidence of new physical forces or particles in the universe.
Maybe theres monsters lurking out there that we havent even imagined yet, Polly said.
As experimental physicists at Fermilab work to replicate this result, theoretical physicists across the world are using simulations to scrutinize their theoretical models. And they need powerful computers to do so.
Although its not yet ready to be used for the muon experiment, researchers at Fermilab are also working to develop technology for quantum computers that can solve such complex problems exponentially faster than standard computers.
Think of it this way. If someone gave you a list of locations and told you they had stashed a pile of cash at one of them, you would have no choice but to search one location, and then the next, and so on until you found it. Standard computers are subject to this same limitation. Just as you can only be in one place at a time, the system can only occupy one of two defined states (represented by the ones and zeroes you see in computer hacking movies) at a given moment.
But what if you could search many locations at the same time? Thats essentially what a quantum computer does. Its system can occupy multiple superimposed quantum states simultaneously, allowing the computer to consider many possible solutions to a problem at once.
It actually is extraordinarily valuable in terms of being able to traverse through the entire computation space much more rapidly than a traditional computer, said Akshay Murthy, a postdoctoral research associate at Fermilab.
Murthy and his colleagues are researching computer technology called superconducting qubits (quantum bits) that use electromagnetic radiation to access the higher-energy quantum states. Specifically, they are working to prolong the qubits coherence timethe amount of time that the system can live in the quantum space and perform calculations. Right now, were getting poofed out of the everywhere at once mode before we can find the cash. In fact, the coherence times of qubits need to be 1,000 to 1 million times longer before they can be used for quantum computing.
To extend coherence times, the team is examining the qubits under a powerful microscope and analyzing the chemical composition of their surfaces to look for any defects that might cause occupation of the quantum states to come crashing down prematurely. They are also exploring modifications that could be made to the external environment, such as shielding the qubit in a freezing cold chamber to prevent temperature fluctuations that might destabilize the system.
This technology is truly transformational if were able to deliver on its promises, Murthy said.
Sarah Anderson is a health, environment and science reporter at Medill and a Ph.D. chemist. Follow her on Twitter @seanderson63.Yuliya Klochan is a health, environment and science reporter at Medill.
View original post here:
Fermilab on the trail for a new building block of matter and quantum computing power - Medill Reports Chicago - Medill Reports: Chicago
A Boulder Company Is Leading the Next Technology Revolution – 5280 – 5280 | The Denver Magazine
A Boulder Company Is Leading the Next Technology Revolution - 5280 Photo courtesy of Cold Quanta Compass
ColdQuanta is ready to take the next step in quantum computing.
Although still in their infancy, quantum computers are already big business, with IBM, Microsoft, Google, and state actors like China cumulatively investing billions to develop the superfast number crunchers. But ColdQuanta, a relatively tiny Boulder firm, may beat them all to a major milestone later this year: releasing a 100-qubit quantum computer.
That would be a big step toward quantum advantage (QA), the point at which these machines will be able to compute in seconds certain kinds of useful problems that would take traditional supercomputers thousands of years to solve. How? Where your laptop must try each possible solution in turn to find the answer, quantum computers can test solutions simultaneously. To do this, they swap bits for qubits made of atoms or subatomic particles chilled to just above absolute zero, where the laws of physics get freaky. While a bit can only be a one or a zero, heads or tails, qubits can be both heads and tails at once.
ColdQuantas advantage lies in how it chills those atoms. Unlike many of its competitors, who use bulky liquid helium refrigeration, ColdQuanta uses lasers and traps them in a sleek glass prism. The technique is so effective, says Paul Lipman, ColdQuantas president of quantum computing, that it may only take a few more years to reach the hundredsor even thousandsof qubits necessary to achieve QA. Once its realized, QA could accelerate scientific discovery, from modeling new cancer drugs on a molecular level to mapping the state of the universe seconds after the Big Bang.
This article appeared in the October 2021 issue of 5280.
Nicholas writes and edits the Compass, Adventure, and Culture sections of 5280 and writes for 5280.com.
All things Colorado delivered straight to your inbox.
Continue reading here:
A Boulder Company Is Leading the Next Technology Revolution - 5280 - 5280 | The Denver Magazine
Judith Olson, Senior Physicist at ColdQuanta, Named Next Generation Leader of the Year at Women in IT Awards – HPCwire
BOULDER, Colo., Oct. 1, 2021 ColdQuanta, the leader in Cold Atom Quantum Technology, announced that Judith Olson, Head of the Atomic Clock Division and Senior Physicist, was named Next Generation Leader of the Year by theWomen in IT Awardsheld inNew YorkonSeptember 21, 2021.
According to the Women in IT Awards, Our 2021 winner demonstrated leadership that has been instrumental in guiding the team to reach new milestones, including taking existing techniques from the research and testing phase, into the field. In doing so, she was able to secure funding and awards to further develop this capability. Judith is breaking down barriers in her space, she serves as a thought leader in a male-dominated discipline, her work is respected, and she speaks on the subject matter to global audiences.
Olson spearheads ColdQuantas Atomic Clock Division. Atomic clocks are a key piece of technology that reduces the reliance on GPS, which is vulnerable to security threats and loss of signal. Under Judiths leadership, ColdQuanta will deliver atomic clocks that enable new capabilities in positioning and communications for use in industries such as aerospace and defense.
Judiths work continues to be of vital importance to ColdQuanta and the industry at large, saidScott Faris, CEO of ColdQuanta. Her leadership has been instrumental in guiding the team to reach new milestones, including taking existing atomic and optical techniques from the laboratory into the field.
The Women in IT Awards also recognized Olson for her mentorship, stating, She is also a mentor and has created a community via monthly events that she facilitates and is actively involved in community tutoring and STEM outreach events. She understands the power of diversity within companies, especially in leadership ranks. She is showing young women what is possible in the world of science by breaking down stereotypes of how a leader should act, feel, speak and look in the workplace. She has most certainly laid the groundwork for other diverse candidates entering the organization and this field of expertise.
About ColdQuanta
ColdQuanta is the leader in Cold Atom Quantum Technology, the most scalable, versatile, and commercially viable application of quantum. The company operates three lines of business Quantum Computing, Devices and Machines, and Quantum Research-as-a-Service. The Quantum Computing division is developing the launch ofHilbert1.0, a cloud-based 100 qubit quantum computer. The Devices and Machines division provides products for quantum computing companies and quantum lab environments. Quantum Research-as-a-Service supports the government and enterprises in developing quantum inertial sensing, radio frequency receivers, and networking technologies, including high precision clock prototypes. ColdQuanta is based inBoulder, CO, with offices inMadison, WisconsinandOxford, UK. Find out more atwww.coldquanta.com.
Source: ColdQuanta
Quantum Computing in Manufacturing Market Still Has Room To Grow: International Business Machines, D-Wave Systems, Microsoft – Digital Journal
Latest Report Available at Advance Market Analytics, Quantum Computing in Manufacturing Market provides pin-point analysis for changing competitive dynamics and a forward looking perspective on different factors driving or restraining industry growth.
The global Quantum Computing in Manufacturing market focuses on encompassing major statistical evidence for the Quantum Computing in Manufacturing industry as it offers our readers a value addition on guiding them in encountering the obstacles surrounding the market. A comprehensive addition of several factors such as global distribution, manufacturers, market size, and market factors that affect the global contributions are reported in the study. In addition the Quantum Computing in Manufacturing study also shifts its attention with an in-depth competitive landscape, defined growth opportunities, market share coupled with product type and applications, key companies responsible for the production, and utilized strategies are also marked.
Key players in the global Quantum Computing in Manufacturing market:
International Business Machines (United States), D-Wave Systems (Canada), Microsoft (United States), Amazon (United States), Rigetti Computing (United States), Google (United States), Intel (United States), Honeywell International (United States), Quantum Circuits (United States), QC Ware (United States), Atom Computing, Inc. (United States), Xanadu Quantum Technologies Inc. (Canada), Zapata Computing, Inc. (United States), Strangeworks, Inc (United States)
Free Sample Report + All Related Graphs & Charts @:https://www.advancemarketanalytics.com/sample-report/179263-global-quantum-computing-in-manufacturing-market
Quantum computing is the computing technique that uses the collective resource of quantum states, Some of them main resources are superposition and entanglement, to perform computation. As these are able to execute quantum computations that is why it also called quantum computers. Quantum computing harnesses the phenomena of quantum mechanics to deliver a huge leap forward in computation to solve certain problems. Quantum computing is an area of study focused on the development of computer-based technologies centered on the principles of quantum theory.
On 12 February 2021 To further progress into the quantum age, various projects are in the works to take computing to the next level. After forming a consortium in December, EU stakeholders have launched an effort to supercharge quantum processor production.
Whats Trending in Market?
Integration With Advance Technologies
What are the Market Drivers?
Raising Deposal Income
The Global Quantum Computing in Manufacturing Market segments and Market Data Break Down are illuminated below:
by Application (Simulation & Testing, Financial Modeling, Artificial Intelligence & Machine Learning, Cybersecurity & Cryptography, Other), Component (Quantum Computing Devices, Quantum Computing Software, Quantum Computing Services)
The study encompasses a variety of analytical resources such as SWOT analysis and Porters Five Forces analysis coupled with primary and secondary research methodologies. It covers all the bases surrounding the Quantum Computing in Manufacturing industry as it explores the competitive nature of the market complete with a regional analysis.
Have Any Questions Regarding Global Quantum Computing in Manufacturing Market Report, Ask Our [emailprotected]https://www.advancemarketanalytics.com/enquiry-before-buy/179263-global-quantum-computing-in-manufacturing-market
The Quantum Computing in Manufacturing industry report further exhibits a pattern of analyzing previous data sources gathered from reliable sources and sets a precedent growth trajectory for the Quantum Computing in Manufacturing market. The report also focuses on a comprehensive market revenue streams along with growth patterns, Local reforms, COVID Impact analysis with focused approach on market trends, and the overall growth of the market.
Moreover, the Quantum Computing in Manufacturing report describes the market division based on various parameters and attributes that are based on geographical distribution, product types, applications, etc. The market segmentation clarifies further regional distribution for the Quantum Computing in Manufacturing market, business trends, potential revenue sources, and upcoming market opportunities.
The Quantum Computing in Manufacturing market study further highlights the segmentation of the Quantum Computing in Manufacturing industry on a global distribution. The report focuses on regions of LATAM, North America, Europe, Asia, and the Rest of the World in terms of developing market trends, preferred marketing channels, investment feasibility, long term investments, and business environmental analysis. The Quantum Computing in Manufacturing report also calls attention to investigate product capacity, product price, profit streams, supply to demand ratio, production and market growth rate, and a projected growth forecast.
Read Detailed Index of full Research Study at @https://www.advancemarketanalytics.com/reports/179263-global-quantum-computing-in-manufacturing-market
In addition, the Quantum Computing in Manufacturing market study also covers several factors such as market status, key market trends, growth forecast, and growth opportunities. Furthermore, we analyze the challenges faced by the Quantum Computing in Manufacturing market in terms of global and regional basis. The study also encompasses a number of opportunities and emerging trends which are considered by considering their impact on the global scale in acquiring a majority of the market share.
Some Point of Table of Content:Chapter One: Report OverviewChapter Two: Global Market Growth TrendsChapter Three: Value Chain of Quantum Computing in Manufacturing MarketChapter Four: Players ProfilesChapter Five: Global Quantum Computing in Manufacturing Market Analysis by RegionsChapter Six: North America Quantum Computing in Manufacturing Market Analysis by CountriesChapter Seven: Europe Quantum Computing in Manufacturing Market Analysis by CountriesChapter Eight: Asia-Pacific Quantum Computing in Manufacturing Market Analysis by CountriesChapter Nine: Middle East and Africa Quantum Computing in Manufacturing Market Analysis by CountriesChapter Ten: South America Quantum Computing in Manufacturing Market Analysis by CountriesChapter Eleven: Global Quantum Computing in Manufacturing Market Segment by TypesChapter Twelve: Global Quantum Computing in Manufacturing Market Segment by Applications
Buy This Exclusive Research Here: https://www.advancemarketanalytics.com/buy-now?format=1&report=179263
Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, West Europe or Southeast Asia.
Contact US:Craig Francis (PR & Marketing Manager)AMA Research & Media LLPUnit No. 429, Parsonage Road Edison, NJNew Jersey USA 08837Phone: +1 (206) 317 1218[emailprotected]
Continue reading here:
Quantum Computing in Manufacturing Market Still Has Room To Grow: International Business Machines, D-Wave Systems, Microsoft - Digital Journal
The coevolution of particle physics and computing – Symmetry magazine
In the mid-twentieth century, particle physicists were peering deeper into the history and makeup of the universe than ever before. Over time, their calculations became too complex to fit on a blackboardor to farm out to armies of human computers doing calculations by hand.
To deal with this, they developed some of the worlds earliest electronic computers.
Physics has played an important role in the history of computing. The transistorthe switch that controls the flow of electrical signal within a computerwas invented by a group of physicists at Bell Labs. The incredible computational demands of particle physics and astrophysics experiments have consistently pushed the boundaries of what is possible. They have encouraged the development of new technologies to handle tasks from dealing with avalanches of data to simulating interactions on the scales of both the cosmos and the quantum realm.
But this influence doesnt just go one way. Computing plays an essential role in particle physics and astrophysics as well. As computing has grown increasingly more sophisticated, its own progress has enabled new scientific discoveries and breakthroughs.
Illustration by Sandbox Studio, Chicago with Ariel Davis
In 1973, scientists at Fermi National Accelerator Laboratory in Illinois got their first big mainframe computer: a 7-year-old hand-me-down from Lawrence Berkeley National Laboratory. Called the CDC 6600, it weighed about 6 tons. Over the next five years, Fermilab added five more large mainframe computers to its collection.
Then came the completion of the Tevatronat the time, the worlds highest-energy particle acceleratorwhich would provide the particle beams for numerous experiments at the lab. By the mid-1990s, two four-story particle detectors would begin selecting, storing and analyzing data from millions of particle collisions at the Tevatron per second. Called the Collider Detector at Fermilab and the DZero detector, these new experiments threatened to overpower the labs computational abilities.
In December of 1983, a committee of physicists and computer scientists released a 103-page report highlighting the urgent need for an upgrading of the laboratorys computer facilities. The report said the lab should continue the process of catching up in terms of computing ability, and that this should remain the laboratorys top computing priority for the next few years.
Instead of simply buying more large computers (which were incredibly expensive), the committee suggested a new approach: They recommended increasing computational power by distributing the burden over clusters or farms of hundreds of smaller computers.
Thanks to Intels 1971 development of a new commercially available microprocessor the size of a domino, computers were shrinking. Fermilab was one of the first national labs to try the concept of clustering these smaller computers together, treating each particle collision as a computationally independent event that could be analyzed on its own processor.
Like many new ideas in science, it wasnt accepted without some pushback.
Joel Butler, a physicist at Fermilab who was on the computing committee, recalls, There was a big fight about whether this was a good idea or a bad idea.
A lot of people were enchanted with the big computers, he says. They were impressive-looking and reliable, and people knew how to use them. And then along came this swarm of little tiny devices, packaged in breadbox-sized enclosures.
The computers were unfamiliar, and the companies building them werent well-established. On top of that, it wasnt clear how well the clustering strategy would work.
As for Butler? I raised my hand [at a meeting] and said, Good ideaand suddenly my entire career shifted from building detectors and beamlines to doing computing, he chuckles.
Not long afterward, innovation that sparked for the benefit of particle physics enabled another leap in computing. In 1989, Tim Berners-Lee, a computer scientist at CERN, launched the World Wide Web to help CERN physicists share data with research collaborators all over the world.
To be clear, Berners-Lee didnt create the internetthat was already underway in the form the ARPANET, developed by the US Department of Defense. But the ARPANET connected only a few hundred computers, and it was difficult to share information across machines with different operating systems.
The web Berners-Lee created was an application that ran on the internet, like email, and started as a collection of documents connected by hyperlinks. To get around the problem of accessing files between different types of computers, he developed HTML (HyperText Markup Language), a programming language that formatted and displayed files in a web browser independent of the local computers operating system.
Berners-Lee also developed the first web browser, allowing users to access files stored on the first web server (Berners-Lees computer at CERN). He implemented the concept of a URL (Uniform Resource Locator), specifying how and where to access desired web pages.
What started out as an internal project to help particle physicists share data within their institution fundamentally changed not just computing, but how most people experience the digital world today.
Back at Fermilab, cluster computing wound up working well for handling the Tevatron data. Eventually, it became industry standard for tech giants like Google and Amazon.
Over the next decade, other US national laboratories adopted the idea, too. SLAC National Accelerator Laboratorythen called Stanford Linear Accelerator Centertransitioned from big mainframes to clusters of smaller computers to prepare for its own extremely data-hungry experiment, BaBar. Both SLAC and Fermilab also were early adopters of Lees web server. The labs set up the first two websites in the United States, paving the way for this innovation to spread across the continent.
In 1989, in recognition of the growing importance of computing in physics, Fermilab Director John Peoples elevated the computing department to a full-fledged division. The head of a division reports directly to the lab director, making it easier to get resources and set priorities. Physicist Tom Nash formed the new Computing Division, along with Butler and two other scientists, Irwin Gaines and Victoria White. Butler led the division from 1994 to 1998.
These computational systems worked well for particle physicists for a long time, says Berkeley Lab astrophysicist Peter Nugent. That is, until Moores Law started grinding to a halt.
Moores Law is the idea that the number of transistors in a circuit will double, making computers faster and cheaper, every two years. The term was first coined in the mid-1970s, and the trend reliably proceeded for decades. But now, computer manufacturers are starting to hit the physical limit of how many tiny transistors they can cram onto a single microchip.
Because of this, says Nugent, particle physicists have been looking to take advantage of high-performance computing instead.
Nugent says high-performance computing is something more than a cluster, or a cloud-computing environment that you could get from Google or AWS, or at your local university.
What it typically means, he says, is that you have high-speed networking between computational nodes, allowing them to share information with each other very, very quickly. When you are computing on up to hundreds of thousands of nodes simultaneously, it massively speeds up the process.
On a single traditional computer, he says, 100 million CPU hours translates to more than 11,000 years of continuous calculations. But for scientists using a high-performance computing facility at Berkeley Lab, Argonne National Laboratory or Oak Ridge National Laboratory, 100 million hours is a typical, large allocation for one year at these facilities.
For more than a decade, supercomputers like these have been providing theorists with the computing power to solve with high precision equations in quantum chromodynamics, enabling them to make predictions about the strong forces binding quarks into the building blocks of matter.
And although astrophysicists have always relied on high-performance computing for simulating the birth of stars or modeling the evolution of the cosmos, Nugent says they are now using it for their data analysis as well.
This includes rapid image-processing computations that have enabled the observations of several supernovae, including SN 2011fe, captured just after it began. We found it just a few hours after it exploded, all because we were able to run these pipelines so efficiently and quickly, Nugent says.
According to Berkeley Lab physicist Paolo Calafiura, particle physicists also use high-performance computing for simulationsfor modeling not the evolution of the cosmos, but rather what happens inside a particle detector. Detector simulation is significantly the most computing-intensive problem that we have, he says.
Scientists need to evaluate multiple possibilities for what can happen when particles collide. To properly correct for detector effects when analyzing particle detector experiments, they need to simulate more data than they collect. If you collect 1 billion collision events a year, Calafiura says, you want to simulate 10 billion collision events.
Calafiura says that right now, hes more worried about finding a way to store all of the simulated and actual detector data than he is about producing it, but he knows that wont last.
When does physics push computing? he says. When computing is not good enough We see that in five years, computers will not be powerful enough for our problems, so we are pushing hard with some radically new ideas, and lots of detailed optimization work.
Thats why the Department of Energys Exascale Computing Project aims to build, in the next few years, computers capable of performing a quintillion (that is, a billion billion) operations per second. The new computers will be 1000 times faster than the current fastest computers.
The exascale computers will also be used for other applications ranging from precision medicine to climate modeling to national security.
Innovations in computer hardware have enabled astrophysicists to push the kinds of simulations and analyses they can do. For example, Nugent says, the introduction of graphics processing units has sped up astrophysicists ability to do calculations used in machine learning, leading to an explosive growth of machine learning in astrophysics.
With machine learning, which uses algorithms and statistics to identify patterns in data, astrophysicists can simulate entire universes in microseconds.
Machine learning has been important in particle physics as well, says Fermilab scientist Nhan Tran. [Physicists] have very high-dimensional data, very complex data, he says. Machine learning is an optimal way to find interesting structures in that data.
The same way a computer can be trained to tell the difference between cats and dogs in pictures, it can learn how to identify particles from physics datasets, distinguishing between things like pions and photons.
Tran says using computation this way can accelerate discovery. As physicists, weve been able to learn a lot about particle physics and nature using non-machine-learning algorithms, he says. But machine learning can drastically accelerate and augment that processand potentially provide deeper insight into the data.
And while teams of researchers are busy building exascale computers, others are hard at work trying to build another type of supercomputer: the quantum computer.
Remember Moores Law? Previously, engineers were able to make computer chips faster by shrinking the size of electrical circuits, reducing the amount of time it takes for electrical signals to travel. Now our technology is so good that literally the distance between transistors is the size of an atom, Tran says. So we cant keep scaling down the technology and expect the same gains weve seen in the past."
To get around this, some researchers are redefining how computation works at a fundamental levellike, really fundamental.
The basic unit of data in a classical computer is called a bit, which can hold one of two values: 1, if it has an electrical signal, or 0, if it has none. But in quantum computing, data is stored in quantum systemsthings like electrons, which have either up or down spins, or photons, which are polarized either vertically or horizontally. These data units are called qubits.
Heres where it gets weird. Through a quantum property called superposition, qubits have more than just two possible states. An electron can be up, down, or in a variety of stages in between.
What does this mean for computing? A collection of three classical bits can exist in only one of eight possible configurations: 000, 001, 010, 100, 011, 110, 101 or 111. But through superposition, three qubits can be in all eight of these configurations at once. A quantum computer can use that information to tackle problems that are impossible to solve with a classical computer.
Fermilab scientist Aaron Chou likens quantum problem-solving to throwing a pebble into a pond. The ripples move through the water in every possible direction, simultaneously exploring all of the possible things that it might encounter.
In contrast, a classical computer can only move in one direction at a time.
But this makes quantum computers faster than classical computers only when it comes to solving certain types of problems. Its not like you can take any classical algorithm and put it on a quantum computer and make it better, says University of California, Santa Barbara physicist John Martinis, who helped build Googles quantum computer.
Although quantum computers work in a fundamentally different way than classical computers, designing and building them wouldnt be possible without traditional computing laying the foundation, Martinis says. We're really piggybacking on a lot of the technology of the last 50 years or more.
The kinds of problems that are well suited to quantum computing are intrinsically quantum mechanical in nature, says Chou.
For instance, Martinis says, consider quantum chemistry. Solving quantum chemistry problems with classical computers is so difficult, he says, that 10 to 15% of the worlds supercomputer usage is currently dedicated to the task. Quantum chemistry problems are hard for the very reason why a quantum computer is powerfulbecause to complete them, you have to consider all the different quantum-mechanical states of all the individual atoms involved.
Because making better quantum computers would be so useful in physics research, and because building them requires skills and knowledge that physicists possess, physicists are ramping up their quantum efforts. In the United States, the National Quantum Initiative Act of 2018 called for the National Institute of Standards and Technology, the National Science Foundation and the Department of Energy to support programs, centers and consortia devoted to quantum information science.
In the early days of computational physics, the line between who was a particle physicist and who was a computer scientist could be fuzzy. Physicists used commercially available microprocessors to build custom computers for experiments. They also wrote much of their own softwareranging from printer drivers to the software that coordinated the analysis between the clustered computers.
Nowadays, roles have somewhat shifted. Most physicists use commercially available devices and software, allowing them to focus more on the physics, Butler says. But some people, like Anshu Dubey, work right at the intersection of the two fields. Dubey is a computational scientist at Argonne National Laboratory who works with computational physicists.
When a physicist needs to computationally interpret or model a phenomenon, sometimes they will sign up a student or postdoc in their research group for a programming course or two and then ask them to write the code to do the job. Although these codes are mathematically complex, Dubey says, they arent logically complex, making them relatively easy to write.
A simulation of a single physical phenomenon can be neatly packaged within fairly straightforward code. But the real world doesnt want to cooperate with you in terms of its modularity and encapsularity, she says.
Multiple forces are always at play, so to accurately model real-world complexity, you have to use more complex softwareideally software that doesnt become impossible to maintain as it gets updated over time. All of a sudden, says Dubey, you start to require people who are creative in their own rightin terms of being able to architect software.
Thats where people like Dubey come in. At Argonne, Dubey develops software that researchers use to model complex multi-physics systemsincorporating processes like fluid dynamics, radiation transfer and nuclear burning.
Hiring computer scientists for research projects in physics and other fields of science can be a challenge, Dubey says. Most funding agencies specify that research money can be used for hiring students and postdocs, but not paying for software development or hiring dedicated engineers. There is no viable career path in academia for people whose careers are like mine, she says.
In an ideal world, universities would establish endowed positions for a team of research software engineers in physics departments with a nontrivial amount of computational research, Dubey says. These engineers would write reliable, well-architected code, and their institutional knowledge would stay with a team.
Physics and computing have been closely intertwined for decades. However the two developtoward new analyses using artificial intelligence, for example, or toward the creation of better and better quantum computersit seems they will remain on this path together.
Go here to see the original:
The coevolution of particle physics and computing - Symmetry magazine
The Fourth Industrial Revolution (4IR) Takeover: IoT and Quantum-Resistant Blockchains Are Setting the Trend – FinanceFeeds
The 21st century ushered in a new era following the debut of the internet and Web 2.0 applications. Today, most people have interacted with search engines such as Google and social media platforms, including Facebook and Twitter. While the internet was a hallmark debut, more technological innovations have come up, marking the fourth industrial revolution (4IR)
This new line of technologies features the likes of Artificial Intelligence (AI), blockchain, and the Internet of Things (IoT). Despite their value proposition, there have been arguments that some of these technologies might soon replace most human roles in todays industries. According to a report by PWC, it is likely that close to 30% of the jobs will be automated by the mid-2030s.
On the brighter side, however, these emerging technologies are proving to have a significant value proposition. IoT is connecting more devices than ever before, while AI is being used to improve machine learning across various industries, including healthcare.
As for blockchain, the distributed ledger technology has paved the way for decentralized markets, featuring digital assets such as Bitcoin and upcoming niches like Decentralized Finance (DeFi) and Non-fungible Tokens (NFTs).
While machines may take some time to replace human roles, it is better to prepare for whats to come by jumping into the right technologies. The fourth industrial revolution is already setting the stage for this shift through the aforementioned technologies.
In the field of IoT, there have been a lot of developments, with the invention of smart homes and cities. However, most people are still not aware of how to become part of these growing networks. Thanks to the value proposition of combining IoT and blockchain, it is now almost seamless to participate in this growing ecosystem through projects such as Minima Global.
The Minima global initiative is one of the IoT-oriented projects that leverage blockchain technology to introduce a decentralized ultra-lean protocol that can fit on an IoT or mobile device. Essentially, anyone across the globe can run a full constructing and validating IoT node from their mobile devices.
This initiative by Minima seeks to create a more decentralized IoT network where value is transferred within a censorship-resistant environment. In doing so, Minima is optimistic about shaping the future of IoT networks by building a scalable ecosystem.
Similar to IoT innovations, there have been significant developments in blockchain ecosystems. Nonetheless, several shortcomings currently face this burgeoning niche, including the threat of quantum computers. At the core, most blockchain projects rely on cryptography to encrypt or decrypt information through a combination of complex and sophisticated algorithms.
With quantum computing gaining popularity, the Qubit built computers will likely crack the binary algorithms run on classical computers. To get a better picture, a classical computer utilizes bits in the form of transistors hence existing in one of the binary functions (0 or 1). On the other hand, quantum computers leverage Qubits which means they can take either of the binary functions or both simultaneously in a superposition state.
This threat of quantum computing is now forcing innovators in the blockchain industry to prepare for the future. As a result, some upcoming initiatives, such as the QANplatform, have introduced a quantum-resistant hybrid blockchain platform. The project seeks to build a futuristic blockchain network that will survive the threat of quantum computers while allowing stakeholders to build decentralized projects, including DApps and DeFi applications.
Unlike most existing blockchain networks, QANplatform is built using a post-quantum cryptographic algorithm under the Rust programming language. In addition, the platform leverages a Proof-of-Randomness (PoR) algorithm, positioning it as one of the greenest blockchain networks. This is one of a few blockchain projects that have taken the lead in the preparation of a quantum computing world.
As the adage goes, change is often inevitable; likewise, the adoption of modern-day technologies such as blockchain and IoT is becoming an expensive affair to ignore. While some of these innovations might replace human roles, their general value proposition is far greater than the projected replacements. If anything, blockchain and IoT have created more opportunities for people globally to become part of the futuristic world.
That said, it would be better for stakeholders to invest more in research and development to advance the potential of the 4IR. Currently, initiatives under these lines of technologies have garnered the support of tech industry veterans and other notable players such as financial institutions and governmental agencies. However, there is still a long way to go, given the pace of innovation and disrupting technologies like quantum computing.
Though still early to predict how the 4IR will shape the world, it is clear that the featured technologies are finding fundamental niches. Blockchain, in particular, has set the platform for a decentralized monetary ecosystem. Meanwhile, IoT innovations are connecting more devices, changing the narrative that only human beings can efficiently exchange information. As both technologies become widely adopted, chances are higher that innovators will integrate them to continue improving the state of existing ecosystems.
Here is the original post:
The Fourth Industrial Revolution (4IR) Takeover: IoT and Quantum-Resistant Blockchains Are Setting the Trend - FinanceFeeds
3 Quantum Computing Stocks to Buy for Their Promising Healthcare Potential – InvestorPlace
Quantum computing stocks are gaining traction as this once-nascent industry is fast evolving. Wall Street is paying increased attention to the segment as companies move from the experimental research phase to developing commercially feasible computers that can solve the worlds most complex problems and revolutionize businesses in many industries. Thus, quantum computing stocks have become a hot item.
Overall, quantum computers offer computational power 100 million times faster than todays ordinary computers at the moment. They can process more information exponentially with each additional quantum bit, or qubit.
From advances in machine learning to healthcare, artificial intelligence (AI) and advanced cybersecurity capabilities, quantum computers are expected to have a significant impact across a wide range of industries. Therefore, I want to introduce three quantum computing stocks to invest in the rest of this year.
Multiple countries are already involved in the quantum computing race, and The Global Quantum Computing Market Size is expected to value USD 487.4 million in 2021 and is expected to reach USD 3728.4 million by 2030 at a CAGR of 25.40% over the forecast period from 2021 to 2030.
So, with that information, lets take a look at three of the top quantum computing stocks on the market right now.
Now, lets dive in and take a closer look at each one.
52-Week Range: $31.76 52.51Dividend Yield: 0.43%Expense Ratio: 0.40% per year
We start our discussion with an exchange-traded fund (ETF), namely the Defiance Quantum ETF. It invests in global businesses that are leading the technology and applications behind quantum computing, cloud platforms, machine learning, as well as other advanced computing technologies.
QTUM, which has 71 holdings, tracks the returns of the BlueStar Quantum Computing and Machine Learning Index. The fund was first listed in September 2018.
In terms of subsectors, we see Quantum Computing Technology (35.56%), followed by Machine Learning Services (21.44%), AI Chips (17.67%), GPU & Other Hardware (13.07%) and Big Data & Cloud Computing (9.39%). Close to 60% of the companies are U.S.-based. Others come from Japan (12.64%), the Netherlands (8.39%), Taiwan (4.11%) among others.
Leadings names in the roster are Analog Devices (NASDAQ:ADI), Ambarella (NASDAQ:AMBA), Advanced Micro Devices (NASDAQ:AMD), Synaptics (NASDAQ:SYNA), and Splunk (NASDAQ:SPLK). The top 10 stock comprise close to 20% of net assets of $132.4 million.
Year-to-date, QTUM is up more than 25% and hit a record high in recent days. As the funds holdings show, there are not many pure-play quantum computing stocks. Instead, a large number of tech names are increasing their focus on the quantum realm. Despite the recent run-up in price, such names in the quantum computing space are likely to create many more quarters of shareholder value. Potential investors could consider buying the dips.
52-week Range: $105.92$152.84Dividend Yield: 4.8%
In June, International Business Machinesrevealed Europes first quantum computerin Germany. According to IBM, the Q System One is now Europes most powerful quantum computer. In this race, IBM is not alone and elsewhere, other tech giants, including Google (NASDAQ:GOOG, NASDAQ:GOOGL) (NASDAQ:GOOG), Amazon (NASDAQ:AMZN) and Honeywell (NASDAQ:HON), are also investing heavily in the quantum computing world.
IBM generates revenue from five segments namely cloud and cognitive software, global business services, global technology services, systems and global financing. While global technology services has the highest share in the top line with about 35%, cloud and cognitive business is the most lucrative business as it has more than 25% pre-tax margin.
The company announced second quarter financial figures at the end of July. Revenue was $18.7 billion implying 3% year-over-year (YOY) growth. Net income of $1.3 million meant a decline of 3% YOY. Diluted non-GAAP earnings per share (EPS) was $2.33. A year ago, it had been $2.18. Meanwhile, net cash from operating activities stood at $17.7 billion.
Management believes quantum computing will play a key role in healthcare as it could enable a range of disruptive use cases for providers and health plans by accelerating diagnoses, personalizing medicine, and optimizing pricing. Quantum-enhanced machine learning algorithms are particularly relevant to the sector.
On the results, CFO James Kavanaugh cited,We expanded operating margins and grew profit dollars in the quarter, providing a key contribution to our cash performance. The company expects to grow revenue for fiscal year 2021 and anticipates free cash flow of $11 billion-$12 billion in 2021.
So far in the year, IBM stock returned just over 9.3%, and hit a multi-year high in June. Since then, though the shares have come under pressure, and price-sales (P/S) ratio stands at 1.66 times. Potential investors could regard the recent decline in price as an opportunity to buy for the long-run.
52-week Range: $196.25 $305.84Dividend Yield: 0.76%
Our last stock is the tech giant Microsoft, which generates revenue from three segments namely Productivity and Business Processes (such as Office 365 and LinkedIn) , Intelligent Cloud (Azure, Premier Support Services, and Consulting Services) and More Personal Computing (Windows Commercial, Bing, and Xbox).
Microsofts fiscal year ends on June 30. At the end of July, the company issued Q4 2021 results. Revenue was $46.2 billion, up 21% YOY. Additionally, net income grew 47% YOY growth to $16.5 billion. Diluted EPS was $2.17 for the fourth quarter, up 49% from a year ago. The company also ended its fiscal year with $14.2 billion cash and equivalents.
Following the announcement, CFO Amy Hood said, As we closed out the fiscal year, our sales teams and partners delivered a strong quarter with over 20%top and bottom-line growth, highlighted by commercial bookings growth of 30% year over year.
For the next quarter, Microsoft shared its segment revenue guidance. Hence, in the Productivity and Business Processes segment, the company expects its revenue between $14.5 and $14.75 billion. For Intelligent Cloud, Microsoft anticipates revenue to be between $16.4 and $16.65 billion.
Microsoft highlights, From breakthroughs in physics and nanomaterials to seamless integration with Microsoft Azure, Microsoft is leading the way to scalable, accessible quantum computing. For example, analysts have been pointing out how Microsofts quantum technology could influence the power industry, healthecare privacy, and personalized medicine.
So far in 2021, MSFT stock is up more than 33% and reached a record high in late August. Moreover, the stock is trading at 13.38 times current sales. Therefore, interested readers could consider investing in the shares for the long-term around current levels.
On the date of publication, Tezcan Gecgil did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to theInvestorPlace.comPublishing Guidelines.
TezcanGecgil has worked in investment management for over two decades in the U.S. and U.K. In addition to formal higher education in the field, she has also completed all 3 levels of the Chartered Market Technician (CMT) examination. Her passion is for options trading based on technical analysis of fundamentally strong companies. She especially enjoys setting up weekly covered calls for income generation.
Read this article:
3 Quantum Computing Stocks to Buy for Their Promising Healthcare Potential - InvestorPlace