Page 2,835«..1020..2,8342,8352,8362,837..2,8402,850..»

Automated Data Science and Machine Learning Platforms Market 2021: Potential growth, attractive valuation make it is a long-term investment | Know the…

Automated Data Science and Machine Learning Platforms Market Report Coverage: Key Growth Factors & Challenges, Segmentation & Regional Outlook, Top Industry Trends & Opportunities, Competition Analysis, COVID-19 Impact Analysis & Projected Recovery, and Market Sizing & Forecast.

A detailed report on Global Automated Data Science and Machine Learning Platforms market providing a complete information on the current market situation and offering robust insights about the potential size, volume, and dynamics of the market during the forecast period, 2021-2027. The research study offers complete analysis of critical aspects of the global Automated Data Science and Machine Learning Platforms market, including competition, segmentation, geographical progress, manufacturing cost analysis, and price structure. We have provided CAGR, value, volume, sales, production, revenue, and other estimations for the global as well as regional markets.

Major Key players profiled in the report include:Palantier, MathWorks, Alteryx, SAS, Databricks, TIBCO Software, Dataiku, H2O.ai, IBM, Microsoft, Google, KNIME, DataRobot, RapidMiner, Anaconda, Domino, Altair and More

Download Free Sample PDF includingfull TOC, Tables and [emailprotected]https://www.marketinforeports.com/Market-Reports/Request-Sample/312848

Dont miss the trading opportunities on Automated Data Science and Machine Learning Platforms Market. Talk to our analyst and gain key industry insights that will help your business grow as you create PDF sample reports.

The regional study of the global Automated Data Science and Machine Learning Platforms market explains how different regions and country-level markets are making developments. Furthermore, it gives a statistical representation of their progress during the course of the forecast period. Our analysts have used advanced Primary and Secondary Research methodologies to compile the research study on the global Automated Data Science and Machine Learning Platforms market.

Market Segment by Type, covers:Cloud-basedOn-premises

Market Segment by Applications, can be divided into:Small and Medium Enterprises (SMEs)Large Enterprises

Competitive Landscape: Competitive landscape of a market explains the competition in the Automated Data Science and Machine Learning Platforms Market taking into consideration price, revenue, sales, and market share by company, market concentration rate, competitive situations, trends, and market shares of top companies. Strategies incorporated by key vendors of the market such as investment strategies, marketing strategies, and product development plans are also further included in the report. The research integrates data regarding the producers product range, top product applications, and product specifications.

Get Chance of 40% Extra Discount, If your Company is Listed in Above Key Players [emailprotected]https://www.marketinforeports.com/Market-Reports/Request_discount/312848

The authors of the report have analyzed both developing and developed regions considered for the research and analysis of the global Automated Data Science and Machine Learning Platforms market. The regional analysis section of the report provides an extensive research study on different regional and country-wise Automated Data Science and Machine Learning Platforms industry to help players plan effective expansion strategies.

Regions Covered in the Global Automated Data Science and Machine Learning Platforms Market: The Middle East and Africa (GCC Countries and Egypt) North America (the United States, Mexico, and Canada) South America (Brazil etc.) Europe (Turkey, Germany, Russia UK, Italy, France, etc.) Asia-Pacific (Vietnam, China, Malaysia, Japan, Philippines, Korea, Thailand, India, Indonesia, and Australia)

Years Considered to Estimate the Market Size:History Year: 2015-2019Base Year: 2019Estimated Year: 2021Forecast Year: 2021-2026

Table of Contents: Global Automated Data Science and Machine Learning Platforms Market Research Report 2021 2026

Chapter 1 Automated Data Science and Machine Learning Platforms Market OverviewChapter 2 Global Economic Impact on IndustryChapter 3 Global Market Competition by ManufacturersChapter 4 Global Production, Revenue (Value) by RegionChapter 5 Global Supply (Production), Consumption, Export, Import by RegionsChapter 6 Global Production, Revenue (Value), Price Trend by TypeChapter 7 Global Market Analysis by ApplicationChapter 8 Manufacturing Cost AnalysisChapter 9 Industrial Chain, Sourcing Strategy and Downstream BuyersChapter 10 Marketing Strategy Analysis, Distributors/TradersChapter 11 Market Effect Factors AnalysisChapter 12 Global Automated Data Science and Machine Learning Platforms Market Forecast

To learn more about the report, visit@ https://www.marketinforeports.com/Market-Reports/312848/Automated-Data-Science-and-Machine-Learning-Platforms-market

What market dynamics does this report cover?The report shares key insights on:

It helps companies make strategic decisions.

Does this report provide customization?Customization helps organizations gain insight into specific market segments and areas of interest. Therefore, Market Info Reports provides customized report information according to business needs for strategic calls.

Get Customization of the [emailprotected]:https://www.marketinforeports.com/Market-Reports/Request-Customization/312848/Automated-Data-Science-and-Machine-Learning-Platforms-market

Why Choose Market Info Reports?:Market Info Reports Research delivers strategic market research reports, industry analysis, statistical surveys and forecast data on products and services, markets and companies. Our clientele ranges mix of global business leaders, government organizations, SMEs, individuals and Start-ups, top management consulting firms, universities, etc. Our library of 600,000 + reports targets high growth emerging markets in the USA, Europe Middle East, Africa, Asia Pacific covering industries like IT, Telecom, Chemical, Semiconductor, Healthcare, Pharmaceutical, Energy and Power, Manufacturing, Automotive and Transportation, Food and Beverages, etc. This large collection of insightful reports assists clients to stay ahead of time and competition. We help in business decision-making on aspects such as market entry strategies, market sizing, market share analysis, sales and revenue, technology trends, competitive analysis, product portfolio, and application analysis, etc.

Contact Us:Market Info Reports17224 S. Figueroa Street,Gardena, California (CA) 90248, United StatesCall: +1 915 229 3004 (U.S)+44 7452 242832 (U.K)Website: http://www.marketinforeports.com

Read more:

Automated Data Science and Machine Learning Platforms Market 2021: Potential growth, attractive valuation make it is a long-term investment | Know the...

Read More..

Second Generation of Black Knight’s Rapid Analytics Platform Significantly Expands Data Marketplace and Team Collaboration Tools; Further Streamlines…

Supports critical business needs by streamlining workflows, significantly increasing the number of available datasets and providing easier access to Black Knight's robust data marketplace

- The Black Knight Rapid Analytics Platform (RAP) is a unique, cloud-based data marketplace and decision-science studio that allows users to directly access diverse data assets and develop analytics strategies within a single solution

- The latest version of RAP delivers a streamlined workflow experience, more intuitive navigation and an interactive data marketplace where users can easily view and explore the platform's wide variety of available data and analytics

- The datasets recently added to RAP's vast repository include Black Knight's Collateral Analytics solutions, which provide a unique combination of the company's top-rated automated valuation models, and comprehensive property and market data.

- The new features also improve collaboration within an organization, include pre-built workspaces and reports to address common industry use cases, and enable seamless interaction with Black Knight's RAP support team

JACKSONVILLE, Fla., April 6, 2021 /PRNewswire/ -- Today, Black Knight, Inc. (NYSE:BKI) announced the release of the second generation of its Rapid Analytics Platform (RAP), which includes a powerful new design that delivers a streamlined workflow experience for users. RAP also now includes several additional datasets that clients can leverage to address a variety of critical business needs.

Black Knight, Inc. Logo (PRNewsfoto/Black Knight, Inc.)

RAP is a unique, cloud-based data marketplace and decision-science studio that allows users to directly access Black Knight's massive, diverse data assets and create custom analytics within a single solution. Users can seamlessly source Black Knight data managed on the platform, connect to other data sources, execute queries, create advanced analytics and train machine-learning models.

"RAP had already changed the landscape for mortgage and housing-related data science by bringing together more primary-sourced data and advanced analytics than any platform currently available," said Ben Graboske, president of Black Knight's Data & Analytics division. "With this second iteration, we've significantly enhanced the user and workflow experience and increased the number of datasets available, while simultaneously boosting the power available to users."

Story continues

RAP is used by forward-looking mortgage, real estate, and capital markets professionals for portfolio retention strategy; equity analysis and valuation; prepayment and default analytics; pre- and post-bid due diligence; performance benchmarking; and much more.

In addition to providing access to many of the industry's deepest and most granular datasets available, RAP also offers a growing catalogue of "out-of-the-box" analytics. Designed with transparency in mind, this exposed code allows users to get started quickly with a deep understanding of how the analytics have been developed. Users can also choose to build their own analytics, or they can leverage Black Knight's highly experienced professionals to develop and deliver customized analytics.

Additionally, workspaces have been added to help users create custom views of the RAP resources relevant to a particular data science strategy.

The datasets recently added to RAP's vast repository include Black Knight's Collateral Analytics solutions, which provide a unique combination of the company's top-rated automated valuation models, and comprehensive property and market data. RAP now also offers daily mortgage loan rate-lock data from Black Knight's leading product and pricing engine, Optimal Blue PPE, as well as daily forbearance, payment, and delinquency data and other Black Knight datasets.

"Our focus with RAP has always been to deliver enhancements and innovations that help clients gain the critical insights they need from within a powerful, unified interface," Graboske continued. "The enhancements in this version, including the new and diverse datasets available via the interactive data marketplace, are key to this goal, and will keep growing with time, so RAP can continue transforming how organizations leverage data and analytics."

About Black Knight Black Knight, Inc. (NYSE: BKI) is an award-winning software, data and analytics company that drives innovation in the mortgage lending and servicing and real estate industries, as well as the capital and secondary markets. Businesses leverage our robust, integrated solutions across the entire homeownership life cycle to help retain existing customers, gain new customers, mitigate risk, and operate more effectively.

Our clients rely on our proven, comprehensive, scalable products and our unwavering commitment to delivering superior client support to achieve their strategic goals and better serve their customers. For more information on Black Knight, please visit http://www.blackknightinc.com.

Cision

View original content to download multimedia:http://www.prnewswire.com/news-releases/second-generation-of-black-knights-rapid-analytics-platform-significantly-expands-data-marketplace-and-team-collaboration-tools-further-streamlines-workflow-301261403.html

SOURCE Black Knight, Inc.

Read this article:

Second Generation of Black Knight's Rapid Analytics Platform Significantly Expands Data Marketplace and Team Collaboration Tools; Further Streamlines...

Read More..

Quantum Physics to Disrupt Geospatial Industry over the Coming Decade – GIM International

Article

5 Questions to Hansjrg Kutterer, DVW

April 1, 2021

Innovative developments based on quantum physics will lead to further disruption of our professional field over the coming decade, predicts Hansjrg Kutterer who, besides being president of DVW, is also a professor of geodetic Earth system science. 'GIM International'asked him five questions relating to the challenges and opportunities in the geospatial industry, now and in the future.

2020 was an extraordinary year. How has the COVID-19 pandemic changed the way the industry operates, and which other factors are influencing the geospatial business?

The pandemic was and is extremely influential on our professional life. At very short notice, we had to considerably change our approaches from on-site and immediate to remote and fully virtual settings. Fortunately, we could benefit from the ongoing digital transformation. The existing digital infrastructure and established procedures based on digital communication and collaboration tools could be used in order to overcome obstacles caused by the pandemic. Thus, it was possible to provide effective substitutes in the given situation, such as digital meetings, digital conferences or digital teaching. Nevertheless, both technical capacities and personal capabilities needed rapid upgrades. Actually, the accelerated digitalization is both an opportunity and an obligation for the geospatial business, as work can generally be continued on a digital basis but very often relies on digital geospatial data.

Which new technologies do you foresee becoming important to your work?

This is going to be the decade of continuous Earth observation based on a sustainably maintained infrastructure and a comprehensive open-data policy. The European Copernicus system may serve as an example. Rapidly increasing amounts of heterogeneous geospatial data are obtained within very short time spans. These new opportunities are accompanied by the strong need for effective data management using integrated research data infrastructures, for example. Moreover, advanced data processing is required which comprises things like deep learning techniques. I also expect that innovative developments based on quantum physics will lead to further disruption of our professional field over the coming decade. Quantum sensors such as optical clocks will provide accurate height differences over large distances, and quantum computers will further speed up time-consuming computations.

Is the surveying profession able to attract enough qualified personnel?

The number of qualified personnel is becoming increasingly crucial for the further development of the surveying profession. Despite the broad appeal of our professional field and the high number of vacancies, there is still a lack of public visibility and thus limited awareness among potential candidates. For this reason, there have been various activities in Germany over the years aimed at reaching and attracting more young people to the industry. For example, the Instagram campaign #weltvermesserer has been launched in 2021 by a consortium consisting of all national stakeholders, including the private sector, administration, science and all relevant professional organizations. Both the expected impact of this campaign and the increasing interdisciplinary nature of our professional community will provide a good basis for tackling this sizeable challenge successfully.

What is your policy on crowdsourcing and open data?

Due to my academic role and my volunteer position within DVW, my answer is twofold. Open data policies are mandatory for a more comprehensive scientific, administrative or private exploitation of existing and newly incoming data. This definitely refers to all stakeholders who rely on geospatial data. Data generated and used in science and education must be open and available through efficient digital data infrastructures. Sustainable open-data initiatives and programmes are highly appreciated. Crowdsourcing offers the opportunity to collect data that is either outside the scope of public agencies or could offer an alternative to existing administrative data that is only available with a licence. The DVW organization encourages any initiative that advances the fields of geodesy, geoinformation and land management.

In terms of meeting your goals, what is the biggest challenge for your organization in the next five years?

As a university professor I am very aware of the increasing need of the professional community for enhanced capabilities in the digital transformation, in smart and integrated systems, in the widespread application of our contributions, and in interdisciplinary work. This needs to be further implemented in the curricula over the coming years, including effective digital settings and dedicated competence-oriented techniques. Actually, this is also linked to DVWs activities, albeit from the perspective of a non-profit organization. As DVW, we offer professional expertise, conferences, post-graduate training, highly skilled working groups, and last but not least an attractive networking platform for our members, essentially based on volunteering. This needs to be sustainably maintained and further developed.

Continued here:

Quantum Physics to Disrupt Geospatial Industry over the Coming Decade - GIM International

Read More..

The mystery of the muon’s magnetism | symmetry magazine – Symmetry magazine

Modern physics is full of the sort of twisty, puzzle-within-a-puzzle plots youd find in a classic detective story: Both physicists and detectives must carefully separate important clues from unrelated information. Both physicists and detectives must sometimes push beyond the obvious explanation to fully reveal whats going on.

And for both physicists and detectives, momentous discoveries can hinge upon Sherlock Holmes-level deductions based on evidence that is easy to overlook. Case in point: the Muon g-2 experiment currently underway at the US Department of Energys Fermi National Accelerator Laboratory.

The current Muon g-2 (pronounced g minus two) experiment is actually a sequel, an experiment designed to reexamine a slight discrepancy between theory and the results from an earlier experiment at Brookhaven National Laboratory, which was also called Muon g-2.

The discrepancy could be a sign that new physics is afoot. Scientists want to know whether the measurement holds up or if its nothing but a red herring.

The Fermilab Muon g-2 collaboration has announced it will present its first result on April 7. Until then, lets unpack the facts of the case.

Illustration by Sandbox Studio, Chicago with Steve Shanabruch

All spinning, charged objectsincluding muons and their better-known particle siblings, electronsgenerate their own magnetic fields. The strength of a particles magnetic field is referred to as its magnetic moment or its g-factor. (Thats what the g part of g-2 refers to.)

To understand the -2 part of g-2, we have to travel a bit back in time.

Spectroscopy experiments in the 1920s (before the discovery of muons in 1936) revealed that the electron has an intrinsic spin and a magnetic moment. The value of that magnetic moment, g, was found experimentally to be 2. As for why that was the valuethat mystery was soon solved using the new but fast-growing field of quantum mechanics.

In 1928, physicist Paul Diracbuilding upon the work of Llewelyn Thomas and othersproduced a now-famous equation that combined quantum mechanics and special relativity to accurately describe the motion and electromagnetic interactions of electrons and all other particles with the same spin quantum number. The Dirac equation, which incorporated spin as a fundamental part of the theory, predicted that g should be equal to 2, exactly what scientists had measured at the time.

But as experiments became more precise in the 1940s, new evidence came to light that reopened the case and led to surprising new insights about the quantum realm.

Illustration by Sandbox Studio, Chicago with Steve Shanabruch

The electron, it turned out, hada little bit of extra magnetism that Diracs equation didnt account for. That extra magnetism, mathematically expressed as g-2 (or the amount that g differs from Diracs prediction), is known as the anomalous magnetic moment. For a while, scientists didnt know what caused it.

If this were a murder mystery, the anomalous magnetic moment would be sort of like an extra fingerprint of unknown provenance on a knife used to stab a victima small but suspicious detail that warrants further investigation and could unveil a whole new dimension ofthe story.

Physicist Julian Schwinger explained the anomaly in 1947 by theorizing that the electron could emit and then reabsorb a virtual photon. The fleeting interaction would slightly boost the electrons internal magnetism by a tenth of a percent, the amount needed to bring the predicted value into line with the experimental evidence. But the photon isnt the only accomplice.

Over time, researchers discovered that there was an extensive network of virtual particles constantly popping in and out of existence from the quantum vacuum. Thats what had been messing with the electrons little spinning magnet.

The anomalous magnetic moment represents the simultaneous combined influence of every possible effect of those ephemeral quantum conspirators on the electron. Some interactions are more likely to occur, or are more strongly felt than others, and they therefore make a larger contribution. But every particle and force in the Standard Model takes part.

The theoretical models that describe these virtual interactions have been quite successful in describing the magnetism of electrons. For the electrons g-2, theoretical calculations are now in such close agreement with the experimental value that its like measuring the circumference of the Earth with an accuracy smaller than the width of a single human hair.

All of the evidence points to quantum mischief perpetrated by known particles causing any magnetic anomalies. Case closed, right?

Not quite. Its now time to hear the muons side of the story.

Illustration by Sandbox Studio, Chicago with Steve Shanabruch

Early measurements of the muons anomalous magnetic moment at Columbia University in the 1950s and at the European physics laboratory CERN in the 1960s and 1970s agreed well with theoretical predictions. The measurements uncertainty shrank from 2% in 1961 to 0.0007% in 1979. It looked as if the same conspiracy of particles that affected the electrons g-2 were responsible for the magnetic moment of the muon as well.

But then, in 2001, the Brookhaven Muon g-2 experiment turned up something strange. The experiment was designed to increase the precision from the CERN measurements and look at the weak forces contribution to the anomaly. It succeeded in shrinking the error bars to half a part per million. But it also showed a tiny discrepancyless than 3 parts per millionbetween the new measurement and the theoretical value. This time, theorists couldnt come up with a way to recalculate their models to explain it. Nothing in the Standard Model could account for the difference.

It was the physics mystery equivalent of a single hair found at a crime scene with DNA that didnt seem to match anyone connected to the case. The question wasand still iswhether the presence of the hair is just a coincidence, or whether it is actually an important clue.

Physicists are now re-examining this hairat Fermilab, with support from the DOE Office of Science, the National Science Foundation and several international agencies in Italy, the UK, the EU, China, Korea and Germany.

In the new Muon g-2 experiment, a beam of muonstheir spins all pointing the same directionare shot into a type of accelerator called a storage ring. The rings strong magnetic field keeps the muons on a well-defined circular path. If g were exactly 2, then the muons spins would follow their momentum exactly. But, because of the anomalous magnetic moment, the muons have a slight additional wobble in the rotation of their spins.

When a muon decays into an electron and two neutrinos, the electron tends to shoot off in the direction that the muons spin was pointing. Detectors on the inside of the ring pick up a portion of the electrons flung by muons experiencing the wobble. Recording the numbers and energies of electrons they detect over time will tell researchers how much the muon spin has rotated.

Using the same magnet from the Brookhaven experiment with significantly better instrumentation, plus a more intense beam of muons produced by Fermilabs accelerator complex, researchers are collecting 21 times more data to achieve four times greater precision.

The experiment may confirm the existence of the discrepancy; it may find no discrepancy at all, pointing to a problem with the Brookhaven result; or it may find something in between, leaving the case unsolved.

Illustration by Sandbox Studio, Chicago with Steve Shanabruch

Theres reason to believe something is going on that the Standard Model hasnt told us about.

The Standard Model is a remarkably consistent explanation for pretty much everything that goes on in the subatomic world. But there are still a number of unsolved mysteries in physics that it doesnt address.

Dark matter, for instance, makes up about 27% of the universe. And yet, scientists still have no idea what its made of. None of the known particles seem to fit the bill. The Standard Model also cant explain the mass of the Higgs boson, which is surprisingly small. If the Fermilab Muon g-2 experiment determines that something beyond the Standard Modelfor example an unknown particleis measurably messing with the muons magnetic moment, it may point researchers in the right direction to close another one of these open files.

A confirmed discrepancy wont actually provide DNA-level details about what particle or force is making its presence known, but it will help narrow down the ranges of mass and interaction strength in which future experiments are most likely to find something new. Even if the discrepancy fades, the data will still be useful for deciding where to look.

It might be that a shadowy quantum figure lurking beyond the Standard Model is too well hidden for current technology to detect. But if its not, physicists will leave no stone unturned and no speck of evidence un-analyzed until they crack the case.

Here is the original post:

The mystery of the muon's magnetism | symmetry magazine - Symmetry magazine

Read More..

6 Quantum Computing Stocks to Invest in This Decade – Investment U

Classical computers have served us well and they will continue to do so but breakthroughs in quantum physics are opening up new doors. Thats why Im sharing my favorite quantum computing stocks today.

Its still in the early stages and could take a while to pay off. But the list of companies below gives you some great investing opportunities. Youll find big companies shaking up the technology world. Theyre not resting on their laurels.

Ill highlight some research from each company and what excites me most. But first, itd be good to get a better understanding of this up-and-coming technology. The potential is huge

Moores Law states that the number of transistors in an integrated circuit doubles about every two years. This exponential trend has led to massive advancements in our world. In fact, its impacted every industry and all of our lives.

To help put Moores Law in perspective, a new iPhone is millions of times faster than the Apollo spacecraft when it comes to computing. Computers have become exponentially faster. This trend might be coming to an end, though

Gordon Moore and other forecasters expect that Moores law will end around 2025. Its the result of an intricate set of physics problems. And quantum computing might be the next big step forward.

Many technology companies see the potential and are investing lots of money. If their research pays off, the top quantum computing stocks could hand shareholders huge returns.

These new computers have the ability to store more information thanks to whats called superposition. Unlike traditional computers that use bits with only ones and zeros, quantum computers take it to the next level. They have the advantage of using ones, zeros and superpositions of ones and zeros. This opens up the door for solving tasks that have long been thought impossible for classical computers.

This list of quantum computing companies includes some of the largest companies in the world. They have proven business models and the resources to push quantum computing forward.

As a result, this list provides some cashflow safety for investors, while providing exposure to new technologies. So lets take a look at some of their top quantum research and projects

IBM was one of the first big movers in quantum research. And already, its deployed 28 quantum computers. Thats the largest fleet of commercial devices, and IBM has a road map to scale systems to 1,000 qubits and beyond.

The IBM Quantum Network is currently working with more than 100 partners. These partners are in many different industries and are developing real-world commercial applications. IBM also offers free access to quantum computing.

IBM is scaling these technologies and making them more accessible. This is vital for further adoption and innovation. The strategy is working, and IBM will continue to be one of the top quantum computing stocks over the coming decades.

Alphabet is one of the top quantum computing stocks to buy. Back in 2019, the company claimed quantum supremacy for the first time when its advanced computer surpassed the performance of conventional devices.

Alphabets Sycamore quantum processor performed a specific task in 200 seconds that would take the worlds best supercomputer 10,000 years. Thats a huge milestone, and the company is continuing to advance with quantum physics.

Google AI Quantum is making big strides as well. Its developing new quantum processors and algorithms to help solve a wide range of problems. Its also open sourcing some of its framework to spur innovation.

Intel is a semiconductor giant thats developing many cutting-edge technologies. And its been making quantum processors in Oregon. Furthermore, the company hopes to reach production-level quantum computing within 10 years.

Intel is on its third generation of quantum processors with 49 qubits. The company has a unique approach, advancing a technology known as spin qubits in silicon. Intel believes it has a scaling advantage over superconducting qubits.

This easily makes Intel one of the top quantum computing stocks. Buying into this company gives investors exposure to many cutting-edge technologies.

Similar to IBM, Microsoft takes a comprehensive approach to quantum computing. Its working on all the technologies required to scale commercial application.

Microsoft is advancing all layers of its computing stack. This includes the controls, software and development tools. Microsoft also created the Azure Quantum open cloud ecosystem. This helps speed up innovation.

In addition, the tech giant is making great advancements with Topological qubits. These provide performance gains over conventional qubits. They increase stability and reduce the overall amount of qubits needed. Its promising technology that should reward shareholders down the road.

Amazon Quantum Solutions Lab is helping businesses identify opportunities. Amazons experts are working with clients to better understand quantum computing. This helps them build new algorithms and solutions.

Amazon now offers quantum computing on Amazon Web Services through Amazon Bracket. This service provides access to D-Wave hardware. D-Wave is a leading quantum computing company based in Canada. Its not publicly traded, though

Overall, Amazon is continuing to disrupt many industries. And advancing quantum computing should help drive its innovation even further.

This quantum stock is the smallest on the list. It gives direct exposure to quantum computing. This makes it a higher-risk opportunity, however. Its risk-to-reward setup looks good for long-term investors, though.

Quantum Computing offers cloud-based, ready-to-run software. Its focused on creating services that dont require quantum expertise or training to use. This approach is opening the doors for more businesses to leverage the new technologies.

This company is also focusing on real-world problems such as logistics optimization, cybersecurity and drug discovery. To accomplish this, its partnering with hardware companies such as D-Wave.

The quantum computing companies above are mostly indirect plays. Their other established businesses provide the capital required to innovate. This is vital, as quantum computing is still an up-and-coming industry.

It might take a decade or more to really play out. And investing early in these technologies can lead to large returns for patient investors. Quantum breakthroughs are compounding and creating new opportunities.

Whether you buy into these top quantum computing stocks or not, well all benefit from the innovation. If you want to stay on the cutting edge of tech investing, consider signing up for Profit Trends. Its a free e-letter thats packed with useful research and tech investing opportunities. Also, here are some othertech IPOs that you might want to consider.

Read the original post:

6 Quantum Computing Stocks to Invest in This Decade - Investment U

Read More..

Can science explain the mystery of consciousness? – The Irish Times

In the second part of a series on the science of consciousness, Sen Duke features those who believe the human brain works more like a quantum computer.

The mystery of consciousness, according to Roger Penrose, the 89-year-old winner of the 2020 Nobel Prize in physics, will only be solved when an understanding is found for how brain structures can harness the properties of quantum mechanics to make it possible.

Penrose, emeritus professor of mathematics at the University of Oxford a collaborator of the late Stephen Hawking who won the Nobel for his work on the nature of black holes, has been interested in consciousness since he was a Cambridge graduate student. He has authored many books on consciousness, most notably The Emperors New Mind (1989), and believes it to be so complex that it cannot be explained by our current understanding of physics and biology.

As a young mathematician, Penrose believed, and still does today, that something is true, not because it is derived from the rules or axioms, but because its possible to see that its true. The ultimate truth in mathematics, he reasoned, cannot, therefore, be proven by following algorithms; a set of calculations performed to instruction.

It followed, Penrose deduced, that the truth of how consciousness operates in the brain may not be provable by algorithms or thinking of the brain as a computer. This idea set off a life-long quest to understand the mysterious processes governing consciousness going on in our heads, which, Penrose says, remain beyond our existing understanding of physics, mathematics, biology or computers.

After The Emperors New Mind was published, Penrose received a letter from Stuart Hameroff, professor of anaesthesiology at the University of Arizona, who also had a long interest in understanding consciousness. In the letter, Hameroff described tiny structures in the brain called microtubules, which he believed were capable of generating consciousness by tapping into the quantum world.

Hameroff, who has worked as an anaesthesiologist for 45 years, believes anaesthesia may work through specifically targeting consciousness through its action on the neural microtubules. After writing the letter, he met Penrose in 1992, and over the next two years they developed radical ideas about consciousness which ran counter to the thinking of most neuroscientists, and still do.

Penrose and Hameroff believe that the human brain works more like a quantum computer than any classical computers. This is because future quantum computers will be designed to harness the ability of quantum particles to exist in multiple locations, states and positions all at once. These quantum effects arise in the microtubules, they suggest, which then act as the brains link to the quantum world.

The microtubules were structures that Hameroff had studied in since his graduate student days. They interested him initially, he recalls, because of their role in cancer. The microtubules were crucial to cell division, by splitting chromosomes perfectly in two. If microtubules did not function then chromosomes could be divided unevenly in three or four, not two, he says, thus triggering cancer.

The central role that the microtubules played in cell division, led Hameroff to speculate that they were controlled by some form of natural computing. In his book Ultimate Computing (1987), he argues that microtubules have sufficient computation power to produce thought. He also argues that the microtubules the tiny structures which give the cell its shape and act like a scaffold are the most basic units of information processing in the brain, not the neurons.

The fact that microtubules are found in animals, plants and even single-celled amoeba, says Hameroff means that consciousness is probably widespread and exists at many levels. The way microtubules work to produce consciousness, he says, can be thought of as being similar to how a conductor directs the sounds produced by individual musicians and orchestrates it into a coherent functioning orchestra.

Consciousness will be a different experience in humans compared to amoeba, says Hameroff. A single-celled organism might have proto-consciousness; that is consciousness without no memory, without context, isolated, not connected with anything else, and occurring at low intensity. There wouldnt be any sense of self memory or meaning, but there would be some glimmer of feeling or awareness.

Penrose agreed with Hameroff that the microtubules could possibly maintain the quantum coherence needed for complex thought and a collaboration began that continues today. Consciousness, the two believed, was a non-logarithmic, quantum process that could only be understood by a theory that linked the brain to quantum mechanics.

This led Penrose and Hameroff to develop a theory called orchestrated reduction, or OR. This proposed that areas of the brain where consciousness occurs must be structured so that they can hold innumerable quantum possibilities all at once per the rules of quantum mechanics while permitting the controlled reduction of such endless possibilities, without destroying the quantum system.

The microtubules were, both agreed, the best currently known structures in the brain where quantum processes could take place in a stable way and be harnessed to generate our conscious experience. They agreed that consciousness might ultimately be found in many locations across the brain, not just confined to the microtubules.

According to Hameroff, the presence of pyramid-shaped cells containing microtubules organised to run in two directions, rather than in parallel, which is more usual, was the difference between the parts of the brain where consciousness happens and the unconscious brain. Its notable, he says, that these pyramidal cells are not present in the cerebellum; an area considered to be unconscious.

One of the main criticisms of the Penrose-Hameroff quantum-based theory of consciousness is that there is no way to measure whether quantum processes are happening in the microtubules or any other parts of the brain. Penrose accepts such criticism but believes such measurements will become possible over the long term.

Hameroff already has plans to test whether quantum states exist inside microtubules. If he can prove this, his next step will be to see if such states disappear under anaesthesia. If they do then he says it strengthens the theory that microtubules host conscious thought.

Brain scanning techniques like PET and MRI, have become very powerful but are of little or no use in consciousness studies, says Penrose. They can, he notes, monitor blood flow and where activity is happening in the brain but they cant say whether that activity involves conscious thought. For that something else is required.

One way to measure thought, some scientists believe, is by observing brainwaves. For example, some evidence suggests that brainwaves, oscillating at about 40 Hertz, can be correlated with consciousness.

Penrose and Hameroff would like to find evidence for quantum brain oscillations in the microtubules but have no tools yet to achieve this.

This is a long-term project, which I dont see resolving for many years, says Penrose who, given his age, would like to see things moving faster. I feel pretty sure that we havent really understood fully how biological systems are organised and how they may be taking advantage of the subtle effects of [quantum] physics.

The big difficulty with trying to measure quantum processes in the brain, Penrose points out, is that such effects are destroyed when they are observed or brought into contact with the outside world. It is going to be very hard to have direct access to consciousness, as to observe it, currently, would be to destroy it.

More:

Can science explain the mystery of consciousness? - The Irish Times

Read More..

‘Spacekime theory’ could speed up research and heal the rift in physics – Big Think

We take for granted the western concept of linear time. In ancient Greece, time was cyclical and if the Big Bounce theory is true, they were right. In Buddhism, there is only the eternal now. Both the past and the future are illusions. Meanwhile, the Amondawa people of the Amazon, a group that first made contact with the outside world in 1986, have no abstract concept of time. While we think we know time pretty well, some scientists believe our linear model hobbles scientific progress. We're missing whole dimensions of time, in this view, and our limited perception could be the last obstacle to a sweeping theory of everything.

Theoretical physicist Itzhak Bars of the University of Southern California, Los Angeles, is the most famous scientist with such a hypothesis, known as two-time physics. Here, time is 2D, visualized as a curved plane interwoven into the fabric of the "normal" dimensionsup-down, left-right, and backward-forward. While the hypothesis is over a decade old, Bars isn't the only scientist with such an idea. But what's different with spacekime theory is that it uses a data analytics approach, rather than a physics one. And while it posits that there are at least two dimensions of time, it allows for up to five.

In the spacekime model, space is 5D. Besides the ones we normally encounter, the extra dimensions are so infinitesimally small, we never notice them. This relates to the KaluzaKlein theory developed in the early 20th century, which stated that there might be an extra, microscopic dimension of space. In this view, space would be curved like the surface of Earth. And like Earth, those who travel the entire distance would, eventually, loop back to their place of origin.

Kaluza-Klein theory unified electromagnetism and gravity, but wasn't accepted at the time, although it did help in the search for quantum gravity. The concept of additional dimensions was revived in the 1990s with Paul Wesson's Space-Time-Matter Consortium. Today, proponents of superstring theory say there may be as many as 10 different dimensions, including nine of space and one of time.

Spacekime theory was developed by two data scientists. Dr. Ivo Dinov is the University of Michigan's SOCR Director, as well as a professor of Health Behavior and Biological Sciences, and Computational Medicine and Bioinformatics. SOCR stands for: Statistics Online Computational Resource designs. Dr. Dinov is an expert in "mathematical modeling, statistical analysis, computational processing, scientific visualization of large datasets (Big Data) and predictive health analytics." His research has focused on mathematical modeling, statistical inference, and biomedical computing.

His colleague, Dr. Milen Velchev Velev, is an associate professor at the Prof. Dr. A. Zlatarov University in Bulgaria. He studies relativistic mechanics in multiple time dimensions, and his interests include "applied mathematics, special and general relativity, quantum mechanics, cosmology, philosophy of science, the nature of space and time, chaos theory, mathematical economics, and micro-and-macroeconomics."

Drs. Dinov and Velev began developing spacekime theory around four or five years ago, while working with big data in the healthcare field. "We started looking at data that intrinsically has a temporal dimension to it," Dr. Dinov told me during a video chat. "It's called longitudinal or time varying data, longitudinal time varianceit has many, many names. This is data that varies with time. In biomedicine, this is the de facto, standard data. All big health data is characterized by space, time, phenotypes, genotypes, clinical assessments, and so forth."

"We started asking big questions," Dinov said. "Why are our models not really fitting too well? Why do we need so many observations? And then, we started playing around with time. We started digging and experimenting with various things. And then we realized two important facts.

"Number one, if we use what's called color-coded representations of the complex plane, we can define spacekime, or higher dimensional spacetime, in such a way that it agrees with the common observations that we make in (the longitudinal time series in) ordinary spacetime. That agreement was very important to us, because it basically says, yes, the higher dimensional theory does not contradict our common observations.

"The second realization was that, since this extra dimension of time is imperceptible, we needed to approximate, model, or estimate, one of the unobservable time characteristics, which we call the kime phase. After about a year, we discovered that there is a mathematically elegant tool called the Laplace Transform that allows us to analytically represent time series data as kime-surfaces. Turns out, the spacekime mathematical manifold is a natural, higher dimensional extension of classical Minkowski, four-dimensional spacetime."

Our understanding of the world is becoming more complex. As a result, we have big data to contend with. How do we find new ways to analyze, interpret and visual such data? Dinov believes spacekime theory can help in some pretty impressive ways. "The result of this multidimensional manifold generalization is that you can make scientific inferences using smaller data samples. This requires that you have a good model or prior knowledge about the phase distribution," he said. "For instance, we can use spacekime process representation to better understand the development or pathogenesis to model the distributions of certain diseases.

"Suppose we are evaluating fMRIs of Alzheimer's disease subjects. Assume we know the kime phase distribution for another cohort of patients suffering from amyotrophic lateral sclerosis, Lou Gehrig's disease. The ALS kime-phase distribution could be used for evaluating the Alzheimer's patients," and many other neurodegenerative populations. Dinov also thinks spacekime analytics could help improve political polling, increase our understanding of complex financial and environmental events, and even the innerworkings of the human brain, all without having to take the huge samples required today to make accurate models or predictions. Spacekime theory even offers opportunities to design novel AI analytical techniques. But it goes beyond that.

Spacekime theory can help us make headway on some of the most pernicious inconsistencies in physics, such as Heisenberg's uncertainty principle and the seemingly irreconcilable rift between quantum physics and general relativity, what's known as "the problem of time."

Dinov wrote that the "approach relies on extending the notions of time, events, particles, and wave functions to complex-time (kime), complex-events (kevents), data, and inference-functions." Basically, working with two points of time allows you to make inferences on a radius of points associated with a certain event. With Heisenberg's uncertainty principle, according to this model, since time is a plane, a certain particle would be in one position or phase, time-wise, in terms of velocity, and another phase, in terms of position.

This idea of hidden dimensions of time is a little like Plato's allegory of the cave or how an X-ray signifies what's underneath, but doesn't convey a 3D image. From a data science perspective, it all comes down to utility. Dinov believes that if we can calculate the true phase dispersion of complex phenomena, we can better understand and control them.

Drs. Dinov and Velev's book on spacekime theory comes out this August. It's called "Data Science: Time Complexity, Inferential Uncertainty, and Spacekime Analytics".

From Your Site Articles

Related Articles Around the Web

View original post here:

'Spacekime theory' could speed up research and heal the rift in physics - Big Think

Read More..

Where did the antimatter go? – The Express Tribune

KARACHI:

In the words of one researcher, science is not meant to cure us of mystery. Instead, he argues, it is supposed to reinvent and reinvigorate.

For us lay people, science remains a mystery sans reinvention. For reasons mundane and existential, most of us shy away from asking the fundamental questions. Perhaps it is due to our own shyness, even fear, that many of us hold such awe for those who dare go intellectually where the rest of us are unwilling or incapable of going.

Speaking of awe and mysteries, there are a handful of places around the world that evoke both while capturing our collective imagination. If I mention the European Organisation for Nuclear Research, it may or may not mean anything to most of us. But, if I use the acronym CERN, many may find their thoughts transported to the world of Dan Brown, of scientific intrigue and where the impossible becomes possible.

Speaking still of mysteries, and of CERN, there is the question of antimatter, a material many of us may confuse with the similar sounding yet wildly different concepts of dark matter and dark energy. Beyond our own mysterious understanding of it, lie the antimatter mysteries that the scientists who study it reinvent and reinvigorate in their attempts to understand it.

As luck, or perhaps fate, would have it, one of them is Pakistani. Who better to demystify both what we know at the moment about antimatter and what working at CERN entails.

Demystifying antimatter

For Muhammad Sameed, a lifes yearning for understanding has led to a dream come true. The 32-year-old Islamabad native is among a mere handful of Pakistanis who would think themselves lucky for a chance to work at CERN. What is more in Sameeds case, he is physicist involved in studying antimatter particles at one of the worlds premier physics research organisations.

At CERN, Sameed is part of the ALPHA experiment, an acronym that stands for the Antihydrogen Laser Physics Apparatus. Just recently, the ALPHA collaboration effort succeeded in cooling down antihydrogen particles the simplest form of atomic antimatter with laser light.

Speaking with The Express Tribune, the young scientist began by admitting that the wider scientific community had perhaps contributed to some of the public misunderstandings about antimatter. I think it is us physicists fault for giving such similar names to such different concepts, he said when asked about the difference between antimatter, dark matter and dark energy.

Taking his own crack at remedying that, he explained: Before we explain antimatter, it is important remember what matter is, at the subatomic level.

Most of us learn about atoms and how they are made of electrons, protons and neutrons in school. But that is where the general awareness ends, said Sameed. If you look deeper, while electrons are fundamental particles they belong to a family of particles called leptons protons and neutrons are not. Those two are made up of two more kinds of fundamental particles: the quarks and the gluons, which bind them. He added that all matter that surrounds us and that we can interact with is made of particles from two families, namely quarks and leptons. Both families consist of six kinds of particles each.

According to Sameed, we have known all of this since the early part of the previous century, when quantum mechanics was developed. Where does antimatter come in? It was first articulated in a theoretical study by physicist Paul Dirac, he shared.

Dirac, while solving a quantum mechanics equation, arrived at two solutions, one positive and the other negative. The positive one corresponded to the electron. Dirac initially disregarded the negative solution, but later used it to hypothesise the existence of antielectrons, Sameed explained. He made that prediction in 1928, and just four years later, an American experiment actually discovered it.

How was the discovery made, you wonder? We have all these particles from outer space that pass through our planet, said Sameed. If we apply a magnetic field to them, we can determine which direction these particles turn in. If electrons turn to one side, particles with the opposite charge would turn in the other direction.

The physicist shared that since the discovery of the antielectron almost 90 years ago scientists have discovered an antimatter counterpart to each regular matter particle we know of. The story we physicists should be telling people is that not only is antimatter real, but that these are particles are found in nature, he said. The real question is this: we know from equations and experiments that when matter is produced in a lab or after the Big Bang an equal amount of antimatter is produced. So how is it that regular matter became so dominant in our universe and why is there so little antimatter occurring in nature?

According to Sameed, all research into antimatter at CERN and other organisations is focused on this question: What happened? Where did all the antimatter go? One proposed explanation, he shared, is that antimatter has some as yet unknown property that converts it into regular matter in unequal amounts. So by producing and trapping antimatter in a lab, we test it for various properties and whether those can explain what happened to most antimatter in nature. This has been the focus of research for the last 30 to 40 years.

Cooling with lasers

Explaining the recent ALPHA experiment with laser cooling, Sameed began by explaining the choice of antihydrogen. Hydrogen is the simplest atom we know of, with just one proton and one electron. Antihydrogen, similarly, is the simplest antiatom, he said.

You take an antiproton, get an antielectron to orbit it, and you should have an antihydrogen atom. But this is easier said than done, he explained. The main challenge with producing antihydrogen or any other antimatter particle is that if an anti-matter particle comes in contact with a regular matter particle, both are annihilated. So in order to capture anti-matter particles, you need to create perfect vacuum to ensure they dont come in contact with matter particles.

Sameed added that the challenge isnt just limited to creating vacuum either. You need to make sure that the container being used is designed in a way to ensure antimatter particles dont come in contact with its walls. This is done using electromagnetic fields.

Explaining how scientists study antimatter, Sameed began by explaining how regular matter particles would be studied. Take a regular hydrogen atom which is in what we call a ground state or normal state. If we shine a laser with a specific energy level onto that simple atom, its electron can jump into an excited state. He said that scientists have known about the effects of lasers on hydrogen atoms for a long time. We know what frequencies can excite it. For our experiment, we thought to test the same on anti-hydrogen atoms. We wondered if it would react differently to regular hydrogen due to differences in energy levels or other properties. Perhaps our findings could help unravel some of the mystery around why there is so little antimatter in the universe?

According to Sameed, the effects of lasers on antihydrogen were first tested in 2017. We shined a laser with the same frequency as the one that excites electrons in a regular hydrogen atom on to an antihydrogen atom. The results suggest the effect on antihydrogen was more or less the same, he said. But one side effect that we uncovered at the time and this had been predicted before the experiment was that laser light can cool particles.

Normally we use lasers to heat things, but if you shine a laser beam on an atom that is coming towards it, it has an effect of slowing down the atom, he added. So our current experiment was the first time we tested this laser cooling principle on antimatter.

Sameed further revealed that the next antimatter experiment being developed aims to study how it behaves under the influence of gravity from regular matter. We know how gravitational forces work between regular matter. Our equations suggest the same interaction would be true between two objects made of antimatter. But, we want to find out what happens in terms of gravity when there is an interaction between matter and antimatter. At the moment, we dont even have any strong theories to predict what will happen.

Easier said than done

When Sameed explains the experiment, one may get the false impression that it is as easy as pointing a laser towards antimatter. But nothing could be further from the truth.

For starters the laser we use is not the one used in laser pointers that most people know of. Ours is an ultraviolet laser, which is invisible to the naked eye and has a much higher energy level. It is not available commercially and is very difficult to manufacture, so we have to develop it in-house at CERN, he said. The laser in question is also absorbed by air particles, Sameed added. Not only must the beam travel through vacuum, it must be produced and aimed in vacuum as well.

According to Sameed, firing a laser at antihydrogen is a very different challenge that firing it at regular hydrogen. For normal hydrogen, we can shine the laser from any angle. But for antihydrogen, because the entire container is surrounded by special magnets to keep it from touching the walls, there is a very small access point for the laser itself.

Commercial possibilities

Any innovative technology can open up opportunities beyond what those who developed it are sometimes able to appreciate. Speaking on this aspect, Sameed said: We scientists develop such innovative solutions to satisfy pure curiosity and answer the fundamental questions about physics. But all research produces technological byproducts and sooner or later, they trickle down to R&D companies and eventually to wider society.

Asked if he could foresee any antimatter applications used beyond research, Sameed said there was one example already in medical imaging, even if it was difficult to predict wider uses. The PET or Positron Emission Tomography scanner. The positron is an anti-matter particle. It is essentially an anti-electron.

Beyond that, CERN in general is responsible for developing a wide-range of technologies. For instance, to trap anti-matter particles, we need strong magnetic fields. To produce those, we have to develop superconducting magnets which have various commercial uses as well. For instance, such magnets are used in hospitals in MRI scanners. A lot of technologies used in modern medicine were first developed in research centres like CERN, Sameed revealed.

Even on the software side, applications developed to track near-instant physical phenomenon have been co-opted by some financing trading companies which want to leverage the ability to process information in microseconds.

Sameed added that all CERN research is open-source and accessible to anyone in the world. You can contact our knowledge transfer centre and gain our research to use in reasonable ways.

The value of research

According to Sameed, the side effects of research for researchs sake are undeniable. We can see it over the last few centuries. The countries and regions that invested in fundamental research and technology, are the ones that are now global powers, he said. But we dont even need to look at it philosophically. Just look at the Internet.

Sameed pointed out that the World Wide Web was originally developed by CERN in the 1980s as a means to share information between physicists instantaneously. The intention at the time was only to aid research. But it was made open source, and the effect of that simple choice in reshaping life can be seen today.

Back to the future

How does one get to work at CERN? For Sameed, the yearning to become a scientist was sparked by one movie most of us watch and loved growing up. I was five or six when I watched Back to the Future. I dont think I understood much about the movie at the time, but I remember finding the character of Doc Brown fascinating, he said. I decided then that I would at least try to be a scientist when I grew up.

In terms of background, Sameed admits he had an ordinary middle-class upbringing. But I am lucky that my parents tried to provide me the best education possible, he said. Still, CERN was beyond his dreams till the time he graduated A-Levels.

For Sameed, the path to a career in science opened with his elder brothers academic pursuits. He went to the US on scholarship to study chemical and biological engineering. That made me realise that I too could get in-depth education in physics abroad. Like him, I applied for a scholarship and was able to study physics at Cornell University. That really set the stage for me.

The next chapter began when Sameed came across the example of another Pakistani who went on a summer internship to CERN. He made me aware that there was a programme international students could apply for. I applied, with no hope of getting in, and originally I was rejected. But, some time later, they contacted me again and told me that I was a better fit for another department, and here I am.

According to Sameed, he initially believed his recruiters at CERN had made an error. But once I got here, I realised that I was no different from other students, from Europe or US or elsewhere. We Pakistanis are in no way inferior to other countries in terms of talent. All that is missing is awareness.

When not busy participating in research, Sameed tries to do his part to raise awareness about opportunities for other Pakistanis at CERN. Pakistan is now an associate member at CERN and what that means is that any Pakistani can apply for any job here. This is not limited to just positions for scientists and engineers, and involves things like administration and legal affairs, etc.

Pakistan and CERN

According to Sameed, at the youth level, there is more awareness about CERN in Pakistan now. In fact, Pakistan has one of the highest number of applicants to CERN, he said. But most of these are just student positions at the moment and for higher level positions, we are still lagging behind. In terms of Pakistanis who are at CERN for the long term, there may be four or five.

Even so, Pakistanis appear to be highly valued at CERN, Sameed revealed. Pakistani engineers have made a huge contribution to CERN and they are a very well respected community here. They always trust Pakistanis who make it to CERN to do a good job, he said. There is also immense pride for Pakistanis at CERN due to Dr Abdus Salams contributions, both to physics as a whole and to CERN during the time he spent here. One of the streets is even named after him, he added.

Asked what advice he had for other young Pakistanis who choose a similar path, the physicist pointed out that there are many opportunities available, not just in CERN, but other high quality institutes. Not only are Pakistanis eligible to apply, but they are looking for people from diverse backgrounds with talent, he said. Even if you dont have the confidence, I would say apply. Because you might think youre not good enough, but the people who are recruiting may believe otherwise.

Sameed reiterated that when he joined CERN, he realised he was as good as anyone else. My hope for the future is that in addition to engineers, more Pakistani physicists will join as well.

See the original post:

Where did the antimatter go? - The Express Tribune

Read More..

Small things misbehaving leads to the greatest question of all – Spectator.co.uk

Helgoland

Carlo Rovelli

Allen Lane, pp. 240, 14.99

Helgoland is a craggy German island in the North Sea. Barely bigger than a few fields, it reaches high above the water on precipitous cliffs and is famous for its sweet air. It has a town and a harbour, and the 1,000-odd inhabitants speak a distinct dialect. In the summer of 1925, the 23-year-old physicist Werner Heisenberg went there to sort out his

Helgoland is a slightly misleading title for Carlo Rovellis inspiring, chaotic, delightfully unsatisfactory book of popular quantum physics. It isnt about Heisenbergs months there or his mathematical insights; Helgoland is Rovellis shorthand for Heisenbergs pellucid state of mind. On Helgoland, says Rovelli, Heisenberg almost got the philosophical approach to quantum theory right. Ever since, weve been getting it wrong.

The discovery of a quantum world began with experimental results. Certain things were taking place in German physics labs that should not be. Atoms were misbehaving. When scientists in Gttingen and Berlin crouched in front of the latest clever electronic instruments and peered, Alice-like, into the wonderland of the very small, what they saw shocked them bolt upright. Wonderland was ridiculous. There, logic was (and still is) fundamentally different.

Translated up to our size, the following nonsense was apparently perfectly possible: throw a full tankard across the hall in a Bierstube, let somebody notice (as it passes overhead) that this tankard has, say, a picture of a stag on it, and the beer inside turns green. That simple observation hey, look, theres a stag on the tankard and ping! the contents of the mug changes colour. But if nobody notices the decoration, the beer stays brown. In the quantum world, two defining qualities that have nothing to do with each other (tankard decoration and beer colour) can influence one another just because somebodys looked at them. Its a place for hucksters, not respectable people. Even Einstein, who got his Nobel Prize for figuring out the existence of this strange new world, was appalled: God does not play dice! he said. Dont you tell God what to do, retorted the Danish theoretician Niels Bohr, who was less prudish.

Heisenberg worked for Bohr, and on Helgoland started to make sense of this wayward behaviour of small things. The central point was, he discovered, that everything in quantum land works with exactly the same logic as it does up here except in one particular: the order in which you look at things matters. In the quantum world, if the observer had only kept his mind focused on the beer, and paid no attention to the pretty decoration, it would have stayed brown. Some physicists tried to get round the metaphysical implications of this idea by insisting that there were hidden things secretly linking the subatomic equivalents of beer colour and mug decoration. Others have given up all pretence of common sense and believe ideas much more outlandish than God, such as the existence of multiple worlds in which all possible beer mug decorations and beer colours get to exist somewhere, really and truly, all at once.

Rovelli has a different idea. He says reality doesnt exist. The reason physicists have been led astray by bonkers theories in the 100 years since Helgoland is because they cant bear the thought of not being real.

It was at this point a third of the way through the book that I mimicked Heisenberg and took my first long, befuddled walk. Reality doesnt exist? What on earth does that mean? Rovellis favourite example is a red chair. Red doesnt exist, for sure everyone knows that philosophical chestnut: its just the way our brains make sense of light of a certain wavelength. But Rovelli also insists that nothing else about the chair exists either its weight, its shape except in its relationship to the person looking at it. And you can keep banging away at this type of argument until you get to the level of the atoms forming the chair. Insisting that anything about this red chair needs to exist outside of relationships is metaphysical neediness.

Part of the fun of Rovellis book is that your immediate reaction to his ideas repugnance or delight isnt meaningless. Without mathematics or experiment, by page 81 your thoughts are at the frontier of quantum theory, and its time for your second brain-cudgeling walk. If things exist only by virtue of their interaction with other things, what happens to them between times? Do they vanish? Do instants of time also not exist? Does it even make sense to talk this way? Oh dear, oh dear.

Rovelli devotes a precious chapter to the work of the second-century Buddhist philosopher Nagarjuna, who also insists there is no ultimate layer of real things. Another chapter 15 pages, getting on for a tenth of this short book is as unexpected as green beer: its about a fierce philosophical argument Lenin had in 1909 with Aleksandr Bogdanov, the co-founder of the Bolshevik party.

I have digressed, says Rovelli, once this exuberant and not particularly helpful passage is over, then promptly tips off the other side of his bar stool and quotes Douglas Adams:

The fact that we live at the bottom of a deep gravity well, on the surface of a gas-covered planet going around a nuclear fireball 90 million miles away and think this to be normal is obviously some indication of how skewed our perspective tends to be.

In other words, its our skewed perspective, not the scientific evidence, that makes us want to believe in the reality of red chairs and atoms.

Rovelli is not a kook. Hes a world-famous professor of quantum gravity. His relational interpretation of quantum theory is discussed seriously by leading philosophers and physicists. Hes ebullient about his ideas, not crazed by them. He doesnt do a particularly good job of describing in laymans terms the fundamental oddity of quantum theory hes too easily distracted and too poetical; his metaphors are a little too breathless. But that shouldnt put you off. Do what I did after my third Helgoland walk: read the opening pages of Leonard Susskinds superb popular science book Quantum Mechanics: The Theoretical Minimum. Anybody who can use fractions can understand them. Then set back to work with Helgoland. What follows is joyous excitement.

It feels exactly right that Rovelli teaches at the University of Marseille. In the same spirit as hes written this book, I imagine him strolling along the quai, his sleeves rolled up, hailing the devil-may-care crowd by the boats and then, with a quick glance to either side, slipping into that crazy little bar where the tankards are flying and the beer turns green if you look at it funny.

Link:

Small things misbehaving leads to the greatest question of all - Spectator.co.uk

Read More..

Understanding the Role of Encryption in GDPR Compliance – tripwire.com

Encryption has been a hot topic of discussion during the implementation phase of most data privacy laws. In the age where organizations are dealing with large volumes of data each day, the protection of this sensitive data is critical. The data, which is seen as a business-critical asset for organizations, should be protected against malicious hackers looking for opportunities to steal the data. For these reasons, most data privacy regulations call for organizations to encrypt their data to help to prevent against cyber-attacks.

Todays article is about one such data privacy law that repeatedly mentions the adoption of encryption. GDPR is a data privacy law in the EU that mentions the use of encryption. Although not mandatory, it is yet seen as a best practice for protecting personal data. So, let us first understand what data encryption is and then understand the role of encryption in GDPR compliance.

Data encryption is a process or technique of translating data from text to hashed code that can only be decrypted with a special key. This is one of the most effective processes that organizations can incorporate to enhance their data security measures.

The purpose of encrypting data is to maintain the confidentiality of sensitive data. Oftentimes, unencrypted data, which is stored in computers, on servers or transmitted using insecure internet or insecure computer networks, can result in data breaches. Having stored or transmitted unencrypted data can jeopardize the confidentiality of the data and lead to data sprawl and hacking.

Encryption plays a crucial role in the security of data. Encryption algorithms ensure the confidentiality, privacy and integrity of the data. It also ensures authentication, access controls and non-repudiation of sending data. There are more benefits to incorporating the technique of data encryption. Provided below are some reasons why data should be encrypted.

So, what does encryption have to do with GDPR? For a better understanding, let us take a closer look at the GDPR and its requirements.

The General Data Protection Regulation (GDPR) is a data privacy law that requires organizations to implement measures to protect the privacy, integrity and confidentiality of data. Although the regulation does not mandate or explicitly call for data security encryption, it requires organizations to enforce the best security measures and safeguards. The Regulation recognizes the risk exposure concerning the processing of personal data, and so it places the responsibility on the controller and the processor in Article. 32(1) to implement appropriate technical measures to secure personal data.

While the regulation does not specify technical and organizational measures to be considered, it does emphasize encryption techniques. Despite not being a mandate, the GDPR Regulation repeatedly mentions encryption and pseudonymization as appropriate technical and organizational measures for GDPR data security. The regulation clearly places the responsibility on the controller or processor to decide where encryption should be implemented.

Encryption of personal data in general offers additional benefits for controllers and/or processors. So, in the event that encrypted data is misplaced or there is a loss of a storage medium that holds encrypted personal data, this incident might not be considered to be a data breach in terms of penalties provided the incident is reported to the data protection authorities. Again, if there is an incident, the authorities may take into consideration the use of encryption in their decision on imposing fines as per Article 83(2)(c) of the GDPR.

Encryption can be a highly effective technique for achieving GDPR compliance.Although GDPR encryption requirements are not mandatory, itis yet a powerful technique for data security, as it converts or encodes information into a non-readable format that only an authorized party can access and read. This way, a GDPR data encryption strategy can work out to be beneficial for your organization, especially when it comes to preventing data breaches.

Regardless of whether the GDPR or another regulation applies to your organization, encryption forms an integral part of any organizations data security strategy. Implementing data encryption will prevent your organization from being vulnerable to a data breach and costly fines, which may be much higher than the cost of implementing encryption.

About the Author:Narendra Sahoo(PCI QSA, PCI QPA, CISSP, CISA, and CRISC) is the founder and director ofVISTA InfoSec, a global information security consulting firm based in the United States, Singapore and India. Mr. Sahoo holds more than 25 years of experience in the IT industry, with expertise in information risk consulting, assessment, and compliance services. VISTA InfoSec specializes in information security audit, consulting and certification services which includeGDPR,HIPAA, CCPA, NESA, MAS-TRM, PCI DSS compliance & audit, PCI PIN, SOC2, PDPA and PDPB, to name a few. The company has for years (since 2004) worked with organizations across the globe to address the regulatory and information security challenges in their industry. VISTA InfoSec has been instrumental in helping top multinational companies achieve compliance and secure their IT infrastructure.

Editors Note: The opinions expressed in this guest author article are solely those of the contributor, and do not necessarily reflect those of Tripwire, Inc.

See original here:
Understanding the Role of Encryption in GDPR Compliance - tripwire.com

Read More..