Page 1,771«..1020..1,7701,7711,7721,773..1,7801,790..»

Quantum computing and its impact on cybersecurity [Q&A] – BetaNews

Quantum computing with its vastly improved processing capability offers the chance of many positive developments in research and science. But it also represents a potential threat to our current encryption models.

How big is quantum's threat to cybersecurity? And should we be taking action on this now? We talked to Skip Sanzeri, QuSecure co-founder and COO, to find out.

BN: What are some of the main trends around quantum computing development?

SS: The quantum computing industry is evolving rapidly. Just a few years ago we were struggling to find companies that had more than a few dozen qubits and now we are in the 100-qubit era. Companies such as IBM, IonQ, Google, and PsiQuantum are talking about having a thousand or more qubits by mid-decade. If coherence continues to advance and noise can be reduced, these systems will be even more powerful. The promise of quantum computing, due to the exponential nature of qubits in superposition, can do amazing things for society -- but job one is cybersecurity.

With the advent of quantum computing upon us, the potential for many positive enhancements to our society may be forthcoming, including algorithms to cut through global emissions and quantum chemistry for personalized medicine. At the same time, tens of billions of dollars are being spent by foreign nations to develop quantum computers (some of which have been openly declared as 'weaponized'). A quantum computer with approximately 4,000 qubits will be able to break RSA 2048 which is the primary algorithm that we rely on to keep the world's data safe on the internet. So, we should prepare for the possibility that the first use of quantum computing may be for harm rather than good.

BN: Why is the need for action now when we know quantum computers are years away?

SS: Store now, decrypt later attacks are the biggest reason to start upgrading networks and communications to post-quantum cybersecurity (PQC). Foreign nation states are stealing data every second of the day. This data is harvested and stored on computers waiting to be decrypted. Quantum computers will be able to crack encryption (proven mathematically by Shores Algorithm) once we reach the scale of around 4,000 qubits. We refer to this as 'Q-Day.' So, all data that is encrypted with current, non-PQC is at risk today of a quantum computer decrypting it in the future. For example, if a quantum computer with enough power to crack encryption is developed in five years, data stolen today would still be very valuable if it has 10, 20, or more years of shelf life. National security secrets, bank account information, and electronic health records may have data security requirements of up to 75 years. Making matters worse, many experts estimate that changing our current encryption across an enterprise or government agency could take as long as 10 years. Adding this to the shelf life of data means that there are 10 more years of exposed data which attackers can weaponize or use against us. In many cases, we are already behind.

Therefore, enterprises (and government already has mandates in place) should start looking very closely at PQC to encrypt current communications and data. If data is stolen but has quantum encryption, it will be safe for decades after Q-Day.

BN: What are the main challenges around addressing the post-quantum cyber threat?

SS: There are a variety of challenges to overcome when thinking about how organizations can become quantum resilient.

First, any change is difficult. Moving from older, legacy systems to newer technologies takes a great deal of planning, time and resources in order to not disrupt operations. Therefore, any upgrades, especially to cybersecurity, need to be backwards compatible so that the upgrade process can move more efficiently.

Second, cutting-edge technology always comes with risk. Betting on new technologies requires significant risk assessment to ensure that upgrades are carefully planned, and best-of-breed vendors are chosen. Using standards-based technologies such as the algorithms that NIST is recommending will help reduce risk. Also looking for companies that have referenceable clients, federally approved credentials, post-quantum cybersecurity, and successful implementations will reduce risk.

BN: What are some of the things organizations should look for in a PQC solution to best protect their data?

SS: Enterprises and government agencies need to look for solutions that are standards-based, backwards compatible, and have cryptographic agility. Using NIST algorithms helps satisfy standards risk. Selecting vendors that can transfer between existing systems and protocols to newer post-quantum protocols is vital so that companies dont have to rip and replace software, which causes disruption and risk. Cryptographic agility means that implementations can use a variety of cryptographic standards such as any of the NIST finalists, which further means that an organization can choose its cryptography versus being locked into just one type of cryptography due to a given vendors choice. By finding a partner like QuSecure that has an adaptive orchestrated solution with continuous availability that standardizes on all the NIST finalists, an organization can know that they have optimized their choice.

BN: There seem to be multiple options in terms of PQC solutions, which are the most optimal and why?

SS: A variety of vendors are coming on scene to help meet this massive upgrade need. There are some solutions that focus on Quantum Key Distribution (QKD). QKD is the idea that you can use two devices to transmit keys via entanglement making the transmission theoretically un-spyable, but it is currently severely range-limited. It is currently only useful for highly specific applications and requires significant scientific breakthroughs to make it applicable to larger networks. Some vendors offer quantum random number generation (ORNG), which serves generally random numbers for use in cryptographic keys. This solves the threat of pseudo random keys (programmatically generated random numbers, which is the standard today) being reverse engineered, but QRNG alone does not address the threat posed to public key cryptography by Shors Algorithm attacks.

Other vendors have teams of mathematicians that offer post quantum cryptographic algorithms, and these fall into two camps. The first is a class of proprietary cryptographic algorithms, and it is generally not recommended to implement non-standard algorithms in an enterprise or government environment. The second class is a handful of companies that have written NIST finalist algorithms and offer generally application specific implementations. Still other vendors offer consulting services for PQC implementation.

Optimally an organization should find the right mix of post-quantum cybersecurity software, hardware and services, and ideally utilize a vendor that provides for quantum orchestration across the enterprise to all nodes, communications and data. Features such as PQC policy management, key orchestration and backwards compatibility are elements that every organization should review so that implementation is seamless and much easier.

Photo Credit: The World in HDR / Shutterstock.com

More:
Quantum computing and its impact on cybersecurity [Q&A] - BetaNews

Read More..

Pasqal, BASF aim to improve weather modeling with quantum computing – VentureBeat

Paris-based Pasqal, a full-stack neutral atom quantum computing provider, and BASF, the multinational chemical juggernaut, are announcing a partnership focused on weather modeling and other computational fluid-dynamics applications. That area of problem-solving relies on solving complex nonlinear differential equations, a task for which, it turns out, Pasqals hardware and algorithms are especially well-suited.

VentureBeat spoke with Benno Broer, Pasqals chief commercial officer. Broer was CEO of Qu & Co. which was focused on quantum algorithms and software, and which merged with Pasqal in January of this year.

Broer said that Pasqals hardware platform implements qubits (quantum bits), using individually trapped atoms that are manipulated with laser beams, and that the company produces 100-qubit systems today. The neutral atom platform supports an analog mode, which enables addressing all of those qubits concurrently, thereby enabling quantum entanglement, where multiple qubits act as a single system and influence each other.

Building on this, Pasqals algorithm technology can implement quantum neural networks, the quantum computing equivalent of physics-informed neural networks (PINNs), a subset of physics-informed machine learning (PIML). In the PIML world, models can be trained using a combination of data and equations that describe the laws of physics underlying the modeled phenomena.

PIML techniques can be used to solve differential equations, which is the key to attacking computational fluid-dynamics applications, including weather modeling. According to Pasqals press release, BASF can then use parameters generated by the weather models to simulate crop yields and growth stages as well as to predict drift when applying crop protection products. The weather modeling further serves BASFs digital farming product portfolio, including an advanced crop optimization platform. This takes quantum computing down from the ivory tower, and applies it, quite literally, in the field.

Perhaps even more intriguing, Broer told VentureBeat the equations used to model short-term weather patterns and those for long-term climate modeling are, in fact, similar. Scaling up the time dimension can allow the technology being applied to weather modeling in the near future to be applicable to climate modeling later, and perhaps be used to mitigate the effects of climate change.

Given the heatwaves impacting so many regions across the world this week, the potential of quantum computing to help mitigate climate change impact is good news . And if were going to tech our way out of this (the phrase attributed to Kleiner Perkins chairman John Doerr), then an approach that combines quantum computing and physics-informed machine learning seems like a good start.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Excerpt from:
Pasqal, BASF aim to improve weather modeling with quantum computing - VentureBeat

Read More..

Old computer technology points the way to future of quantum computing Hope Standard – Hope Standard

Researchers have made a breakthrough in quantum technology development that has the potential to leave todays supercomputers in the dust, opening the door to advances in fields including medicine, chemistry, cybersecurity and others that have been out of reach.

In a study published in the journal Nature on Wednesday, researchers from Simon Fraser University in British Columbia said they found a way to create quantum computing processors in silicon chips.

Principal investigator Stephanie Simmons said they illuminated tiny imperfections on the silicon chips with intense beams of light. The defects in the silicon chips act as a carrier of information, she said. While the rest of the chip transmits the light, the tiny defect reflects it back and turns into a messenger, she said.

There are many naturally occurring imperfections in silicon. Some of these imperfections can act as quantum bits, or qubits. Scientists call those kinds of imperfections spin qubits. Past research has shown that silicon can produce some of the most stable and long-lived qubits in the industry.

These results unlock immediate opportunities to construct silicon-integrated, telecommunications-band quantum information networks, said the study.

Simmons, who is the universitys Canada Research Chair in silicon quantum technologies, said the main challenge with quantum computing was being able to send information to and from qubits.

People have worked with spin qubits, or defects, in silicon before, Simmons said. And people have worked with photon qubits in silicon before. But nobodys brought them together like this.

Lead author Daniel Higginbottom called the breakthrough immediately promising because researchers achieved what was considered impossible by combining two known but parallel fields.

Silicon defects were extensively studied from the 1970s through the 90s while quantum physics has been researched for decades, said Higginbottom, who is a post-doctoral fellow at the universitys physics department.

For the longest time people didnt see any potential for optical technology in silicon defects. But weve really pioneered revisiting these and have found something with applications in quantum technology thats certainly remarkable.

Although in an embryonic stage, Simmons said quantum computing is the rock n roll future of computers that can solve anything from simple algebra problems to complex pharmaceutical equations or formulas that unlock deep mysteries of space.

Were going to be limited by our imaginations at this stage. Whats really going to take off is really far outside our predictive capabilities as humans.

The advantage of using silicon chips is that they are widely available, understood and have a giant manufacturing base, she said.

We can really get it working and we should be able to move more quickly and hopefully bring that capability mainstream much faster.

Some physicists predict quantum computers will become mainstream in about two decades, although Simmons said she thinks it will be much sooner.

In the 1950s, people thought the technology behind transistors was mainly going to be used for hearing aids, she said. No one then predicted that the physics behind a transistor could be applied to Facebook or Google, she added.

So, well have to see how quantum technology plays out over decades in terms of what applications really do resonate with the public, she said. But there is going to be a lot because people are creative, and these are fundamentally very powerful tools that were unlocking.

Hina Alam, The Canadian Press

RELATED: US intel warns China could dominate advanced technologies

RELATED: Elon Musk claims Tesla will debut a humanoid robot next year

Computers and ElectronicsScienceSFU

The rest is here:
Old computer technology points the way to future of quantum computing Hope Standard - Hope Standard

Read More..

Quantum Computing Market Size, Scope, Growth Opportunities, Trends by Manufacturers And Forecast to 2029 This Is Ardee – This Is Ardee

New Jersey, United States TheQuantum ComputingMarket research guides new entrants to obtain precise market data and communicates with customers to know their requirements and preferences. It spots outright business opportunities and helps to bring new products into the market. It identifies opportunities in the marketplace. It aims at doing modifications in the business to make business procedures smooth and make business forward. It helps business players to make sound decision making. Quantum Computing market report helps to reduce business risks and provides ways to deal with upcoming challenges. Market information provided here helps new entrants to take informed decisions making. It emphasizes on major regions of the globe such as Europe, North America, Asia Pacific, Middle East, Africa, and Latin America along with their market size.

Such unique Quantum Computing Market research report offers some extensive strategic plans that help the players to deal with the current market situation and make your position. It helps in strengthening your business position. It offers better understanding of the market and keep perspective to aid one remain ahead in this competitive market. Organizations can gauze and compare their presentation with others in the market on the basis of this prompt market report. This market report offers a clarified picture of the varying market tactics and thereby helps the business organizations gain bigger profits. You get a clear idea about the product launches, trade regulations and expansion of the market place through this market report.

Get Full PDF Sample Copy of Report: (Including Full TOC, List of Tables & Figures, Chart) @https://www.verifiedmarketresearch.com/download-sample/?rid=24845

Key Players Mentioned in the Quantum Computing Market Research Report:

Qxbranch QC Ware Corp., International Business Machines Corporation (IBM), D-Wave Systems Inc., 1qb Information Technologies Inc., Cambridge Quantum Computing Ltd, Station Q Microsoft Corporation, River Lane Research, Rigetti Computing, Research at Google Google Inc

Quantum ComputingMarket report consists of important data about the entire market environment of products or services offered by different industry players. It enables industries to know the market scenario of a particular product or service including demand, supply, market structure, pricing structure, and trend analysis. It is of great assistance in the product market development. It further depicts essential data regarding customers, products, competition, and market growth factors. Quantum Computing market research benefits greatly to make the proper decision. Future trends are also revealed for particular products or services to help business players in making the right investment and launching products into the market.

Quantum ComputingMarket Segmentation:

Quantum Computing Market, By Offering

Consulting solutions Systems

Quantum Computing Market, By Application

Optimization Machine Learning Material Simulation

Quantum Computing Market, By End User

Space and Defense Automotive Healthcare Banking and Finance Chemicals Energy & Power

Inquire for a Discount on this Premium Report@ https://www.verifiedmarketresearch.com/ask-for-discount/?rid=24845

For Prepare TOC Our Analyst deep Researched the Following Things:

Report Overview:It includes major players of the Quantum Computing market covered in the research study, research scope, market segments by type, market segments by application, years considered for the research study, and objectives of the report.

Global Growth Trends:This section focuses on industry trends where market drivers and top market trends are shed light upon. It also provides growth rates of key producers operating in the Quantum Computing market. Furthermore, it offers production and capacity analysis where marketing pricing trends, capacity, production, and production value of the Quantum Computing market are discussed.

Market Share by Manufacturers:Here, the report provides details about revenue by manufacturers, production and capacity by manufacturers, price by manufacturers, expansion plans, mergers and acquisitions, and products, market entry dates, distribution, and market areas of key manufacturers.

Market Size by Type:This section concentrates on product type segments where production value market share, price, and production market share by product type are discussed.

Market Size by Application:Besides an overview of the Quantum Computing market by application, it gives a study on the consumption in the Quantum Computing market by application.

Production by Region:Here, the production value growth rate, production growth rate, import and export, and key players of each regional market are provided.

Consumption by Region:This section provides information on the consumption in each regional market studied in the report. The consumption is discussed on the basis of country, application, and product type.

Company Profiles:Almost all leading players of the Quantum Computing market are profiled in this section. The analysts have provided information about their recent developments in the Quantum Computing market, products, revenue, production, business, and company.

Market Forecast by Production:The production and production value forecasts included in this section are for the Quantum Computing market as well as for key regional markets.

Market Forecast by Consumption:The consumption and consumption value forecasts included in this section are for the Quantum Computing market as well as for key regional markets.

Value Chain and Sales Analysis:It deeply analyzes customers, distributors, sales channels, and value chain of the Quantum Computing market.

Key Findings:This section gives a quick look at the important findings of the research study.

For More Information or Query or Customization Before Buying, Visit @ https://www.verifiedmarketresearch.com/product/quantum-computing-market/

About Us: Verified Market Research

Verified Market Research is a leading Global Research and Consulting firm that has been providing advanced analytical research solutions, custom consulting and in-depth data analysis for 10+ years to individuals and companies alike that are looking for accurate, reliable and up to date research data and technical consulting. We offer insights into strategic and growth analyses, Data necessary to achieve corporate goals and help make critical revenue decisions.

Our research studies help our clients make superior data-driven decisions, understand market forecast, capitalize on future opportunities and optimize efficiency by working as their partner to deliver accurate and valuable information. The industries we cover span over a large spectrum including Technology, Chemicals, Manufacturing, Energy, Food and Beverages, Automotive, Robotics, Packaging, Construction, Mining & Gas. Etc.

We, at Verified Market Research, assist in understanding holistic market indicating factors and most current and future market trends. Our analysts, with their high expertise in data gathering and governance, utilize industry techniques to collate and examine data at all stages. They are trained to combine modern data collection techniques, superior research methodology, subject expertise and years of collective experience to produce informative and accurate research.

Having serviced over 5000+ clients, we have provided reliable market research services to more than 100 Global Fortune 500 companies such as Amazon, Dell, IBM, Shell, Exxon Mobil, General Electric, Siemens, Microsoft, Sony and Hitachi. We have co-consulted with some of the worlds leading consulting firms like McKinsey & Company, Boston Consulting Group, Bain and Company for custom research and consulting projects for businesses worldwide.

Contact us:

Mr. Edwyne Fernandes

Verified Market Research

US: +1 (650)-781-4080UK: +44 (753)-715-0008APAC: +61 (488)-85-9400US Toll-Free: +1 (800)-782-1768

Email: sales@verifiedmarketresearch.com

Website:- https://www.verifiedmarketresearch.com/

Go here to see the original:
Quantum Computing Market Size, Scope, Growth Opportunities, Trends by Manufacturers And Forecast to 2029 This Is Ardee - This Is Ardee

Read More..

Rigetti Computing to Report Second Quarter 2022 Financial Results and Host Conference Call on August 11, 2022 – GlobeNewswire

BERKELEY, Calif., July 19, 2022 (GLOBE NEWSWIRE) -- Rigetti Computing, Inc. ("Rigetti" or the "Company") (Nasdaq: RGTI), a pioneer in hybrid quantum-classical computing, announced today that it will release second quarter 2022 results on Thursday, August 11, 2022, after market close. The Company will host a conference call to discuss its financial results and provide an update on its business operations at 5:00 p.m. ET the same day.

Key details regarding the call are as follows:

Call Date: Thursday, August 11, 2022Call Time: 5:00 p.m. ET / 2:00 p.m. PT Webcast Link: https://edge.media-server.com/mmc/p/ndjhqqmyLive Call Participant Link: https://register.vevent.com/register/BI8f37a843026f4cbab4519e6d16db70d8

Webcast InstructionsYou can listen to a live audio webcast of the conference call by visiting the Webcast Link above or the "Events & Presentations" section of the Company's Investor Relations website at https://investors.rigetti.com/. A replay of the conference call will be available at the same locations following the conclusion of the call for one year.

Live Call Participant InstructionsTo participate in the live call, you must register using the Live Call Participant Link above. Once registered, you will receive dial-in numbers and a unique PIN number. When you dial in, you will input your PIN and be routed into the call. If you register and forget your PIN, or lose the registration confirmation email, simply re-register to receive a new PIN.

About RigettiRigetti is a pioneer in full-stack quantum computing. The Company has operated quantum computers over the cloud since 2017 and serves global enterprise, government, and research clients through its Rigetti Quantum Cloud Services platform. The Companys proprietary quantum-classical infrastructure provides ultra-low latency integration with public and private clouds for high-performance practical quantum computing. Rigetti has developed the industrys first multi-chip quantum processor for scalable quantum computing systems. The Company designs and manufactures its chips in-house at Fab-1, the industrys first dedicated and integrated quantum device manufacturing facility. Rigetti was founded in 2013 by Chad Rigetti and today employs more than 160 people with offices in the United States, U.K. and Australia. Learn more at http://www.rigetti.com.

ContactRebecca MalamudRigetti Computingpress@rigetti.com

Polly PearsonInvestor RelationsRGTI@investorrelations.com

Read more:
Rigetti Computing to Report Second Quarter 2022 Financial Results and Host Conference Call on August 11, 2022 - GlobeNewswire

Read More..

How to get the most out of Google Drive – Engadget

Google Drive is an incredibly powerful tool for storing and organizing all sorts of data. And best of all, its available to anyone with a Google account for free (at least to start). Additionally, because Drive holds all your files in the cloud, it offers some important advantages compared to stashing everything locally on your phone or PC. Drive also works on practically any device with an internet connection, which makes it easy to use at home, at school, in the office and everywhere in between.

However, if youre new to Drive, there are some important basics you should know before you transfer over all your data and files. So heres a quick rundown covering the most critical things about Googles popular cloud storage service.

Every Google Drive user gets 15GB of free storage. However, any data you have saved in Google Photos also counts towards that limit. So if youre using that to back up your pictures , you may not have a ton of room left over for documents and files. That said, you can increase your storage in Drive via a Google One subscription, which starts as low as $1.99 a month (or $20 a year) for 100GB of storage and goes up to $9.99 for 2TB of storage (or $100 a year).

Sam Rutherford/Engadget

For most people, 100GB is more than enough to stash important files, work docs, and family photos. But if youre planning on using Drive as a way to backup all your data, youll probably want to go with one of the bigger plans. The nice thing is that even though the basic $20 a year plan is relatively cheap, there are a number of ways to get additional storage for free, at least temporarily. For example, anyone who buys a new Chromebook will get 100GB of space in Drive free for a year, while customers new to Google One may get offers to test the service out with a free one-month subscription.

So before you start uploading all your files, youre going to want to figure out how much storage you need and how much that may (or may not) cost you.

Once youve figured out how much storage you need, you can begin uploading or transferring your files to Drive. For single files or data stored locally on your device, you can simply tap the New button and select the option for File or Folder upload. On a computer, you can also drag and drop files into your browser window when you are on the Drive website. Drive supports a wide variety of file types including most of the most popular formats like .JPGs, .PNGs, .GIFs, .MP3s, and more. For a full list of support file types, check out Googles official Help Center here.

After you have all your files uploaded, you can manage them just like you would locally on your phone or computer. You can create nested folders and drag and drop files from one place to another. And of course, you can look for a specific file or folder by typing in the search box, though its important to remember that if youre storing a lot of files in Drive, it may take a bit longer to find them (especially if your internet connection isnt very speedy). So if youre able to create a general directory of folders for important projects or data sets on day one, youll probably save yourself a lot of time and headaches later.

Sam Rutherford/Engadget

Its also important to note that while you can create new Google Docs, Sheets, Slides, etc. directly within Drive on PC, on mobile you need to install both Drive and the specific productivity program you want individually. Thats because while they all work together, they are considered separate apps.

Another good way to use Google Drive to organize your work is to save templates for various projects in Docs. This allows you to start writing a script or create forms without starting from scratch every time. You can also save templates for things like bibliographies, potentially saving students time when trying to cite sources for a research paper.

Alternatively, instead of using dedicated apps, you can share a Google Sheet with roommates to help figure out the cost of utilities and other shared expenses. And while it wasnt strictly designed for this, students have discovered that when places like schools ban or put restrictions on typical messaging apps, you can still chat with friends using Google Docs. All you have to do is invite people to a blank doc and then use real-time collaboration to talk and respond to each other. And once youre done, you can simply delete the doc, or keep it around for another day.

In addition to making cloud storage simple and easy to use, one of Google Drives most powerful features is its range of collaboration tools. Sharing a file or document with someone else is as simple as hitting the share button and entering their email. Alternatively, Drive can generate a link to send via text, social media or your messaging app of choice. Once someone has access, youll be able to view or edit the file with them in real-time.

That said, its important to know who youre sharing your files with and how they are using them. For example, it might be really helpful to give editing permission to a teacher or mentor if youre looking for help with an essay, but less so if youre just sharing an ebook with a friend. In addition to the owner of the file, Drive offers three different levels of access: viewer, commenter and editor. And if something goes wrong and you ever want to see an older copy of a Google Doc, Sheet or Slide, you can open the File menu and select the option that says Version history.

Sam Rutherford/Engadget

Viewers are only able to see and read the document, but dont have the ability to change any of the content. Commenters can view and surface thoughts and questions about the content by using Googles Comment tool, while editors can make changes just like the owner of a doc.

If you want to see files that others have sent you recently, you can click on Google Drives Shared with me tab. And if you have a Google Workspace account through school or work, you can also open the handy Activity Dashboard by clicking on the squiggly icon. (Its in the top right next to the blue Share button on a desktop.) Finally, if you want a fast way to see which files youve shared with others, you can type to: into Drives search box.

While Google Drive is intended primarily as a way to manage docs and files stored in the cloud, it does support offline access, which can be handy when you dont have a good internet connection. However, there are some steps you need to take before you can get the full benefit of using Drive offline.

Sam Rutherford/Engadget

First, you need to make some changes to your Drives settings while connected to the internet before going offline. On a computer, you need to click the gear icon in the top right corner of your Google Drive browser tab, hit Settings and then check the box next to the Offline menu option. On mobile, youll need to open the Drive app, find a specific file and then designate for offline access by enabling the option from the More icon (it's the one that looks like three vertical dots). Once you do that, youll be able to access, edit and save any changes you make. And the next time your device connects to the internet, it will automatically sync any changes you made to the offline doc to the one saved in the cloud. Meanwhile on a Chromebook, all you have to do is open up your Google Drive settings, scroll down, check the box next to the Offline option and hit Done.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Continue reading here:
How to get the most out of Google Drive - Engadget

Read More..

It’s So Hot the Clouds Are Melting – Gizmodo

Note: this is just a graphic rendering of a Google server room. We expect the real thing would have Googles logo posted on every floor and ceiling tile as well.Illustration: Novikov Aleksey (Shutterstock)

If you think your office is stifling even with your lone air conditioner buzzing away hour after hour, think how much worse it is for the always-on server banks during this hot, hot summer.

The Register first reported on the cloud server heat-related failures as a record-breaking heat wave has caused massive havoc with not just civil infrastructure, but also, apparently, computer infrastructure as well.

The heat brought havoc to Englands cloud computing apparatus. Oracle cloud servers were reportedly taken offline Tuesday afternoon and came back Wednesday at around 11 a.m. UTC. Apparently, the companys London center failed to keep up with the 104 degrees Fahrenheit temperature, automatically instituting a protective shut down. The report notes that some Oracle Cloud users were delayed in accessing their data hosted on the servers.

Though servers were apparently restored, the service notes a subset of Oracle Integration Cloud resources continue to experience impact even going into Wednesday.

Tuesday morning, Google also noted that several of its UK cloud servers were experiencing hiccups, including extra errors, high latencies, and server outages in its europe-west2 server bank, also located in London. Those server outages apparently impact a multitude of Google services other than the regular Google Cloud Storage, including Cloud Machine Learning, Cloud Firestore and Cloud Datastore.

G/O Media may get a commission

41% Off

LG Oled 55" Smart TV

PrettyAside from being 55" in size, this OLED TV has over 8 million pixels for truly stunning images, incredibly depth of blacks, and vibrant colors, uses an a7 Gen 4 AI Processor for 4K imaging, has low latency if youre after a good gaming TV, and even has integrated Google Assistant and Alexa.

All this was apparently tied to a cooling related failure in one of the buildings that hosts the europe-west2 servers. While the company noted that cooling systems were restored and that some services should be back online, there are still lingering issues. Google did say that going into Wednesday people using multiple services would be experiencing blackouts and service interruptions.

But both Oracle and Google said there should not be any further service interruptions for UK users while they try to get all the various programs back up and running.

Its been noted how unprepared the whole of England has been for a heat wave. Many residential buildings lack any air conditioning. Of course, a building meant to house computer services will have some kind of HVAC system in place, but it seems they werent built to withstand those constant 100+ degree temps.

Its another reminder of just how much of our common, connected infrastructure is not built to handle increasing temperatures brought on in part by climate change. Weve already seen how intense wildfires have spread all throughout Europe and north Africa, showing up earlier than normal. This past week intense rains washed out New York City subways. In Texas, the multiple massive crypto mining facilities are routinely taking their systems offline so as not to overtax the states routinely overburdened grid.

Excerpt from:
It's So Hot the Clouds Are Melting - Gizmodo

Read More..

IBM storage revenues up thanks to mainframe refresh Blocks and Files – Blocks and Files

IBM has reported financial results for the second 2022 quarter, ended June 30, with storage hardware sales growing at the start of a new mainframe cycle.

Total group evenues were $15.54 billion, up 16 percent year-on-year at constant currency. Some 5 percent of the increase was attributed to sales to Kyndryl which was spun off last year into a separately traded entity. Net profit was $1.39 billion, up a tad on the year-ago $1.33 billion.

IBM chairman and CEO Arvind Krishna said: In the quarter we delivered good revenue performance with balanced growth across our geographies, driven by client demand for our hybrid cloud and AI offerings. The IBM team executed our strategy well. With our first-half results, we continue to expect full-year revenue growth at the high end of our mid-single digit model.

The company divides its business into four segments: Consulting, Software, Red Hat Hybrid Cloud Platform, and Infrastructure. Storage hardware is included in the Infrastructure segment as part of the Distributed Infrastructure sub-segment, but not revealed. Storage software is included in the Software segment and, again, not revealed. The segment results were:

IBM said there was a solid z16 mainframe launch in April and the company released a new generation of Power servers earlier this month. Mainframe sales attract DS8000 storage array sales in their wake.

CFO Jim Kavanaugh said in the earnings call: Infrastructure performance, which reflects a good start to our z16 product cycle, was up 25 percent. Software and infrastructure each include about seven points of growth from the commercial relationship with Kyndryl.

He added: Distributed infrastructure revenue grew 17 percent this quarter. This growth was led by storage, driven by both high-end storage tied to the z16 cycle and distributed storage. We also had good performance in high-end Power10. We interpret that as meaning FlashSystems sales were up.

Subscription sales are growing, Kavanaugh said. Across the four Hybrid Platform and Solution business areas, our annual recurring revenue, or ARR, is nearly $12.9 billion, up 8 percent.

The company had expected full-year free cashflow to be between $10 billion and $10.5 billion but reduced that to $10 billion as a result of withdrawing its business from Russia. It made no comment about supply chain or pandemic issues in its earnings release and presentation, but they did feature in the earnings call. Kavanaugh mentioned the impact of increased component cost and supplier premiums on z16 profitability.

Kavanaugh said third-quarter revenue expectations are for year-on-year growth of 9 to 10 percent at constant currency. Overall, he said: We are a faster-growing, focused, disciplined company with sound business fundamentals.

Follow this link:
IBM storage revenues up thanks to mainframe refresh Blocks and Files - Blocks and Files

Read More..

Understand the uses of blockchain in data centers – TechTarget

As with any trend, blockchain technology has a fair share of pros and cons, especially in the context of the data center. However, blockchain's rising influence can provide organizations with a competitive advantage when they apply it appropriately.

Today, blockchain technology is one of the biggest trends in the industry. According to Deloitte's "2021 Global Blockchain Survey," 81% of senior executives of major organizations said they believe blockchain technology is broadly scalable and has achieved mainstream adoption. Furthermore, 78% also reported that their executive teams believe there's a compelling business case for the use of blockchain within their organization, and 80% are discussing how to integrate it into their current strategies.

Blockchain is a highly secure and immutable record-keeping technology. Bad actors can't break into the system or forge the data stored on it. This distributed ledger technology records transactions and related data in multiple places at the same time, which prevents a single point of failure and validates every piece of information it stores.

Compared to traditional databases that store data in rows, columns, tables and files, a blockchain is decentralized and managed by computers in a peer-to-peer network. It stores data in chained blocks; during a transaction, each block of data is sent to every computer node in the network, where it's authorized and then attached to the blockchain securely. Once added, a block cannot be changed.

The validation process ensures the data is unique and legitimate with time stamps to prove it. Should someone try to swap out a block, copy it or change its state, the network of computers that constitute the blockchain receives an alert immediately and no one can add new blocks to the chain until the issue is addressed.

Security is easily the blockchain's biggest advantage, followed by resilience. Each block is continually reconciled by a network of computers. If one node fails, it can't bring the whole system down because all the other nodes have a copy of the ledger.

There are several types of blockchains and a variety of uses across industries. The finance industry currently leads blockchain adoption due to the way the technology can simplify the transaction process and lock it down as well.

With regards to data center architecture, blockchain takes a different approach to data storage.

Blockchain uses decentralization to manage and store data. The blockchain network can consist of dozens, hundreds or thousands of computers spread across the world in various locations. For a blockchain breach to succeed, hackers would have to take down multiple computers in the network -- and even then, blockchain data storage is encrypted, which minimizes the security risk.

The rise of blockchain also demands more reliable power, high-performance equipment that can process blocks of data quickly, and more intensive cooling to keep that equipment from overheating during intense computations.

These advantages directly compete with traditional data center storage. Data centers house massive amounts of data in a single location. This centralization puts them at risk of natural disasters and outages in the local area. To add some redundancy and prevent data loss, organizations might copy data and store it in other locations; however, this process can be time-consuming and costly, and it creates a surplus of information that also needs protection.

Blockchain data storage can deliver higher levels of security, reliability, redundancy, resilience and transparency. Its distributed nature enables users to have a higher degree of control over where they store their data, which affects accessibility and availability, too.

That said, accessibility and availability can become a detractor from the decentralized approach. To retrieve a block of data, the different nodes on the network must sync, validate and pull the block; this can take a significant amount of time, depending on the nodes' locations and loads. Traditional data centers can deliver much faster speeds and higher levels of data availability.

The security of the blockchain, while quite advanced, is also not perfect. As more users adopt the technology, bad actors will get better at finding and exploiting holes in blockchains. For now, however, it provides much better data security over in-house and cloud storage.

Finally, cost is a big factor that determines whether organizations might choose to adopt blockchain. Although blockchain becomes more popular every day, it is still not widely deployed by organizations -- at least not at the same level as cloud storage, which is cheap and available in many forms today.

For data center teams seeking a blockchain approach, start by reevaluating the data center infrastructure. Consider how to pivot resources, and begin implementing and moving toward a decentralized architecture. Start thinking about how to set up a peer-to-peer network that can handle a blockchain workflow.

The rise of blockchain also demands more reliable power, high-performance equipment that can process blocks of data quickly, and more intensive cooling to keep that equipment from overheating during intense computations. The faster users can write and validate blocks, the better. So, make sure to factor in these requirements and invest accordingly.

The traditional data center architecture must fundamentally transform to meet blockchain's demand for higher traffic and availability, including having staff on hand with specialized blockchain skills. This presents a big undertaking; but, by planning for it now, organizations can keep pace with increasing data processing needs and adapt to client needs as they adopt new business strategies to integrate the blockchain.

See the original post here:
Understand the uses of blockchain in data centers - TechTarget

Read More..

Redis is ready for CXL memory pooling Blocks and Files – Blocks and Files

CXL memory pooling could drastically increase the amount of memory available to in-memory applications, such as Redis with its database. But what does Redis itself think?

Memory pooling with CXL uses the Computer eXpress Link protocol, based on the PCIe 5 bus, to enable servers to access larger pools of memory than they could if they only used local, socket-accessed DRAM.

We spoke to Redis CTO and co-founder Yiftach Shoolman, who thinks CXL memory pooling is a positivre development.

Blocks and Files: How do you see CXL 2.0 memory pooling affecting Redis?

Yiftach Shoolman: Redis is an in-memory real-time datastore (cache, database, and everything in between); as such, CXL enables Redis to support larger datasets at a lower price per GB as it extends the capacity of each node in the Redis cluster with more cost-effective DRAM.

Furthermore, CXL enables the deployment of more memory per core (or virtual cores) in each node in the Redis cluster. This is great for users who use Redis Enterprise (our commercial software and cloud service) as they can now deploy more databases or caches on the same infrastructure and reduce their deployment costs using the built-in multi-tenant capability of Redis Enterprise.

Blocks and Files: How could Redis manage the access speed discrepancy between a servers directly connected DRAM and the CXL memory pool?

Yiftach Shoolman: Redis Enterprise uses its Redis on Flash technology to enable memory tiering capabilities, in which hot objects are stored in DRAM, and warm objects are stored in SSD. CXL provides another (middle) tier that can potentially extend this hierarchy, i.e. hot objects will be stored in DRAM, the warm objects in CXL (cheap memory or Persistent Memory), and cold objects in SSD.

Blocks and Files: Is there a role for storage class memory in a CXL memory pool?

Yiftach Shoolman: Yes, the CXL standard supports DRAM memory and NAND flash memory like Intels Persistent Memory. And its up to the infrastructure providers to decide the ratio between the different memory technologies that are (or will be) available for the users.

Blocks and Files: Will CXL memory pooling be more relevant to customers of hyperscale cloud providers or enterprises?

Yiftach Shoolman: I believe we are still in the early stages of adoption. That said, and theoretically speaking, enterprises have more flexibility to decide about the type of hardware they will use in their infrastructure. But, on the other hand, if the hyperscale cloud providers find CXL attractive, the adoption of this technology for mainstream use cases will be much faster.

Blocks and Files: How might such Redis customers differ in their use of CXL memory pooling? Will it just or mostly be the sheer size of the memory pool?

Yiftach Shoolman: Sizing is just one parameter. The others are the type of memory they would like to use with CXL. Is it regular DRAM, or is it slower but cheaper DRAM or Persistent Memory?

Blocks and Files: How could the data in Redis in a CXL memory pool be persisted and then protected?

Yiftach Shoolman: From the Redis perspective, CXL is just another pool of memory. Therefore each one of the Redis persistent mechanisms will continue to operate as if CXL were a regular DDR memory.

Blocks and Files: Are there other points relevant to Redis and CXL memory pooling?

Yiftach Shoolman: I believe the success of CXL memory pooling is mainly dependent on the adoption of the hyperscale cloud providers. Without it, CXL might be just a niche solution for specific use cases.

Hyperscaler adoption of CXL memory pooling could be a tipping point in Shoolmans thinking. Such large-scale adoption would encourage CXL hardware development and also facilitate cloud bursting of CXL memory-pooled applications by enterprises giving them access to potentially much larger memory pools than they could afford themselves.

A note about CXL and storage-class memory: Simon Thompson of Lenovo has written: We eagerly await the arrival of Compute Link Express (CXL) storage class systems!

Read more:
Redis is ready for CXL memory pooling Blocks and Files - Blocks and Files

Read More..