Page 2,061«..1020..2,0602,0612,0622,063..2,0702,080..»

Google wants to win the quantum computing race by being the tortoise, not the hare – The Next Web

The most exciting name in quantum computing today is Google. Last years time crystals breakthrough was the culmination of decades of academic effort from the Search giant, and it proved Big G is a clear front-runner in the world of cutting-edge quantum physics research.

Despite having virtually no B2B presence in the quantum computing marketplace, the Mountain View company managed to leverage itself as one of the most important players in the field.

Googles position comes as a bit of a surprise when you consider the competition. D-Waves been making quantum computers for about as long as Google has been in business. And both Microsoft and IBM have focused quantum computing ecosystems generating revenue today to offset their massive research expenditures.

But Googles not as big a newcomer to the field as you might imagine. Its quantum ambitions go all the way back to at least 2005-2006, when its AI division began working on algorithms designed to run on D-Wave quantum computing chips.

Eventually, the partnership would pay off and, in 2009, D-Wave and Google demonstrated quantum speedup for an image classification algorithm.

Fast-forward to 2022 and Googles managed to build at least three gate-based quantum processors of its own, demonstrated a new phase of matter (time crystals),and supposedly achieved quantum supremacy. Not bad for a company most people wouldnt associate with the field of quantum physics.

In fact, if you take a look at the whole picture, its clear that Google or, to be more accurate, its parent company Alphabet has its sights set on being the worlds premiere quantum computing organization.

Weve seen this kind of focus before when the company pivoted from mobile-first to AI-firstin 2016. And, arguably, Googles managed to nab the top spot among US AI companies in the time since.

Googles taken the same tried-and-true approach to building out its quantum ambitions. And, based on recent developments, it appears as though the Mountain View companys long-term plans are starting to come into focus.

Googles working with institutions ranging from NASA to Stanford to develop the quantum computing systems of the future. Its work demonstrating quantum advantage in gate-based quantum systems and the aforementioned time crystals breakthrough has cemented it as a stalwart member of the quantum physics world.

But research at the edge is hard to monetize.Thatswhy Microsoftrecently partnered upwithPasqal to round out its cloud-based quantum access offerings while it continues to research its far out topographical qubit ideas.

And D-Wave spent decades developing useful quantum computers capable of solving problems right away before it finally began researching futuristic gate-based systems in earnest.

Even IBM, Googles closest running mate in the research field among big tech outfits, has prioritized cloud access for business clients over its own monumental research efforts.

Based on everything weve seen, Googles as capable of fielding a functioning quantum-as-a-service paradigm as any other player in the field. And it may even be ahead of the pack when it comes to the race towards quantum advantage a quantum computer capable of surpassing every supercomputer on the planet.

In fact Google Quantum AI, which was founded in partnership with NASAs quantum labs, believes itll have a gate-based quantum computer capable of quantum advantage within the next decade.

Of course the competition IBM, Microsoft, and D-Wave have all made similar claims. And that makes this one of the most potentially-lucrative races in technology history.

As weve argued, IBMs off to a head start and Microsoft looks poised to dominate this market in a matter of a few years. But Googles got a few aces up its sleeves that could shake everything up.

Parent company Alphabet recently starbursted its SandboxAQ division into its own company, now a Google sibling. Its unclear exactly what SandboxAQ intends to do now that its spun out, but its positioned as a quantum-and-AI-as-a-service company. We expect itll begin servicing business clients in partnership with Google in the very near-term.

And, in doing so, Google will shore up its short-term quantum endeavors in much the same way Microsoft has recently. The major difference here is that Alphabet controls both Google and SandboxAQ, whereas Microsoft can cut its Pasqal partnership if the tide changes.

Itll be interesting to see the likes of Alphabet and Microsoft spar over future government contracts for quantum services. Where Microsoft tends to outperform Google in the bidding arena, Big G already has close ties to NASA and is intrinsically involved in its quantum ambitions for the US space program.

At the end of the day, Googles betting it all on its research arms covering a lot of ground over the next ten years. If time crystals and the companys other gate-based quantum computing research veins dont pan out, it could end up lagging too far behind the competition to matter.

Neurals take: everything weve seen in the past five years tells us the exact opposite is likely to happen.

We can safely assume we havent seen the last of Googles quantum computing research breakthroughs, and that tells us we could very well be living in the moments right before the slow-and-steady tortoise starts to make up ground on the speedy hare.

Follow this link:
Google wants to win the quantum computing race by being the tortoise, not the hare - The Next Web

Read More..

3 Quantum Computing Stocks to Buy Before They Go Mainstream – InvestorPlace

Quantum computing stocks are in the limelight, as the technology has tremendous potential to advance big data and artificial intelligence (AI). Analysts highlight that quantum computers could transform many industries, including finance, pharmaceuticals, energy, agriculture and telecom.

Recent metrics suggest the global quantum computing market could reach $9 billion in revenue by 2030, up from $260 million in 2020. Annual average growth is forecast to exceed 40% during the decade, with development gaining pace after 2025.

Regular InvestorPlace users may already know Microsoft (NASDAQ:MSFT) andAmazon(NASDAQ:AMZN) already provide quantum computing services with Azure Quantum and Braket, respectively. But other companies are also carving their niche in this market, and their stocks are compelling buys.

With that in mind, here are the three best quantum computing stocks to buy for lucrative returns through the decade.

First on our list is Alphabet, the internet media giant with Google as one if its most prominent segments. Googles primary interest in quantum computing comes from its leading role in internet search.

Alphabet issued fourth-quarter 2021 resultson Feb. 1. Revenue increased 32% year-over-year (YOY) to $75.3 billion. Net income came in at $20.6 billion, or $30.69 per diluted share, up from $15.2 billion in the prior-year quarter. Cash and equivalents ended the period at $20.9 billion.

Management also announced an upcoming 20-for-1 stock split. Investors are now looking forward to July 1, the split date. Many retail buyers believe these events offer attractive investment opportunities.

In 2019, Googles Sycamore quantum computing chipsexecuted a task in 200 seconds that the company claimed would have taken a supercomputer 10,000 years to perform. The tech giant aims to create a useful, error-corrected quantum computer by 2029.

Understandably, Google will likely invest billions in developing the technology over the next decade. Management has just spun off its quantum computing unit, Sandbox.

GOOGL stock is up 36% over the past year, but down 3.9% since the start of 2022. Shares are trading at 24.2 times forward earnings and 7.3 trailing sales. The 12-month median price forecast for GOOGL stock is $3,500. After July 1, the stock price and analysts forecasts will change to reflect the split.

52 week range: $174.42 $236.86

Dividend Yield: 2%

Prominent technology name Honeywell manufactures numerous high-tech products, ranging from aerospace equipment to medical devices and advanced materials. In late November, it merged its Honeywell Quantum Solutions with Cambridge Quantum Computing to create Quantinuum, the largest quantum computing company in the world.

Wall Street expects Quantinuum to go public by the end of 2022. Investors also seem excited about Quantinuums first product, which involves a platform-agnostic and device-independent cybersecurity solution.

Honeywell announcedQ4 2021 results on Feb. 3. Revenue declined 3% YOY to $8.7 billion. Net income came in at $1.43 billion, or $2.05 per diluted share, up from $1.36 a year ago. Cash and equivalents ended the period at nearly $11 billion.

HON stock is down almost 8% over the past year and 6.8% year-to-date. Shares are trading at 22.4 times forward earnings and 3.9 trailing sales. Meanwhile, the 12-month median price forecast for HON stock stands at $220.

52-week range: $5.91 $11.37

Rigetti Computing has become a pioneer of full-stack quantum computing. It has launched a multi-chip processor for scalable quantum computing systems.

Its Quantum Cloud Services (QCS) platform serves global enterprises, various agencies of the U.S. government and leading research centers. Several of them include the National Aeronautics and Space Administration (NASA), the U.S. Department of Energy and Palantir Technologies (NYSE:PLTR).

Management announced fiscal 2021 results on March 10. Revenue increased by 48% YOY to $8.2 million. Net loss widened to $38.2 million, or a $1.74 loss per share, compared with $26.1 million a year ago. Cash and equivalents ended the period at $11.7 million.

The tech name was founded in 2013 and went public on March 2. Rigetti completed a reverse-merger with Supernova Partners Acquisition Company II, a special purpose acquisition company (SPAC).

This deal valued the company at $1.5 billion. Rigetti has received $261.75 million from the deal to accelerate its development of multiple generations of quantum processors and expand its commercial operations.

However, since going public, RGTI stock has lost more than 20%. Meanwhile, the 12-month median price forecast for the stock stands at $19. Investors interested in a young company could consider researching Rigetti further.

On the date of publication, Tezcan Gecgil did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to theInvestorPlace.comPublishing Guidelines.

Tezcan Gecgil has worked in investment management for over two decades in the U.S. and U.K. In addition to formal higher education in the field, she has also completed all 3 levels of the Chartered Market Technician (CMT) examination. Her passion is for options trading based on technical analysis of fundamentally strong companies. She especially enjoys setting up weekly covered calls for income generation.

Excerpt from:
3 Quantum Computing Stocks to Buy Before They Go Mainstream - InvestorPlace

Read More..

The New School at SXSW: The Art, Design, and Social Good of Quantum Computing – The New School News

March 24, 2022

Quantum Computing sounds like a program you would find at a university focused on technology, like MIT or Stanford, not a school known for its design, music, and social science programs. But understanding that this new technology is positioned to shape and change the world, The New School has embarked on an initiative to explore the applications of quantum computing in art, design, education, business, and social justice. Recently the individuals bringing quantum computing to The New School spoke at the 2022 South by Southwest Conference and Festivals (SXSW) to discuss how this emerging technology can be integrated into creative arts and applied to advance social good.

During the panel, Lin Zhou, senior vice president and chief information officer at The New School; Sven Travis, associate professor of media and design at Parsons School of Design; and Maya Georgieva, senior director of The New Schools Innovation Center, discussed the importance of having artists, designers, and social researchers participate in the early development of quantum computing.

Its extremely rare for creatives to get access to technology in the early days of development. One of the things were hoping for is that the evolution of quantum could happen in a different way than, for example, artificial intelligence and machine learning, said Travis. We can go back to any number of technologies over the last couple of decades where were getting access to it or engaging with it usually after the technology is fairly fully developed.

The computing were accustomed to, which drives laptops, desktops, websites, and smartphones, takes in information coded as the value of either 1 or 0. In contrast, quantum computing can take in information that exists in more than one state, such as a 1 and a 0 at the same time. The combination of The New Schools strength in liberal arts with this cutting-edge technology makes the new course different from those in traditional STEM university programs.

Although quantum computing is still an emerging field, the importance it will have prompted the university to be proactive in bringing this subject to students. Whenever there is a technology breakthrough, usually the leading uses are not in the liberal arts. If you think about artificial intelligence [AI], the leading uses for AI are in financial technology, cyber security, and facial and voice recognition. Liberal arts is usually an afterthought. When those problems are figured out, then they say, How about music? How about design? How about fashion? said Zhou. This has to stop, because the arts, music, and design impact peoples daily lives. Whenever we have a new technology, liberal arts ought to be one of the front-runners as new technology is adopted.

Many liberal arts and design colleges look at computer coding as the new literacy, but Zhou shared how creatives, social researchers and technologists should take a more holistic view toward technology. In the past, when we talked about literacy, we usually talked about reading and writing. But for this century, its not enough. When we talk about literacy, we actually mean that everybody should be able to create harmony with technology. Quantum, as the next emerging and breakthrough technology, has profound capability to solve problems that the classical computer cannot solve today. So, from the point of view of all the higher education institutions, we have the obligation to help society adopt the technology, said Zhou. We know if we dont do it right with privacy, with social justice, those issues, which seem to be very simple, will backfire on us.

Part of Georgievas mission is to engage the community with emerging technologies. The opportunity for us is to create events where people can come together, so that students can have a real conversation about their own ideas. Its important to us to give them that space, access to tools, and opportunities to play, said Georgieva.

Bringing an emerging technology, frontier technology, and code as a language, into a creative setting is really fascinating and opens up imaginative projects that may not necessarily take place in a lab. We want to see that impact. We want to be part of explaining what it would mean to live in a world where quantum computing and art is one expression, she adds.

Citing the universitys history of innovation and commitment to social change since its founding, Zhou believes The New School has an important role to play in the development of quantum computing. With The New School, for the past 100 years, we have produced world-class thinkers, designers, and social justice movers. We will continue to focus on leveraging quantum computing, this wonderful technology, on the social front, and leveraging the technology to improve the human condition, said Zhou.

Visit link:
The New School at SXSW: The Art, Design, and Social Good of Quantum Computing - The New School News

Read More..

Elderly care? Bring in the robots! – Modern Diplomacy

What is quantum computing? Why do we need quantum computing? According to Moores law (The complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months and hence quadruples every 3 years), the density of transistors per area unit on a computing chip doubles every year and a half, which poses two main problems for traditional computers. Firstly, as to computation, high-density transistors will face the problem of power consumption and thermal effects. Secondly, the reduction in size will cause the failure of the classic theory of transistors and their performance will deviate from the original design.

Both of these problems will limit the further shrinkage of transistors, thus putting an end to Moores law. However, although the traditional computer develops until the end of Moores law, it is still unable to cope with many problems that need to be solved. Let us say we calculate the fundamental state energy of N coupled two-level systems, since the number of unknowns will be proportional to 2^N. The current simulation time required for IBMs supercomputer is 2.5 days for a specific computation on Googles 53-qubit quantum computer, which takes about 200 seconds. Qubit is the contraction of quantum bit, the term coined by Benjamin Schumacher to denote the quantum bit, i.e. the basic unit of quantum information.

As the number of qubits continues to increase, conventional computers will soon reach a bottleneck. However, almost all conventional computations involving quantum mechanics face the same problems. Hence many researchers started thinking about how to use the quantum properties themselves as computational resources as early as 1970, which was then summarised by Richard Feynman in 1982.

Hence what advantages do qubits have over traditional computing? The most surprising is none other than the properties of quantum superposition and quantum entanglement. Quantum superposition is a non-classical state that contrasts with empirical intuition and the metaphor is Schrdingers Cat that is both alive and dead.

The superposition state, however, is a real state for qubits on microscopic or mesoscopic scales (spatial scales, viewpoints and the like that are intermediate between macroscopic and microscopic scales). Qubits can be found in the superposition of two characteristic quantum states, and this superposition state is a non-classical state in which being and non-being coexist in the quantum world. In this state, the qubit is neither 0 nor 1, but it is not in a state in which both sides (0 and 1) are uncertain, but rather with equal probability, like a coin before it lands on the palm of the hand.

While in visible nature it is possible to observe a phenomenon without perceptibly influencing it by observation alone (i.e. only by looking at the said phenomenon) in atomic physics and quantum mechanics, a finite and up to a certain point invisible perturbation is connected to every observation. The uncertainty principle is the recognition of absolute chance and arbitrariness in natural phenomena. On the other hand, as will become clear later, quantum mechanics does not predict a single, well-defined result for the observation or for any observer.

The fact that qubits can undergo quantum evolution in a set of superposition states which is neither 0 nor 1 implies quantum parallelism in the relevant computation. The evolution of each qubit, however, is not sufficient to construct all possible evolutions of a multi-qubit system. We must therefore

also interact with different qubits so that they can be intertwined in order to construct a satisfactory algorithm for such a computation. This special superposition is precisely called entangled quantum state.

Let us take two qubits as an example, which is a typical entangled state. Between them, the state representing the first qubit is connected to the state of the second qubit. The two connections are in quantum superposition and we cannot therefore talk about the state in which the two qubits are at that moment hence we talk about entanglement.

There is a more practical view of entanglement in quantum computing, i.e. entangled states usually arise from the control of one qubit (control qubit) over another (target qubit). The relationship between the control qubit and the target qubit is similar to the aforementioned Schrdingers Cat. According to this view, if the controlling part is in a state of superposition, the controlled part will be in a superposition of different controlled situations.

This entanglement process is an important element in quantum computing. We can say that superposition and entanglement synergistically weave the varied parallel evolution of quantum computing. Each measurement can only compute one of the possible states, and the superposition state no longer exists after the first measurement. Hence, with a view to obtaining the statistical information we need in the superposition state, we have to compute and measure results again.

Therefore, in many quantum algorithms (such as the Shors algorithm for factoring [which solves the problem of factor decomposition of integer numbers into primes] and digital quantum simulation), we need to use some interference mechanisms after the computation, so that the information of that phase containing the response in the superposition state is converted into conservation (with the implicit idea of preventing a final spill or loss) due to constructive interference (i.e. by the immediately following variation of other data produced), while further data is eliminated by destructive interference. In this way, the response can be obtained with fewer measurements. Most quantum algorithms rely heavily on the phenomenon of fluctuation and interference hence the relative phase is very important for quantum computing, which is called quantum coherence. In the hardware design of quantum computers, many considerations are related to how to protect the quantum state to prolong the coherence lifetime.

Quantum computers have a variety of hardware implementations, but the design considerations are similar. There are three common considerations: qubit operability, measurability, and protection of quantum states. In response to these considerations, a cavity quantum electrodynamics (cQED) system has been developed. A superconducting quantum system can be taken as an example to introduce the implementation of quantum computers. The difference in frequency between the resonant cavity and the qubit means that the coupling between the resonant cavity and the qubit tends not to exchange energy quanta, but only to generate entanglement, which means that the frequency of the resonant cavity will shift with the state of the qubit. Hence the state of the qubit can be deduced by measuring the microwave penetration or reflection spectrum near the resonant frequency with the bit readout line.

The entanglement mechanism between adjacent qubits is provided by the coupling relative to the electrical capacitance between cross-type capacitors. The coupling effect is controlled by the frequency difference between adjacent qubits. The oscillating behaviour reflects the quantum interference effect and its gradual disappearance leads to the decay of coherence and quantum energy.

The coherent lifetime of qubits is influenced by two factors, an intrinsic and an extrinsic one. The extrinsic influence comes mainly from the coupling between the qubit and the quantum state readout circuit. The presence of a filter-like protection mechanism in the microwave cavity between the bit and the readout line can provide a qubit-like protection mechanism because the cavity and the qubit have a frequency difference of about 718 MHz. The intrinsic influence comes mainly from the loss of the qubit itself and the sensitivity of its frequency to various types of noise, which can usually be suppressed by improved materials and processes and optimisation of the geometric structure.

Quantum computing has a wide range of applications, currently involved in the fields of decryption and cryptography, quantum chemistry, quantum physics, optimisation problems and artificial intelligence. This covers almost all aspects of human society and will have a significant impact on human life after practice. However, the best quantum computers are not yet able to express the advantages of quantum computing. Although the number of qubits on a quantum computer has exceeded 50, the circuit depth required to run the algorithm is far from sufficient. The main reason is that the error rate of qubits in the computation process is still very high, even though we can use quantum correction of qubits and fault-tolerant quantum computation. In the case of quantum computing, the accuracy which gradually improves data will greatly increase the difficulty of producing the hardware and the complexity of the algorithm. At present, the implementation of some well-known algorithms has only reached the level of conceptual demonstration, which is sufficient to demonstrate the feasibility of quantum computing, but practical application still has a long way to go.

But we should remain optimistic because, although general quantum computation still needs to be improved by quantum computer hardware, we can still find new algorithms and applications. Moreover, the development of hardware can also make great strides, just like the development of traditional computers in the beginning. In line with this goal, many existing technological industries could be upgraded in the near future. Research is running fast thanks also to significant public and private investment, and the first commercial applications will be seen in the short term.

Considering defence and intelligence issues, many governments are funding research in this area. The Peoples Republic of China and the United States of America have launched multi-year plans worth billions of yuan and dollars. The European Union has also established the Quantum Flagship Programme for an investment of one billion euros.

Related

Read the original here:
Elderly care? Bring in the robots! - Modern Diplomacy

Read More..

Get a lifetime of VPN protection plus 10TB of cloud storage and backup for $63 – ZDNet

StackCommerce ZDNet Recommends

The best mobile VPNs

Here's how to find an effective Virtual Private Network service for both iOS-powered iPhones and Android smartphones.

Read More

While our modern digital world offers almost unlimited convenience, it's nearly outweighed by the degree of risk we face, such as having our most sensitive information compromised online or losing what's stored locally on our computers. Fortunately, we can now protect our data permanently from both situations with the Lifetime Backup & Security Subscription Bundle. You can even use coupon code DOWNLOADNOW to save an additional 30% during our Best of Digital Sale and get it for only $62.99.

We know we need to back up the files on our computer because losing them would cause chaos. Degoo makes that effortless by backing up your data automatically, encrypting it and even replicating it at the same time. Degoo's Premium Plan also gives you a generous 10TB of cloud storage, which you can access for life.

Users are very happy with what Degoo offers. It's earned an average rating of 4.4 stars out of 5 from almost 600,000 reviews on Google Play and 4.5 stars out of 5 from over 6,500 reviews on Apple's App Store.

KeepSolid VPN Unlimited is the second half of this bundle, and it's the bestselling VPN of all time for many reasons. With this deal, you get unlimited speed and bandwidth on up to five devices, along with maximum security and privacy. That includes military-grade encryption, zero logging and a kill switch. And with more than 500 servers in over 80 locations around the globe, you can also enjoy your favorite content no matter where you are.

Reviewers and users alike love KeepSolid VPN. With more than 10 million customers, it has been named "Best VPN for Laptop" by Laptop Review Pro. Additionally, VPN Special notes, "KeepSolid VPN Unlimited offers amazing services, and its advanced features make it a solid VPN service provider."

Whether working from home, training for a new career orteaching online courses of your own, chances are you've been creating more files and spending more time online over the last couple of years than ever before. And that means backup storage and VPN protection are more critical than ever.

It's hard to imagine that you can buy so much peace of mind for so little. But you can use coupon code DOWNLOADNOW today to save an additional 30% during our Best of Digital Sale and get the Lifetime Backup & Security Subscription Bundle for only $62.99.

Read more:
Get a lifetime of VPN protection plus 10TB of cloud storage and backup for $63 - ZDNet

Read More..

4 Actionable Ways to Cut Your Organization’s Cloud Costs in Half – TechGenix

Enterprise cloud adoption has been soaring over the past decade. To top it off, traditional business models are still suffering from continued Covid restrictions. Thats why you may be looking for new ways to operate in a cloud-native world. Cloud expenses can also get problematic if you dont monitor them. That can lead cloud bills to skyrocket. Thats why companies must reduce cloud costs and optimize usage. That said, reductions must reflect realistic cloud workloads and not result in an excessive reduction.

Learn how you can gain control over your cloud costs in this article. You can usually reduce these by half and end expensive mistakes. Lets begin with understanding your cloud pricing and where all the money is going.

Cloud costs can rapidly rise when companies dont have good strategies to deal with them. Thats why you want to, first, plan for and manage dozens of variables. You also want to address a complex web of cloud resources and know where your money is going. Here are the 2 main expenses that result in high cloud costs.

Companies usually fail to understand the basic differences between operating and capital expenses. Knowing the operating nature of the cloud and the capital-expense cost is critical.

Budgeting and managing cloud resources is a continuous process. This is because no visible capital expense exists in cloud services. Costs accumulate over time and depend on: usage, duration, size, and workload attributes.

You have the option to scale resources based on project requirements. Yet, administrators tend to overestimate the resources they need for a given workload. This margin of safety ends up costing companies money for services that arent running.

You may see overprovisioning when resources exist but arent accessible. This happens when projects arent archived. Users either leave it available just in case or forget to clean up after themselves.

Instances end up running even after theyre no longer needed. One person might also spin up an instance for a short while and forget to shut it down for hours, days, or months at a time. That may happen when companies focus on deadlines more than efficiency.

That leads storage, data volumes, and costs to grow over time. Charges also accumulate every passing month through compounding. Developers, in turn, might have an idea of how much storage they have but not how often they use it. That leads to overpaying for high-availability access, which might not be useful.

Lets take a look at how exactly you can optimize your cloud costs.

When your cloud costs are always shifting and decision-making is remote, problems arise. Understanding expenses can get difficult along with utility. Thats why optimizing your cloud costs is becoming an important need for businesses that use it. To help you, try applying some kind of organizational policy across teams. Thatll ensure you cut your cloud costs in half.

The first thing to do is decide which applications and data need to go on the cloud. Use cloud-optimized software applications. That helps you take advantage of storage-tiering and dynamic scaling. That means leaving legacy applications locally.

Another way to optimize your cloud costs is to use cloud services that offer billing APIs. You can also use services that offer portals with visualization for budgeting.

Companies can also look at the resources and cost for applications before using them. This allows you to explore expiration dates and remove cloud resources that dont have tags.

FinOps is a new discipline that gets accountants and engineers to set usage policies. That provides a company with cost-effective decision-making. Centralized purchasing rates with various options also provide teams with cost visibility. FinOps also helps with resource accountability.

Lets now take a look at the top 4 cost-cutting methods you can use during the optimization process.

When cutting costs, you want a starting point. Thats why weve compiled the top 4 cloud cost-cutting methods for you to use.

To optimize your resource usage and cut costs, you can buy committed use discounts. They lock you into one or three-year commitments. Here, you get to decide the amount of money, memory, and other resources you want to buy at a discounted price. You can then distribute resources between machines and at a rate you specify. A committed use discount enables you to optimize your compute costs. Thats possible through analyzing your VM spending trends. You can then decide on either single or multiple contracts according to your need.

Googles Cloud Free Program (CFP) offers many features for effective cloud costing. That includes a 90-day $300 Free Trial that provides $300 in free Cloud Billing credits. The idea here is to help you to explore Google Cloud and Maps Platform services.

A Free Tier feature enables businesses to use certain Google Cloud products for free. It works by providing you with the full utility but with monthly usage limits. If you stay within the limits, you wont pay even after your trial ends.

The Projects feature helps you group cloud resources to understand your spending better.

You may also like the Google Cloud Billing feature. Thats a dashboard that explores the trends in your spending. It predicts how much youre likely to spend in the future, allowing you to change your spending habits.

Establishing budgets for cloud services and ensuring all the teams are aware of them is a must. A budget dashboard is a staple for teams using cloud services. You can set up specific budgeting for different services and display them to users.

You can also set automated alerts for spend thresholds. That enables relevant teams or users to inspect and analyze the overrun.

A cost breakdown report provides a complete view of what you spent. That includes details of on-demand costs and any usage credits. Tracking your savings can also help you become more efficient in the future.

Now you know all the key cost-cutting measures, lets look at the governance youll need.

Cloud governance defines how users work with cloud computing services daily. Some of its principles consist of budget, cost trends, and cost optimization policies. These rules help identify if a budget is enough. These also reduce costs by keeping you in the know about committed use discounts or credits.

Governance tools enable you to track the cost and cloud usage. They alert you when the total usage may exceed a limit. A robust governance solution will contain proactive protocols that automate tasks. That helps ensure user compliance with protocols. In turn, this saves your company both time and money.

When it comes to cloud costs and planning, every company has different needs. Your cloud strategy should optimize your workflow and increase efficiency. That said, managing cloud storage costs isnt as daunting as it seems.

You can benefit from cost-cutting methods like committed use discounts. These can save you money by helping you know your needs better. Googles Cloud Free Program and budget alertness can also help you. Implementing cost breakdown reporting ensures user and team accountability and transparency.

Dont pay more than you need to with cloud costs. Instead, save a lot by following the simple steps I mentioned in this post.

Googles Cloud Free Program is a free trial that provides $300 in free credits to explore features. It also provides 20+ free products and a few operations. Thats a great way for you to explore a cloud platforms capabilities without breaking the bank.

Its a cost management tool or service that provides a value-based pricing model for cloud usage. You can create an estimate with it for your cloud use cases and model a solution before building them. This is a handy tool when estimating cloud costs ahead of time.

Cloud storage is no different from virtual disk space measured in GB. You can buy storage for on-demand usage or a fixed period. GB/TB/PB measure storage usage and data transfer. These costs may seem small at the start but can spiral out of control if you dont monitor them.

Durable Reduced Availability (DRA) storage lets you store data at a lower cost. This storage type has a lower data availability than others. Its useful for longer-term storage needs for non-critical operations. You can expect DRA to have the same durability as standard storage pools. In general, it gives you more flexibility and options as you look to save on cloud storage costs.

A budget can either manage the entire Cloud Billing account or focus on a specific set of resources. That includes subaccounts, projects, products and services, and labels. This is useful to attribute costs to specific departments within your organization.

Yes, the number of projects on Google Cloud Platform (GCP) has a limit. Yet, once you reach your quota, you can request an increase by filling out a request form. In this form, youll need to specify the number of extra projects youll need. That said, thats only applicable to larger organizations. Remember you can always create separate accounts for each team to reduce the need to do this. Its also a great way to keep track of your cloud costs.

Visit the TechGenix website to stay up-to-date on popular technology news.

Understand Cloud Cost Management including its purpose, advantages, and best practices.

Read about the pricing wars between the big cloud vendors.

Check out 5 ways to stop your cloud costs from increasing here.

Get 5 smart strategies to reduce your cloud costs in this article.

Learn about the Google Cloud Free Program here.

Read more from the original source:
4 Actionable Ways to Cut Your Organization's Cloud Costs in Half - TechGenix

Read More..

Google Cloud is making a major change to its VMs – TechRadar

Google Cloud customers will now be able to suspend their virtual machines (VMs) when not in use which will help lower their cloud spending.

The software giant's cloud computing division has announced that its new Suspend/Resume feature for VMs is now generally available after launching in alpha several years ago.

The new feature works in a similar way to closing the lid of your laptop or putting your desktop PC to sleep. By suspending a Google Compute Engine VM, the the state of your instance will be saved to disk so that you can pick up later right where you left off.

The best part about Suspend/Resume in Google Cloud though is that customers will no longer need to pay for cores or RAM when their VMs are in a suspended state. However, they will still need to pay the cloud storage costs of their instance memory as well as other VM running costs like OS licensing but these may be reduced.

When a Google Cloud customer suspends an instance, an ACPI S3 signal is sent to the instance's operating system just like when you close a laptop's lid or put a PC to sleep.

The company makes the case that using this type of signal allows for broad compatibility with a wide selection of OS images so that customers don't have to use a cloud specific OS image or install daemons. At the same time, undocumented and custom OS images that respond to the ACPI S3 signal may also work with Google Cloud's Suspend/Resume feature.

It's also worth noting that storage is dynamically provisioned when Suspend is requested and is separate from the instance's boot disk. Other cloud providers require users to ensure that they have sufficient space in their boot disk to save instance states which may increase the costs of running VMs.

In a new blog post announcing the general availability of Suspend/Resume, Google Cloud also pointed out that the feature can be used by organizations to deal with demand spikes as they can initialize instances with their critical applications and then suspend them so that they can be resumed later. Although Compute Engine instances can be created quickly, resuming an instance is much faster than creating an entirely new instance.

Read the original:
Google Cloud is making a major change to its VMs - TechRadar

Read More..

Cloud Storage Market Analysis by Size, Business Strategies, Share, Growth, Trends, Revenue, Competitive Landscape and Developments Forecast by 2029 -…

The study and estimations of an excellent Cloud Storage Market report helps to figure out types of consumers, their views about the product, their buying intentions and their ideas for the step up of a product. With the market data of this report, emerging trends along with major drivers, challenges, and opportunities in the market for this industry can be identified and analysed. For the clear and better understanding of facts and figures, the data is represented in the form of graphs and charts. With the studies, insights, and analysis mentioned in the finest Cloud Storage market report; get comprehensible idea about the marketplace with which business decisions can be taken quickly and easily.

Market survey performed in Cloud Storage business report helps to unearth important information about the buyer personas, target audience, current customers, market, competition, and more e.g. demand for the product or service, potential pricing, impressions of the branding, etc. The report is prepared by using several steps such as surveys etc. This research contains a variety of question types, like multiple choice, rankings, and open-ended responses. It also has quantitative and short-answer questions that saves time and helps to more easily draw conclusions. The categories of questions that are requested in market survey while generating Cloud Storage marketing report include demographic, competitor, industry, brand, and product.

Download Sample Copy of the Report to understand the structure of the complete [emailprotected]https://www.databridgemarketresearch.com/request-a-sample/?dbmr=global-cloud-storage-market

Cloud storage market is expected to gain market growth in the forecast period of 2022 to 2029. Data Bridge Market Research analyses the market to rise up to the USD 1943.6 million by 2029 and to grow at a CAGR of 24.41% in the above-mentioned forecast period.

Cloud Storage Market Key Trends Analysis

The important factors influencing the growth of the Cloud Storage market have been examined in this report. The driving factors that are boosting demand for Cloud Storages and the restraining factors that are slowing growth of the Cloud Storage industry are addressed in depth, as well as their implications for the worldwide Cloud Storage market. In addition, the published analysis identifies and discusses in detail the trends that are driving the market and impacting its growth. In addition, other qualitative variables such as risks connected with operations and key problems faced by market players are covered in the report.

Cloud Storage Market Strategic Analysis

The market was studied using several marketing methodologies such as Porters Five Forces Analysis, player positioning analysis, SWOT analysis, market share analysis, and value chain analysis in the Cloud Storage market study. The market dynamics and factors such as the threat of a Cloud Storage substitute, the threat of new entrants into the Cloud Storage market, buyer bargaining power, supplier bargaining power to Cloud Storage providing companies, and internal rivalry among Cloud Storage providers are analysed in Porters Five Forces analysis to provide the reports readers with a detailed view of the current market dynamics.

This analysis assists report users in evaluating the Cloud Storage market based on various parameters such as economies of scale, switching costs, brand loyalty, existing distribution channels, capital investments, manufacturing rights & patents, government regulations, advertising impact, and consumer preference impact. This simplified data is expected to aid the industrys key decision-makers in their decision-making process. Furthermore, this study answers the crucial question of whether or not new entrants should enter the Cloud Storage industry.

Read Detailed Index of full Research Study @https://www.databridgemarketresearch.com/reports/global-cloud-storage-market

Leading Key Players Operating in the Cloud Storage Market Includes:

Some of the major players operating in the cloud storage market report are Amazon Web Services, Inc., Microsoft, IBM, Oracle, MongoDB, Inc., Rohde & Schwarz, Hewlett-Packard, Dell, Atlantic.Net, VMware, Cisco Systems, Inc., Data direct Networks, Swisslog Holding AG, Mecalux, S.A., KNAPP AG, Dematic, BEUMER GROUP, Bastian Solutions, Inc., TGW Logistics Group, Fritz SchAfer GmbH, Kardex Group, Daifuku Co., Ltd, Nilkamal and Murata Machinery, Ltd., and Verizon Terremarkamong others.

Key Market Segments:

Cloud Storage Market, By Region:

New Business Strategies, Challenges & Policies are mentioned in Table of Content, Request TOC @https://www.databridgemarketresearch.com/toc/?dbmr=global-cloud-storage-market

Cloud Storage Key Benefits over Global Competitors:

Some of the key questions answered in these Cloud Storage market reports:

Inquire Before Buying This Research [emailprotected]https://www.databridgemarketresearch.com/inquire-before-buying/?dbmr=global-cloud-storage-market

About Data Bridge Market Research, Private Ltd

Data Bridge Market ResearchPvtLtdis a multinational managementconsultingfirm with offices in India and Canada. As an innovative and neoteric market analysis and advisory company with unmatched durability level and advanced approaches. We are committed to uncover the best consumer prospects and to foster useful knowledge for your company to succeed in the market.

Data Bridge Market Research is a result of sheer wisdom and practice that was conceived and built-in Pune in the year 2015. The company came into existence from the healthcare department with far fewer employees intending to cover the whole market while providing the best class analysis. Later, the company widened its departments, as well as expands their reach by opening a new office in Gurugram location in the year 2018, where a team of highly qualified personnel joins hands for the growth of the company. Even in the tough times of COVID-19 where the Virus slowed down everything around the world, the dedicated Team of Data Bridge Market Research worked round the clock to provide quality and support to our client base, which also tells about the excellence in our sleeve.

Data Bridge Market Research has over 500 analysts working in different industries. We have catered more than 40% of the fortune 500 companies globally and have a network of more than 5000+ clientele around the globe. Our coverage of industries includes

Contact Us

US: +1 888 387 2818UK: +44 208 089 1725Hong Kong: +852 8192 7475Email [emailprotected]

The rest is here:
Cloud Storage Market Analysis by Size, Business Strategies, Share, Growth, Trends, Revenue, Competitive Landscape and Developments Forecast by 2029 -...

Read More..

Lori Borgman: Trying to get her head and data in the clouds – Daily Journal

Borgman

I have a long history of issues with my cloud. The main problem being that I cant wrap my head around it. Some people cant get their heads out of the clouds, I cant get mine in.

I suffer from the trap of the literal mind. I have to picture things. And not just food or sitting on a shoreline.

Once every week or so my phone tells me it failed to backup because there is not enough cloud storage. Then it prompts me to buy a bigger, better cloud. Why would I buy more of something that I cant comprehend now?

They want me to buy something I cant see. What next? A bridge in Jersey? Hey, I wasnt born yesterday.

Seeing is believing.

If I looked up at the sky and saw a cloud floating by with my name on it, or even just my initials on it, Id be good. I wouldnt even care if it were a cirrus, cumulus, stratus or nimbus although one of those huge anvil clouds would be cool.

It would also be nice to see whose cloud is next to my cloud and if there is any cloud aggression going on. That way I could yell, Hey! You! Get off of my cloud! The Rolling Stones were in cyberspace before cyberspace was cool.

Its the metaphor that is the problem. Yes, I understand that my calendar, documents, photos, emails and many things are in a cloud, but a cloud is puffy. A cloud can evaporate and dissolve into nothingness. Why would I want to store my life in something wispy? A vault or a safe room, maybe; a cloud, no.

I would do better if the message on my phone said, Your reinforced steel file cabinet in the sky is full and you need a bigger one, so pay up.

Work stored in a file cabinet is easy to imagine. A file cabinet is tangible, it holds things lots of things and you can even lock it.

For example, I know where all my tax records are. I know where my supporting receipts and invoices are. Theyre upstairs in a two-drawer file cabinet where both drawers are jammed full and completely inaccessible courtesy of a shoe rack.

I may not be able to open the file cabinet, but I know where the file cabinet is. And thats why the cloud wins. I may not know where my cloud is, but I can access its contents, which I understand are stored on a giant server called a Lexus. Or a Linux. Again, a Lexus I can picture, a Linux I cannot.

Id be happy with an arrow on a map of the sky marking the Lexus holding my large file cabinet that says, You are here.

That I can visualize.

Excerpt from:
Lori Borgman: Trying to get her head and data in the clouds - Daily Journal

Read More..

Qumulo opens the door to Kubernetes with CSI Blocks and Files – Blocks and Files

Scale-out file system supplier Qumulo has made its file services available to Kubernetes-orchestrated containers via a CSI driver plug-in.

CSI, the Container Storage Interface, enables stateless containers orchestrated by Kubernetes to request and consume storage services such as volume provisioning, data read and writes, and protection from external storage. They effectively support statefulness. Qumulos Core product provides scale-out file services and runs in its own on-premises appliances, third-party servers in its Server Q form, and also, in its Cloud Q incarnation, in the public clouds AWS, Azure, and GCP.

Sean Whalen, Qumulo senior cloud product marketing manager, wrote in a blog: Now, customers innovating using Kubernetes dont have to set up a storage interface each time a cluster is set up or knocked down the process is automatic and provides the containerized application maximum exposure to the Qumulo analytics so that customers can easily understand whats happening across their stored data.

The CSI driver is Qumulo production preview software and provides exposure to Qumulo analytics for containerized applications so that customers can understand whats happening across their stored data.

Kubernetes operates a cluster of machines, starting and stopping containers on behalf of its users. CSI allows the Kubernetes orchestrator and individual containers to connect to external (persistent) storage. Qumulo storage will automatically deploy inside a new container and supports the movement of storage from container to container and machine to machine.

Ben Gitenstein, VP of product at Qumulo, said: Qumulos new CSI driver enables customers to store unstructured data once but serve it to an infinite number of both native applications and container-based microservices all without moving data, copying it to disparate systems, or changing their workloads. Customers who store their data on Qumulo can now focus their time on building modern applications, not on moving or managing their data.

Qumulo is not alone here. CSI driver support is table stakes for external storage suppliers. Dells PowerScale/Isilon already supports CSI as do HPEs Primera and Alletra products, IBMs FlashSystem, NetApps ONTAP software, Pure Storage, and Weka with its scale-out, parallel filesystem software.

Beyond CSI, external storage software can be made into a container itself. Examples are Pures Portworx, MayaDatas OpenEBS Mayastor product, Ondat (rebranded StorageOS),and Robin.ios cloud-native storage. These storage containers execute inside a servers environment and link to the servers own physical storage or to external storage.

StorageOS, for example, aggregates the local disk storage in a cluster of servers (nodes) into one or more virtual block storage pools. Storage in the pool is carved out into virtual volumes and app containers in the nodes mount and access these virtual volumes via the storage container.

When executing in the public clouds, they would use the CSPs storage services. Either on-premises or in the public clouds Kubernetes will be used to orchestrate and manage the storage containers as well as the application containers for DevOps users.

A storage container runs like any other app container with no dependencies on proprietary kernels, hardware, storage protocols or other layered services customers are freed from lock-in to these things. In theory, a storage container should respond more quickly to app container requests for storage services as the link is direct rather than hopping across network links to an external storage system. The storage container should also scale out beyond the limit of, for example, a dual-controller array.

Storage consultant Chris Evans has said: I doubt any storage array could cope with creating and destroying hundreds of volumes per hour (or more), whereas on (Ondat) StorageOS, those constructs are mainly in software on the same node as the application, so can be created/destroyed in milliseconds.

It seems possible that there will be a phase 2 in Qumulos support of containerization, with its Core software eventually becoming cloud-native itself.

Visit link:
Qumulo opens the door to Kubernetes with CSI Blocks and Files - Blocks and Files

Read More..