Category Archives: Cloud Computing

Qarnot Computing raises $6.5 million to heat buildings with wasted energy from cloud computing – VentureBeat

Qarnot Computing has raised $6.5 million for its system that captures the heat from computers and repurposes it for residential and business climate systems.

Rather than build a centralized datacenter, which requires huge amounts of energy to cool, Qarnot has developed a distributed architecture that places its processing machines in peoples homes in units that look like a typical radiator. The computers are networked to provide high-power cloud computing for clients, and the heat they generate is used to warm a home or apartment.

Qarnot has drawn praise for its innovative approach to tackling the massive energy consumption thats created as more services move online and the number of power-hungry datacenters explodes.

The Paris-based company has been working on its residential cloud computers, QH1, for several years. With microprocessors embedded in the back, the units are sold to apartment buildings and placed inside each unit. The machines are connected by fiber optic cables, and Quarnot sells the service as a green cluster of cloud computing. Residents can control the release of heat, much as they would any traditional climate system. In its home market, Qarnot says it now heats 1,000 housing units, including an entire building in Bordeaux.

More recently, the company has unveiled an industrial version that basically turns the cloud computing system into a water boiler. The QB1digital boiler captures the heat released by 24 servers and uses it to heat water circulating in boiler pipes attached to the machines. These are targeted for large commercial buildings.

The digital boiler is now being used by Casino, one of Frances largest grocery store chains, to heat one of its warehouses.

On the cloud computing side, the company has now signed up several major French banks, including BNP Paribas, Socit Gnrale, and Natixis. And it also landed a partnership with Illumination Mac Guff, the Paris-based animation studio that is owned by Universal Pictures and makes the Despicable Me movies.

Qarnot president Paul Benoit said the latest funding will be used for continued research, product development, and sales expansion.

Read more from the original source:
Qarnot Computing raises $6.5 million to heat buildings with wasted energy from cloud computing - VentureBeat

Can cloud computing sustain the remote working surge? – Tech Wire Asia

Microsofts collaboration platform Teams has showed the strain. Source: Shutterstock

With most of the global, white-collar workforce now working remotely, cloud technology has been many businesses savior. In one way or another, these services are enabling people to interact with one another, and access the tools they need to do their job.

The COVID-19 pandemic is somewhat of a stress test for cloud computing.

While there has long been a mass migration to cloud services by degrees between businesses, the demands of the crisis have certainly seen a recent spike. With an endnowhere in sight, can they cope with this massive workload in the long run?

Cloud-based collaboration tool Microsoft Teamshas already shown the strain its under with multiple reports of messaging-related functionality problems due to increased workload being managed in the backend.

Chances are that most organizations will be utilizing public cloud at some level. Affordability, scalability and ease of maintenance make it very appealing as a first-line solution.

Some rely entirely on it, and others deploy it together with private clouds to create a hybrid cloud strategy.

But using public clouds has its drawback. There is, of course, the issue of security for one, but during these times, there is also the issue of capabilities.

It might be limited more connections mean more data center usage, and data center infrastructures might not be able to cope with this surging demand.

Even as the electronics supply chain in China and Korea is slowly recovering, there are already signs of shortages and higher prices for data centers memory and storage products.

There are also concerns about data rationing.

Currently, cloud providers are still doing a good job in distributing resources among tenants, but at some point rationing measures may need to be implemented to respond to overwhelming demand.

Not all cloud services are going to drown though. Matthew Prince, co-founder and CEO of Cloudflare, said that providers may have individual challenges spurred by the pandemic their ability to cope with the shift in usage is highly dependent on their IT architecture.

Major cloud providers such as Amazon have expressed confidence in meeting customer demand for capacity.

By and large, public cloud providers seem to be coping well with the skyrocketing demand there has yet to be any issues of major cloud crashes just yet.

What providers should really be concerned about is the challenges that will come post-pandemic.

By then, enterprises would have already recognized the unquestionable value of cloud, and will double down on cloud migrations.

Cloud providers must make sure that their data infrastructure is prepared to support data at unprecedented scales.

Warren Buffet once remarked: you will only find out who is swimming naked when the tide goes out.

The collateral impact of COVID-19has already shown some cloud service providers havent been wearing the most snug of swimwear and, as businesses realize the value of cloud can have on their businesses now, they better be ready to accommodate more business in the post-pandemic surge.

Emily Wong

Emily is a tech writer constantly on the lookout for cool technology and writes about how small and medium enterprises can leverage it. She also reads, runs, and dreams of writing in a mountain cabin.

Read the original post:
Can cloud computing sustain the remote working surge? - Tech Wire Asia

The Cloud Native Computing Foundation Adds 81 Members to Reach New Heights – Yahoo Finance

Foundation welcomes new members including Cyber Armor, Monzo, Twitter and Ubisoft to help define the future of the cloud native ecosystem

SAN FRANCISCO, March 31, 2020 /PRNewswire/ --The Cloud Native Computing Foundation(CNCF), which builds sustainable ecosystems for cloud native software, today announced that 81 new members and end user supporters have joined the Foundation, bringing total membership to 560 organizations.

This significant increase in membership comes on the heels of CNCF's new 2019 survey which gathered insights from almost 1,400 developers. The survey found that usage in production of nearly all 24 CNCF projects has increased. Notably, 78 percent of respondents indicated they are using Kubernetes in production, a 20 percent increase from last year's findings.

"We are once again amazed by the level of interest, diversity, and caliber of joining companies," said Dan Kohn, executive director of the Cloud Native Computing Foundation. "Over 200 new members including technology vendors, end users, and non-profit organizations have joined CNCF in the last year alone. Thanks to their support, we can provide a number of services as well as a neutral home for our growing number of diverse open source, cloud native projects that are solving some of today's most pressing issues."

These new members will also join CNCF this summer for the upcoming 2020 KubeCon + CloudNativeCon events, including KubeCon + CloudNativeCom EUin Amsterdam from August 13 16, 2020, and KubeCon + CloudNativeCon NAfrom November 17 20, 2020 in Boston.

About the newest Silver Members:

About the newest Nonprofit Member:

About the newest End User Members & Supporters:

With the addition of these new members, there are now over 140 organizations in the CNCF End User Community. This group regularly meets to share adoption best practices and feedback on project roadmaps and future projects for CNCF technical leaders to consider.

Additional Resources

About Cloud Native Computing FoundationCloud native computing empowers organizations to build and run scalable applications with an open source software stack in public, private, and hybrid clouds. The Cloud Native Computing Foundation (CNCF) hosts critical components of the global technology infrastructure, including Kubernetes, Prometheus, and Envoy. CNCF brings together the industry's top developers, end users, and vendors, and runs the largest open source developer conferences in the world. Supported by more than 500 members, including the world's largest cloud computing and software companies, as well as over 200 innovative startups, CNCF is part of the nonprofit Linux Foundation. For more information, please visit http://www.cncf.io.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademarkusage page. Linux is a registered trademark of Linus Torvalds.

Media ContactJessie Adams-ShoreThe Linux FoundationPR@CNCF.io

View original content to download multimedia:http://www.prnewswire.com/news-releases/the-cloud-native-computing-foundation-adds-81-members-to-reach-new-heights-301032448.html

SOURCE Cloud Native Computing Foundation

See more here:
The Cloud Native Computing Foundation Adds 81 Members to Reach New Heights - Yahoo Finance

Which [r]evolution lies ahead for cloud computing in Southeast Asia? – DatacenterDynamics

Hybrid cloud will continue to appeal to businesses

Business appetite for hybrid cloud has grown significantly in 2019. The challenges faced by businesses in terms of new skills, new application needs, legacy IT management etc. are constantly increasing as businesses realize that cloud computing is no panacea. What is at stake is the significant cost associated with the extensive use of public cloud services and the ever more critical need for data control and security. Against this backdrop, businesses are turning away from exclusive public cloud offerings to move part of their data back to a private cloud. On the other hand, they are abandoning on-premises cloud computing in favor of a hosted private cloud service that combines the best of both worlds - greater cost control and a higher level of security, all with the elasticity and scalability of the cloud.

According to the Nutanix Cloud Enterprise Index, 92 percent of IT decision-makers say this type of infrastructure best meets their needs. The same report also revealed that Singapore is the leading nation in terms of hybrid cloud adoption because of the countrys superior connectivity, in turn providing organizations based here a solid foundation to capitalize on such a technology and remain competitive in our digital economy.

Following in the footsteps of hybrid cloud but going one step further, there comes multi-cloud - a combination of cloud environments ranging from on-premises cloud to hosted private cloud to public cloud, each dedicated to different use cases. Given that no single cloud today can competitively provide for all solutions, the most mature businesses find in multi-cloud the promise of excellence - selecting the best solutions from the entire cloud offering available to build a single application environment, in which all components are interdependent.

A business can choose to host its database with one provider, turn to another provider for its compute needs, store its data in yet another location, and orchestrate everything in a multi-cloud architecture. As applications become less and less monolithic and their components communicate in an increasingly standardized way, it is a safe bet that multi-cloud has a bright future ahead of it.

Previously, data security solutions focused on storage or networking capabilities. For example, if you wanted to store encryption keys securely, you had to rely on an HSM (Hardware Security Module), a monolithic solution that was poorly aligned with the cloud concept. The ability to secure data in use, called Confidential Computing, is a big leap forward. More processors will embed this capability, which will therefore be increasingly available in infrastructures.

Organizations are now able to store and run all or part of software programs that require end-to-end security, thus greatly improving the security of data encryption and, in turn, of entire systems. Data encryption will be more readily available, whether for data in transit or at rest, to enhance data security and give businesses much needed peace of mind at a time cyber breaches are becoming increasingly costly. Last year, it was estimated that the average organizational cost of breach in ASEAN stands at US$2.62 million.

With the introduction of data protection regulations and increased public awareness of this issue, businesses have realized the strategic nature of data sovereignty for themselves. The issue of the legal framework for data goes beyond the scope of cloud providers alone and also affects businesses that use cloud solutions. Local initiatives are multiplying to set the rules for a trusted cloud, which meets everyone's expectations in terms of data sovereignty. Taking as example the recent French-German Gaia-X project, it would not be surprising that in 2020, private as well as public organizations were to favor their regional ecosystem face over the American-Chinese duopoly. We should see the development of new collaborative projects allowing the implementation of more local alternatives, made possible by a collective awareness by European vendors of their ability to provide a relevant cloud offering for the Southeast Asia market.

Many other topics could have been addressed here, such as open source, blockchain, AI and machine learning, but also applications related to smart cities, autonomous cars and connected health. These technologies and fields of application involve the storage, exchange, and processing of a large - sometimes quite large - amount of data, and are still in their infancy. In any case, one thing is for sure; society is evolving and cloud computing will continue to evolve as well, in order to better support it. ASEAN, being the fastest-growing Internet market in the world, offers numerous opportunities for businesses here, however, to capitalize on such, there is a need to not only understand the current state of cloud computing but also pay close attention to its evolution in order to stay ahead of competitors.

More:
Which [r]evolution lies ahead for cloud computing in Southeast Asia? - DatacenterDynamics

4 Reasons Why Cloud Technology Is Inseparable from Our Daily Lives – IT News Africa

As we head deeper into 2020, speculation, predictions and technology foresight are everywhere. But when it comes to cloud computing, no other technology has seen such rapid adoption across both the business and consumer sphere.

Cloud has become so indiscernible from our everyday existence thats why speaking about its future as a standalone technology doesnt always make sense. Canon SA suggests four reasons why cloud technology is so important for our lives, including:

1.Transforming society

What is the future of our society? What does it look like, and what does the cloud have to do with it? Cloud will be a huge driving force in the futurization of towns, workplaces, institutions, and people because it will provide the digital infrastructure of tomorrows smart cities (where an estimated 68% of the worlds population will live by 2050).

Our cities are becoming smarter, but each new connected device creates data that has to be stored and analyzed. It will only be possible to support connected technology at this scale, with a combination of edge technology and cloud computing. With this digital infrastructure in place, smart elevators and parking lots, driverless cars and drone taxis, trains and subways, farms and power plants will all be part of the functioning fabric of everyday life.

The cloud will also support emerging technologies such as artificial intelligence and help them to adapt to new platforms such as mobile. For example, while AI has already found its way onto mobile phones, these devices contain a lot of unstructured data such as emails, text messages, and photos. Analyzing unstructured data takes time and processing power that most smartphones dont have locally. With cloud powering the computing, we can expect our phones to become even smarter.

2. Transforming business

Cloud has become indispensable for businesses. One of the key reasons for this is how it has impacted innovation. Without the need for physical infrastructure and the operational and labour costs that come with it, cloud technology removes the typical financial barriers to innovation and digital transformation. Smaller businesses who would traditionally struggle to come up with the upfront investment required for on-premise implementations can access new technology through cloud delivery models.

Meanwhile, cloud has also reduced the risks associated with investment. Expensive, rigid contracts can be a barrier for many smaller companies. The scalability of cloud computing means offerings can grow or shrink back depending on the needs of the company, helping to manage costs and financial risk. These flexible, cloud-based models will continue to grow in popularity, with predictions stating that by the end of 2020, all new companies and 80% of historical vendors will offer subscription-based business models.

Cloud computing also encourages innovation because its speed makes it easy to experiment with new ideas, as feedback can be gathered quickly. If a strategy isnt working it can be corrected quickly, rather than waiting until it has failed to take stock and learnings. This allows businesses to innovate more freely. Meanwhile, if businesses spot opportunities within the market, a cloud infrastructure allows them to respond and harness these opportunities more rapidly.

With so much opportunity in the cloud, businesses will continue to transition, with the worldwide public cloud services market projected to grow 17.5% in 2019 alone, with no sign of abating.

3. Impacting information

At the heart of every business is information. How we access it, harness it and share it is closely tied to how successful any organization can be. And this is where cloud has truly made an impact.

Thanks to cloud, business workers are no longer tied to the office and can access information and collaborate on projects from anywhere in the world, in real-time. This has revolutionized business models. For example, European company OpenDesk uploads furniture designs to the cloud and lets customers download the designs and commission a local manufacturer to build it in their region. This lowers shipping and inventory costs while reducing the companys carbon footprint.

Cloud-based platforms have also enabled businesses to be more efficient. As organizations grow they tend to become ever more siloed, with teams evolving idiosyncratic ways of working and ways of sharing information. Cloud helps companies bridge the gap, allowing all workers to access one place for everything they need. This also makes it more straightforward to create cross-company workflows, where before workers might have doubled up or lost sight of documents as they progressed.

4. Data deluge

An irony of all of this is that the more we use cloud solutions, the more we need them. The so-called data deluge can be attributed to how much we access cloud-based services in our daily lives. Using online systems, social networking, sharing videos, capturing traffic flow, collaborating with colleagues these all add up to vast quantities of data each person and each business generates for themselves and it doesnt even touch on the data created by healthcare, education, science, and the military. All this information needs computing power to manage, store and analyze it for which we need cloud.

This is why talking about the future of cloud is so erroneous what we really need to be talking about is the future of society and the future of business. The future of our cities, our workplaces and institutions are heavily dependent on cloud technology and this will only become truer, as more services go serverless and digital infrastructures become more advanced.

It is not enough to say that the future of the cloud is heading in a certain direction. Rather, it would be more correct to make predictions about the future of humanity, society and business and how cloud technology will rise to meet those challenges.

Edited by Luis Monzon

FollowLuis Monzonon Twitter

FollowIT News Africaon Twitter

Go here to see the original:
4 Reasons Why Cloud Technology Is Inseparable from Our Daily Lives - IT News Africa

4 things you need to understand about edge computing – VentureBeat

Edge computing has claimed a spot in the technology zeitgeist as one of the topics that signals novelty and cutting-edge thinking. For a few years now, it has been assumed that this way of doing computing is, one way or another, the future. But until recently the discussion has been mostly hypothetical, because the infrastructure required to support edge computing has not been available.

That is now changing as a variety of edge computing resources, from micro data centerstospecialized processorstonecessary software abstractions, are making their way into the hands of application developers, entrepreneurs, and large enterprises. We can now look beyond the theoretical when answering questions about edge computings usefulness and implications. So, what does the real-world evidence tell us about this trend? In particular, is the hype around edge computing deserved, or is it misplaced?

Below, Ill outline the current state of the edge computing market. Distilled down, the evidence shows that edge computing is a real phenomenon born of a burgeoning need to decentralize applications for cost and performance reasons. Some aspects of edge computing have been over-hyped, while others have gone under the radar. The following four takeaways attempt to give decision makers a pragmatic view of the edges capabilities now and in the future.

Edge computing is a paradigm that brings computation and data storage closer to where it is needed. It stands in contrast to the traditional cloud computing model, in which computation is centralized in a handful of hyperscale data centers. For the purposes of this article, the edge can be anywhere that is closer to the end user or device than a traditional cloud data center. It could be 100 miles away, one mile away, on-premises, or on-device. Whatever the approach, the traditional edge computing narrative has emphasized that the power of the edge is to minimize latency, either to improve user experience or to enable new latency-sensitive applications. This does edge computing a disservice. While latency mitigation is an important use case, it is probably not the most valuable one. Another use case for edge computing is to minimize network traffic going to and from the cloud, or what some are calling cloud offload, and this will probably deliver at least as much economic value as latency mitigation.

The underlying driver of cloud offload is immense growth in the amount of data being generated, be it by users, devices, or sensors. Fundamentally, the edge is a data problem, Chetan Venkatesh, CEO of Macrometa, a startup tackling data challenges in edge computing, told me late last year. Cloud offload has arisen because it costs money to move all this data, and many would rather not move it to if they dont have to. Edge computing provides a way to extract value from data where it is generated, never moving it beyond the edge. If necessary, the data can be pruned down to a subset that is more economical to send to the cloud for storage or further analysis.

A very typical use for cloud offload is to process video or audio data, two of the most bandwidth-hungry data types. A retailer in Asia with 10,000+ locations is processing both, using edge computing for video surveillance and in-store language translation services, according to a contact I spoke to recently who was involved in the deployment. But there are other sources of data that are similarly expensive to transmit to the cloud. According to another contact, a large IT software vendor is analyzing real-time datafrom its customers on-premises IT infrastructure to preempt problems and optimize performance. It uses edge computing to avoid backhauling all this data to AWS. Industrial equipment also generates an immense amount of data and is a prime candidate for cloud offload.

Despite early proclamations that the edge would displace the cloud, it is more accurate to say that the edge expands the reach of the cloud. It will not put a dent in the ongoing trend of workloads migrating to the cloud. But there is a flurry of activity underway to extend the cloud formula of on-demand resource availability and abstraction of physical infrastructure to locations increasingly distant from traditional cloud data centers. These edge locations will be managed using tools and approaches evolved from the cloud, and over time the line between cloud and edge will blur.

The fact that the edge and the cloud are part of the same continuum is evident in the edge computing initiatives of public cloud providers like AWS and Microsoft Azure. If you are an enterprise looking to do on-premises edge computing, Amazon will now send you an AWS Outpost a fully assembled rack of compute and storage that mimics the hardware design of Amazons own data centers. It is installed in a customers own data center and monitored, maintained, and upgraded by Amazon. Importantly, Outposts run many of the same services AWS users have come to rely on, like the EC2 compute service, making the edge operationally similar to the cloud. Microsoft has a similar aim with its Azure Stack Edge product. These offerings send a clear signal that the cloud providers envision cloud and edge infrastructure unified under one umbrella.

While some applications are best run on-premises, in many cases application owners would like to reap the benefits of edge computing without having to support any on-premises footprint. This requires access to a new kind of infrastructure, something that looks a lot like the cloud but is much more geographically distributed than the few dozen hyperscale data centers that comprise the cloud today. This kind of infrastructure is just now becoming available, and its likely to evolve in three phases, with each phase extending the edges reach by means of a wider and wider geographic footprint.

Phase 1: Multi-Region and Multi-Cloud

The first step toward edge computing for a large swath of applications will be something that many might not consider edge computing, but which can be seen as one end of a spectrum that includes all the edge computing approaches. This step is to leverage multiple regions offered by the public cloud providers. For example, AWS has data centers in 22 geographic regions, with four more announced. An AWS customer serving users in both North America and Europe might run its application in both the Northern California region and the Frankfurt region, for instance. Going from one region to multiple regions can drive a big reduction in latency, and for a large set of applications, this will be all thats needed to deliver a good user experience.

At the same time, there is a trend toward multi-cloud approaches, driven by an array of considerations including cost efficiencies, risk mitigation, avoidance of vendor lock-in, and desire to access best-of-breed services offered by different providers. Doing multi-cloud and getting it right is a very important strategy and architecture today, Mark Weiner, CMO at distributed cloud startup Volterra, told me. A multi-cloud approach, like a multi-region approach, marks an initial step toward distributed workloads on a spectrum that progresses toward more and more decentralized edge computing approaches.

Phase 2: The Regional Edge

The second phase in the edges evolution extends the edge a layer deeper, leveraging infrastructure in hundreds or thousands of locations instead of hyperscale data centers in just a few dozen cities. It turns out there is a set of players who already have an infrastructure footprint like this: Content Delivery Networks. CDNs have been engaged in a precursor to edge computing for two decades now, caching static content closer to end users in order to improve performance. While AWS has 22 regions, a typical CDN like Cloudflare has 194.

Whats different now is these CDNs have begun to open up their infrastructure to general-purpose workloads, not just static content caching. CDNs like Cloudflare, Fastly, Limelight, StackPath, and Zenlayer all offer some combination of container-as-a-service,VM-as-a-service,bare-metal-as-a-service, andserverless functions today. In other words, they are starting to look more like cloud providers. Forward-thinking cloud providers like Packet and Ridge are also offering up this kind of infrastructure, and in turn AWS has taken an initial step toward offering more regionalized infrastructure, introducing the first of what it calls Local Zones in Los Angeles, with additional ones promised.

Phase 3: The Access Edge

The third phase of the edges evolution drives the edge even further outward, to the point where it is just one or two network hops away from the end user or device. In traditional telecommunications terminology this is called the Access portion of the network, so this type of architecture has been labeled the Access Edge. The typical form factor for the Access Edge is a micro data center, which could range in size from a single rack to roughly that of a semi trailer, and could be deployed on the side of the road or at the base of a cellular network tower, for example. Behind the scenes, innovations in things like power and cooling are enabling higher and higher densities of infrastructure to be deployed in these small-footprint data centers.

New entrants such as Vapor IO, EdgeMicro, and EdgePresence have begun to build these micro data centers in a handful of US cities. 2019 was the first major buildout year, and 2020 2021 will see continued heavy investment in these buildouts. By 2022, edge data center returns will be in focus for those who made the capital investments in them, and ultimately these returns will reflect the answer to the question: are there enough killer apps for bringing the edge this close to the end user or device?

We are very early in the process of getting an answer to this question. A number of practitioners Ive spoken to recently have been skeptical that the micro data centers in the Access Edge are justified by enough marginal benefit over the regional data centers of the Regional Edge. The Regional Edge is already being leveraged in many ways by early adopters, including for a variety of cloud offload use cases as well as latency mitigation in user-experience-sensitive domains like online gaming, ad serving, and e-commerce. By contrast, the applications that need the super-low latencies and very short network routes of the Access Edge tend to sound further off: autonomous vehicles, drones, AR/VR, smart cities, remote-guided surgery. More crucially, these applications must weigh the benefits of the Access Edge against doing the computation locally with an on-premises or on-device approach. However, a killer application for the Access Edge could certainly emerge perhaps one that is not in the spotlight today. We will know more in a few years.

Ive outlined above how edge computing describes a variety of architectures and that the edge can be located in many places. However, the ultimate direction of the industry is one of unification, toward a world in which the same tools and processes can be used to manage cloud and edge workloads regardless of where the edge resides. This will require the evolution of the software used to deploy, scale, and manage applications in the cloud, which has historically been architected with a single data center in mind.

Startups such as Ori, Rafay Systems, and Volterra, and big company initiatives like Googles Anthos, Microsofts Azure Arc, and VMwares Tanzu are evolving cloud infrastructure software in this way. Virtually all of these products have a common denominator: They are based on Kubernetes, which has emerged as the dominant approach to managing containerized applications. But these products move beyond the initial design of Kubernetes to support a new world of distributed fleets of Kubernetes clusters. These clusters may sit atop heterogeneous pools of infrastructure comprising the edge, on-premises environments, and public clouds, but thanks to these products they can all be managed uniformly.

Initially, the biggest opportunity for these offerings will be in supporting Phase 1 of the edges evolution, i.e. moderately distributed deployments that leverage a handful of regions across one or more clouds. But this puts them in a good position to support the evolution to the more distributed edge computing architectures beginning to appear on the horizon. Solve the multi-cluster management and operations problem today and youre in a good position to address the broader edge computing use cases as they mature, Haseeb Budhani, CEO of Rafay Systems, told me recently.

Now that the resources to support edge computing are emerging, edge-oriented thinking will become more prevalent among those who design and support applications. Following an era in which the defining trend was centralization in a small number of cloud data centers, there is now a countervailing force in favor of increased decentralization. Edge computing is still in the very early stages, but it has moved beyond the theoretical and into the practical. And one thing we know is this industry moves quickly. The cloud as we know it is only 14 years old. In the grand scheme of things, it will not be long before the edge has left a big mark on the computing landscape.

James Falkoff is an investor with Boston-based venture capital firm Converge.

Read the original post:
4 things you need to understand about edge computing - VentureBeat

Microsoft ‘might be the best tech stock in this market,’ Jim Cramer says – CNBC

Demand for cloud computing services has spiked during the coronavirus outbreak and Microsoft has been a "huge beneficiary from the lockdowns," CNBC's Jim Cramer said Monday.

"Microsoft's stock is a buy. Of course, I'd like it to come down after this gigantic rally, but Microsoft it might be the best tech stock in this market," the "Mad Money" host said.

Since reaching a low of $132.52 during Wall Street's slide into a bear market, shares of Microsoft have rallied more than 20% to $160.23 as of Monday's close.

On Saturday, the software behemoth's Azure cloud business revealed that cloud usage spiked triple digits on a weekly basis in communities under stay-at-home and social distancing mandates. Cloud services demand soared 775% in those areas, the company said in a blog post. Elsewhere, Microsoft Teams saw a "very significant spike" in usage. About 44 million users logging more than 900 million meeting and calling minutes in each of those days on the collaboration platform.

Windows Virtual Desktop and Power BI traffic usage were also up. The data was released amid the exodus of employees from offices and students from classrooms to remote work and learning environments in efforts to help slow the spread of COVID-19.

"The coronavirus has created a magnificent bull market in cloud computing," Cramer said. "These [Microsoft] numbers are astounding."

Still, Microsoft is not clear of the impact of the fast-spreading virus on the broader economy. The software company, along with other big technology names, are prone to headwinds in a global economy that's been knocked off base. As nonessential businesses across the country have been forced to close, unemployment claims are up significantly with millions out of work.

As many as 47 million people could lose their jobs, sending the unemployment rate in the United States soaring to 32.1%, according to projections offered Monday by the St. Louis Federal Reserve. About 67 million Americans are believed to be working in jobs that are most at risk of being cut.

"I'm expecting a major worldwide slowdown, so there's a real possibility that Microsoft's booming cloud business could get cut back as millions are laid off," the host said, though he thinks tailwinds remain.

"I think the rapid-fire adoption of Microsoft's cloud platform overrides these macro concerns."

Disclosure: Cramer's charitable trust owns shares of Microsoft.

Disclaimer

Questions for Cramer?Call Cramer: 1-800-743-CNBC

Want to take a deep dive into Cramer's world? Hit him up!Mad Money Twitter - Jim Cramer Twitter - Facebook - Instagram

Questions, comments, suggestions for the "Mad Money" website? madcap@cnbc.com

Go here to read the rest:
Microsoft 'might be the best tech stock in this market,' Jim Cramer says - CNBC

D-Wave gives anyone working on responses to the COVID-19 free cloud access to its quantum computers – TechCrunch

D-Wave, the Canadian quantum computing company, today announced that it is giving anyone who is working on responses to the COVID-19 free access to its Leap 2 quantum computing cloud service. The offer isnt only valid to those focusing on new drugs but open to any research or team working on any aspect of how to solve the current crisis, be that logistics, modeling the spread of the virus or working on novel diagnostics.

One thing that makes the D-Wave program unique is that the company also managed to pull in a number of partners that are already working with it on other projects. These include Volkswagen, DENSO, Jlich Supercomputing Centre, MDR, Menten AI, Sigma-i Tohoku University, Ludwig Maximilian University and OTI Lumionics. These partners will provide engineering expertise to teams that are using Leap 2 for developing solutions to the Covid-19 crisis.

As D-Wave CEO Alan Baratz told me, this project started taking shape about a week and a half ago. In our conversation, he stressed that teams working with Leap 2 will get a commercial license, so there is no need to open source their solutions and wont have a one-minute per month limit, which are typically the standard restrictions for using D-Waves cloud service.

When we launched leap 2 on February 26th with our hybrid solver service, we launched a quantum computing capability that is now able to solve fairly large problems large scale problems problems at the scale of solving real-world production problems, Baratz told me. And so we said: look, if nothing else, this could be another tool that could be useful to those working on trying to come up with solutions to the pandemic. And so we should make it available.

He acknowledged that there is no guarantee that the teams that will get access to its systems will come up with any workable solutions. But what we do know is that we would be remiss if we didnt make this tool available, he said.

Leap is currently available in the U.S., Canada, Japan and 32 countries in Europe. Thats also where D-Waves partners are active and where researchers will be able to make free use of its systems.

More:
D-Wave gives anyone working on responses to the COVID-19 free cloud access to its quantum computers - TechCrunch

Creating the optimal cloud defense strategy – ITP.net

Back in the day, the theft and loss of backup tapes and laptops were a primary cause of data breaches. That all changed when systems were redesigned and data at rest was encrypted on portable devices. Not only did we use technology to mitigate a predictable human problem, we also increased the tolerance of failure. A single lapse, such as leaving a laptop in a car, doesnt have to compromise an organisations data. We need the same level of failure tolerance, with access controls and IT security, in the cloud.

In the cloud, all infrastructure is virtualised and runs as software. Services and servers are not fixed but can shrink, grow, appear, disappear, and transform in the blink of an eye. Cloud services arent the same as those anchored on-premises. For example, AWS S3 buckets have characteristics of both file shares and web servers, but they are something else entirely.

Practices differ too. You dont patch cloud servers they are replaced with the new software versions. There is also a distinction between the credentials used by an operational instance (like a virtual computer), and those that are accessible by that instance (the services it can call).

Cloud computing requires a distinct way of thinking about IT infrastructure.

A recent study by the Cyentia Institute shows that organisations using four different cloud providers have one-quarter the security exposure rate. Organisations with eight clouds have one-eighth the exposure. Both data points could speak to cloud maturity, operational competence, and the ability to manage complexity. Compare this to the lift and shift cloud strategies, which result in over-provisioned deployments and expensive exercises in wastefulness.

So how do you determine your optimal cloud defense strategy?

Before choosing your deployment model, it is important to note that there isnt one definitive type of cloud out there.The National Institute of Standards and Technology's (NIST) definition of cloud computing lists three cloud service models infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS), and software-as-a-service (SaaS). It also lists four deployment models: private, community, public, and hybrid.

In the cloud, all infrastructure is virtualised and runs as software. Services and servers are not fixed but can shrink, grow, appear, disappear, and transform in the blink of an eye. Cloud services arent the same as those anchored on-premises.

Heres a quick summary of how it all works through a security lens:

If you have a hybrid cloud deployment, youll have to mix and match these threats and defenses. In that case, an additional challenge is to unify your security strategy without having to monitor and configure different controls, in different models and in different environments.

Any strategy and priority decisions should come before the technological reasons. Dont go to the cloud for the sake of it. A desired goal and robust accompanying strategy will show the way and illuminate where deeper training and tooling are needed.

Follow this link:
Creating the optimal cloud defense strategy - ITP.net

InfiniteIO Introduces Native File Support for Cloud Storage to Accelerate Access, Analytics and Collaboration for Cloud-Native Applications – Business…

AUSTIN, Texas--(BUSINESS WIRE)--InfiniteIO, which offers the worlds fastest metadata platform to accelerate applications, today announced new Hybrid Cloud Tiering software that provides native file access for traditional and cloud-native applications. Powered by the InfiniteIO File Metadata Engine, InfiniteIO allows IT teams to access and manage all data migrated to object storage in native file format. By leveraging InfiniteIOs continuous data placement policies with native file format, customers and partners can utilize cloud-native services such as analytics, machine learning and serverless computing, while maintaining internal security and governancewithout changing the existing infrastructure or user experience.

Organizations are facing increased complexity managing and moving data across diverse hybrid cloud infrastructure environments, said Scott Sinclair, senior analyst at Enterprise Systems Group. InfiniteIOs metadata-first approach for enabling cloud-native data services can help enterprises standardize and automate data movement activities, which can accelerate cloud adoption, increase data mobility, and unlock the potential of the vast amounts of data they own.

The public health, critical infrastructure and economic crises that were seeing on a global scale have underscored the strategic value of data, said Mark Cree, CEO of InfiniteIO. InfiniteIOs innovations on hybrid cloud computing will speed an organizations ability to extract insights, share information and collaborate at the lowest cost, which will mean a world of difference for todays fast-evolving scientific, medical and commercial applications.

Extending Cloud-Native Workflows for Post-production Processing

InfiniteIOs platform-agnostic data placement policies supporting native file format will provide new opportunities to extend and scale on-premises file workloads to public cloud providers such as Amazon Web Services, Google Cloud Platform, IBM Cloud, Microsoft Azure and Oracle Cloud Platform. Entire files placed by InfiniteIO from any primary NAS system to any S3-based object storage can be stored in cloud-native format. Users and applications will continue to securely access the data as needed whether the data is on-premises or in the public cloud.

InfiniteIO also today announced extended metadata management support for Hitachi HCP S3 and Pure FlashBlade environments. Customers and partners building out hybrid cloud infrastructure continue to have a broad choice of primary and secondary storage partners for their workflows and archiving requirements. Currently supported S3-based private cloud platforms include: Cloudian, Dell/EMC, HPE, NetApp, Scality and Quantum among others.

The InfiniteIO File Metadata Engine (IFME) architecture processes metadata to reduce application latency from seconds to microseconds and enable high-performance hybrid clouds. Built on the IFME, the InfiniteIO Application Accelerator enables both on-premises and cloud-migrated file workloads to run faster by responding to metadata requests directly from the network. InfiniteIO Hybrid Cloud Tiering enables IT managers to optimize their IT budgets using the economics of cloud storage with high-performance data tiering and seamless file accessibility.

InfiniteIO plans to offer native file format support in Q2 as part of software release 2.5 for InfiniteIO Hybrid Cloud Tiering. Existing Hybrid Cloud Tiering customers with current maintenance agreements will be able to upgrade their software at no additional cost.

Additional Information

About InfiniteIO

InfiniteIO provides the lowest possible latency for file metadata, enabling applications to run faster, reduce development cycles, and increase data productivity. Based in Austin, Texas, InfiniteIO independently processes file metadata to simultaneously accelerate application performance and hybrid-cloud data tiering for global enterprises, research organizations and media companies. Learn more at http://www.infinite.io or follow the company on Twitter @infiniteio and LinkedIn.

Read the rest here:
InfiniteIO Introduces Native File Support for Cloud Storage to Accelerate Access, Analytics and Collaboration for Cloud-Native Applications - Business...