Category Archives: Cloud Servers
Arm CPUs To Take A Bite Out Of The HPC Market – The Next Platform
Arm-based servers have had a somewhat checkered history that has seen many abortive attempts to challenge the X86 processor hegemony, but the firm appears bullish about its chances in the high performance computing (HPC) sector, where it believes its licensing model and the energy efficiency of its architecture give it an edge.
Speaking at anHPC community event hosted by Dell, Arms senior director for the HPC business, Brent Gorda, said that the company is really driving hard in the HPC community and highlighted its partnership with companies such as Nvidia, Silicon Pearl, and Fujitsu to develop Arm-based silicon to drive HPC and AI applications.
In fact, Gorda pointed out that Arm has already secured a place in the HPC industry, right at the top with the Fugaku supercomputerat RIKEN Laboratory in Japan that has been ranked as the fastest in the world with its 152,064 48-core Fujitsu A64FX processors.
However, Fujitsu followed the path of acquiring an architecture license from Arm, which meant that it was able to design and manufacture its own custom Arm-compatible processor pretty much to suit its own requirements. This meant the addition of 512-bit Scalable Vector Engine units to support the kind of calculations Fujitsu had in mind, plus its own Tofu D high-speed interconnect.
But few HPC sites can design their own chip from scratch. Fortunately, Arms business model also lets partners take a ready-made core design and add custom modules to it, Gorda explained.
Theres something called a core license whereby you can license Arm Neoverse, which is our IP. And that gives you the core building blocks, the logic itself, around which you customize and build the chip that you want to build, he said.
Surrounding all this is the Arm ServerReady compliance program, which certifies that a specific chip meets compatibility requirements for the Arm server ecosystem.
Once you pass this certification, the software world is available to you. It guarantees functionality for the software, and you can then pay for supported OS releases like Red Hat.
This ability to customize the chip for a specific application or set of applications is where Arm has an advantage, Gorda claimed, especially with where HPC and AI appears to be heading. Customers can take the Arm core engine plus the on-chip network, and add custom accelerators for their target workload.
Bill Dally from Nvidia will say you can get three orders of magnitude performance improvement by putting custom gates down on your silicon chip. That plays exactly to where Arm is going, he said. Everybodys got an idea for an accelerator. And if you know your workload well enough, you can optimize that and just get crazy good performance. And in fact, thats the reason why the A64FX is so good. They took ten years, they studied the ten or twelve applications that they had, and they nailed it. The processor came out and it just completely nailed the applications that the Japanese wanted on their system.
Arm launched its Neoverse effort back in 2018 to target datacenter infrastructure rather than the mobile device market. The Neoverse designs were expanded last year and now comprises three families of processor designs: the V series, which emphasizes performance; the N series, which is focused on scale-out applications such as cloud infrastructure; and the E series, which is targeted more at edge applications.
Silicon Pearl, the company involved with the European Processor Initiative (EPI) project is using the Neoverse V1 design, Gorda disclosed. Meanwhile, the N1 design has been used in the Quicksilver and Mystique Altra server chips from Ampere Computing, the startup founded by former Intel executive Rene James. Amazons Graviton2 chip that powers some AWS EC2 instances uses the N1 core, and the Graviton3 uses the V1 core. Neoverse V series cores also apparently feature inNvidias planned Grace chipaimed at supercomputing, and in a server chip being developed by South Koreas Electronics and Telecommunications Institute (ETRI).
One of the issues that has hindered Arm in the server market is software support, with many key software packages developed for X86 processor platforms. When asked if all the pieces are now in place to deploy HPC on Arm, Gorda said that in general, the answer is yes.
The place where you will find some softness is that, while I believe its accurate to say all of the ISVs have an Arm port in progress, not all of the ISVs are publicly supporting Arm in silicon just yet. So if youre dependent on ISV licenses and software, you will have to poll your ISV to understand the status of things, he explained.
However, Gorda cited the example of the Lustre parallel file system, widely used in HPC environments. There has been Arm support for the Lustre client for many years, but there are very few Arm-based storage servers, Gorda says, and so Lustre server components for Arm are not currently supported by Whamcloud, the division within DDN that oversees Lustre development. This is something Arm is trying to address, he added.
Gorda also pointed out that Arm acquired Allinea Software, a leading provider of software tools for HPC, about five years ago, in order to bolster Arms HPC software ecosystem support.
Another factor in Arms favor is greater power efficiency, according to Gorda. This is something that may become more important as supercomputers expand into exascale territory and ongoing energy costs become a greater concern for HPC operators. Although Arms Neoverse V architecture emphasizes performance rather than power efficiency, the chips based on it still consume less energy than rival X86 processors, according to Gorda.
The X86 guarantee is that you can run a 286 binary on it, and all of that historical legacy of being a CISC architecture with a RISC underlying it calls for a whole lot of logic up front in decode reordering, fixing up instructions, all of that is overhead that goes into the chip and consumes energy, he said. In contrast, you can think of Arm as a clean sheet of paper, to some extent.
Gorda also claimed that end users no longer care what silicon their software is running on, citing the adoption of Arm-powered cloud servers by the likes of AWS.
Theres new big players in town that get to control the architecture. And the things they care about are different than what historically has been cared about. They care about the cost, they care about the energy consumption, they care about turnaround time, and the software stack running on top of things, he said.
If you take a look at what Amazons doing with the Graviton2, they talk about it being 40 per cent cheaper. From an end users perspective, they dont care what the silicon is, they care that its 40 per cent cheaper, and that the turnaround time is on par with what theyre used to.
Earl Joseph, CEO of HPC analyst firmHyperion Research, said that he expects to see high growth of HPC servers based on Arm processors over the next several years.
We expect that to see a five year growth rate of over 31 percent, while the base market moves at around 7 percent to 8 percent, Joseph said. That would equate to Arm-based systems accounting for about 10 percent of the HPC market by 2025, he added.
However, Joseph also cautioned that the revenue numbers can be misleading, as massive supercomputer projects can skew the figures, as the close to $1 billion Fugaku system did in 2020.
The market can thus shift dramatically due to such large individual installations, and Hyperion Research said it anticipates two European exascale machines based on Arm processors in 2025.
Many forthcoming HPC systems are expected to feature a mix of processors, Arm and X86 as well as other processor types, he added.
Read more here:
Arm CPUs To Take A Bite Out Of The HPC Market - The Next Platform
How to Use pCloud in 2022 [Easy Step-by-Step Instructions] – Cloudwards
pCloud is one of the most popular cloud storage services on the market. Its a reputable, secure, speedy and well-designed service, so its not surprising that its a magnet for cloud storage newbies. If youre one of them and want to give pCloud a whirl, this guide will show you how to use pCloud and get it up and running.
Lucky for you, pCloud is exceedingly simple to use, so youll have an easy enough time with getting it set up. It has quite a few advanced features which you can read about in our pCloud review but theyre intuitively integrated into pClouds applications. Even better, you can try it with a free account. Read on to learn more about how to set up pCloud.
Like all other cloud services, pCloud lets you upload files to the cloud for remote access. It works by taking your files, encrypting them and sending them to remote cloud servers.
pClouds user interface is very intuitive and easy to use. Everything you need is clearly labeled and easily found in the interface.
You can access pCloud via its website. Once you have an account, you can access your files via the desktop app or the web interface.
pCloud is one of the better-designed cloud services out there, in terms of both appearance and ease of use. The user experience should be smooth for anyone, as uploading files is as simple as dragging and dropping them to your pCloud drive or the website.
Coming up, well be covering how to set up pCloud on your computer and go over its various features. Well go over topics like uploading files and sharing them, setting up file syncing, setting up a backup and using pClouds Transfer feature.
The main purpose of pCloud, like any other cloud storage, is to store your files securely so that you can access them remotely and share them with others. However, pCloud has a few more tricks up its sleeve that make it stand out from the crowd.
Take the way pCloud handles file syncing. It attaches a virtual drive to your computers storage, which looks just like the other drives like Local Disk (C:). This virtual drive or network drive uses your pCloud storage to store files, essentially expanding your hard drives storage capacity. The downside is that you need to be online at all times to access this virtual drive, but, really, when are you not online?
pCloud also features the most capable media player weve seen in a cloud storage service. It allows you to create playlists and can automatically sort by artist, album or folder. Its video player is also pretty advanced, giving you the ability to adjust the playback speed and even download a converted version of the video file in a different file format.
Now lets talk security. This is one of pClouds strengths, yet its probably the area where pCloud could stand to improve the most. It offers zero-knowledge encryption, which is a security measure that prevents anyone but you from accessing your files (not even pCloud can see their contents).
Unfortunately, you have to pay extra for this kind of encryption, which takes the form of an add-on called pCloud Crypto. The service creates a Crypto folder where all files are covered by zero-knowledge protection. Thankfully, its privacy policy is clean, and it allows you to store files on European pCloud servers. This ensures youre covered by strong privacy laws, like the GDPR.
See the original post here:
How to Use pCloud in 2022 [Easy Step-by-Step Instructions] - Cloudwards
Enhanced Security, Streamlined Automation and Deployment Features Shine in Release 6.0 of StarlingX, the Open Source Platform for Edge – PR Web
The latest release of StarlingX marks another incredible milestone to advance cloud technologies for mission critical industries. ... We are encouraged to see continued ecosystem development from a thriving open source community.
AUSTIN, Texas (PRWEB) February 01, 2022
StarlingXthe open source edge computing and IoT cloud platform optimized for low-latency and high-performance applicationsis available in its 6.0 release today. StarlingX combines Ceph, OpenStack, Kubernetes and more to create a full-featured cloud software stack that provides everything telecom carriers and enterprises need to deploy an edge cloud on a few servers or hundreds of them.
New features in StarlingX 6.0 include:
***Download StarlingX 6.0 at https://opendev.org/starlingx***
Since StarlingX was first released in 2018, the StarlingX open source community has continued to advance and mature this unique cloud platform that offers high availability and low latency for edge workloads, said Ildiko Vancsa, Senior Manager, Community & Ecosystem for the Open Infrastructure Foundation. It is exciting to see the community delivering more advanced functionality for a broad variety of edge applications. The sixth release of the project tackles security enhancements and takes crucial steps towards supporting zero touch deployment and management of edge sites on a large scale that delivers tremendous value as users are deploying the platform in production.
Key Features of StarlingX 6.0To further support the low-latency and distributed cloud requirements of edge computing and industrial IoT use cases, the community prioritized these features in StarlingX 6.0:
Learn more about these and other features of StarlingX 6.0 in the communitys release notes.
OpenInfra Community Drives StarlingX ProgressThe StarlingX project launched in 2018, with initial code for the project contributed by Wind River and Intel. Active contributors to the project include Wind River, Intel and 99Cloud. Well-known users of the software in production include T-Systems, Verizon and Vodafone. The StarlingX community is actively collaborating with several other groups such as the OpenInfra Edge Computing Group, ONAP, Akraino and more.
Community Accolades for StarlingX 6.0 The latest release of StarlingX marks another incredible milestone to advance cloud technologies for mission critical industries. The community has seen tremendous growth in commercial adoption and investments across markets by major organizations and contributors. As a strong ongoing supporter of the project and original contributor to the code base, we look forward to continuing our collaboration and delivering expertise for the distributed cloud by drawing from our technologies such as Wind River Studio, as well as collaboration with key initiatives such as O-RAN. We are encouraged to see continued ecosystem development from a thriving open source community. Paul Miller, Chief Technology Officer, Wind River
The StarlingX community is continuously making significant progress. Were excited to see StarlingX 6.0 to be available with a lot of enhancements and new features. As the 5G era approaches, StarlingX is a key component to meet edge computing requirements. 99Cloud has witnessed and participated in the StarlingX 6.0 release which brings the maturity of the edge cloud platform to a new stage. As one of the leading contributors of StarlingX, well continuously contribute to the community and work with customers and partners to promote StarlingX 6.0 to more commercial deployment. Shuquan Huang, Technical Director, 99Cloud Inc.
Project Resources
About StarlingXStarlingX is the open source edge computing and IoT cloud platform optimized for low latency and high performance applications. It provides a scalable and highly reliable edge infrastructure, tested and available as a complete stack. Applications include industrial IoT, telecom, video delivery and other ultra-low latency use cases. StarlingX ensures compatibility among diverse open source components and provides unique project components for fault management and service management, among others, to ensure high availability of user applications. StarlingX is the ready-for-deployment code base for edge implementations in scalable solutions. StarlingX is an Open Infrastructure Foundation project. http://www.starlingx.io
Share article on social media or email:
Second Trojan asteroid confirmed to be leading our planet around the Sun – The Register
Scientists have confirmed the discovery of Earth's second Trojan asteroid leading the planet in its orbit around its nearest star.
Dubbed 2020 XL5, the hunk of space rock was discovered in December 2020. Although excitement surrounded the early observations of a second Earth Trojan, low observational coverage meant uncertainties in the data were too great for a scientific confirmation.
Trojan asteroids are small bodies sharing an orbit with a planet, which remain in a stable orbit approximately 60 degrees ahead of or behind the main body.
Venus, Mars, Jupiter, Uranus, and Neptune all have them but it wasn't until 2011 that asteroid 2010 TK7 was found to be the first Earth could lay claim to. Now a second was confirmed this week.
Around 1.18km across (give or take 80m), 2020 XL5 is probably made of carbon and is the larger of the Earth's Trojan asteroids to be discovered, according to the study published in Nature Communications. Both lead our planet in its trajectory around the Sun.
Toni Santana-Ros, postdoctoral researcher at Barcelona University's Institut de Cincies del Cosmos and his team used archival data from Catalina Sky Survey which revealed promising data from Mount Lemmon telescope in Arizona and the online repository of images from Vctor M. Blanco Telescope, Chile. They combined this data with optical images of 2020 XL5 from 4m class telescopes, the Southern Astrophysical Research telescope in Chile and the Lowell Discovery Telescope in Arizona.
They also made new observations using the European Space Agency's Optical Ground Station 1m telescope on Tenerife, Spain, watching the skies from February 9 last year until March 16. The integration of the orbit data employed ESA AstOD orbit determination software.
As well as confirming the finding, their study shows the Earth Trojan's orbit is likely to remain stable for at least 4,000 years.
They suggest the object may have been thrown out of the Solar System's main asteroid belt following an interaction with Jupiter, but more work is needed to confirm the idea.
Because it is bigger than its sibling, the newly confirmed space rock may be a better candidate for a future fly-by mission, the researchers suggested.
Read the original post:
Second Trojan asteroid confirmed to be leading our planet around the Sun - The Register
Another Massive Display as AMD hails ‘outstanding’ 2021, teases Genoa and Bergamo chips – The Register
AMD has hailed 2021 as an "outstanding" year with each of its business units growing significantly, thanks to strong sales of its Epyc server chips and data centre GPUs. The firm is hoping to continue this with its Genoa chips this year and Bergamo in 2023.
In a conference call to disclose AMD's Q4 and year-end financial results, president and CEO Lisa Su said the firm had exceeded its growth goals and delivered a record year. In particular, she claimed that data centre revenue had more than doubled year-on-year.
In servers, Su said revenue had more than doubled year-over-year and increased by a double-digit percentage sequentially, driven by demand across both cloud and enterprise customers. She also picked out data centre graphics revenue as more than doubling year-on-year, driven by HPC wins for AMD's latest Instinct MI200 accelerators, with platforms coming this quarter from Asus, Dell, HP, Lenovo, Supermicro, and others.
"We're still cloud-weighted relative to enterprise. But enterprise has made a really nice progress. It's a sizable business, and we've made progress with the larger OEMs as well as across a number of regional OEMs," Su said.
AMD's computing and graphics segment's revenues, meanwhile, were up 32 per cent growth to $2.584bn. Su said this was driven by sales of Ryzen processors and Radeon graphics processors. She also noted the "industry has seen some price increases across the supply chain."
When questioned about the company's own pricing strategies by an analyst, Su said that "without a doubt, the predominant growth is products. So it's units and average selling prices from the mix of the product, and that's the predominant growth."
Although data centre is not broken out into a specific business unit at AMD, Su claimed that revenue for data centre products constituted "a mid-20 percentage of overall revenue" for 2021, and indicated that the firm expected 2022 to be another year of growth based on signals it was getting from customers for current and next-generation products.
"Demand for our product is very strong, and we look forward to another year of significant growth and share gains as we ramp our current products and launch our next wave of Zen 4 CPUs and RDNA 3 GPUs. We have also made significant investments to secure the capacity needed to support our growth in 2022 and beyond," the CEO said.
Further to the supply chain issues, Su said that AMD has made significant investments in wafer capacity as well as substrate capacity, adding: "We feel very good about our progress in the supply chain to meet the 2022 guidance." Looking ahead, Su said that AMD is already sampling its Genoa Epyc processors to customers now and is on track to launch later this year, while shipments of the Bergamo chips are planned to follow in the first half of 2023.
Genoa is set to feature up to 96 Zen 4 cores and next-generation memory and I/O technologies, according to AMD, while Bergamo features a version of the Zen 4 core called Zen 4c that has been specifically optimised for cloud-native computing.
"Bergamo is a high core power-efficient CPU that can be used in the same platforms as Genoa. It will feature up to 128 CPU cores and deliver significant performance and power efficiency advantages for cloud workloads," Su claimed.
AMD also recently got clearance from the Chinese regulatory authorities for its planned takeover of FPGA maker Xilinx. Su said that she was "extremely excited about Xilinx" and the combination of AMD and Xilinx technology, saying that the firm has been planning for the integration and has had interest from customers anxious to talk about combined road maps.
Su hinted that there was an opportunity for edge deployments in communications and 5G networks, saying: "As we bring Xilinx into the equation, they have very deep relationships with a number of these accounts. And so we see that as an incremental positive as we think about EPYC in communications."
FPGAs have been finding new uses in the data centre over recent years, as accelerators for AI processing or as part of SmartNICs, and rival Intel has even offered Xeon chips combined with an FPGA for select customers.
For the longer term, Su expressed confidence in AMD's future, based on its roadmap and the commitments it has from customers.
"We are confident in our ability to continue growing significantly faster than the market, based on our expanded roadmap investments and the deep relationships we have established with a broad set of customers who view AMD as a strategic enabler of their success," she said.
Read more here:
Another Massive Display as AMD hails 'outstanding' 2021, teases Genoa and Bergamo chips - The Register
How to deliver the benefits of the public cloud experience on-premise – Information Age
In the hybrid cloud era, CIOs are increasingly targeting cloud-like infrastructure in the enterprise, or on-premise
Some organisations need the benefits of the public cloud in an on-premise environment.
The DCIaaS (Dedicated (Local) Cloud Infrastructure-as-a-Service) market will be worth $14 billion by 2025, according to the IDC.
ESG Research estimates that 60% of those who have moved to the cloud are now looking at bringing certain workloads back to the enterprise, and the report adds that 46% will invest in bringing the cloud experience on-premise. This is because there are datasets that dont belong in the public cloud, such as servers in manufacturing or 5G towers, or in hospitals and banks. Regulation requirements and security challenges mean that this data must be stored on-premise or on the edge of a network.
There are also huge cost savings that can be achieved by moving workloads back to the enterprise.
Speaking during The IT Press Tour to San Francisco, Siamak Nazari CEO and co-founder at Nebulon compared the experience to renting a car. Deploying all workloads to the cloud is viable and cost effective in the short term it helps companies scale but in the long-term, this is an expensive route. He did note that there were workloads that absolutely have to be stored on the public cloud.
Research from Andreessen Horowitz says that enterprises can save 50% in costs, as a result of cloud repatriation. As an example, significant costs can be achieved by removing potentially thousands of virtual machines in the enterprise to on-premise, which consume energy even when idol. When these machines are all turned on at once, this intensifies the cost problem.
>Read here: To find out more about the other companies in the latest edition of The IT Press Tour in San Francisco and Silicon Valley
The challenge is producing the public cloud experience on-premise something agile, elastic and flexible that delivers hyperscale infrastructure.
Enterprises have to adopt a cloud model in the data centre to stay competitive, but they cant the technology isnt mature, said Nazari.
The standard AWS cloud operation platform for hyperscale infrastructure consists of three areas, which need to be added to for the cloud-experience to be viable in the enterprise.
1. AWS management console: automation, non-disruptive ops and instant updates.
Enterprises need to add Zero Trust to the SaaS delivery model, which is beyond what cloud providers can offer.
2. AWS nitro system: offloaded enterprise data services, secure platforms and isolated fault domains.
On-premise these services need to be available with any supplier of choice for the flexibility that cloud providers cant offer.
3. Amazon machine images: consistent deployments, consistent maintenance, and application variety.
Immutable instance or updated servers need to be available from day 1, which gives the enterprise the agility and variety necessary to work on the fly.
This equips customers to operate their on-premises infrastructure like a hyperscale public cloud.
A key inhibitor to delivering these IaaS cloud efficiencies on-premises is overcoming the manual, server-by-server deep infrastructure operations in public and private environments.
Nebulons server-embedded, infrastructure software delivered as-a-service implemented on a dedicated card that functions as an IoT endpoint allows the enterprise to manage their deep infrastructure operations, such as BIOS updates, SSD firmware updates and component health monitoring.
As an example of the service in action, the COO and co-founder, Craig Nunes, provided a customer case study.
He said: In its current estate the customer has 10 data centres, 3-tier infrastructure with 250 VM servers and 60 arrays, equivalent of approximately 2 petabytes of storage.
The project involved decommissioning the entire current environment, replace it with Nebulon-enabled servers and Nebulons infrastructure operations SaaS.
We deployed nearly 100 enabled servers and over a petabyte of storage across four centres.
The benefits they saw was 40% savings on infrastructure, higher margins on existing services and new customer acquisition benefits.
See also: Have you traded on-premise lock-in for in-cloud lock-in? Martin Gaffney, VP EMEA at Yugabyte, discusses why CIOs need to head off long-term lock-in by adopting as open an approach to their cloud strategy, as to their software choices
More here:
How to deliver the benefits of the public cloud experience on-premise - Information Age
Cloud computing and high-speed data transfer the top technology trends for 2022 – theloadstar.com
Photo 179775716 Michal Bednarek | Dreamstime.com
The rise of cloud computing services is set to change the investment criteria for freight and logistics operators.
Increasing numbers of as-a-service options are transforming digitisation from a capex requirement to an operating expense (opex).
A recent report on the top technology trends for 2022 by Transport Intelligence (Ti) argues that market growth is increasingly enabled by high-speed internet access reducing the time delay in business communications.
And ever-faster connections gave rise to cloud computing the ability to store and exchange data with server farms anywhere, rather than companies having to establish their own server and data storage facilities.
The supply chain parallel would be just-in-time delivery, and the technology is lowering entry costs and time-to-market for new businesses.
Ti has termed this trend opex not capex, where software and the processing power required to run it is rented rather than bought or licensed, leading to an explosion in online services and solutions priced similar to the business models of the old phone companies.
Report author Ken Lyon writes: It enables many small companies to enter the market and service customers from a much lower cost base this makes future competition about ideas, innovation and execution.
He adds that hardware has transitioned from the large expensive mainframe units and servers that resided within organisations, into the enormous (and invisible) server farms, operated by the cloud service vendors.
This theoretically enables much more energy-efficient computing as well; a server farm is more likely to have the critical mass to be able to switch to renewable energy, whereas a server cabinet in an office could not.
For a server farm, energy costs are the main consideration, and many have been set up in Iceland, Norway and Sweden, where hydro-electric or geothermal energy sources are plentiful and colder ambient temperatures reduce the load on cooling systems, helping provide low costs and a competitive advantage.
The Loadstar recently reported on FourKites acquisition of German carrier-facing service provider NIC-place, and Ti argues this may be the first of many such developments, as consensus arises on the standards of information-sharing between platforms.
Application programming interfaces (APIs) are now the standard means of sharing information easily between systems and services, this will continue and these gateways will become easier to implement.
Increasingly, these types of communications will be automated via machine-to-machine, frequently referred to as the internet of things (IOT).
The often hysterical universe of cryptocurrencies and non-fungible tokens mask the usefulness of the technologies behind them: this could be very significant for supply chain visibility. The notion of an intelligent network that can react to alarms or alerts and swiftly replan and reschedule actions without human intervention is compelling, says the report.
This could provide huge opportunities for supply chain providers, but tempered with a need for greater vigilance and cybersecurity.
But, cybercrime notwithstanding, these developments will lead to better access to data for both providers and their customers, as standardisation enables better integration, decreased time to market, as remote servers drive down the cost of computing, and more nimble businesses, as autonomous systems develop the ability to fix themselves.
However, it takes time to adopt new technologies, especially within large organisations, Mr Lyon notes.But the demands to adopt new technology will not diminish, so companies must learn to adapt and become more open minded about the choices that they will need to make.
This is especially difficult for senior staff who have built career paths around expectations which are no longer relevant, he warns.
Read more from the original source:
Cloud computing and high-speed data transfer the top technology trends for 2022 - theloadstar.com
Cloud Based Contact Center Market Projected to Surpass USD 45.5 Billion by 2030 with a CAGR of 24.8% – GlobeNewswire
New York, USA, Jan. 31, 2022 (GLOBE NEWSWIRE) -- Market Overview: According to a comprehensive research report by Market Research Future (MRFR), Cloud Based Contact Center Market information by Solution, by Vertical, by Application and Region forecast to 2030 market size to reach USD45.5 billion, growing at a compound annual growth rate of 24.8% by 2030.
Market Scope: The increased use of cloud-based contact center by different industrial verticals like healthcare and life sciences, government and public sector, consumer goods and retail, BFSI, and others will offer robust opportunities for the market over the forecast period.
Besides, other factors adding market growth include cloud based contact centers help in tracking real-time administration metrics through a customizable control panel, growing awareness about the alluring features of cloud based contact centers such as auto dialer real time monitoring, ACD, call center reports, IVR, Omni-channel support, and call center integration, as well as the growing need for cross-channel communication solutions, among others.
Dominant Key Players on Cloud Based Contact Center Market Covered are:
Get Free Sample PDF Brochure: https://www.marketresearchfuture.com/sample_request/6358
Market USP Exclusively Encompassed:Market DriversGrowing Need for Cloud Computing to Boost Market Growth The growing need for cloud computing will boost market growth over the forecast period for its convenience features, flexibility, affordability, and robust scalability. Thus organizations are widely looking in migrating their contact center operations to the cloud from the traditional on-premise model.
Cyber-attacks impacting Business Operations to act as Market Restraint In the current digital world, the access to vital information has resulted to several challenges, of which one of these include enterprises storing their sensitive data that turns into a key target for the cybercriminals. Contact centers, unfortunately that generally handle surplus customer information are no exception. On a regular basis, contact centers collect as well as store enough customer information which attracts cybercriminals for targeting such contact centers.
High Initial Investment to act as Market Challenge The high initial investment and the dearth of trained expertise/skilled professionals may act as market challenges over the forecast period.
Browse In-depth Market Research Report (100 Pages) on Cloud Based Contact Center Market:https://www.marketresearchfuture.com/reports/cloud-based-contact-center-market-6358
Segmentation of Market Covered in the Research:The global cloud based contact center market has been bifurcated based on vertical, deployment model, organization size, services, and solution.
By solution, interactive voice response and automatic call distribution will lead the market over the forecast period as organizations focus to streamline and automate the massive volume of calls cost effectively and efficiently.
By services, the global cloud-based contact center market has been segmented into managed services and professional services.
By organization size, large enterprises will spearhead the market over the forecast period.
By deployment model, the public cloud segment will have the lions share in the market over the forecast period for the low cost associated with public deployment.
By industry, the IT and telecommunication segment will command the market over the forecast period. This will be followed by the BFSI sector as most financial institutions are using cloud-based solutions for making the facilities convenient. The banking sector has turned digital with the growing adoption of cloud platforms.
Regional AnalysisNorth America to Reign Cloud Based Contact Center Market North America will reign the cloud based contact center market over the forecast period. The presence of several key vendors in the region, the adoption of associated services, increasing recognition of cloud-based solutions, the growing adoption of favorable technologies like the IoT, the availability of cloud-based contact centers at a lower price by key vendors, large-scale digitization initiatives, growing number of startups, the presence of influential and innovative vendors like Cisco Systems Inc., Microsoft Corporation, and Oracle Corporation, and increasing investments of such key vendors in R&D activities are adding to the global cloud based contact center market growth in the region. Besides, organizations increasingly migrating their business operations to the cloud, rising trend of remote working that fuels the adoption of cloud based contact centers, the existence of major vendors, increasing acceptance of related services, and the growing acceptance of cloud-based solutions are adding to the global cloud based contact center market growth in the region.
Talk to Expert: https://www.marketresearchfuture.com/ask_for_schedule_call/6358
APAC to Have Significant Growth in Cloud Based Contact Center Market The APAC region will have significant growth in the cloud based contact center market over the forecast period. Dramatic growth of data center business, the rising number of startups, customers willingness to adopt new technologies, significant adoption of cloud-based solutions, ongoing digitalization, new entrants in the cloud contact center market particularly in Australia that are driving more established traditional on-premises players to aggressively roll out cloud-based solutions, the accessibility of cloud-based contact centers at an affordable cost by the regions leading vendors, and large-scale digitalization initiatives are adding to the global cloud based contact center market growth in the region.
COVID-19 Impact on the Global Cloud Based Contact Center MarketOrganizations are choosing cloud-based contract center solutions for managing contract center operations at the time of the COVID-19 outbreak for executing daily operations remotely. The need for cloud-based contact center solutions surged in 2020 for the different perks that it offers like low setup cost, scalability, and flexibility. Most of the organizations across industries have switched to work from home or remote working for safeguarding the well-being of employees and maintaining operational efficiency thus boosting the need for cloud-based solutions. Organizations which have already shifted their contact center operations to the cloud have survived easily with the continuity of their business during the pandemic. Cloud based contact centers are being preferred increasingly by enterprises for handling outbound and inbound customer communications effectively and delivering flexibility to manage customer service operations through remote workforce.
Competitive LandscapeThe cloud based contact center market is both competitive along with being fragmented on account of the presence of several international and also domestic industry players. Such industry players have utilized an assorted innovative strategies for being at the top along with sufficing to the burgeoning requirement of the esteemed clients including geographic expansions, collaborations, joint ventures, new product launches, partnerships, contracts, and much more. Besides, the players are also investing in different research & development activities.
Related Reports:Hyperscale Data Center Market Research Report: by Component (Servers, Networking, Software, Storage and others), by Services (Consulting, Installation and Deployment, and Maintenance and Support), by Type (Cloud Providers, Colocation Providers, Enterprises), by Tier Type (Tier 3, Tier2, Tier 4 and Tier 1), by Design Type (Electrical Construction and Mechanical Construction) and by Region (North America, Asia-Pacific, Europe, Middle East and Africa and South America) - Forecast till 2027
Data Center Cooling Market, By Components (Chillers, Economizer, Server Cooling), Cooling Type (liquid, air), Service (Professional Service, Managed Service), Organization Size (SMEs, Large Enterprises) Vertical (BFSI, Energy ) - Forecast till 2027
Data Center Security Market: by Component (Solution, Services), Data Center Type (Mid-Sized Data Center, Enterprise Data Center, Large Data Center), Vertical (BFSI, Healthcare, IT & Telecommunication, Media & Entertainment) Forecast to 2027
Data Center Power Market Report Information by Component (Solution, Service), Data Center Size (Small & Medium Size, Large Size Enterprise Size Data Center), End-User (IT & Telecommunication, BFSI, Government, Power & Energy) Global Forecast till 2027
About Market Research Future:Market Research Future (MRFR) is a global market research company that takes pride in its services, offering a complete and accurate analysis regarding diverse markets and consumers worldwide. Market Research Future has the distinguished objective of providing the optimal quality research and granular research to clients. Our market research studies by products, services, technologies, applications, end users, and market players for global, regional, and country level market segments, enable our clients to see more, know more, and do more, which help answer your most important questions.
Follow Us:LinkedIn|Twitter
See the rest here:
Cloud Based Contact Center Market Projected to Surpass USD 45.5 Billion by 2030 with a CAGR of 24.8% - GlobeNewswire
Is Palantir the AWS of Data Analytics? – Motley Fool
It comes as no surprise to anyone in the business world that companies are increasingly relying on data for everything they do. According to IDC, more than 80% of an organization's data will be unstructured by 2025. This means that customer records or important legal documents are siloed in different systems, which presents a challenge for traditional data solutions.
As a result, management could spend years working with high-cost consultants attempting to build in-house solutions. Fortunately,Palantir Technologies (NYSE:PLTR) develops a software-as-a-service (SaaS) platform called Foundry. Unlike much its competition, Foundry provides multiple capabilities as part of a unified platform. This helps prevent the risks that come with disparate technologies and ensures a seamless experience in which data history, security, and privacy are protected.
Amazon (NASDAQ:AMZN) also recognized early on that data was becoming more important for decision-makers within organizations. Just as Amazon identified that aggregating and storing data was going to become a problem, Palantir has identified that unifying and interpreting this data is becoming increasingly challenging as companies acquire more customers and invest in new operating systems.
Image source: Getty Images.
Following its IPO in September 2020, Palantir received a lot of media coverage. It even briefly became a meme stock, and by January 2021, shares had skyrocketed over 250% from its IPO price.
Throughout 2021, Palantir slowly came back to Earth as investors started forming a better understanding of its business. Palantir reported $392 million in revenue during Q3 2021, representing 36% year-over-year growth and 203 total customers. Although this level of growth is impressive, the company has been criticized by Wall Street for being heavily reliant on government contracts, leading some to fear that Palantir will struggle to expand into the private sector in a meaningful way. By comparison, competing software analytics platform Snowflakereported over 5,400 customers, 173% net revenue retention, and revenue growth of 110% year over year for the fiscal quarter ended Oct. 31, 2021.
Although Snowflake and Palantir experienced heavy sell-offs during the final months of 2021, Snowflake currently trades at 106 times its trailing-12-month sales compared to Palantir's 25 times. Despite committing to 30% revenue growth each year over the next four years, the pace at which Palantir is growing compared to peers and its path to penetrate the private sector remain legitimate investor concerns.
As a company acquires more customers, it inherently collects more data which can be used to hone existing product features. In the early 2000s, Amazon understood that data would become an important pillar for companies and that physical servers should be virtualized. This dynamic is referred to as cloud computing. Cloud computing allows businesses more flexibility and scalability than physical servers, and is often a more cost-effective solution.
In 2006, Amazon made its first strides to tackle server virtualization by introducing a cloud hosting solution called Amazon Web Services (AWS). Perhaps what was even more innovative was the way that Amazon commercialized this product: Amazon drove awareness in the marketplace by introducing a certification program. Since businesses are increasingly requiring candidates to be experts in areas such as digital advertising, customer relationship management (CRM), and data analytics, certifications allow employers to have standards across the employee base because companies need teams that are well-versed on these software products.
Public health agencies are already leveraging Foundry to create unified catalogues of data to power response workflows including tracking and analyzing the spread of COVID-19and coordinating with hospitals and medical supply manufacturers. Moreover, Palantir is used by financial institutions to combat crimes such as money laundering. Although it is encouraging to see Foundry used for critical business operations, it could be argued that its use cases have much more room to grow across these organizations, rather than combating one specific problem.
In mid-January, Palantir announced that it would be rolling out a certification program for Foundry. The certification program is an important step in an ongoing effort to support Palantir customers in unlocking value from the software. The company noted in the press release that "with its certification program, Palantir is accelerating and expanding the productization of its approach." This means that Palantir is looking to generate network effects within a company through its certification program. As more professionals are proficient in Foundry software, this platform could become more engrained into the fabric of global operations.
In effect, tools such as AWS or Foundry can become vertically integrated across different product groups and functions in a company. This methodology has helped Amazon sell AWS to millions of customers and grow it into a $50 billion run-rate revenue business. By comparison, Palantir has only guided for revenue of $1.5 billion for calendar 2021, but Palantir is looking to replicate the AWS blueprint as it seeks to penetrate private sector growth. Just as AWS has become a market leader in the cloud, Palantir has a huge opportunity as it looks to build the central operating system for big data.
Amazon used used its AWS certification program to create more awareness in the marketplace. In turn, it was able to learn the pain-points of business of all sizes and create numerous applications for those organizations. The primary goal of Foundry is that it will have such a profound impact within an organization that its use cases begin to compound. In effect, Foundry would become the central nervous system and backbone for data as artificial intelligence and machine learning become more mainstream for companies. Instead of selling Foundry to one organization for one particular use case, Palantir can accelerate and expand the use of its software as more people become certified across different functions throughout a company. The certification program could be a lucrative driver of future business for Palantir, especially as the company seeks to grow beyond its government segment, thereby fueling long-term momentum for the stock.
This article represents the opinion of the writer, who may disagree with the official recommendation position of a Motley Fool premium advisory service. Were motley! Questioning an investing thesis -- even one of our own -- helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.
More here:
Is Palantir the AWS of Data Analytics? - Motley Fool
iXsystems Outperforms in 2021 with 70% Year-over-Year Growth of TrueNAS Open Storage Deployments – PR Web
"The commitment of our team has been exceptional in supporting customers and the important data storage challenges we address"Michael Lauth, President and CEO for iXsystems
SAN JOSE, Calif. (PRWEB) January 31, 2022
iXsystems today announced new growth milestones over the past 12 months with a 70% Year-over-Year revenue increase. The impressive growth is due to rising adoption of TrueNAS Open Storage software and applicable storage systems, including TrueNAS M-Series, TrueNAS R-Series, TrueNAS X-Series, and TrueNAS Mini storage systems.
While the enterprise storage industry is seeing a rise in demand, iX significantly outperformed the market in 2021. Led by TrueNAS Enterprise, the only Open Source unified software defined storage to provide business-grade capabilities, customers are choosing the platform for its full-featured, unified (block/file/object) storage for both flash performance and disk capacity. Now with over 1.1 million deployments and a new milestone of two exabytes of data under management, TrueNAS has delivered true storage freedom to thousands of organizations around the globe.
Global market revenue for enterprise external OEM storage systems grew 9.7% year over year to $6.9 billion, reported International Data Corporation (IDC) in the companys Worldwide Quarterly Enterprise Storage Systems Tracker. Total external OEM storage capacity shipped was up 27.9% year over year to 22.1 exabytes during the quarter.
Significant iXsystems 12-month milestones achieved include: 70% year-over-year total sales growth 54% year-over-year international sales growth 146% growth in new TrueNAS Enterprise deployments over one petabyte More than 500,000 TrueNAS software downloads Partner generated channel sales YoY growth of 152% Partner deal registration growth of 154% YoY TrueNAS SCALE introduced TrueCommand Cloud launched TrueCharts introduced, a community catalog of apps for TrueNAS SCALE RevMatch Channel Partner Program opened Ranking among Top Five SDS Block Storage Solutions by DCIG Winner of 2021 Best in Biz Awards
"We are pleased with the companys momentum and technology milestones achieved over the past year," said Michael Lauth, President and CEO for iXsystems. The commitment of our team has been exceptional in supporting customers and the important data storage challenges we address.
Tweet This: @iXsystems Post 70% Year-over-Year Growth Driven by TrueNAS Enterprise Storage - https://www.ixsystems.com/press-releases/
Additional Resources: To learn more about TrueNAS Open Storage, visit: https://www.truenas.com Follow TrueNAS News on Twitter at: http://twitter.com/iXsystems
About iXsystems and TrueNASThrough decades of expertise in system design and development of Open Source software (FreeNAS, FreeBSD, OpenZFS, and TrueNAS), iXsystems has become an innovation leader in high-availability storage and servers powered by Open Source solutions. With over one million deployments and backed by the legendary ZFS file system, TrueNAS offers the stability and reliability required for Backup, Multimedia, Cloud Hosting, Virtualization, Hyper-converged Infrastructure, and much more. Since the founding of iXsystems in 2002, thousands of companies, universities, and government organizations have come to rely on the companys enterprise servers, TrueNAS Open Storage, and consultative approach to building IT infrastructure and Private Clouds with Open Source economics.
###
Share article on social media or email: