Category Archives: Cloud Computing

How the edge and the cloud tackle latency, security and bandwidth issues – Information Age

Mark Seddon, CEO of Pact Global, discusses how edge computing is complementing the cloud to tackle latency, security and bandwidth issues

Cloud and edge can work together to increase data efficiency.

Edge computing, an IT deployment designed to put applications and data as close as possible to the users or things that need them, is best understood through its use in the Internet of things (IoT), as it is the IoT that has produced the need for it. Simply put, IoT is all the physical objects that connect to the internet and exchange data; thermostats, security cameras, fridge freezers, Alexa, Google Home and even vehicles. Just as the need for increased data storage by individuals and companies created the need for the vast centralised storage capabilities of the cloud, IoT has created a need for a faster, more secure way to use the same data, but by using less bandwidth.

The move from personal computing to cloud computing has seen massive amounts of data sent to and stored in huge server farms, largely owned by Google, Amazon, Microsoft and IBM. In order to use cloud data, it must be accessed, processed and analysed before being returned for purpose. A useful analogy for this is the home assistant. When you ask Google Home what the weather is going to be like, it processes your speech, sends a compressed version to the cloud, which is uncompressed, processed, perhaps performs an API function to get the answer, and returns it to your device. This round-trip data usage creates three main issues: latency, security and bandwidth.

Access to data generated by the Internet of Things (IoT) can be difficult to control, but how can companies go about achieving this safely? Read here

With the rise of IoT, edge computing is rapidly gaining popularity as it solves the issues the IoT has when interacting with the cloud. If you picture all your smart devices in a circle, the cloud is centralised in the middle of them; edge computing happens on the edge of that cloud. Literally referring to geographic location, edge computing happens much nearer a device or business, whatever thing is transmitting the data. These computing resources are decentralised from data centres; they are on the edge, and it is here that the data gets processed. With edge computing, data is scrutinised and analysed at the site of production, with only relevant data being sent to the cloud for storage. This means much less data is being sent to the cloud, reducing bandwidth use, privacy and security breaches are more likely at the site of the device making hacking a device much harder, and the speed of interaction with data increases dramatically.

While edge and cloud computing are often seen as mutually exclusive approaches, larger IoT projects frequently require a combination of both. Take driverless cars as an example. If information from all car sensors had to be sent to the cloud for processing and returned for function, the capability of the network, the ability to hack into the car and the delay in response would all mean self-driving cars were not feasible. As a combination of cloud and edge computing, there is no user responsibility for the software that is run and sensors work in real-time, but the car has to use centralised data to get updates and send processed data back to enhance algorithms.

As the cloud market grows, Faraz Ahmed, CTO at Nakisa, and Sanjay Vyas, CTO at Planful, provide their CTO tips on delivering cloud innovation to avoid disruption. Read here

Despite its maturity in the market, businesses are only recently realising how IoT will help automate and refine the services they provide. Edge computing and the cloud are beneficial to business in a number of ways: time saved analysing data; downsized storage volumes; and ease in abiding security and data privacy regulations like the General Data Protection Regulation (GDPR) being just a few. As the digital and real world converge and experiences become increasingly immersive, the proliferation of data being collected is unparalleled and will continue to grow.

Insurance is perfectly positioned to be the first major industry to benefit from the combination of edge computing and the cloud, to create the most immersive user experience with IoT. As an example, home insurance has always been reactive rather than proactive and relied upon users to report a claim to a third party via the phone. Additionally, this data is rarely used to calculate accurate risk. However, with the rise of smart home and IoT devices, home insurers are able to flip this model on its head.

Gari Gono, head of solutions at INSTANDA, says that the time is up for insurers to embrace cloud technology. Read here

With the use of IoT devices customers homes can provide data, which can be leveraged to more comprehensively calculate risk, while smart security and sensors can be used preventatively in the home, sending both homeowners and their insurers alerts when something isnt right. Flood damage is one of the leading causes of home insurance claims and the perfect example for this. Leak sensors deployed near potential leak areas like boilers, sinks and washing machines, can send notifications to avoid a damaging event. If a pipe bursts when no one is home, the sensor can command a smart valve to shut the water off, minimising damage, notifying the customer and minimising the claim.

For home insurers, partnering with IoT companies can lead to better relationships with customers and improve the understanding of risk, specific to each customer. With this understanding of the true risks it will enable insurers to reduce premiums.

There is no doubt that the future of IoT for insurance is the combination of both edge and cloud computing, utilising the major advantages of each; maximising the benefits for their customers by minimising damage and providing more accurate premiums, whilst minimising risk and claims costs for companies.

Go here to see the original:
How the edge and the cloud tackle latency, security and bandwidth issues - Information Age

Cloud Computing in K-12 Market -Industry Trends, Opportunities and Forecasts to Global Industry Analysis, Size, Share, Growth, Trends, and Forecast…

Global Cloud Computing in K-12 market- Report defines the vital growth factors, opportunities and market segment of top players during the forecast period from 2019 to 2025. The report Cloud Computing in K-12 offers a complete market outlook and development rate during the past, present, and the forecast period, with concise study, Cloud Computing in K-12 market effectively defines the market value, volume, price trend, and development opportunities. The comprehensive, versatile and up-to-date information on Cloud Computing in K-12 market is provided in this report.

The latest research report on Cloud Computing in K-12 market encompasses a detailed compilation of this industry, and a creditable overview of its segmentation. In short, the study incorporates a generic overview of the Cloud Computing in K-12 market based on its current status and market size, in terms of volume and returns. The study also comprises a summary of important data considering the geographical terrain of the industry as well as the industry players that seem to have achieved a powerful status across the Cloud Computing in K-12 market.

For more insights into the Market, request a sample of this report (Including Full TOC, List of Tables & Figures, Chart) @ https://www.marketresearchhub.com/enquiry.php?type=S&repid=2743692&source=atm

segment by Type, the product can be split intoSaaSIaaSPaaSMarket segment by Application, split intoTraining & ConsultingIntegration & MigrationSupport & Maintenance

Market segment by Regions/Countries, this report coversNorth AmericaEuropeChinaJapanSoutheast AsiaIndiaCentral & South America

For Information On The Research Approach Used In The Report, Ask to Our Industry [emailprotected] https://www.marketresearchhub.com/enquiry.php?type=E&repid=2743692&source=atm

Complete Analysis of the Cloud Computing in K-12 Market:

Comprehensive assessable analysis of the industry is provided for the period of 2019-2025 to help investors to capitalize on the essential market opportunities.

The key findings and recommendations highlight vital progressive industry trends in the global Cloud Computing in K-12 market, thereby allowing players to improve effective long term policies

A complete analysis of the factors that drive market evolution is provided in the report.

To analyze opportunities in the market for stakeholders by categorizing the high-growth segments of the market

The numerous opportunities in the Cloud Computing in K-12 market are also given.

Note: Our analysts monitoring the situation across the globe explains that the market will generate remunerative prospects for producers post COVID-19 crisis. The report aims to provide an additional illustration of the latest scenario, economic slowdown, and COVID-19 impact on the overall industry.

This detailed report on Cloud Computing in K-12 market largely focuses on prominent facets such as product portfolio, payment channels, service offerings, applications, in addition to technological sophistication. The report lends versatile cues on market size and growth traits, besides also offering an in-depth section on opportunity mapping as well as barrier analysis, thus encouraging report readers to incur growth in global Cloud Computing in K-12 market.

You can Buy This Report from Here @ https://www.marketresearchhub.com/checkout?rep_id=2743692&licType=S&source=atm

Furthermore, Global Cloud Computing in K-12 Market following points are involved along with a detailed study of each point:

Generation of this Global Cloud Computing in K-12 Industry is tested about applications, types, and regions with price analysis of players that are covered.

Revenue, sales are planned for this Cloud Computing in K-12 market, including with various essentials along yet another facet is assessed in this section for foremost regions.

In continuation using earnings, this section studies consumption, and global Cloud Computing in K-12 market. This area also sheds light on the variance between ingestion and distribution. Export and Cloud Computing in K-12 significance data are provided in this part.

In this section, key players have been studied depending on product portfolio, their Cloud Computing in K-12 market company profile, volume, price, price, and earnings.

Cloud Computing in K-12 market analysis aside from business, the information, and supply, contact information from manufacturers, consumers and providers can also be presented. Additionally, a feasibility study to asset and SWOT analysis for endeavors have been contained.

For More Information Kindly Contact:

marketresearchhub

90 State Street,

Albany NY,

United States 12207

Tel: +1-518-621-2074

USA-Canada Toll Free: 866-997-4948

Email: [emailprotected]

Go here to read the rest:
Cloud Computing in K-12 Market -Industry Trends, Opportunities and Forecasts to Global Industry Analysis, Size, Share, Growth, Trends, and Forecast...

Will 5G Networks Move To Open RAN? – Forbes

Samsung Networks and Verizon announced the first fully virtualized, end-to-end 5G data connection, ... [+] highlighting the growing opportunity for Open RAN technology in 5G networks.

It is not an easy task to make sense of 5G wireless networks. Not only are there a wide variety of complicated individual pieces needed to create a network, theres also the manner by which those pieces interconnect, the software used to control them, the standards and protocols they need to support, and, well, yeahits a lot.

Of course, most people dont really have to worry about how a wireless network of any kind works, as long as it does what its supposed to do. However, if you want to make sense of the growing applications for 5G and what may or may not be possible at any given time or place, then you should understand at least some of the basic principles behind how it works.

One of the most interesting capabilities promised for 5G, for example, is network slicing. The idea with network slicing is that you can virtualize and then isolate a given set of data traffic or a specific application all within a single shared physical network. So, for example, if you want to separate time-sensitive IoT traffic for manufacturing-based sensors from the data traffic on a consumers smartphone or even provide a private VPN-style connection for a companys employees, network slicing can provide this and much, much more.

In order to do these types of tasks, 5G networks need more compute power and more sophisticated software than previous generations of wireless networks have required. As a result, theres been a lot of discussion about changing the overall architecture of the infrastructure used to power 5G networks. Traditionally, most of the equipment used in cellular wireless networks comes from a few dedicated vendors, notably Ericsson, Nokia, and more recently, Samsungs Network division. (Huawei has also been a significant player in the network equipment infrastructure, but recent geopolitical challenges have begun to reduce its impact outside of China.)

For decades, the traditional network equipment vendors have primarily made very sophisticated, but essentially proprietary, pieces of equipment that are used at the heart of cellular networks around the world. Over the last few years, however, there have been industry-wide movements to open up these proprietary boxes, particularly because of the expanding compute requirements for 5G. In particular, theres been a growing call to open up the RAN, or radio access network, portion of the network infrastructure. The RAN is the part of the mobile network that sits between your smartphone or any other end-user device (including things like IoT sensors, connected cars, etc.) and the core telecom network. The RAN handles the reception and transmission of the wireless signals used to communicate between devices and the main network (and then out to the cloud), so it plays a critical role for applications like network slicing.

The reason for this interest in moving to more flexible, open standards is, in many ways, analogous to the move to virtualization in servers and then the migration to cloud computing models. More sophisticated application demands, along with greater overall demand for bandwidth has put traditional telecom markets under a great deal of strain. The ability to virtualize and share resources more efficiently, along with the standardization around more general-purpose hardware, has much of the same appeal to modern wireless networks as it did to enterprise data centers and cloud computing providers more than a decade ago. Toss in the increasingly sophisticated software used to manage all these resources, and the comparisons to cloud computing, as well as the seeming inevitability of the change for wireless networks, seems undeniable.

But it turns out things arent quite as simple as that in the telco world. First, while uptime and reliability are incredibly important for cloud-based computing services and the applications that run on them, the utility-like service of wireless networks demands even more reliability. Think about it this wayhaving Zoom go down for a few hours this week was bad, but if your entire internet (or mobile) connection went down for the same amount of time, it would be much worse. Indeed, to their immense credit, wireless (and broadband) networks have been impressively resilient throughout the course of the pandemic and, thankfully, have proven to be incredibly reliable under tremendous strain.

Part of the reason for this is that the RAN portion of wireless networks are significantly more distributed. While that makes it harder for larger portions to go down at the same time, it makes it much more challenging to upgrade all the different pieces that need to be upgraded in order to make this transition to an open, software-controlled, virtualized environment. Directly related to all of this is that the mindset of many telco network providers is more like that of a utility than it is a cutting-edge cloud computing service. At times, the methodical, conservative pace at which many telcos make changes to their networks can be frustrating to those whove become accustomed to the lightning-fast pace at which other aspects of the technology industry move. When it comes to reliability, however, and the dependence we all have on these networks, a more measured pace of change does start to make sense.

Even within these constraints, weve started to see some important movements towards a more open, virtualized network environment. In fact, just this week Verizon teamed with Samsung Networks and Intel to announce that they had completed what they claim is the worlds first fully virtualized end-to-end 5G data session over an Open RAN. In particular, the company created a service that provided access for mobile developers to AWSs Wavelength edge computing platform over Verizons 5G network. As simple as that may sound, it actually required a great deal of effort across a host of different elements, such as Samsungs new virtual CU (Central Unit) and virtual DU (Distributed Unit) components, which replace traditional proprietary network infrastructure hardware with software that runs on general-purpose computing hardware.

There has also been growing interest in Open RAN technologies from a number of other places. Not surprisingly, traditional computing and data networking hardware and software vendorscompanies like Intel, Cisco, Microsoft, IBM and others, as cloud providers like Amazon and Google (in addition to Microsoft) have been eager to see telcos move into technologies and platforms for which they have a wide range of capable offerings. Interestingly, weve also seen traditional network equipment vendors start to get behind Open RAN. In addition to Samsung Networks efforts, Nokia has also come out as a strong supporter of the concept, believing that certain aspects of the RAN could benefit from more flexible options.

Carriers have also started to talk more about the potential for Open RAN. In addition to this Verizon announcement, AT&T has publicly discussed its own support for Open RAN on several other occasions. In Japan, Rakuten has built a completely virtualized 5G network, relying heavily on Open RAN and other modern network technologies. Of course, Rakuten had the huge benefit, at least from an architectural perspective, of not having any existing legacy network infrastructure to upgrade or replace. As with the first enterprise data centers that started making the move to cloud computing many years back, the big US carriers have a huge installed base of legacy equipment that they cant just easily switch out. As a result, even though theres strong interest in moving to more modern network infrastructures, the move to Open RAN is clearly going to be a multi-year process.

At the same time, there can be challenges to moving to new environments as well. Some of Rakutens well documented early issues centered around trying to piece together a wide range of different components from many different vendors. Glitches and delays are fine for a carrier that everyone understands is starting from scratch, but not really acceptable for large, established telcos.

The bottom line is that the drive to adopt Open RAN and other more modern, software-defined, virtualizable network topologies is unquestionably moving forward, but the ultimate pace at which it occurs is going to be dependent on how easily network equipment vendors can make the transition possible for carriers. Given the incentive of speeding up the delivery of new 5G services, like network slicing, as well as improving the overall efficiencies and costs of running those networks, the appeal to move to Open RAN is strong. At this point, its just a question of practical realities and how motivated all the various participants are in making it happen.

Disclosure: TECHnalysis Research is a tech industry market research and consulting firm and, like all companies in that field, works with many technology vendors as clients, some of whom may be listed in this article.

The rest is here:
Will 5G Networks Move To Open RAN? - Forbes

3 Stocks That Have Doubled In 2020 and Still Have Room to Grow – Motley Fool

Shares of Datadog (NASDAQ:DDOG), MercadoLibre (NASDAQ:MELI), and Cloudflare (NYSE:NET) have more than doubled in 2020. These stocks have seen more opportunities than challenges in the coronavirus health crisis. They are also poised to continue growing for years to come because the COVID-19 pandemic may have accelerated their growth plans but the good times should keep rolling when the world goes back to normal operations again.

Datadog helps other companies monitor their cloud computing services through a variety of data collection and analysis tools. These tools are, of course, presented as cloud-based services themselves. This is software-as-a-service designed to monitor the performance, security, and efficiency of your software-as-a-service solutions.

The work-from-home policies of the COVID-19 pandemic were tailor-made to boost Datadog's business prospects. The stock has gained 123% year to date.

Top-line sales rose 68% year over year in the second quarter, exceeding both analyst estimates and management's guidance targets. The list of customers with contracts worth at least $100,000 per year grew 71% longer and existing clients renewed their deals at an average of 30% above the expiring arrangement's dollar value.

Image source: Getty Images.

Datadog expects the new customers to stick around for the long haul because the service monitoring platform tends to become essential to the client's day-to-day operations very quickly.

"We had a few small yet notable new logo wins from two global hotel chains, an amusement park chain, a large U.S. university, and a European airline," CEO Olivier Pomel said in the second-quarter earnings call. "These wins show that even in the face of challenging times for these customers, transforming to ensure business resilience and longevity is a top priority."

The bottom line often looks skimpy, including second-quarter earnings of just $0.05 per share, but that's not what this company is about. Datadog is a classic growth stock, funneling every ounce of available cash into promoting even faster revenue growth rather than adding it to a growing pile of retained earnings. It's a winning business model, with or without artificial tailwinds from a global pandemic.

MercadoLibre started out as the eBay (NASDAQ:EBAY) of Latin America and the Caribbean. That's still true but the company has also branched out into a number of related fields such as digital payment services (Mercado Pago and Mercado Credito), shipping and logistics (Mercado Envios), and e-commerce storefront platforms (Mercado Shops).

The safer-at-home policies of the COVID-19 pandemic were tailor-made to boost MercadoLibre's business prospects. The stock has gained 112% year to date.

Sales were not just growing but accelerating over time when the pandemic came along. The growth curve only turned more sharply upward in the spring and summer of 2020:

MELI Revenue (TTM) data by YCharts

People turning to online shopping instead of going to the local tiendas and supermercados drove MercadoLibre's revenues 61% higher in the second quarter while operating expenses only increased by 15%. You don't have to do a whole lot of advertising to keep the customers coming to an e-commerce portal in times like these. The road ahead involves helping Latin America as a whole get back on track.

"At MercadoLibre, our stated business mission is to democratize commerce and payments," CFO Pedro Arnt said in the second-quarter earnings call. "With so many businesses being hard hit, we have the unique opportunity to connect and empower millions of Latin American entrepreneurs, while continuing to partner with governments across the region in our role as an essential service. Never has our mission been more relevant and never have we felt more determined to fulfill it."

That's a strategy I can get behind, and it should serve MercadoLibre and its investors well in the years to come.

Image source: Getty Images.

This company runs a global network of digital content delivery servers, distributed-denial-of-service (DDoS) mitigation hubs, and Internet security tools. Clients both large and small depend on Cloudflare to keep their online services available and snappy to users around the world, even while under attack from a DDoS botnet.

(Stop me if you've heard this before!)

The work-from-home policies of the COVID-19 pandemic were tailor-made to boost Cloudflare's business prospects. The stock has gained 123% year to date.

Cloudflare's second-quarter sales rose 48% year over year. The number of large clients increased by 65% and the list of paying customers lengthened by 8% compared to the previous quarter. Meanwhile, Cloudflare continued to protect its client during one of the busiest periods of online attacks in the history of global networking.

"May was the busiest month the Internet has ever seen for DDoS attacks," CEO Matt Prince said on the earnings call. "June saw an attack against one of our customers that lasted 4 days and peaked at more than 750 million packets per second. Our network didn't flinch. The targeted customer's infrastructure never slowed down, and they weren't even aware until our systems alerted them."

That's the best possible outcome from attacks like these, and Cloudflare has delivered many similar success stories in 2020. Keeping cloud-based services available and fully functional at all times has become an essential business requirement for many companies, including some that never thought of the internet as a business asset before the coronavirus pandemic rolled in.

Cloudflare's proven expertise in protecting cloud-based business services will make a lasting impression on clients who signed on in the rapidly changing networking environment of 2020. This is another high-performing growth stock for the ages.

Continue reading here:
3 Stocks That Have Doubled In 2020 and Still Have Room to Grow - Motley Fool

Cloud Computing: The Solution to a Pandemic-Induced Problem – DevPro Journal

In 2019, Gartner forecasted that global public cloud revenue would nearly double from $182.4 billion in 2018 to $331.2 billion in 2022. However, that failed to take into account the outbreak of COVID-19. Now, we can expect cloud revenues to well more than double as organizations embrace the new normal.

But, why? To understand public cloud adoption and usage in 2020, one needs to journey back to the more stable climate of 2012. Thinking of what existed and of what was to come, the National Institute of Standards of Technology (NIST) published the five essential characteristics of cloud computing which both define public cloud to this day and explain why public cloud is the perfect match for these unstable times. These principles are as follows: on-demand self-service, broad network access, resource pooling, rapid elasticity and measured service.

On-demand self-service is the principle that a cloud customer can quickly adjust their service needs immediately through an online portal rather than through a drawn-out process. When the international danger of COVID-19 became apparent, many enterprises realized they would need to make rapid adjustments to adjust for massive demand. On-premises or colocation solutions take months to implement and would result in months of lost business. With public cloud, self-adjustments can instead be made on the fly in a way completely transparent to users.

Broad network access is the principle that computing resources and services can be accessed from anywhere on any kind of device. Once lockdowns started happening, the enterprise world was upended by the fact that its workforce could no longer be on site. Virtual private network (VPN) solutions were suddenly beyond maximum capacity for far longer than a normal continuity of operations scenario, leaving organizations unable to support the immediate influx of remote workers. Since public cloud is easily accessible over the internet, enterprises can jump on it instantly to surmount access challenges, making it the perfect solution to a pandemic-induced problem.

Rapid elasticity is the principle that capabilities can be scaled up or scaled down quickly. Once the pandemic occurred and capacity for things like remote work had to be added, many organizations needed to account for major capacity swings such as going from a workforce that was 10% remote to one that was 90% remote. Since rapid elasticity allowed for public cloud providers to add thousands of servers nightly to organizations, the public cloud providers were able to absorb such a massive shift while their customers were able to maintain continuous operations that would have otherwise gone dark.

Resource pooling is the ability for a service provider to serve multiple customers within the same infrastructure. This became crucial during the pandemic because it allowed for organizations of various sizes to leverage the cloud. With on-premises or colocation scenarios, providers would have only been able to service blue-chip companies since the waiting list would be too long and the onboarding process too complicated to be able to service small businesses and nimble start-ups. With resource pooling, however, public cloud providers had the spare capacity and a repeatable onboarding process to account for all organizations that needed to address COVID-19 disruptions, not just the major corporate players.

Measured service is the principle that a cloud provider prices its offering as discrete measured units of service as opposed to indiscrete all or nothing offerings. This approach offers a more efficient pricing model during unpredictable times, providing customers with consistency and reliability amidst uncertainty. With unpredictable pricing, large customers would be deterred by cost uncertainty and smaller customers would not be able to afford enterprise solutions. However, with measured service, pricing is extremely consistent with usage, which helps organizations of all sizes look out for their bottom line.

In 2012, it was known that the enterprise landscape was on the cusp of major change. However, few envisioned that the five principles of cloud computing that were put forward then would define how enterprises would thrive or die in 2020s trying times. And, for the enterprises that do thrive thanks to the cloud, the new normal that will exist post-pandemic will be vastly different than the one before.

Go here to see the original:
Cloud Computing: The Solution to a Pandemic-Induced Problem - DevPro Journal

Insights on the Global Cloud Computing Market | COVID-19 Impact and Analysis of Related Markets Drivers, Opportunities and Threats | Technavio -…

LONDON--(BUSINESS WIRE)--Technavio predicts the global cloud computing market to grow by USD 190.32 billion during 2019-2023. One of the primary drivers of the market is the growing inclination towards cloud computing in a bid to implement cost-cutting. The growing focus toward reducing overall CAPEX has led many businesses to opt for cloud computing solutions. This is because cloud solutions enable the deployment of applications without the need for provisioning hosting capabilities. This results in cost savings, which is fueling the growth of the global cloud computing market. Download Free Sample Report with COVID-19 Impact Analysis

The global cloud computing market is a part of the global internet services and infrastructure market. The global internet services and infrastructure market includes companies providing services and infrastructure for the internet industry including data centers and cloud networking and storage infrastructure. It also includes companies providing web hosting services. Our research reports provide a holistic analysis, market size and forecast, trends, growth drivers, and challenges, as well as vendor analysis covering around 25 vendors.

Latest reports related to cloud computing market analysis

Global Cloud DVR Market 2020-2024

Get FREE Sample Report

Global Bare Metal Cloud Market 2020-2024

Get FREE Sample Report

Global Private Cloud Services Market 2019-2023

Get FREE Sample Report

Technavios reports are aimed at providing key insights on cloud computing markets by identifying the key drivers, trends, and, challenges that are impacting the overall internet services & infrastructure market. The research analyses the impact on these factors on the cloud computing markets, for the present market scenario and over the forecast period. Technavios reports provide a comprehensive analysis on the vendors and their offerings, major growth strategies adopted by stakeholders, and the key happenings in the market.

Buy 1 Technavio report and get the second for 50% off. Buy 2 Technavio reports and get the third for free.

View market snapshot before purchasing

Internet Services and Infrastructure Market: Segmentation

Internet services and infrastructure, the parent market, includes the global cloud computing market within its scope and it is further segmented into multiple sub-segments. Technavios reports identify the high growth areas and opportunities for vendors operating in each sub-segment of the global cloud computing market. The market is segmented as follows:

Service

Deployment Type

Geography

Register for a free trial today and gain instant access to 17,000+ market research reports

Technavio's SUBSCRIPTION platform

Internet Services and Infrastructure Market: Geographic Segmentation

The global internet services and infrastructure market has been analyzed across key geographical regions to identify region level market dynamics, developments, and the key growth countries for the forecast period. The regional level analysis identifies the market shares, growth momentum, and key leading countries in the market, which include (but are not limited to) the following:

Vendor Landscape

Technavios industry coverage utilizes multiple sources and tools to gather information about multiple stakeholders and their offerings towards the market. Sources such as company websites, annual reports, whitepapers, subscription & in-house databases, industry journals, publications, and magazines are used in addition to other relevant sources. The vendor landscape provides a framework to estimate the health care supplies market, while also categorizing the vendors into pure-play, category-focused, or diversified based on their offerings. All market reports provide the key and contributing players across the value chain based on in-house influence index, developed using multiple industry and market parameters.

About Technavio

Technavio is a leading global technology research and advisory company. Their research and analysis focus on emerging market trends and provides actionable insights to help businesses identify market opportunities and develop effective strategies to optimize their market positions.

With over 500 specialized analysts, Technavios report library consists of more than 17,000 reports and counting, covering 800 technologies, spanning across 50 countries. Their client base consists of enterprises of all sizes, including more than 100 Fortune 500 companies. This growing client base relies on Technavios comprehensive coverage, extensive research, and actionable market insights to identify opportunities in existing and potential markets and assess their competitive positions within changing market scenarios.

Read the original here:
Insights on the Global Cloud Computing Market | COVID-19 Impact and Analysis of Related Markets Drivers, Opportunities and Threats | Technavio -...

Cloud Computing And Cybersecurity Need Better Oversight Now – Forbes

(Photo Illustration by Budrul Chukrut/SOPA Images/LightRocket via Getty Images)

On Aug. 5, the Office of the Comptroller of the Currency (OCC) handed down a cease and desist order to Capital One for its failure to establish effective risk assessment and management processes before migrating its information technology operations to a cloud operating environment.

While we hear about data breaches on a nearly weekly basis, the Capital One incident is noteworthy because it involved the banks migration to cloud computingsomething that many banks are either in the process of doing, or will be doing in the near future.

The $80 million fine Capital One must pay to the US Treasury is pocket change for the bank.

The compliance actions the bank will be required to take will likely prove to be the bigger headache. The OCCs consent order requires Capital One to:

Its hard to believe, but bank executives concerns regarding cybersecurity are declining (that isnt a typo).

According to Cornerstone Advisors Whats Going On in Banking studies, nearly half of bank executives put cybersecurity on their list of top three concerns for 2018. That percentage declined to 36% in 2019 and dropped even further to 21% in 2020.

Financial Institutions Citing Cybersecurity as a Top Three Concern

Whats going on here?

Operational integration is lulling banks into a false sense of (cyber) security.

Cybersecurity policy is becoming business as usual for banks. As a result, bank execs are more confident today than they were three years ago that cybersecurity policies are well-designed and being well-executed.

Its a false sense of security, however, because banks have yet to feel the cybersecurity impact of cloud computing.

Three data points highlight the growth of cloud computing in banking:

As cloud computing within banking grows, the prevalence of cyber breaches for cloud services is growing significantly as well. According to a Verizon study:

Cloud assets were involved in about 24% of breaches this year. Cloud breaches involved an email or web application server 73% of the time, and 77% involved breached credentials.

A new report from Cornerstone Advisors, commissioned by DefenseStorm, Cloud on the Horizon, identifies emerging cloud-related cybersecurity challenges facing banks including:

1) Over-reliance on providers. There is an over-reliance on providers to complete cybersecurity checklists from banks during due diligence. It would be pretty easy for them to dupe us, said one Chief Information Security Officer (CISO) interviewed for the report.

There is also over-reliance on just a few providers.

Richard Harmon, Managing Director at Cloudera, calls this cloud concentration risk and writes, the consolidation of multiple organizations within one cloud service provider (CSP) presents a more attractive target for cybercriminals.

2) Reporting problems. Bank CISOs have discovered incorrect completion of due diligence cybersecurity requests for third party risk management from the providers.

Transparency has become an issue, as well. CISOs stated a lack of willingness to show any of the providers security policies or audits.

One CISO mentioned that when his bank asked a provider for a SOC-2, the vendor produced Amazon Web Services SOC 2. When the CISO questioned the vendor as to whether it had its own SOC 2, the provider was unaware it even needed to do its own.

3) Technical limitations. Many cloud vendors have cybersecurity limitations. For example, they cannot IP-restrict or require multi-factor authentication for third parties. Configuration is a challenge, as well.

Its not just the vendors fault. According to Bill Glasby, Chief Technology Officer of Heritage Bank, one issue around cloud security is operators' inability to configure the tools. The problem is that its all home-brew today.

Banks migration to the cloud will necessitate changes to how they govern IT from three perspectives:

1) Contractual. Migrating to the cloud requires switching from traditional security testing to a contractual-based model for security testing. Banks cant move to the cloud without caring about and dealing with the contractual clauses with their service providers. In particular, banks should negotiate the reversibility clause with their cloud providers.

One problem, however, according to a CIO interviewed by Cornerstone, is that many cloud providers dont even know what should be written in a reversibility clause.

2) Organizational. Business departments and lines of business end-running IT and buying cloud solutions directly from cloud providers will become more prevalent with a migration to the cloud. IT will have to reinforce its IT governance policies and procedures in order to minimize the risks caused by the solutions implemented by the different business departments.

3) Strategic. Business departments want flexibility and innovation. However, migrating to cloud services typically involves a shift from highly customized to mostly-standardized services. This can cause friction between IT and the businessfriction that must be resolved with strategic clarity and direction.

To handle the coming wave of cloud-related cybersecurity issues, Cornerstone and DefenseStorm recommend that banks:

For a complimentary copy of the Cloud On The Horizon report click here. To register for the Cloud On The Horizon webinar on August 20 at 2:00pm ET click here.

See original here:
Cloud Computing And Cybersecurity Need Better Oversight Now - Forbes

Worldwide Transaction Monitoring Solutions Industry to 2024 – The Emergence of Cloud Computing Services is Boosting Growth – ResearchAndMarkets.com -…

The "Global Transaction Monitoring Solutions Market 2020-2024" report has been added to ResearchAndMarkets.com's offering.

The transaction monitoring solutions market and it is poised to grow by $ 9.97 bn during 2020-2024 progressing at a CAGR of 15% during the forecast period.

The reports on transaction monitoring solutions market provides a holistic analysis, market size and forecast, trends, growth drivers, and challenges, as well as vendor analysis covering around 25 vendors.

The report offers an up-to-date analysis regarding the current global market scenario, latest trends and drivers, and the overall market environment. The market is driven by the emergence of cloud computing services, need for compliance with government regulations, and increased need for greater customer satisfaction.

The transaction monitoring solutions market analysis include application segment, deployment segment and geographic landscapes. This study identifies the emergence of advanced technologies as one of the prime reason driving the transaction monitoring solutions market growth during the next few years.

The research presents a detailed picture of the market by the way of study, synthesis, and summation of data from multiple sources by an analysis of key parameters.

The transaction monitoring solutions market covers the following areas:

Companies Mentioned

Key Topics Covered:

1. Executive Summary

2. Market Landscape

3. Market Sizing

4. Five Forces Analysis

5. Market Segmentation by Deployment

6. Customer landscape

7. Geographic Landscape

8. Drivers, Challenges, and Trends

9. Vendor Landscape

10. Vendor Analysis

11. Appendix

For more information about this report visit https://www.researchandmarkets.com/r/sagqqi

View source version on businesswire.com: https://www.businesswire.com/news/home/20200820005534/en/

Contacts

ResearchAndMarkets.comLaura Wood, Senior Press Managerpress@researchandmarkets.com For E.S.T Office Hours Call 1-917-300-0470For U.S./CAN Toll Free Call 1-800-526-8630For GMT Office Hours Call +353-1-416-8900

Read more:
Worldwide Transaction Monitoring Solutions Industry to 2024 - The Emergence of Cloud Computing Services is Boosting Growth - ResearchAndMarkets.com -...

SKYY: A Cloud Computing ETF Not Overly Concentrated In The Big-3 – Seeking Alpha

Source: ParkMyCloud.com

The First Trust Cloud Computing ETF (SKYY) isn't as heavily weighted in the stocks of companies that first jump into most investors' heads when they think of "the cloud". Most investors would assume a cloud-focused ETF or mutual fund would be over-weighted in the holdings of the "Big-3" leading cloud providers Microsoft (MSFT) for its Azure platform, Amazon (AMZN) for its AWS business, and Alphabet (NASDAQ:GOOG) (NASDAQ:GOOGL) for its Google Cloud ("GCP").

Regardless, SKYY - which seeks to track the "ISE CTA Cloud Computing Index" - is up 26% YTD and 150% over the past 5 years (not counting dividends).

And to be sure, SKYY holds the Big-3 companies in the fund, just not to the level you might have thought - or perhaps desired. Let's take a look at the top-10 holdings as of last week (8/14/2020):

Source: First Trust

As can be seen from the graphic above, SKYY's top-10 holdings have an aggregate 36.76% total allocation. Within that group, the Big-3 have an aggregate weighting of ~13%. That might surprise some investors, considering the relative market share the Big-3 have in the cloud. As the graphic below shows, in Q1 2020, the Big-3 providers accounted for a whopping 55% of total global infrastructure spending:

Source: Canalys.com

Alibaba (NYSE:BABA) had a 6% share and is the second biggest holding in the SKYY ETF with a 4.6% weighting, much more in line with its market share as compared to the Big-3. This relative weighting might indicate the SKYY portfolio managers may be thinking:

Investors should ponder these issues and consider what their comfort level is with various valuation levels, growth expectations, regional considerations, and the associated risks.

Regardless of the relative weighting of the biggest cloud service providers, one thing is clear: growth in global cloud service revenue is nowhere near slowing down. As the graphic below demonstrates, virus or no-virus (note this was a July 2020 forecast), cloud service revenue is expected to grow 19% next year and another 18% in 2022 versus 2021. Or, looking at it in raw numbers: revenue in 2022 is expected to be ~$106 billion higher than full-year 2020 expectations.

Source: Gartner July 2020

Note that the "Others" category in the previous market share graphic held a 38% slice of the global cloud service pie. That, combined with Gartner's growth projections, means that some of the smaller players held in the SKYY ETF will have plenty of opportunities to grow.

Oracle Corporation (ORCL), the #5 holding, and its cloud-related revenue was relatively flat year over year (see Q4 and FY2020 EPS report). The stock is also flat YTD. So, it's clearly been an under-performer while yielding ~1.8%. But Oracle is up nicely in the pre-market today due to speculation the company is teaming up with private equity to challenge Microsoft's (MSFT) bid for Tik-Tok.

Fastly Class A (FSLY) is the #7 holding and is up a whopping 280% this year on the heels of fast growing revenue and market acceptance of its security and cloud products. VMware (VMW), the #8 holding, is down YTD and is the technology much of Oracle's cloud-based solutions are tethered. Interesting that both those companies are relatively flat this year.

Arista Networks (ANET) holds down the #9 position and has also been a relative laggard with poor YoY comparisons.

The downside risks should not be sugar-coated (just look at the chart at the end of the article for what happened in March):

Alibaba may face some headline risks due to the ongoing trade battle between the US and China. That said, Alibaba's position in China is not going to be affected much, and if the company needs to remove its US exchange listing, there are a number of other exchanges that would be happy to host the company.

Upside risks include continued big-tech domination in the overall market, acceleration of the work- and school-from-home trends that could lead to more cloud-related activity, and an acceleration of digitization in general due to the 5G rollout, which should be rapidly accelerating over the next 2-3 years, at a minimum.

While SKYY holds solid positions in the Big-3 cloud service providers, they are under-weighted in comparison to their cloud market share. But that might be just fine for investors who feel they are over-valued in today's market. Alibaba is more equally weighted in the fund as compared to its cloud market share, and that might be exactly what an investor who thinks China may grow faster than the US wants to see. Regardless, the global cloud market is set to grow by over $100 billion over the next two years, and that gives the ~60% of the companies that aren't in SKYY's top-10 holdings plenty of opportunities to grow.

Bottom line: SKYY has an excellent 1-year and 5-year track record (see chart below) of performance and is well worth the 0.60% expense ratio.

Source: Seeking Alpha

Disclosure: I am/we are long AMZN, GOOG. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Additional disclosure: I am an engineer, not a CFA. The information and data presented in this article were obtained from company documents and/or sources believed to be reliable, but have not been independently verified. Therefore, the author cannot guarantee their accuracy. Please do your own research and contact a qualified investment advisor. I am not responsible for the investment decisions you make.

Visit link:
SKYY: A Cloud Computing ETF Not Overly Concentrated In The Big-3 - Seeking Alpha

Canvas and AWS Educate Make Extensive Teaching Resources Available for Cloud Computing Education – PRNewswire

SALT LAKE CITY, Aug. 18, 2020 /PRNewswire/ --To help students prepare for careers in the growing field of cloud computing, Instructure and Amazon Web Services, Inc. (AWS), specifically AWS Educate, today announced that they are providing educators with extensive cloud computing teaching resources. This collaboration gives educators access to the knowledge, resources, and support they need to build student skills in cloud technology and further prepare them to fill the hundreds of thousands of unfilled cloud computing jobs that exist today.

Instructure will provide its Canvas users access to the AWS Educate Getting Started Resource Center through the Canvas Commons repository, a digital resource library full of educational content for educators to use in the classroom. Educators can then access the educational content and resources from AWS Educate to explore the AWS Cloud, including resources like AWS Promotional Credits, access to labs, and curated content designed to introduce students to cloud computing basics.

"One of the biggest challenges in preparing students for cloud careers is a lack of accessible hands-on resources for educators to effectively teach cloud concepts in the classroom. We are thrilled to expand our collaboration with Instructure to address this opportunity with foundational resources for educators," said Ken Eisner, Director of Worldwide Education Programs for AWS and head of AWS Educate. "Now, educators using Canvas can quickly access and download valuable AWS Educate cloud computing content directly to their Canvas courses without ever leaving the learning management system. By centralizing and simplifying the ability to deliver content, thousands of students can learn skills they need to build in the cloud."

Instructure is hosting a series of webinars together with AWS Educate that will help Canvas educators get started using these cloud computing resources and help them understand the impact they can have in the classroom.

"Among the barriers to teaching cloud computing in schools is a lack of current subject matter expertise and resources to successfully build student skills," said Melissa Loble, Chief Customer Experience Officer at Instructure. "Together with AWS Educate, we can give educators the cloud computing tools and knowledge needed to prepare today's students for tomorrow's workforce."

The first instructional webinar from Instructure will be held today, August 18, 2020, at 10 am MT. To register, visit LINK.

About Instructure:

Instructure helps people grow from the first day of school to the last day of work. More than 30 million people use the Canvas Learning Management Platform for schools and the Bridge Employee Development Platform for businesses. Learn more at http://www.instructure.com.

CONTACT: Cory EdwardsVice President, Corporate CommunicationsInstructure(801) 869-5258[emailprotected]

SOURCE Canvas; Instructure

http://www.instructure.com

Here is the original post:
Canvas and AWS Educate Make Extensive Teaching Resources Available for Cloud Computing Education - PRNewswire