Category Archives: Cloud Storage
Apache Pulsar vs. Kafka and other data processing technologies – TechTarget
This article discusses how Apache Pulsar handles storage and compares it to how other popular data processing technologies, such as Apache Kafka, deal with storage. Follow this link and take 35% off Apache Pulsar in Action in all formats by entering "ttkjerrumgaard" into the discount code box at checkout.
Apache Pulsar's multilayered architecture completely decouples the message serving layer from the message storage layer, allowing each to scale independently. Traditional distributed data processing technologies such as Hadoop and Spark have taken the approach of co-locating data processing and data storage on the same cluster nodes or instances. That design choice offered a simpler infrastructure and some possible performance benefits due to reducing transfer of data over the network, but at the cost of a lot of tradeoffs that impact scalability, resiliency and operations.
Pulsar's architecture takes a very different approach that's starting to gain traction in a number of cloud-native solutions. This approach is made possible in part by the significant improvements in network bandwidth that are commonplace today: separation of compute and storage. Pulsar's architecture decouples data serving and data storage into separate layers: Data serving is handled by stateless "broker" nodes, while data storage is handled by "bookie" nodes as shown in Figure 1.
This decoupling has many benefits. For one, it enables each layer to scale independently to provide infinite, elastic capacity. By leveraging the ability of elastic environments (such as cloud and containers) to automatically scale resources up and down, this architecture can dynamically adapt to traffic spikes. It also improves system availability and manageability by significantly reducing the complexity of cluster expansions and upgrades. Further, this design is container-friendly, making Pulsar the ideal technology for hosting a cloud native streaming system. Apache Pulsar is backed by a highly scalable, durable stream storage layer based on Apache BookKeeper that provides strong durability guarantees, distributed data storage and replication and built-in geo-replication.
A natural extension of the multilayered approach is the concept of tiered storage in which less frequently accessed data can be offloaded to a more cost-effective persistence store such as S3 or Azure Cloud. Pulsar provides the ability to configure the automated offloaded of data from local disks in the storage layer to those popular cloud storage platforms. These offloads are triggered based upon a predefined storage size or time period and provide you with a safe backup of all your event data while simultaneously freeing up storage capacity on the local disk for incoming data.
Both Apache Kafka and Apache Pulsar have similar messaging concepts. Clients interact with both systems via topics that are logically separated into multiple partitions. When an unbounded data stream is written to a topic, it is often divided into a fixed number of equal sized groupings known as partitions. This allows the data to be evenly distributed across the system and consumed by multiple clients concurrently.
The fundamental difference between Apache Pulsar and Apache Kafka is the underlying architectural approach each system takes to storing these partitions. Apache Kafka is a partition-centric pub/sub system that is designed to run as a monolithic architecture in which the serving and storage layers are located on the same node.
In Kafka, the partition data is stored as a single continuous piece of data on the leader node, and then replicated to a preconfigured number of replica nodes for redundancy. This design limits the capacity of the partition, and by extension the topic, in two ways. First, since the partition must be stored on local disk, the maximum size of the partition is that of the largest single disk on the host machine (approximately 4 TB in a "fresh" install scenario); second, since the data must be replicated, the partition can only grow to the size of smallest amount of disk space on the replica nodes.
Let's consider a scenario in which you were fortunate enough to have your leader be placed on a new node that can dedicate an entire 4 TB disk to the storage of the partition, and the two replica nodes each only have 1 TB of storage capacity. After you have published 1 TB of data to the topic, Kafka would detect that the replica nodes are unable to receive any more data and all incoming messages on the topic would be halted until space is made available on the replica nodes, as shown in Figure 3. This scenario could potentially lead to data loss, if you have producers that are unable to buffer the messages during this outage.
Once you have identified the issue, your only remedies are to either make more room on the existing replica nodes by deleting data from the disks, which will result in data loss, since the data is from other topics and most likely has not been consumed yet. The other option is to add additional nodes to the Kafka cluster and "rebalance" the partition so that the newly added nodes will serve as the replicas. Unfortunately, this requires recopying the entire 1 TB partition, which is an expensive, time-consuming and error-prone process that requires an enormous amount of network bandwidth and disk I/O. What's worse is that the entire partition is completely offline during this process, which is not an ideal situation for a production application that has stringent uptime SLAs.
Unfortunately, recopying of partition data isn't limited to only cluster expansion scenarios in Kafka. Several other failures can trigger data recopying, including replica failures, disk failures or machine failures. This limitation is often missed until users experience a failure in a production scenario.
Within a segment-centric storage architecture, such as the one used by Apache Pulsar, partitions are further broken down into segments that are rolled over based on a preconfigured time or size limit. These segments are then evenly distributed across a number of bookies in the storage layer for redundancy and scale.
Using the previous scenario we discussed with Apache Kafka in which one of the bookies disks fills up and can no longer accept incoming data, let's now look at the behavior of Apache Pulsar. Since the partition is further broken down into small segments, there is no need to replicate the content of the entire bookie to the newly added bookie. Instead, Pulsar would continue to write incoming message segments to the remaining bookies with storage capacity until the new bookie is added. At that point, the traffic will instantly and automatically ramp up on new nodes or new partitions, and old data doesn't have be recopied.
As we can see in Figure 5, during the period when the fourth bookie stopped accepting segments, incoming data segments 4, 5, 6 and 7 were routed to the remaining active bookies. Once the new bookie was added, segments were routed to it automatically. During this entire process, Pulsar experienced no downtime and was able to continue serving producers and consumers. As you can see, Pulsar's storage system is more flexible and scalable in this type of situation.
About the authorDavid Kjerrumgaard is the director of solution architecture at Streamlio, and a contributor to the Apache Pulsar and Apache NiFi projects.
See more here:
Apache Pulsar vs. Kafka and other data processing technologies - TechTarget
Global Cloud Storage Service Market: Current Status, In-depth Analysis and Forecast Outlook 2026 By Red Hat, Inc., Zadara Storage, IBM Corporation,…
The report on Global Cloud Storage Service Market is a holistic guide to understand various other elements that play crucial role in growth progression. Details pertaining to competitor strategies, vendor landscape and details on trend assessment have all been discussed at length to derive logical deductions based on which new market aspirants as well as established vendors in global Cloud Storage Service market can maneuver and deliver growth supportive business decisions.
Request a sample of Cloud Storage Service Market report @ https://www.orbismarketreports.com/sample-request/123056?utm_source=Maia
Also, considering the volatile nature of the market owing to immense technological disruptions, this report is also designed to harness an intelligent investment guide for established market players eying sustainable revenue pools and market stability in Cloud Storage Service market.
Key Plyares Analyis: Global Cloud Storage Service Market
Red Hat, Inc.Zadara StorageIBM CorporationVMware, Inc.Microsoft CorporationEMC CorporationQumulo, Inc.Amazon Web Services, Inc.RubrikOracle CorporationNasuni CorporationGoogle Inc.Rackspace Hosting, Inc.Hewlett Packard Enterprise Development LP
Browse the complete report @ https://www.orbismarketreports.com/global-cloud-storage-service-market-size-share-growth-analysis-and-forecast-outlook-by-2026?utm_source=Maia
Segment-based Assessment: Global Cloud Storage Service Market
The end-use application segment is thoroughly influenced by fast transitioning end-user inclination and preferences. Product and application based segments clearly focus on the array of novel changes and new investments made by Cloud Storage Service market forerunners towards improving product qualities to align with end-use needs.
Cloud Storage Service Market Analysis by Types:
Primary storageDisaster recovery and backupCloud storage gatewayData archiving
Cloud Storage Service Market Analysis by Applications:
Large EnterprisesSmall & Medium Enterprises
The subsequent sections of the report includes a detailed assessment of core vendors, manufacturers and stakeholders that play decisive roles in maintaining steady growth and revenue stability in global Cloud Storage Service market. Details about aspirants eying seamless penetration, growth stimulating strategies, expansion schemes deployed by established players have also been echoes in the report to mediate growth proficient business decisions in global Cloud Storage Service market.
Major Report Highlights:
1. The Cloud Storage Service market report documents high end data concerning volume and value based developments encompassing crucial elements such as regional, product based and application oriented insights to influence high revenue generation and growth.
2. The Cloud Storage Service market report also specifically outlines key parameters encapsulating market drivers and restraining factors that deflate growth.
3. Key challenges faced by market players have also been broadly discussed in this report section to locate untapped Cloud Storage Service market opportunities.
4. The Cloud Storage Service market report also mirrors exact growth strategies and tactical business decisions braced by frontline players.
5. A top-down assessment of competition spectrum and regional growth hubs have also been pointed out in great detail to underpin lucrative business decisions.
Report Investments: Logical Guide
1. Investment in this Cloud Storage Service market report is a highly time saving discretion as the report houses crucial market relevant information that play integral role in growth process.
2. The report features business priorities at length to draw logical relations with business strategies, needed to induce high growth in global Cloud Storage Service market.
3. The report allows readers to align their pipeline investments and ongoing ones in complete co-ordination with growth objectives.
4. The Cloud Storage Service market report also helps readers to design and implement optimum decision making aligning with commercial viability of products and services echoing consumer interests.
Make an enquiry of this report @ https://www.orbismarketreports.com/enquiry-before-buying/123056?utm_source=Maia
Major Points from Table of Content:
1 Cloud Storage Service Market Overview2 Global Cloud Storage Service Market Landscape by Player3 Players Profiles4 Global Cloud Storage Service Production, Revenue (Value), Price Trend by Type5 Global Cloud Storage Service Market Analysis by Application6 Global Cloud Storage Service Production, Consumption, Export, Import by Region (2014-2019)7 Global Cloud Storage Service Production, Revenue (Value) by Region (2014-2019)8 Cloud Storage Service Manufacturing Analysis9 Industrial Chain, Sourcing Strategy and Downstream Buyers10 Market Dynamics11 Global Cloud Storage Service Market Forecast (2019-2026)Continued
ABOUT US:
With unfailing market gauging skills, Orbis Market Reports has been excelling in curating tailored business intelligence data across industry verticals. Constantly thriving to expand our skill development, our strength lies in dedicated intellectuals with dynamic problem solving intent, ever willing to mold boundaries to scale heights in market interpretation. We are equally backed by an elongated list of success stories and case studies that vouch for our extraordinary market research skills and milestones. Orbis Market Reports is a one-stop-solution to all market queries.
CONTACT US:
Address :- 6200 Savoy Drive,Suite 630 Houston, TX 77036Phone :- +1 210-667-2421Mail us: [emailprotected]
Online Harms in the UK: Significant new obligations for online companies and fines of up to 10% of annual global turnover for breach – Lexology
Yesterday was a busy day for Europe in terms of tech regulation. In addition to the ground-breaking announcements made by the European Commission of the new Digital Services Act and the Digital Markets Act, the UK government also published its long-anticipated final response to the Online Harms White Paper. The Online Harms consultation ran from 8 April 2019 to 1 July 2019 and received over 2,400 responses from stakeholders across the technology industry, including online platforms, charities, think-tanks, publishers, individuals and small/medium sized enterprises.
The governments response contains a significant amount of detail and gives us a real insight into the stance the UK will adopt post-Brexit on digital regulation.
As the lie of the land becomes clearer, in future articles we will be breaking down the detail into practical points businesses should be aware of. But, for now, here are some of the headline points that jump out when reading the governments response:
What next?
Unsurprisingly, the devil will be in the detail. The draft Online Safety Bill is expected in the new year, and there are sure to be further twists and turns as it is developed and scrutinised by Parliament during 2021. Look out for further articles over the coming weeks in which well be analysing what the governments response means for businesses and how to prepare for whats to come.
View original post here:
Online Harms in the UK: Significant new obligations for online companies and fines of up to 10% of annual global turnover for breach - Lexology
10 Ways To Boost Cloud Performance – CRN
The coronavirus pandemic has proved an endorsement of the need for organizations to move to the cloud to maintain business continuity, service their customers and ensure their remote workforces remain connected and can access the right tools to stay productive.
The percentage of overall IT spending dedicated to cloud is expected to continue to increase at an accelerated pace post-pandemic, with Gartner forecasting that cloud spending will account for 14.2 percent of the total global enterprise IT spending market in 2024, up from 9.1 percent this year.
Cloud providers, independent software vendors and solution providers continue to churn out new products and services to meet the rising demand and expand organizations cloud capabilities.
Heres a look at 10 recent offerings from Amazon Web Services, Cockroach Labs, CyberArk, Google Cloud, HashiCorp, LogDNA, Microsoft Azure, NetApp, SkyKick and TriggerMesh that fit that bill.
AWS Gateway Load Balancer is designed to make it easier and more cost-effective to deploy, scale and run third-party virtual appliances such as firewalls, intrusion detection/prevention systems and deep packet inspection systems on AWS. It provides a single gateway to distribute traffic across virtual appliances while scaling them based on demand to increase availability and eliminate potential points of network failure. It performs health checks on the virtual appliances and reroutes traffic flows when it discovers unhealthy ones.
CockroachDB 20.2 is the latest version of Cockroach Labs cloud-native, distributed SQL database. It includes new capabilities for spatial data for net-new workloads for IoT, transportation and environmental applications; a new CockroachDB for Kubernetes option that packages the database with a custom operator optimized for orchestrated deployments; and the addition of enterprise backup-and-restore functionality to the free community option. The release extends CockroachDBs TPC-C Benchmark performance up to 140,000 warehouses with a maximum throughput of 1.7 million transactions per minute, a 40 percent improvement over the past year, according to the company.
The cloud-agnostic CyberArk Cloud Entitlements Manager is a privilege-based, artificial intelligence-powered service designed to strengthen the security of cloud environments by removing excessive cloud permissions. It continuously detects hidden, misconfigured and unused cloud permissions to improve security through a least-privilege, zero-trust approach. It features a centralized dashboard with a single view of permissions across Amazon Web Servicesincluding Amazon Elastic Kubernetes Service Google Cloud Platform and Microsoft Azure environments.
Google Clouds serverless Document AI platform is a unified console for document processing that allows users to quickly access all of the cloud providers form, table and invoice parsers, tools and offeringsincluding Google Clouds Procurement DocAI and Lending DocAIwith a unified API. The document understanding offering uses artificial intelligence and machine learning to classify, extract and enrich data from scanned and digital documents at scale, including structured data from unstructured documents, making it easier to understand and analyze.
HashiCorp Waypoint is a new open-source project providing developers with a consistent workflow to build, deploy and release applications across any platform. It enables developers to get their applications from development to production in a single file and deploy them using a single command: waypoint up. Waypoint supports Kubernetes, HashiCorp Nomad, EC2, Google Cloud Run, Azure Container Instances, Docker and Buildpacks, among others. It can be extended with plugins to target any build/deploy/release logic.
LogDNAs new Kubernetes Enrichment for log management provides detailed insight into Kubernetes cluster environments. Its designed to enhance existing application logs in LogDNA with Kubernetes events and resource metrics, empowering developers to resolve deployment issues without deep specialized knowledge of Kubernetes tooling. It provides development teams with a single-pane-of-glass observability experience between a users underlying Kubernetes infrastructure and deployed services.
Microsoft Cloud For Healthcare is designed to help health-care customers and partners deliver better patient experiences, insight and care while improving workflow efficiency, streamlining collaboration and connecting data across sources. It incorporates offerings from Microsoft Azure, Microsoft 365, Microsoft Dynamics 365 and Microsoft Power Platform in addition to partner health-care solutions. Its architecture has built-in governance and privacy capabilities supporting compliance with HIPAA, GDPR, HITRUST CSF and other regulatory requirements.
Spot Storage by NetApp is a fully managed, continuously optimized block-and-file storage offering for cloud workloads. It automatically matches storage volume size and performance to application requirements, using NetApp technology for volume shaping, thin provisioning, compression and deduplication to deliver storage volumes that reduce storage costs. It will be integrated with Spot optimization products including Ocean (for containers and Kubernetes) and Elastigroup (for virtual machines) to allow automated, application-driven storage and compute management.
The new Cloud Manager is SkyKicks next-generation no-code and low-code automation, workflow and management application designed to help IT service providers securely administer and manage their cloud customers across SaaS, IaaS, PaaS and devices. Features include Command Center, a help desk automation application to aid tier one support desks in resolving cloud tickets faster; and WorkBench, a lowcode workflow automation engine that enables admins to turn PowerShell into Command Center applications with a click.
TriggerMesh Cloud Native Integration Platform 1.0 automates Kubernetes, cloud-native and on-premises infrastructures. The architecture-agnostic, production-ready SaaS offering allows users to integrate services, automate workflows and build event-driven applications out of any on-premises application or cloud service. Features include a declarative API, event transformation to forward events from one format to another, and new bridges to automate infrastructure and bridge workflows between Salesforce, OracleDB, Zendesk, Datadog, Oracle Cloud Security Logs, Splunk and others.
See the original post:
10 Ways To Boost Cloud Performance - CRN
Here are the Top 5 Encrypted Cloud Storage Services – The Mac Observer
To accompany my roundup of encrypted DNS services to use, I wanted to write a roundup of encrypted cloud storage services. These can be used in place of- or in addition to services like iCloud, Google Drive, Dropbox, and OneDrive.
NextCloud is among the top private services to use because you can host it on your own server, putting control of your data in your hands instead of a company. But you can still choose to have a company host your data for you. NextCloud is also open source.
NextCloud
Crypt.ee can store your files, photos, and notes on servers based in Estonia. Its open source and you only need a username to get started; no personal information like a phone number or even email address is needed to create an account. Crypt.ee is unique in that it doesnt have apps. Instead, its a web service that you access via a browser. The company says this helps give users deniability.
Crypt.ee
Tresorit is a zero-knowledge, end-to-end encrypted service that uses Microsoft Azure data centers in Ireland and the Netherlands. Depending on the selected plan, each user gets from 200 GB up to 1 TB. One notable feature is version history. Tresorit tracks all file changes and it can restore previous versions to correct mistakes or revert changes.
Tresorit
Syncs notable feature is the ability to send files of any size to any person, even if they dont use Sync. The service also backs up your files in realtime and its easy to recover deleted files and previous versions of files. This company has impressive, affordable storage plans. Its basic plan gives you 2TB for US$8/month (billed annually).
Sync
Im adding Proton Drive for the sake of future-proofing. Its a service from ProtonMail that is currently in closed beta for paid users. Like Sync, the company is developing end-to-end encrypted link sharing so you can send your files to non-Proton users. I dont know when it will exit the beta but its likely it will become public in 2021. Ill be keeping an eye out.
Proton Drive
Follow this link:
Here are the Top 5 Encrypted Cloud Storage Services - The Mac Observer
Explained: How the changes in Google free cloud storage policy will impact you – The Indian Express
Written by Shruti Dhapola, Edited by Explained Desk | New Delhi | Updated: December 9, 2020 9:49:31 amStarting June 1, 2021, Google Photos will no longer be free as earlier.
Googles online cloud storage policy will undergo a major change from June 1, 2021. The reason: the search giant will no longer offer unlimited free storage for uploading photos and videos onto its Google Photos service. But thats not at all: Google will also start deleting content from inactive accounts. We explain everything you need to know about Googles new online cloud storage policy and how it impacts you, the user.
What is the existing policy from Google?
Users on a regular Google Account get 15GB of storage space free. This is considerably more than Microsoft, which offers 5GB free space on OneDrive and Apples iCloud, which offers 5GB as well.
This 15GB space counts towards the users Gmail, Drive and Photos. Drive includes all files, spreadsheets, etc which are created on Googles suite of apps such as Google Docs, Google Sheets, etc. However, photos which were uploaded to the Google Photos app were not counted to this free space. This applied to all high-resolution photos and videos and express resolution photos and videos.
According to Googles definition, when you are uploading photos at high resolution, they are compressed to save space. Photos larger than 16MP get resized to 16MP. All videos at more than 1080p resolution are also resized to high-definition 1080p. In express resolution, the photos are at a maximum of 3MP, and videos are at 480p resolution.
Those who were uploading photos and videos at original resolution, meaning there was no compression of the photos or videos, will not be affected by the change at all. Thats because if you uploaded photos or videos at original resolution, Google did count these against the online storage available in your account.
READ | Best Google Photos alternatives: Free storage, paid cloud storage plans and more
What is the major change to the policy?
Starting June 1, 2021, Google Photos will no longer be free as earlier. All photos and videos uploaded post June 1, 2021 will be counted towards the account storage. Under the earlier policy it was technically possible for you to keep uploading photos and videos without running out of space on the free account, because it did not count. But all that will change next year.
So why is Google making this change?
It is not surprising to see Google make the change considering it has more than a 1 billion users for each of its products and providing so much free cloud storage does not seem like a feasible option.
Google itself explained in a blogpost that people are uploading more content than ever beforein fact, more than 4.3 million GB are added across Gmail, Drive, and Photos every day. The company says it needs to make these changes to keep pace with the growing demand.
Further, Googles move will likely push users who were on the fence to make payments for using its cloud service, especially those who have now become dependent on Google Photos for uploading and storing their phones photo gallery.
The company offers paid options for extra storage under the Google One program. In India, it starts at 200GB for Rs 210 per month, 2TB for Rs 650 per month or Rs 6500 per year, 10TB at Rs 3,250 per month and 20TB at Rs 6,500 per month.
I dont want to pay and I have a lot of photos on Google. Does it mean I will have to delete all my earlier photos?
Google is giving some concessions. All photos and videos uploaded before June 1, 2021 will continue to remain free and not count against the storage. So if you have somehow managed to upload more than 15GB of photos and videos to your Google Account till date, dont worry. You dont have to delete them.
But all photos and videos uploaded post June 1, 2021 will be counted against the free space that Google gives you. And if you plan to keep uploading more photos and videos, it might make sense to pay for the service, given the 15GB is divided across Gmail, Photos and Drive.
For users who have a Google One account which is a paid account, theres nothing to worry about and nothing really changes. Express Explained is now on Telegram
READ | Google Photos: Heres how to export or download all pictures, videos offline
Why is Google deleting content from inactive accounts?
As part of the new policy, Google will delete content from inactive accounts. Any account which has been inactive for more than two years or 24 months, might find content deleted from products where the user is inactive. So if you have not used Google Photos for your account for two years, then the company will delete content from that particular product.
But Google One members who are within their storage quota and in good-standing will not be impacted by this new inactive policy, says the company. Thankfully, Google will warn you plenty of times before deleting the data and you will have the option of downloading the content as well.
You can keep the account active by periodically visiting Gmail, Google Photos and Google Drive on the web or through a Google app, according to the company. If you have exceeded the storage limit for 2 years, you might also find your content being deleted. Again, Google will warn before it decides to delete all your data from accounts if you cross the storage limit.
Dont miss from Explained | How a telescope in Australia is creating a Google map of the Universe
The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines
For all the latest Explained News, download Indian Express App.
IE Online Media Services Pvt Ltd
Read the original:
Explained: How the changes in Google free cloud storage policy will impact you - The Indian Express
Object Matrix Joins the Active Archive Alliance – insideHPC
Boulder, Colo. December 9, 2020 TheActive Archive Alliancetoday announced that Object Matrixhas joined the organization, which promotes modern strategies to solve data growth challenges. Object Matrix joins a growing number of industry-leading storage and IT vendors that support the use of active archive solutions for data lifecycle management.
Were pleased to welcome Object Matrix to the Active Archive Alliance,said Peter Faulhaber, Chairman of the Active Archive Alliance and President and CEO of FUJIFILM Recording Media U.S.A., Inc. The Alliance is dedicated to bringing the best of breed technologies together to provide users with active archive solutions for intelligent data management. Companies across the globe are embracingactive archive technologiesto manage the complexities of vast data volumes and realize the value of their data.
Rising data volumes are intensifying the need for efficient, cost-effective active archives, which manage data for rapid search, retrieval, and analytics.Active archiving enables organizations to gain value from retained data quickly.By leveragingan intelligent data management layer,active archives allowfast access to archival data across a virtual file system and manage data between storage systems and media types.
Object Matrixis the award-winning software company that pioneered object storage and the modernization of media archives.It enables global collaboration, increases operational efficiencies, and empowers creativity through the deployment of MatrixStore, an on-prem, hybrid and cloud storage platform. Object Matrixs unified deployment approach ensures content spans on-prem and cloud storage, while its focus on the media industry gives it a deep understanding of the challenges organizations face when protecting, processing and sharing video content.
Now more than ever, media organizations require solutions that enable creative and production teams to self-serve access to content from work or remotely from anywhere, said Jonathan Morgan, CEO, Object Matrix. Our MatrixStore solution delivers the inherent scalability and interoperability of object-based storage, as well as instant access to data and metadata, and our customers include some of the worlds largest media organizations. With MatrixStore, organizations have the protection and governance for the lifetime of any content. We are excited to join the Active Archive Alliance and help promote active archive strategies to solve todays data growth challenges.
Link:
Object Matrix Joins the Active Archive Alliance - insideHPC
What Is Public Cloud and Is It Right for Your Business? – G2
Cloud platforms are a necessity for any business trying to make a mark in the digital world.
The ever-increasing demand for cloud platforms stems from the need of organizations to be updated with the latest data, insights and software at all times to stay relevant in the competition.
While cloud platforms offer innumerable advantages and make our lives easier, they are also increasingly complex in the adoption process of a company trying to figure out its cloud strategy. The sheer number of cloud service types can be quite a fix.
In this article, well focus on public cloud, the simplest and most convenient cloud platform available, and learn more about the advantages it has to offer and the challenges that come with it.
Public cloud computing is the cloud migration and storage services provided by third-party cloud vendors over the internet. It gives organizations the ability to choose their resources on-demand and provides them with a scalable platform and flexible payment plans.
A private cloud platform is a cloud service provided and managed internally by an organization. Private cloud solutions are highly secure and offer optimized performance benefits since they are hosted on dedicated servers and have robust firewall access protocols.
A public cloud platform, on the other hand, is a cloud service provided by external vendors where common computing and storage resources are shared among the customers with appropriate security protocols, and the resources are managed by the vendor for a fee.
On an enterprise level, both private and public cloud platforms are viable for use in various applications but sticking to either one of them may cause a hindrance to an optimal cloud experience. Organizations are adopting hybrid cloud solutions and multicloud solutions, which is a merger between private and public platforms, depending on the security and infrastructure requirements.
Hybrid cloud strategy uses public cloud infrastructure as a service platform and private cloud data centers for a more secure cloud experience. Multicloud strategy, on the other hand, employs a variety of cloud platforms for different applications in a single network infrastructure.
Public cloud vendors provide shared computing resources to multiple users and give them the liberty to decide the resources as they deem fit for their business. It is called a multi-tenant architecture since multiple tenants (organizations) use the same resources provided by the vendor. Each user is given secure storage for their data and the same servers will be used to host multiple applications belonging to different users.
Public cloud architecture can be categorized according to the three most common service models, explained as follows:
Public cloud vendors provide infrastructures such as servers, storage and networking hardware to users as per their demands. IaaS customers also have access to various other services and functionalities from the vendors such as maintenance, load balancing and process monitoring.
Best known examples of IaaS service providers include Amazon Web Services(AWS), Google Cloud Platform(GCP), Oracle Cloud, IBM, Digital Ocean and Microsoft Azure.
PaaS is a cloud service where the third-party vendor provides the hardware and software components to users to build application supporting platforms that they need to run on the cloud. PaaS users dont need to replace their entire IT infrastructure, rather just use the vendors hosted infrastructure services over a web browser.
The best known examples of vendors providing PaaS services include AWS Elastic Beanstalk and Google App Engine.
SaaS is a software on-demand cloud model, where the cloud service providers give the users access to a fully developed application created specifically for distribution. The software updates are rolled out for all users uniformly and organizations can use their own tools with the vendor provided application programming interfaces (APIs).
Best known examples of SaaS providers include Netsuite, Salesforce, and Concur.
Public cloud platforms have been adopted by a huge number of organizations, owing to the lesser hassle and optimization benefits they come with once the strategy has been put in place. Lets take a detailed look at the multiple advantages public cloud offers and gain a better understanding of its popularity in the market.
Public cloud model uses a shared resources model which allows cloud vendors to provide their services at a lesser price. Lack of on-premises IT infrastructure and virtualization of existing operating systems also enables users to reduce their hardware and networking assets pricing, allowing them to save better in the long run. Cloud services have become affordable for businesses small and big since they eliminate the need for operational and maintenance costs.
Public cloud services enable users to monitor their network usage, performance capabilities and computing power in realtime and plan the further cloud strategy accordingly. They need not depend on the service provider for these insights on a regular basis.
Public cloud platforms offer greater scalability to their users and give them the liberty to use more or fewer resources for every application, as needed. Both infrastructure and traffic scalability the ability to devote adequate compute power for unexpected increase in business traffic is achieved quickly owing to the huge number of computing services and resources they have available at hand.
Cloud data analytics are a huge advantage that public cloud computing provides. Organizations can gather usage metrics of the cloud resources and present business insights for a better future strategy.
In the public cloud computing model, physical resources like networking hardware and storage units and virtual resources like virtual machines and disk snapshots are available for use by all users in a pay per use system. Hence, if the situation demands, all resources could be pooled to serve a single user with high performance, lesser latency and higher storage needs.
Organizations employing public cloud services are not responsible for the maintenance of the resources provided by the vendors. Both operational and maintenance costs and trouble are taken care of by the vendor as part of their agreement, leading to a stress free cloud experience for the users. Since public cloud deployments are done through a web facility, disaster recovery incase of an unforeseen calamity is also taken care of by the cloud management provider.
Since public cloud services are externally sourced, they come with their own share of security implications and control issues. Let us now dive into the cons of public cloud and what companies need to take into consideration while employing this strategy.
Availing a public cloud service entails that the organization will have lesser control on the physical hardware standards, automation, access management, IT management, and technical support. Since all of these services are outsourced to the external vendor, you might have to adjust quite a few business processes in order to ensure a smoother transition to the cloud.
Since multiple cloud users are hosted on the same platform by the cloud service provider, data vulnerability remains a raging issue in case of public cloud providers.
While vendors are getting better with their cloud security practices and promises, it does not hide the fact that organizations are trusting an external vendor with their data and processes, and are still susceptible to a cyber attack more than they will be if they are on a private network.
Since public cloud platforms are hosted over the web, with no dependency whatsoever on the on-premise infrastructure, a connectivity issue might spell trouble with the entire cloud platform.
One way to stay out of this would be ensuring that you have strong network connectivity at all times, irrespective of the physical conditions.
For organizations having a complex network dependency between applications or an intertwined business model, public clouds may pose a headache because of the rigidity of the multi-tenant architecture.
Businesses cannot customize and optimize the use of resources on demand, they can only make the best of what the service provider has to offer. This one-size-fits-all methodology might be an issue for companies prone to rapid changes in their business and cloud strategy.
When GDPR updates rolled in, every organization who was in business with the EU pushed harder than ever to get the compliance factors checked. In case an organization is using public cloud services and needs to overcome a compliance hurdle for a particular client or partner, vendors might not be fast enough to provide the service and make all their resources compliant with every incoming change.
Cloud uptime is a critical area of consideration when it comes to adopting a cloud strategy. In case of any cloud outage at the vendor side, organizations will have to endure an unplanned downtime affecting their business hours.
While public cloud may not be the only ideal solution for your business migrating to the cloud, it is certainly a strong contender to be chosen for a number of applications having simpler cloud storage and security needs.
Migrating to and running your business on the cloud can be made easier by employing multiple cloud environments for different applications as per their individual requirements.
Read more:
What Is Public Cloud and Is It Right for Your Business? - G2
Looking into the crystal ball: Tech predictions for 2021 – ITProPortal
2020 was a year of evolution for many sectors of the tech industry due to the reality of the Covid-19 pandemic. Many tech leaders had to pivot their strategies by shifting to a largely remote-work model and assessing new markets that were impacted by unforeseen economic impacts.
The tech industry has had major changes this year and because of that many experts are predicting new or evolved trends in 2021. Below multiple tech experts highlight their top predictions within the technology industry for the new year.
Krishna Subramanian, president and COO at Komprise:
In 2021, cloud storage costs begin to overtake compute costs. For the past three years, cloud cost optimization has been a key priority for businesses. In fact, Gartner predicted that 80 percent of businesses will outspend their cloud budgets in 2020. A bulk of these costs so far has been in the compute, since cloud object storage is relatively cost effective. But this is changing, since cloud file storage is typically ten times more expensive than S3, and file data is way more voluminous than block data all of which underscores the importance of using cloud file storage just when you need it. In 2021, enterprise IT organizations will begin adopting cloud data management solutions to understand how cloud data is growing and manage its lifecycle efficiently across the various cloud file and object storage options.
Anshu Sharma, CEO and co-founder, Skyflow:
Every large company is on a long-term transition to digital and cloud - so they can effectively compete against the likes of Amazon, and the Silicon Valley startups who are aiming for them. Companies like Nike, JP Morgan Chase, and Walgreens have been trying to transform and in 2020 they all got an unintended boost in their push to the cloud - because they had to. The Fortune 500, having seen relative success with cloud and digital are not going back. They are all doubling down. 2021 will be the year of the digital double down.
Patrick Harr, CEO, SlashNext:
Over the last 30 days, 10 percent of company users were phished, according to live data we compiled across more than 100 large and mid-sized enterprises. Every day, SlashNext Threat Labs detect 21,000 new phishing attacks, almost double the number of threats from a year ago, and SlashNext Threat Labs are seeing an alarming 50-75 percent attacks getting past conventional phishing defenses to compromise enterprise networks. So, if you think your current defenses will keep you safe, think again. And in 2021, we anticipate this problem will get much, much worse.
Stowe Boyd, analyst, Gigaom:
The pandemic has accelerated the adoption of technologies that were popular before, but which are now essential. One example has been the combination of work chat tools and video conferring, as typified by Microsoft Teams and Slack. Microsoft has seen a dramatic uptick in usage, and the release of Google's new take on the former GSuite, now known as Google Workspace, which also integrates work chat and video conferencing represents another challenge for Slack. As the two leaders in what we might think of as 'business operating systems,' Google and Microsoft present a difficult challenge for Slack, since companies will not want to pay extra for functionality, they already have access to in their communications and file storage platforms.
Saad Siddiqui, principal, Telstra Ventures:
The unrelenting pace of open source innovation will continue in 2021, particularly in the areas of data analytics and data infrastructure, leaving many Fortune 1000 businesses and other SMBs struggling to keep pace with, and integrate with, open source innovations. Unlike large tech players, who can afford to hire an army of engineers to constantly change things and add their own herbs and spices to improve their operations, most businesses and newer vendors dont have as many talented engineers or cant hire more engineers to keep pace with change. To address this talent wall, were seeing the rise of a hybrid open source business model, whereby open source data analytics companies like Incorta and infrastructure companies like Rancher Labs monetize closed source, out-of-the-box capabilities that deliver open source innovations while requiring less time and resources for enterprises to derive value.
Cornelia Davis, CTO, Weaveworks:
2021 will see the emergence of common distributed operational patterns widely implemented across all industries, and with the coming of 5G and the edge, this could not be timelier. There are several signals that foreshadow this, such as the adoption of GitOps as a defacto standard and best practice for operating Kubernetes and its workloads. IT systems will not only enjoy greater resilience, and security, but having GitOps in place also lays the foundation for massive scalability and growth.
Yiannis Antoniou, analyst, Gigaom:
Responsible AI / ML will become the hottest topic in the cloud ML industry. Given societys increased emphasis on combatting unfairness and bias and the overall interest in better interpretability and explainability of machine learning models, cloud providers will invest and enhance their ML offerings to offer a full suite of responsible ML / AI capabilities that will aim to satisfy and reassure regulators, modelers, management and the market on the fair use of ML. Meanwhile, AI / ML will continue to see explosive growth and usage across the whole industry, with significant enhancements in ease-of-use and UX combining within a responsible AI / ML framework to drive the next growth spurt of this sector.
IT Experts
See the original post here:
Looking into the crystal ball: Tech predictions for 2021 - ITProPortal
How can the cloud industry adapt to a post-COVID world? – IT PRO
One of the unexpected silver linings to the global coronavirus crisis has been the rapid growth the cloud industry has enjoyed. The shift to remote working during the various lockdowns that have taken place over the course of 2020, was largely, if not entirely, facilitated by cloud services. This has meant that while other sectors have struggled and there has been an overall economic downturn, cloud companies have performed relatively well financially.
Although they wouldnt want to characterise the past few months as profiting from the pandemic, the likes of Zoom and Microsoft Teams have surged in usage and revenue, with the latter surpassing 44 million users as early as March. This period has also accelerated many digital transformation projects, with engineers more than capable of carrying out projects at pace and scale, including the traditionally lethargic public sector. This success, however, has been driven entirely by the effects of the pandemic, forcing the industry to question whether, and how, it can adapt once their services are no longer as highly sought after.
While we all rejoiced at the news that a potential COVID-19 vaccine may be available for distribution before the end of the year, shares in a handful of companies dropped sharply in response, including at least 15% reduction in the valuation of Zoom.
Whether things go back to the way they were, or cloud companies continue to play a more pivotal role than ever, is yet to be determined. For independent cloud consultant Danielle Royston, the goal of going back to normality in 2021 is misplaced. Theres no point wasting time and energy trying to return to the halcyon days of pre-COVID, she says. Lets focus instead on some of the positive disruptions weve seen this year. In all the companies Ive been at, Ive promoted and in some cases fully converted to remote working. I saw this as the inevitable direction that work and society was going, as the cloud computing tools were already there. And it makes sense: A better quality of life for employees, ease of collaboration, cutting the costs of business travel.
This is a trend that Tom Wrenn, cloud investment expert and partner at private equity firm ECI Partners, predicts will continue well into next year, telling ITPro that COVID-19 forced many companies into rapidly adopting cloud-based operations. These, driven by government-enforced lockdowns, allowed them to continue operating remotely. Now, having done a basic shift to cloud-based systems, he adds, 2021 will be the year of full cloud adoption, with businesses starting to optimise all its benefits; for example, data analytics and AI. If rapid investment was needed in 2020, next year businesses will want to see a return on that investment and will expect to see more from their cloud computing providers.
Although the recent transition to remote working is a trend sparked by COVID-19, the consensus is that its the beginning of a wider cultural shift. Former IBM boss Ginni Rometty is among the latest to suggest as much, claiming mass remote working will continue in some form as part of a broader hybrid model in future. This may involve companies keeping some physical presence while establishing the infrastructure and equipment to allow workers to work remotely as and when desired.
Cisco CTO for UK and Ireland, Chintan Patel, agrees, telling IT Pro that remote working gained widespread acceptance during COVID-19, even in organisations where it was unthinkable before. This means cloud and software as a service (SaaS) tools will continue to remain a crucial part of many setups, even though businesses will mostly return to a form of hybrid model. For remote working, cloud plays a central role; think secure cloud-based collaboration, accessing cloud-based business applications, and extending the security perimeter to thousands of devices, he explains. Its important to note, though, that cloud-based consumption models are not limited to remote working only. As to those returning to the offices, we see technology can help make the workplace more secure and efficient. As and when companies prepare for a return to office, they also need to optimise their space, address worker concerns about sanitation and social distancing and plan how to communicate policies and information clearly.
Accelerate digital transformation with enterprise apps on the cloud
The risks and rewards of moving enterprise applications to a multi-cloud environment
Technology will play a major part in instigating the changes needed in future, with a key role to play for many of the firms that have enjoyed success during the pandemic. While demand for software such as video conferencing platforms may not be as sky-high as it was at the beginning of the pandemic, Wrenn argues the next big step is how cloud companies can eat further into the market share enjoyed by the traditional telephone industry. More and more businesses are using Microsoft Teams or Zoom to interact, he explains, when previously they would have used conference lines or even called a person directly due to it being more convenient. Cloud providers need to think about how they can make the most of this opportunity as the way in which people interact changes.
To some extent, we should all consider ourselves lucky the global pandemic happened when it did, given that cloud computing has only in recent recently become as advanced as it is now. Thus, rather than profiting from the pandemic, this period has been the making of the industry. After all, cloud storage, processing, and compute facilities are already set up, and ready to expand easily and automatically, as and when enterprises need, according to Royston, who claims this wouldnt have been the case ten to 15 years go. It wouldve been an epic failure and caused even more disruption and long-term damage to global economies. This year, white-collar workers being able to quickly adapt to working from home in their millions is part of whats helped many sectors stay afloat. And its because of the investment and ongoing work of hyperscalers over the past few years thats meant businesses can support workers in doing this.
The IT Pro Podcast: A post-COVID cloud future
COVID has rewritten the rulebook for businesses - but will it last?
Connectivity, too, will continue to grow as organisations reliance SaaS tools increases too, Patel adds, with firms expecting more from these companies beyond provision. With cloud infrastructures becoming increasingly diverse, especially with applications adding more layers of complexity, businesses will be looking to strengthen their infrastructure. This will be achieved by gaining deeper visibility across their IT estates, ensuring workloads have continuous access to required resources and running systems that connect and protect at scale - from on-prem to hybrid cloud configurations. This is in addition to using technologies such as machine learning to give customers tools to manage their ever-growing data lakes. This is where providers can step in to guide customers on their migration journeys.
As such, the greatest challenge facing cloud providers, in light of the above, will largely be customer retention, according to Tom Wrenn. If we take online meeting services as an example, historically businesses would have had to invest in a service, such as [Cisco] WebEx, which is often costly and comes with a lot of equipment, he says. Today, however, businesses are using Zoom and Teams for this and can just turn services on and off with little upfront investment. This means that customers arent locked into providers in a way they once were. As a result, cloud computing providers will need to over-deliver for their clients, retaining a high level of customer service as well as ensuring that service levels dont decline as they undergo a huge period of growth.
Three key steps to delivering data-driven marketing
Go further with data management in your marketing efforts
How to take infrastructure monitoring to the next level
The four imperatives for building true observability
Go further with mobile marketing
Easy steps to get your mobile strategy up-to-speed
MLOps 101: The foundation for your AI strategy
What is MLOps and why do you need an MLOps infrastructure?
See more here:
How can the cloud industry adapt to a post-COVID world? - IT PRO