Category Archives: Cloud Hosting
Cloud Accounting versus Traditional Accounting | AccountingWEB – Accountingweb.com (blog)
Cloud Accounting versus Traditional Accounting | AccountingWEB Accountingweb.com (blog) What's the difference between cloud accounting and traditional accounting? This is a question we are asked often, as providers of cloud-based accounting se. |
Read more here:
Cloud Accounting versus Traditional Accounting | AccountingWEB - Accountingweb.com (blog)
IoT apps and event-driven computing reshape cloud services – TechTarget
Tools are always shaped by their uses. When cloud computing first came on the scene, it was a form of hosted virtualization, and its goal was to look like a bare-metal server.
Infrastructure as a service (IaaS) shaped the earliest cloud services, and it still dominates public cloud as well as the private cloud software market. Even so, that doesn't mean it's going to be the source of future cloud opportunity.
Cloud providers are always strategizing for the future, and their plans reveal an important -- and already underway -- shift. Every major public cloud provider has added services to process events. In particular, providers are adding features to help developers build applications for the internet of things (IoT). Could these be the basis for the most transformational set of applications to come along since the internet itself?
Legacy applications follow a pattern that's decades old: Work comes to the applications that support it. In traditional cloud computing, users pay for the processing resources they use. The terms differ, but it's essentially a lease of virtual infrastructure. This is a direct mirror of what happens in a data center -- a server farm is loaded with applications and transactions are routed to the correct server in the pool. This approach is great where work is persistent, as in the case of a retail banking application that runs continuously.
Event-driven and IoT apps change this critical notion of persistence. An event can pop up anywhere, at any time. It would be wasteful, perhaps prohibitively wasteful, to dedicate an IaaS instance to wait around for an event. Or the instance might reside within a data center halfway around the world from where the event occurs. If all possible event sources were matched with traditional cloud hosting points, most would sit idle much of the time, doing little but running up costs.
The reason why there's a specific right or wrong place to process events is simple: delay. Most events have a specific response-time expectation. Imagine a machine that triggers spray paint when an item passes a sensor. Picture a self-driving vehicle that approaches a changing traffic light.
The information flow between an event and the receipt of the appropriate response is called a control loop. Most events require a short control loop, which means that their processes need to be close to the point of the event. That's the problem with control loops that force event-handling processes to disperse out toward the cloud edge -- and multiply in numbers.
It's easy to see how the scarcity of events at a given point creates a problem of cloud efficiency and pricing for traditional cloud computing. It's also possible to have too many events. The cloud can allow for cloud bursting, or scaling capacity by spinning up multiple copies of an application component on demand, but it's not that easy.
Few applications written to run on a bare-metal server can seamlessly scale or replace failed instances. Those cloud capabilities aren't common in data centers, where legacy applications run. Moving the applications to the cloud doesn't add the features necessary to scale applications, either.
Multiple copies of an application component require load balancing, and many applications were not designed to allow any copy of a component to handle any event or request. Applications that work by assuming a string of requests in context can't work if half the string goes to one copy of the application and the other half to another. How do we make IoT apps scalable and resilient? They have to be rewritten.
Developers are doing just that, and big cloud providers are responding. In particular, they all see the same IoT-and-event future for the cloud. They have been steadily enhancing their cloud offerings to prepare for that future. Not only do the cloud giants offer special web services to manage IoT devices and connections, they now provide tools to support the kind of programming that IoT apps will demand.
The functional or lambda style of programming doesn't allow an application or component to store data between uses. As a result, all instances of the component can process an event. Cloud providers now offer functional or microservice support instead of simply providing infrastructure, platform or software as a service, because a function cloud is very different.
Where is your function hosted in a function cloud? Everywhere. Nowhere. Functions are activated anywhere they're needed -- when they're needed -- and you pay when you use one. Function clouds for IoT, or any kind of event-driven computing, represent the ultimate flexibility and agility. They also demand that users take care to establish policies on just how much function hosting they are willing to pay for, a decision they'll have to make based on the combination of cost and those pesky control-loop lengths.
Amazon has even allowed for the possibility that IoT will demand cloud applications that migrate outside the cloud. Their Amazon Web Services (AWS) Greengrass platform is a software and middleware framework that lets users execute AWS-compatible functions on their own hardware. This capability will let IoT users do some local processing of events to keep those control loops short, but still host deeper, less time-critical functions in the AWS cloud.
The old cloud model made you pay for hosting instances. In the function cloud, you don't host instances in the usual way. You have extemporaneous execution of functions, as needed. This is what gives rise to the pay-as-you-go or serverless description of the function cloud, but that's short of the mark. You could price any cloud computing service, running any application, on a usage basis, but that doesn't make those cloud services scalable or easily optimized. Without these features, serverless is just a pricing strategy.
Developers will have to make changes in applications to accommodate IoT and function clouds. Almost every new program or service stores information, and this makes it difficult to scale. The rule of functional programming is stateless, meaning that the output you get from a process is based only on what you provide as input. There are even programming languages designed to enforce stateless behavior on developers; it's not second nature.
The notion of the function cloud is likely to accelerate a trend that's already started in response to the use of mobile devices and implementation of BYOD policies. Companies have found that they are creating application components designed to format information for mobile devices, interface with apps written for a variety of mobile platforms and provide consistent support from back-end applications often running in data centers.
These forces combine to create a two-tier model of an application. The device-handling front tier lives in the cloud and takes advantage of the cloud's ability to distribute applications globally. The cloud part then creates traditional transactions for the core business applications, wherever they are.
IoT is even more distributed than mobile workers, and some IoT events need short control loops. As a result, cloud hosting of the front-end part of applications could see explosive growth. That puts pressures on the off-ramp of this kind of two-tier application structure because many events might generate many transactions. These transactions can overwhelm core business applications. Cloud providers are working on this, too. Microsoft, for example, has a cloud-distributed version of the service bus typically used to feed business applications with work.
If you're writing functions for any reason, isn't using a function cloud inevitable?
Given that IoT is in its infancy -- and cloud IoT is even younger -- it's easy to wonder why cloud providers are already offering IoT features. There are three reasons. First, IoT could radically increase IT spending, and cloud providers want to grab some of that as potential new revenue. Second, IoT isn't the only thing that generates events. A lot of mobile worker interaction, for example, looks like event processing. Finally, functional programming techniques are being promoted for every kind of processing. IoT apps demand them. Developer tools and conferences are already describing how functional programming techniques can make programs better and more maintainable.
If you're writing functions for any reason, isn't using a function cloud inevitable?
That's the big question that every cloud provider and cloud user needs to think about. Fully scalable applications -- ones that can increase or decrease capacity under load and repair themselves by simply loading another copy -- are very useful to businesses. The functional programming techniques developed for IoT apps, and the function clouds to support those techniques, will remake programs.
Tools are defined by their uses, remember? Well, users are already seeing the cloud of the future in event processing, and IoT will accelerate that trend. IoT's potential to generate events over a wide area, in large numbers, while demanding short control loops will revolutionize cloud use.
Learn the benefits of runtime as a service
Are you ready for serverless computing?
Build IoT apps for cloud in a flash
Read the original post:
IoT apps and event-driven computing reshape cloud services - TechTarget
Softcat among those trumpeting G-Cloud success – ComputerWeekly.com
Softcat has strengthened its position as a public sector player on the latest G-Cloud 9 framework moving into a position where it can offer a significant number of cloud offerings.
Digital transformation is a phrase that means many things to many people but for it to have any real relevance to the channel then it needs to mean a chance to make money. This guide will share some of the recent developments in the channel and the latest thoughts about the issue.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.
The reseller can now provide 226 services through the framework to a range of public sector customers, including government, emergency services, health and education.
The G-Cloud services are split into lots that cover cloud hosting, software and support. Softcat operates across those areas and has tendered through the evolution of the G-Cloud process.
Winning a place on this latest framework agreement reinforces our public sector offering and allows us to provide a huge range of services to our customers via a trustworthy procurement route," said Any Bruen, public sector frameworks manager at Softcat.
The G-Cloud framework is one of the more straightforward frameworks for our customers to procure under and offers a vast array of different cloud-based IT solutions," he added.
The latest iteration of G-Cloud came into force on 22 May and other public sector channel specialists have also been trumpeting their success in getting listed as approved suppliers.
Jason Clark, CEO and president at Proact, which can offer 33 services, said that it had been on both version 7 and 8 of G-Cloud.
With a specialist team covering all aspects of the sector across the whole of mainland UK, Proact's services on G-Cloud 9 can support UK public sector organisations in their digital transformation journeys with flexible, affordable and on-demand services and solutions that can help provide public services in a more effective, more efficient and lower cost way to the taxpayer," he said.
Plenty of G-Cloud business should be going through the channel with 50% of the top 50 suppliers being SMEs. So far 1.6 billion spent on the government's digital marketplace and 56% of total sales and 64% of volume was awarded to SMEs.
See the article here:
Softcat among those trumpeting G-Cloud success - ComputerWeekly.com
Managed Services and Cloud Hosting: What the Leading Cloud Hosts Offer – Data Center Knowledge
Brought to you by MSPmentor
From Rackspace to AWS to Azure, theres no shortage of cloud hosting platforms available today.
One thing that sets them apart is the degree of managed services available from each cloud host.
This is a key differentiator that MSPs need to understand when building a managed service offering.
All of the major cloud-hosting platforms provide the same basic thing: Cloud-based infrastructure that organizations can use to run physical and/or virtual servers that host their workloads.
The details of the hosting plans, prices and features of each platform vary, but thats fodder for a separate article.
This articles goal is to compare the extent to which managed services are built into each major cloud platform, how easy it is to obtain third-party managed services and which cloud platforms are most in need of additional managed-service help.
Toward that end, heres how cloud hosts stack up on the managed services front:
Amazon Web Services (AWS)
AWS is probably the best known cloud host in the market today.
It has a built-in Managed Services feature, but what AWS calls Managed Services is actually just an automation tool.
That said, because AWS has been around for so long, the actual managed services market around AWS is already very crowded.
AWS is an important cloud platform to support if you want to build a comprehensive MSP business that covers all cloud vendors.
But if youre trying to build a niche MSP offering based on cloud hosting, AWS isnt a good place to start.
The AWS managed services market is already saturated.
Microsoft Azure
Azure is also a well-established cloud-hosting platform.
Its features and functionality mirror those of AWS, to a large extent.
Azure doesnt have any built-in managed services offering, and its somewhat harder to find Azure third-party managed services support for Azure than it is for AWS.
Still, the Azure managed services market is pretty mature.
Rackspace
Rackspace, which began as a cloud infrastructure company, has shifted gears and now focuses heavily on managed services.
Its most recent move in this vein was itsrecent TriCore acquisition.
As a result, Rackspace is not a good cloud host to focus on if you want to build an MSP offering.
Rackspace already provides managed support for its infrastructure.
Indeed, Rackspace even offers managed services for other clouds, including AWS and Azure.
This means Rackspace is now a competitor with MSPs in all areas of the cloud.
DigitalOcean
DigitalOcean, which markets itself as a cloud-hosting platform for developers, is not as big a cloud host as AWS or Azure, but it ranks on any shortlist of major cloud providers.
DigitalOcean doesnt offer managed services for its infrastructure, although third-party companies do.
Because DigitalOcean managed services is a smaller market, it is easier for new MSPs to enter.
Linode
Linode is another cloud host that pitches itself as a platform for developers.
It provides hosting on high-performance Linux servers.
Like Rackspace, Linode has expanded its managed service offerings in recent years.
Linodes managed services arent totally comprehensive, but the company offers backups, incident management and other types of professional services.
There is some opportunity for third-party vendors to add extra managed services around the Linode platform that are not offered by Linode itself.
Vultr
Vultr is a cloud host that focuses on high-performance virtual servers.
The company doesnt offer managed services itself, but it partners with Cloudways to provide professional services.
Still, there is room in the Vultr managed services market for other MSPs.
This article originally appeared on MSPmentor.
The rest is here:
Managed Services and Cloud Hosting: What the Leading Cloud Hosts Offer - Data Center Knowledge
The use of cloud computing in the federal government services – Born2Invest
Traditional IT can call for large data centers and server farms which are a serious investment and require 24/7 IT oversight and energy to power and cool the servers.
Computers and software are now part of everyday life. We constantly use emails and set up websites. Some of us even run our own businesses. We are able to use these services without having to host our own massive IT infrastructure, hiring tons of staff to operate it, spending a lot of money and getting mired in lengthy and complicated procurement processes.
If you can do this easily, why cant the government?
The federalgovernment has an extensive infrastructure, a broad user base in agencies with a variety of missions, and complex suites of applications. To address these challenges, the Federal CIO Council has charged the government to leverage cloud computing services.
According to the National Institute of Standards and Technology (NIST), cloud computing provides scalable IT capabilities that are offered as a service over the Internet to multiple users. Many users share pooled IT resources, reducing costs and resulting in greater computing efficiency. The federal government is focusing on security, privacy, and procurement as it moves towards cloud computing.
The Federal Government has an extensive infrastructure, a broad user base in agencies with a variety of missions, and complex suites of applications. (Source)
Traditional IT can call for large data centers and server farms which are a serious investment and require 24/7 IT oversight and energy to power and cool the servers. The federal government has hundreds of these centers around the country that often perform similar tasks, such as providing email or web hosting, and are generally used at a fraction of their capability. They typically have large carbon footprints due to their enormous energy consumption and have to comply with strict environmental controls.
SEE ALSO Dubai introduced its first Android police force
Cloud computing can be viewed as the green computing option, as it promotes sustainability and has a much smaller carbon footprint by limiting duplicated efforts and utilizing computing power more efficiently.
Cloud computing also offers scalability, meaning you can scale capacity and processing power on-demand. It is always evolving and it is not an immediate solution for all government computing needs, but it can give the federalgovernment the same opportunity the private sector enjoys to reduce spending while making better use of staff and resources with a more forward-thinking, environmentally sensitive approach.
See the rest here:
The use of cloud computing in the federal government services - Born2Invest
Three benefits powering the cloud hype – The Media Temple Blog (press release) (blog)
Cloud computings roots go back to the start of the computer age, when the shared access mainframes of the 70s and 80s laid its foundation. Less than a decade later, virtual machines (VMs) and virtual private servers (VPS) created new efficiencies in sharing hardware resources. Powered by a single piece of hardware, not only was it expensive to maintain the infrastructure but any outage or malfunction also spelled disaster for sites or apps.
Butwhat exactly is fueling the current cloud hype and, critically, why now?
Where previous iterations required all of the various software components to live in one physical space, the modern cloud distributes software pieces across multiple services that are optimized for their specific use. This has a twofold effect: Not only does it improves the overall performance of your cloud setup, but it also creates redundancies that significantly improve an organizations agility and the uptime of digital assets.
The cloud can receive software updates instantly, providing a near instant modernization of any existing infrastructure. This allows enterprises to realize cutting-edge scalability, connectivity, cost savings, and security benefits. By being able to update resources or software instantly, it can mean the difference between dreading spikes in web traffic and welcoming them. It provides the peace of mind that sites and apps will not be outdated and thus fall victim to security breaches or brought down under the weight of a DDos attack.
Hosting high-traffic sites and apps on todays cloud allows enterprises to only pay for what they use. Rather than getting bogged down by significant upfront costs for possibly unused hardware and infrastructure, current cloud models utilize a pay-as-you-go structure. Enterprises can provision additional capabilities as needed, making it easy to accommodate for a campaign launch or a special event like an online sale. When that has ended, its simple to scale back during periods of regular traffic to cut costs and save resources.
Advances in technology have finally made the overall speed, security, and reliability of cloud computing meet the needs of modern organizations on the internet. Now, users can unlock the clouds unique advantages without sacrificing performance or putting unnecessary pressure on their IT budgets. Learn more here.
Read more from the original source:
Three benefits powering the cloud hype - The Media Temple Blog (press release) (blog)
Product Spotlight: Kodak PRINERGY Cloud – What They Think
Commentary & Analysis
Your print business runs on software, more and more of that software is being delivered via the cloud where resources can be precisely controlled and scaling (both up and down) is configurable in real-time. Kodaks flagship workflow product PRINERGY Workflow is no exception to taking advantage of what cloud computing can do to a printers production workflow.
By Jennifer Matt Published: June 13, 2017
This article is sponsored by Kodak as part of WhatTheyThink's Print Software Product Spotlight series. In preparing this article, the WhatTheyThink Print Software Section editors conducted original, in-depth research on Kodaks product suite. This Product Spotlight describes what the editors feel are the product suites strengths in the marketplace. Kodak reviewed the final article for accuracy but had no editorial control over the content.
Your print business runs on software, more and more of that software is being delivered via the cloud where resources can be precisely controlled and scaling (both up and down) is configurable in real-time. The cloud is a brilliant way of optimizing computing resources. Cloud computing offers many advantages, virtually every provider of print software must evaluate how their software could benefit from this revolutionary deployment model. As a print business owner, the cloud reduces the amount of IT infrastructure you maintain and manage on-site. As your technical resources become more and more critical to delivering technology-enabled solutions directly to your customers you want to take hardware and software management off their plates. As the pace of technology change continues to speed up, investing in hardware gets less appealing when you can rent just what you need, only when you need it from the cloud.
Kodaks flagship workflow product PRINERGY Workflow is no exception to taking advantage of what cloud computing can do to a printers production workflow. The key to getting the most out of what the cloud has to offer is to first understand how your print software is being used and then what aspects of your solution would benefit from cloud deployment. Kodak did not rush into a wholesale move of PRINERGY Workflow to the cloud, that would not have made sense. Can you imagine the large local file transport that happens around your printing plant today being restricted by your internet bandwidth? Kodaks approach to the cloud is a strategic and incremental migration of specific workflow services which benefit most from cloud deployment.
A Hybrid Approach to the Cloud
PRINERGY Workflow, the industrys standard for production workflow has been hosted locally by the printer for decades. This hosting arrangement made sense. The printers network provided a secure and fast way to enable PRINERGY Workflow to manage and automate the production workflow. As cloud computing has evolved, Kodak has taken a hybrid approach to utilizing the cloud where it makes the most sense for their customers. The transition from deploying a solution on local servers to cloud deployment is a large change for both Kodak and its customers.
Moving to the cloud requires any organization to make a significant investment so that services you used to get from your local network work as good or better from the cloud. Performance is about investment and understanding that partnering with best-in-class providers is a far better approach than trying to build out your own cloud infrastructure. Kodak has partnered with Microsoft Azure for their cloud solution. This allows Kodak to leverage Microsofts cloud infrastructure and immediately deploy PRINERGY Cloud to data centers across the world for both redundancy and responsiveness.
A hybrid approach to the cloud is ideal for PRINERGY Workflow users. The workflow solution solves so many challenges in the production workflow; not all of them would be served well by a pure cloud solution. Kodaks hybrid approach to the cloud peels off the services that would benefit most from cloud deployment and keeps the services that are better suited for local deployment on servers at the printer.
Decision Analytics
When you move to the cloud you can share IT resources and solutions. An embedded Business Intelligence (B.I.) tool can be a costly investment for an individual printer. The cloud gives you the ability to embed tools once and then share them across your entire customer base. Kodaks Decision Analytics is the foundation of the PRINERGY Cloud. What is the number one request by every print business owner when it comes to technology? I need a report that tells me What is the number one request once you deliver a report to a print business owner? This is great, now I need to see more about this or that... (another report).
We are in a major transition when it comes to making fact-driven decisions. A snapshot of what your business did (a static report) is not good enough anymore. Your people need real-time access to your business data in a dynamic fashion, rather than the static snapshot of a report. Business Intelligence delivers on that need through the deployment of dashboards that can be drilled into to get to that next level question. You see dashboards that deliver insights, not data. For far too long weve had to work very hard to make fact-based decisions.
Kodak has embedded a full-featured business intelligence tool into PRINERGY Cloud. This puts you in charge of how you want to see your business data and get to insights quicker. As turn times continue to decrease, shelf life of printed materials evaporates, and order volumes increase you dont have time to work at getting the data required to make better decisions.
Workflow Services Approach PRINERGY Cloud Services
Kodak PRINERGY Cloud is on a roadmap of incrementally adding the services that make sense in the cloud and simultaneously work with your locally deployed PRINERGY server. The first two services offered over the cloud are File Archive and Backup and Decision Analytics. Backup and archive are an obvious choice for your security and protection weve always moved critical backups to alternative locations. Now you can do that without consuming the time of your mission critical IT resources. Decision Analytics deployed over the cloud allows Kodak to make one large investment in providing this critical functionality to all PRINERGY Cloud customers as a subscription service.
Prinergy Cloud Services will continue to add more service layers to the solution while maintaining a workflow that gracefully bridges between local hosted services and cloud services. This sane approach to the cloud will make it easier for printers to move incrementally to the cloud without large scale disruption to their business. As Kodak offers more and more tools over the PRINERGY Cloud, printers will move towards managing service level agreements rather than locally hosted hardware and software.
Visit link:
Product Spotlight: Kodak PRINERGY Cloud - What They Think
ViUX ViUX.Hosting Shared Cloud Hosting | Cloud VPS …
Truth in Hosting ViUX has a strict NO OVERSELLING policy.
ViUX Hosting does not resell the same Diskspace, Email Accounts (Mailboxes) and other such resources over and over and over again to countless customers in expectation and hope that most of them will never come anywhere near to using the full amount of resources that they have been promised (sold). Sadly, many Web Hosting Providers do engage in this dishonest and disrespectful practice (called Overselling); and many more are fast adopting the same, because they feel that they must do so in order to compete with those that already have.
Because we at ViUX believe that there should be Truth in Hosting, you will not see any ViUX Hosting Plans offering: Unlimited Diskspace; Unlimited Mailboxes; or even: 100 GB of Diskspace; 1000 Mailboxes; etc all for less than $5 per month! Really!?!
We would consider such an offering to be dishonest, because resource quantities of those levels would cost far more than $5 to provide and still maintain profitability, without engaging in massive overselling of those same resources to a large group of unsuspecting customers. It is for this reason that we also consider overselling to be disrespectful to customers.
To offer such incredible resource levels for such a ridiculously low price is only possible if the provider is large enough to take the hit and is actually willing to lose money to grow their customer base; or as is most often the case The provider really has no intention of honoring those resources and has language in their Terms of Service (TOS) and Acceptable Use Policy (AUP) giving them an out by basically disallowing usage above a certain level (or percentage) and/or by restricting the rate of growth which again, is dishonest and disrespectful to those unsuspecting customers.
Further, Shared Hosting cannot meet the needs of every Website or customer If someone actually needs ultra-high resource levels, then that is what Cloud VPS Hosting and Dedicated Servers are for. After all, consider that if Shared Hosting could really meet the needs of those requiring UNLIMITED resources then it would be the only type of Web Hosting offered, as no other type of Hosting Service would ever be needed.
For more information on the practice of overselling and details of some Unlimited resources that ViUX Hosting does offer (such as Domains & Traffic) please read our blog posting: Truth in Hosting To Oversell or Not to Oversell to gain additional insight into this topic.
More here:
ViUX ViUX.Hosting Shared Cloud Hosting | Cloud VPS ...
Virtualization admin? Pivot — pivot now — to a cloud computing career – TechTarget
For those virtualization admins hiding under a virtual rock regarding cloud, I have news for you. Your job isn't safe. No one can put the cloud genie back in the bottle. Cloud computing is here to stay, and virtualization admins need to shift focus to keep up with tomorrow's jobs.
This complimentary guide helps readers determine the pros, cons and key considerations of DevOps by offering up 5 important questions you should be asking in order to create a realistic DevOps assessment.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.
The move to cloud is already happening at all levels, from the smallest through to the largest businesses. Cloud and microservices mark a new iteration of change that is as disruptive as the original arrival of virtualization with VMware -- if not more so.
Virtualization has two phases: consolidation and abstraction.
In the beginning, virtualization's goal was more efficient use of underutilized hardware. Rarely do servers consume all the resources allocated to them. Virtualization admins could reclaim these lost resources and vastly reduce wasted capacity.
In phase two, virtualization developed advanced functions such as live storage motion or migration, high availability and fault tolerance. These virtualization capabilities address the issues that arise when several machines share one physical piece of hardware. Automation arrived and made server deployment simple and straightforward.
I argue that this virtualization adoption curve peaked a few years ago -- we are now moving to the next iteration, and you'll need to follow a cloud computing career path to come along.
Even once-conservative technology adopters, such as financial institutions, are jumping on board with the third wave of virtualization.
There is a thirst to cut costs, and automation allows massive cost cuts. There will be job losses. No virtualization admin should think it will never happen to them. You are fooling yourself. Fewer staff means fewer medical plans and pensions to support. It is not hard to see why the cloud appeals to the bottom line.
There will not be enough cloud computing careers to go around based on old virtualization working practices, such as in a phase one scenario.
Consider virtual machine orchestration. In early-phase virtualization environments, VMs still required some level of administration action, such as deployment from a template, to accompany automated steps. Tools such as VMware vRealize Automation or Platform9's managed vSphere stack enable an approved user to request a VM, customized to their specifications, and have it deployed within 10 minutes with no administrator interactions. Larger companies used to have several virtualization admins whose jobs purely entailed VM creation and deployment. Within a year or two, that job role disappeared.
Virtual machines are now moving to cattle status, meaning they're disposable commodities. To scale applications, organizations adopt automation tools that deploy new VMs. It's quicker to deploy another instance of a machine than to troubleshoot a broken one.
DevOps does away with manual work; manual deployment is the exact opposite of how DevOps is supposed to work. A key tenet of DevOps is that tasks performed more than once in the same way should be scripted so the IT platform does the action itself.
There is still time to retool and get on a cloud computing career path. Virtualization admins are luckier than most.
Platform as a service reduces the workload. Workloads that used to be custom-built and based on infrastructure as a service are now provided as a service for consumption by developers and companies. Examples include the larger cloud vendors offering secure and highly available database hosting that organizations consume without any effort to build and manage the underlying database infrastructure. Little to no database admin input required. No server admin required either.
The complexity hasn't gone away -- it has just changed. Management complexity moved from the VMs to orchestration and scaling. Virtualization elements such as high availability and disaster recovery (DR) lost importance, while the IT industry turned its attention to microservices that are scalable, redundant and can be spun up and down at will. Automation means little to no hands-on intervention. For example, you can spin up a cloud infrastructure from a single PowerShell script.
Classic DR locations are now costly relics of waste. Cloud affects virtualization in secondary ways. For example, businesses are used to having one primary data center and one DR setup in another data center. Given a relatively modern application set, the entire company infrastructure can restart in its entirety in the cloud in the event of a disaster. Modern DR management products, such as Zerto and Acronis, eliminate the costly secondary data center, allowing businesses to prepopulate and configure DR setups in the cloud.
This is the reality for virtualization admins, and the only future is in a cloud computing career. Over time, more applications are built cloud-first to save money from the start; the old, immovable on-site applications go the way of pagers and typewriters.
The reality is that most virtualization admin roles as we know them will vastly shrink or become outmoded over the next decade. A virtual data center requires far fewer staff, and with automation and scripting, a single administrator can manage massive numbers of servers.
There is still time to retool and get on a cloud computing career path. Virtualization admins are luckier than most. While the technology itself may change, these administrators have skills that easily translate to the popular cloud and DevOps arena.
This doesn't mean becoming a code guru or programmer, but a virtualization admin will need a deep understanding of architectures and tools such as Docker for containerization, Chef for configuration management and Kubernetes for container orchestration to become a DevOps admin. Learn multiple scripting languages and investigate hyper-converged infrastructure for cloud hosting.
The warning signs are there, fellow admins. It is just a case of doing something about it while you still can.
Help keep an organization on track as a cloud capacity manager
Break down seemingly convoluted DevOps job requirements
DevOps engineers must demonstrate strong communication skills
Set up a DevOps home lab to gain hands-on skills
Here is the original post:
Virtualization admin? Pivot -- pivot now -- to a cloud computing career - TechTarget
Managed Services and Cloud Hosting: What the Leading Cloud Hosts Offer – MSPmentor
From Rackspace to AWS to Azure, there's no shortage of cloud hosting platforms available today.
One thing that sets them apart is the degree of managed services available from each cloud host.
This is a key differentiator that MSPs need to understand when building a managed service offering.
All of the major cloud-hosting platforms provide the same basic thing: Cloud-based infrastructure that organizations can use to run physical and/or virtual servers that host their workloads.
The details of the hosting plans, prices and features of each platform vary, but that's fodder for a separate article.
This article's goal is to compare the extent to which managed services are built into each major cloud platform, how easy it is to obtain third-party managed services and which cloud platforms are most in need of additional managed-service help.
CLOUD PLATFORMS COMPARED
Toward that end, here's how cloud hosts stack up on the managed services front:
Amazon Web Services (AWS)
AWS is probably the best known cloud host in the market today.
It has a built-in Managed Services feature, but what AWS calls Managed Services is actually just an automation tool.
That said, because AWS has been around for so long, the actual managed services market around AWS is already very crowded.
AWS is an important cloud platform to support if you want to build a comprehensive MSP business that covers all cloud vendors.
But if you're trying to build a niche MSP offering based on cloud hosting, AWS isn't a good place to start.
The AWS managed services market is already saturated.
Microsoft Azure
Azure is also a well-established cloud-hosting platform.
Its features and functionality mirror those of AWS, to a large extent.
Azure doesn't have any built-in managed services offering, and it's somewhat harder to find Azure third-party managed services support for Azure than it is for AWS.
Still, the Azure managed services market is pretty mature.
Rackspace
Rackspace, which began as a cloud infrastructure company, has shifted gears and now focuses heavily on managed services.
Its most recent move in this vein was itsrecent TriCore acquisition.
As a result, Rackspace is not a good cloud host to focus on if you want to build an MSP offering.
Rackspace already provides managed support for its infrastructure.
Indeed, Rackspace even offers managed services for other clouds, including AWS and Azure.
This means Rackspace is now a competitor with MSPs in all areas of the cloud.
DigitalOcean
DigitalOcean, which markets itself as a cloud-hosting platform for developers, is not as big a cloud host as AWS or Azure, but it ranks on any shortlist of major cloud providers.
DigitalOcean doesn't offer managed services for its infrastructure, although third-party companies do.
Because DigitalOcean managed services is a smaller market, it is easier for new MSPs to enter.
Linode
Linode is another cloud host that pitches itself as a platform for developers.
It provides hosting on high-performance Linux servers.
Like Rackspace, Linode has expanded its managed service offerings in recent years.
Linode's managed services aren't totally comprehensive, but the company offers backups, incident management and other types of professional services.
There is some opportunity for third-party vendors to add extra managed services around the Linode platform that are not offered by Linode itself.
Vultr
Vultr is a cloud host that focuses on high-performance virtual servers.
The company doesn't offer managed services itself, but it partners with Cloudways to provide professional services.
Still, there is room in the Vultr managed services market for other MSPs.
View original post here:
Managed Services and Cloud Hosting: What the Leading Cloud Hosts Offer - MSPmentor