Category Archives: Cloud Hosting
Get a dedicated server in five minutes – The Register
Sponsored With the progress of clouds, organizing IT infrastructure is no harder than organizing pizza delivery.
According to MarketsandMarkets, by the end of 2021, cloud data centers will process over 94% of the total workload. Its easy to explain why businesses migrate to clouds: companies like how simple it is to rent capacities and solutions from providers. However, all that is more relevant to virtual machines.
The situation around dedicated servers is not as simple. They are, however, still indispensable for certain tasks, and allow clients business to get the most out of its infrastructure. Our review of bare metal servers will show their advantages, when they are indispensable and how to seamlessly integrate them into corporate IT infrastructure.
Clouds with bare metal servers and virtual machines have a lot in common:
However, bare metal servers have a host of advantages over virtual machines.
When renting a bare metal server, the user gets full control over the isolated equipment. In the single-user environment, the client has root access to the server. They use the CPU, memory, and bandwidth resources on their own, which ensures maximum productivity under peak load. Virtual machines are different: many machines may be launched within one server, and the user cannot directly control the overall load of the server.
Virtual machines are not as predictable. Their productivity can change when they are transferred between servers, which may also lead to extra expenses. Bare metal servers are more stable because they cannot be transferred or tampered with unless the user requests or authorizes it. Their productivity is higher and more predictable: bare metal servers dont use virtualization and computational resources are distributed between the clients apps.
Virtual machine users dont know the state of the equipment the provider gives them. It might be cheap and outdated, and the load on the hosts infrastructure might be too big because of overselling. With bare metal servers, such problems cannot occur. The client has direct access to hardware, fully controlling the environment, and theyre able to check the equipment and details at any point.
It might seem that launching a virtual machine within an operating system ensures extra safety. However, its the other way around: the operating system on which the virtual machine works might also be vulnerable to hacking, exploits, or viruses. If something happens to the OS, it will also affect the safety of the virtual environment. Bare metal server users have direct access to the infrastructure, which means they have everything it takes to prevent any safety issues proactively, especially since bare metal servers also support extra protection measures provided by vendors.
Cloud provider clients dont always realize that bare metal servers are more economical than virtual machines. At first glance, bare metal servers seem more expensive than virtual hosting, but their benefits are usually more obvious in the longer term. This is due to the fixed service costs. Unlike virtual machines, a private server has a maximum bandwidth limit, so clients dont have to worry about possible overspending and extra expenses. As a result, cloud resources on bare metal servers oftentimes turn out to be cheaper than renting the same computing capacities, with the same cores, memory, and storage, within virtual hosting.
There are a lot of tasks for which bare metal servers are a better fit than virtual ones. Here are some of them:
However, no matter how much better bare metal servers are suited for many tasks, its not always right to put them up against virtual machines. Its often more efficient and profitable to use both solutions within a joint IT infrastructure, especially since its easy to do with cloud services.
G-Core Labs new offering, Bare-Metal-as-a-Service, allows businesses to deploy bare metal nodes in the cloud within minutes. It combines the high capacity of traditional dedicated servers with the simplicity of working with virtual machines under the IaaS model.
In combination with the providers other solutions, the new server gives the public cloud users a ready-to-use, flexible infrastructure, which allows the simultaneous management of dedicated servers and virtual machines, public and private networks, as well as other cloud products. It allows clients to both efficiently and economically use resources and gives them additional scenarios of infrastructure utilization. For example, in a public cloud, a customer may run production environments on bare metal servers, deploying extra virtual machines within minutes when the load peaks. This extra capacity will be easy to then delete.
The combined solution ensures global scalability, high safety, and reliability of cloud infrastructure, allowing businesses to exploit the advantages of both virtual machines and bare metal services. Its convenient because the first ones are suitable for one kind of task, like apps with short-time load peaks, and the second ones are perfect for other situations, including the operation of productivity-sensitive apps.
Sponsored by G-Core Labs
See the original post:
Get a dedicated server in five minutes - The Register
Azure to host worlds largest geospatial data platform, over 100 petabytes of 3D data – IT Brief New Zealand
3D data company Euclideon has signed a partnership with Microsoft, unveiling plans to build the worlds largest geospatial data platform on the Azure cloud.
This month Euclideon plans to launch its visualisation as a service (VaaS) platform, using its high-speed udStream 3D render technology, which can visualise petabytes of data within a second (the geospatial data platform will use more than 100 petabytes of 3D data), all hosted on Azure.
In 2019 the global geospatial analytics market was valued at $US 58 billion, and growth of 14.2% per year is predicted until 2027 when the market is thought to be worth more than $US 158 billion with the increased use of 3D data and IoT sourced information.
Potential applications range from business process optimisation, smart cities, to the creation of digital twins.
Some early adopters of 3D data and visualisation platforms include transport, resources, military, education, and new emerging sectors. Governments also recognise VaaS as providing an opportunity to streamline access to open data sets.
International organisations already utilising Euclideons technology include Thales Group, Transport and Main Roads Queensland BHP Billiton, UNSW, Canadas City of Richmond, Frances SNCF, and Lockheed Martin in the US.
3D data represents the biggest big data, Euclideon works with organisations across multiple sectors to help them manage and visualise their growing 3D data collections, says Euclideon CEO, Daniel Zhang.
This is fundamental for applications such as digital twins or smart cities, and we are developing innovative tools and services to help maximise the value that customers can reap from their data.
Hosting in Azure ensures we have the performance, scalability, and trusted security that we and our customers need, and also positions us for rapid expansion here in Australia, and internationally. It also opens the door to multiple Azure services that will help us accelerate innovation for our customers.
Euclideons Azure-based udCloud is an on-demand cloud-based solution for managing, distributing, and visualising large 3D datasets. And can be used in conjunction with Euclideon udStream enabling real-time web viewing tools for unlimited streamed data visualisation. It can be used on any standard desktop or laptop.
Euclideon has established an enviable reputation at the very forefront of 3D data management and visualisation, says Microsoft Australia SMC lead, Phil Goldie.
With Euclideons udCloud and udStream being powered by Azure, it can offer world-leading 3D services that will accelerate innovation along with the peace of mind that comes from Azures in-built security, privacy, and value.
We are proud to partner with Euclideon, and also to explore opportunities for ongoing collaboration between our two organisations.
The rest is here:
Azure to host worlds largest geospatial data platform, over 100 petabytes of 3D data - IT Brief New Zealand
ScalaHosting and AWS to deliver new SPanel web hosting capabilities – TechRadar
One of the obstacles to cloud hosting adoption has been the increased technical demand on users, which could be difficult for smaller companies to surmount without significant specialized talent acquisition or outsourcing.
This was one of the very things that pushed managed cloud VPS hosting firm ScalaHosting to partner with Amazon Web Services (AWS) and introduce SPanel, offering users an intuitive interface for Cloud VPS management.
The Graphic User Interface (GUI)-driven SPanel allows users to migrate to the platform quickly and is also compatible with cPanel.
After ScalaHosting finalized its strategic collaboration partnership agreement with AWS, TechRadar Pro sat down with Chris Rusev, CEO and co-founder of ScalaHosting and SPanel.io to iron out the details of the collaboration.
The main idea behind the AWS integration is simple - "Think global, host local". To achieve the highest speed and performance a website must be as close as possible to its visitor. AWS runs 230+ data centers in 200+ countries globally. Combining that with the modern AWS cloud infrastructures quality and the ScalaHosting in-house developed SPanel, which is among the top 5 cPanel/WHM alternatives, the user gets the complete bundle that will let him scale up his online business in one of the most secure and reliable cloud environments worldwide. In addition, with this move, ScalaHosting enters more aggressively the B2B hosting segment, because these days, no one gets fired for choosing AWS.
Before all else SPanel means freedom to our customers. Its completely free to all our customers and it makes them independent from 3rd party software providers like cPanel. Because its our own proprietary cloud management platform, we can afford to integrate into it only our own apps and features or free 3rd party add-ons. Instead of making SPanel commercial and monetizing it as the others do, we decided to make it developer-friendly.
Here, everyone can request new features and vote for existing ones, and we start developing only those with the max votes. Usually, it takes from weeks to two months max from request to implementation. In addition, by cutting down costs, we successfully lowered the price of a fully managed cloud VPS with a control panel, to the level of an average shared hosting plan.
a) Dedicated cloud environment - CPU, RAM & Disk resources are 100% dedicated to you. Your resources are yours and yours alone. Other users on the same server cannot degrade the performance of your websites and services.
b) Security - a VPS gives you much higher security because the whole server is dedicated to you and there is no risk that another website on the same server to be used to compromise your data. The fully managed service makes things really easy for the website owner as they don't need to worry about software updates, security, and daily routine server administration tasks. In comparison to the traditional hosting where no matter how well you isolate your account youll be still sharing the same OS kernel and software allowing a hacker to try local exploits to compromise other accounts or the whole server. Having local access to the OS makes hacking hundreds of times easier no matter how well a shared server is configured. With the cloud VPS such scenarios are simply not possible due to the dedicated environment.
c) Scalability - a cloud VPS may significantly scale up and down to serve hundreds of thousands of visitors and you only pay for the CPU/RAM resources that you really used for the period of time you needed them. Extra resources are always available and take less than a minute to add more.
d) Dedicated IP - Each VPS comes with a dedicated IP which is really important for both email deliverability and SEO as well. A compromised account may send spam or upload a phishing page with malicious code. As a result, the IP of the server is usually blacklisted in IP reputation platforms, and your emails start getting delivered to the Spam folder instead of Inbox. Losing only one customer because of that will cover the fee for a VPS for a couple of months. If you pay for SEO that will have even more dramatic results in the long run as Google also monitors IP reputation.
Most small to medium businesses cannot afford large IT and sysadmin teams in-house. For that reason, AWS was not an option because they do not provide fully managed service. With this collaboration, we are now able to offer SMBs, AWS bundled with 24/7 live ask anything technical support, SPanel, SShield security protection, premium Softaculous 1-click installer and more. Thats how we completely offload all server-related problems from the entrepreneurs backs and we give them the chance to fully focus on what they can do best.
In the light of the global Covid crisis, many offline SMBs were forced to move online. For these newcomers, this is still an unknown alien world, and they are unaware of how important the type of hosting they choose is for their online business. Shared hosting can silently drive back customers because no one likes to wait for more than a blink when he checks out his basket. Our strategy is to keep on promoting the VPS technologies and also to educate about the benefits of having an isolated cloud environment. We also want to spread the word that these advanced cloud technologies are not affordable and not that much different than the traditional hosting price-wise.
With this integration, ScalaHosting steps into 24 data centers around the globe, allowing the user to host his websites as close as possible to his audience. Now our customers can now choose between ScalaHosting native data centers in Dallas, New York, and Sofia-BG, and the globally integrated datacenter network of AWS and DigitalOcean.
More:
ScalaHosting and AWS to deliver new SPanel web hosting capabilities - TechRadar
Aunalytics Announces FedRAMP Ready Status of Its Cloud – GlobeNewswire
SOUTH BEND, Ind., June 29, 2021 (GLOBE NEWSWIRE) -- Aunalytics, a leading data platform company delivering Insights-as-a-Service for enterprise businesses, announced today that its Aunalytics Cloud solution has achieved Federal Risk and Authorization Management Program (FedRAMP) Ready status and is actively working toward FedRAMP certification. Certified cloud-based products help U.S. federal agencies meet increasingly complex regulations and defend against cybersecurity threats, prevent data loss, enforce compliance, and protect agency domains.
FedRAMP is a government-wide program which is an assessment and authorization process that federal government agencies have been directed to use to ensure security is in place when accessing cloud computing products and services. By applying the FedRAMP framework to their evaluation, government agencies have a uniform assessment and authorization of cloud information security controls, alleviated cloud security concerns, and increased trust in the validity of assessments.
FedRAMP Ready status and, ultimately certification, represents one of the highest compliance standards and third party validations of our cloud hosting services, giving federal agencies the utmost confidence that our offering is tested and confirmed to meet the trust principles of confidentiality, availability, security, and privacy, said Kerry Vickers, CISO Aunalytics. Meeting these rigorous standards will benefit all of our clients in every industry and enable us to expand our footprint within the government sector by providing federal agencies, as well as defense contractors and others required to use FedRAMP certified suppliers, with a cloud infrastructure that is FedRAMP compliant.
Listed as FedRAMP Ready on the FedRAMP Marketplace, Aunalytics is seeking an agency sponsor as it moves toward the second phase of being FedRAMP authorized.
Tweet this: .@Aunalytics Announces #FedRAMP Ready Status of Its Cloud #Dataplatform #Dataanalytics #Dataintegration #Dataaccuracy #ArtificialIntelligence #AI #Masterdatamanagement #MDM #DataScientist #MachineLearning #ML #DigitalTransformation #FinancialServices
About AunalyticsAunalytics is a data platform company delivering answers for your business. Aunalytics provides Insights-as-a-Service to answer enterprise and mid-sized companies most important IT and business questions. The Aunalytics cloud-native data platform is built for universal data access, advanced analytics and AI while unifying disparate data silos into a single golden record of accurate, actionable business information. Its DaybreakTM industry intelligent data mart combined with the power of the Aunalytics data platform provides industry-specific data models with built-in queries and AI to ensure access to timely, accurate data and answers to critical business and IT questions. Through its side-by-side digital transformation model,Aunalyticsprovides on-demand scalable access to technology, data science, and AI experts to seamlessly transform customers businesses.To learn more contact us at +1 855-799-DATA or visit Aunalytics at http://www.aunalytics.comor on Twitter and LinkedIn.
PR Contact: Denise NelsonThe Ventana Group for Aunalytics (925) 858-5198dnelson@theventanagroup.com
View original post here:
Aunalytics Announces FedRAMP Ready Status of Its Cloud - GlobeNewswire
Vertafore unveils next generation of ImageRight to transform productivity and user experience for insurers – Yahoo Finance
Cloud-based Vertafore Hosting for ImageRight 7 can reduce cost of ownership for carriers, with savings up to 25-50% annually
DENVER, June 29, 2021 /PRNewswire/ -- Vertafore, the leader in modern insurance technology, today announced the next generation of ImageRight, the industry's premier workflow and content management system designed specifically for insurance carriers. The announcement was made during the keynote event of Carrier Week at Accelerate, powered by NetVU.
Vertafore (PRNewsfoto/Vertafore)
ImageRight 7 provides a fully reimagined user experience in a web-based interface, enabling users to get more done with streamlined core processes and personalized task lists.
As the first release in Vertafore's multi-year commitment to modernize ImageRight, the update is available anytime and anywhere via modern browsers, further enabling carriers to equip their workforce for success as the "new normal" of work evolves. ImageRight 7 also includes compliance and security updates to keep users current with the latest rules and regulations.
ImageRight 7 provides a state-of-the-art user experience that improves productivity and streamlines tasks with enhancements that include:
An intuitive interface that enables users to easily navigate, discover, and track policies and all related documents.
The ability to easily edit and manage Microsoft Office files right from ImageRight, with all changes kept in sync.
Personalized to-do lists that surface work by priority. To-do lists are updated in real time, reducing the number of clicks to get to the latest tasks.
Vertafore Hosting for ImageRight 7 can save carriers 2550% annually
Also available with ImageRight 7 is Vertafore Hosting, a new cloud hosting option available via Amazon Web Services. Carriers get ImageRight delivered Software-as-a-Service (SaaS), including automatic version upgrades, as Vertafore takes the hosting burden off carrier IT teams, freeing them up for other priorities.
Data from Vertafore users show that Vertafore Hosting reduces the total cost of ownership for a carrier with 100 users an average of $320,000 over three years, or 25-50% of savings annually. The biggest savings come from eliminating capital expenses associated with maintaining their own hardware, operating systems, databases, and backup system infrastructure for ImageRight.
Story continues
"ImageRight 7 provides the industry's most modern experience for carriers to manage their workflow and content," says Sharmila Ray, head of carrier strategy at Vertafore. "With these latest enhancements and the introduction of Vertafore Hosting, carriers are empowered to drive productivity, reduce their IT costs and better meet the expectations and needs of a modern workforce."
About Vertafore
As North America's InsurTech leader for more than 50 years, Vertafore is modernizing and simplifying insurance distribution so that our customers can focus on what matters most: people. Vertafore's solutions provide end-to-end connectivity, improve the client and agent experience, unlock the power of data, and streamline essential workflows to drive efficiency, productivity, and profitability for independent agencies and carriers. For more information about Vertafore, visit http://www.vertafore.com.
2021 Vertafore and the Vertafore logo are registered trademarks of Vertafore. All rights reserved. All other trademarks are the property of their respective owners.
Press ContactAmanda Urbanaurban@nextpr.com312-259-1814
Cision
View original content to download multimedia:https://www.prnewswire.com/news-releases/vertafore-unveils-next-generation-of-imageright-to-transform-productivity-and-user-experience-for-insurers-301321197.html
SOURCE Vertafore
BMO Global Asset Management (EMEA) Deploys NICE Cloud Compliance Recording for Microsoft Teams to Support Remote Workforce and Full Agility – Business…
HOBOKEN, N.J.--(BUSINESS WIRE)--NICE (NASDAQ: NICE), a leading provider of communication compliance solutions, today announced that BMO Global Asset Management (BMO GAM), the global investment manager owned by BMO Financial Group, has deployed NICEs certified Cloud Compliance Recording solution for Microsoft Teams across its business in Europe, the Middle East and Africa (EMEA) to meet certain regulatory requirements around recording omnichannel communications for regulated employees now working from home.
As we transitioned to Microsoft Teams to support our remote workforce, we needed to implement a compliance recording solution quickly, said Scott Wilson, Director, Infrastructure & Operations at BMO Global Asset Management. NICEs fully managed, cloud-based, software-as-a-service offering for Teams compliance recording enabled us to accelerate our adoption of Microsoft Teams. We were also able to free our staff from time-consuming activities like handling security patches, upgrades and other platform management tasks, which are now fully managed by NICE.
Chris Wooten, Executive Vice President, NICE, said, Leading financial services firms like BMO GAM are embracing remote and hybrid work environments and collaborative communication technologies like Microsoft Teams. NICE is helping to accelerate their digital transformation and ensure faster time-to-value, by providing complete recording coverage and contributing to compliance with different regulations worldwide. For firms with a cloud-first strategy, our SaaS recording solution delivers a low maintenance, touch-free experience, along with other powerful advantages of the cloud, including zero footprint and unprecedented scalability, reliability and security.
NICEs Certified Compliance Recording for Microsoft TeamsOffered as an on-premise or fully managed cloud solution, NICEs all-in-one compliance recording and assurance platform is used by most of the worlds leading banks and investment firms to record and retain trade-related conversations from unified communications platforms, turrets, desk phones, and mobile phones. It is the first compliance recording solution to be certified under the Microsoft Teams certification program. Offering complete recording coverage for all Microsoft Teams communications, including voice, video, chat and screen sharing, the solution leverages the Microsoft Azure secure cloud for application hosting, and compliant capture and archiving of regulated employee communications.
Learn MoreTo learn more about NICE Compliance Recording for Microsoft Teams:
About BMO Global Asset ManagementBMO Global Asset Management is a global investment manager with offices in more than 25 cities in 14 countries, delivering service excellence to clients across five continents. Our four major investment centres in Toronto, Chicago, London and Hong Kong are complemented by a network of world-class specialist managers strategically located across the globe: BMO Real Estate Partners, LGM Investments and Pyrford International Ltd. BMO Global Asset Management is a signatory of the United Nations-supported Principles for Responsible Investment initiative (UNPRI).
BMO Global Asset Management is a part of BMO Financial Group, a highly diversified financial services provider based in North America with total assets of CDN $950 billion as of 30 April 2021.
About NICENICE (Nasdaq: NICE) is the worlds leading provider of both cloud and on-premises enterprise software solutions that empower organizations to make smarter decisions based on advanced analytics of structured and unstructured data. NICE helps organizations of all sizes deliver better customer service, ensure compliance, combat fraud and safeguard citizens. Over 25,000 organizations in more than 150 countries, including over 85 of the Fortune 100 companies, are using NICE solutions. http://www.nice.com.
Trademark Note: NICE and the NICE logo are trademarks or registered trademarks of NICE Ltd. All other marks are trademarks of their respective owners. For a full list of NICEs marks, please see: http://www.nice.com/nice-trademarks.
Forward-Looking StatementsThis press release contains forward-looking statements as that term is defined in the Private Securities Litigation Reform Act of 1995. Such forward-looking statements, including the statements by Mr. Wooten, are based on the current beliefs, expectations and assumptions of the management of NICE Ltd. (the Company). In some cases, such forward-looking statements can be identified by terms such as believe, expect, seek, may, will, intend, should, project, anticipate, plan, estimate, or similar words. Forward-looking statements are subject to a number of risks and uncertainties that could cause the actual results or performance of the Company to differ materially from those described herein, including but not limited to the impact of changes in economic and business conditions, including as a result of the COVID-19 pandemic; competition; successful execution of the Companys growth strategy; success and growth of the Companys cloud Software-as-a-Service business; changes in technology and market requirements; decline in demand for the Company's products; inability to timely develop and introduce new technologies, products and applications; difficulties or delays in absorbing and integrating acquired operations, products, technologies and personnel; loss of market share; an inability to maintain certain marketing and distribution arrangements; the Companys dependency on third-party cloud computing platform providers, hosting facilities and service partners;, cyber security attacks or other security breaches against the Company; the effect of newly enacted or modified laws, regulation or standards on the Company and our products and various other factors and uncertainties discussed in our filings with the U.S. Securities and Exchange Commission (the SEC). For a more detailed description of the risk factors and uncertainties affecting the company, refer to the Company's reports filed from time to time with the SEC, including the Companys Annual Report on Form 20-F. The forward-looking statements contained in this press release are made as of the date of this press release, and the Company undertakes no obligation to update or revise them, except as required by law.
What is GitHub? How to start using the code hosting platform – Business Insider
If you're interested in software or software development, you've likely heard of GitHub.
For a coder, GitHub is akin to what Pinterest offers to an interior designer a place where a person goes not just to upload content, but also for creative inspiration and collaboration.
Here's what you need to know about GitHub and how it relates to coding.
GitHub is, fundamentally, a hosting platform for coders. The cloud-based service allows coders to effectively manage and maintain open-source programming projects while collaborating with others.
To understand how GitHub works, you have to have an understanding of "Git" and the idea of "version control" in relation to Git.
Git, started by Linux creator Linus Torvalds, is an open-source version control system that tracks changes in files over time.
Version control is an important system when it comes to coding. It enables coders to be nimble with programming, and allows for apps to constantly have new version releases, expansion to other platforms, and bug fixes, among other tracked changes.
Version control systems like Git help maintain the integrity and security of ever-evolving code by safeguarding modifications, and those revisions are then hosted by GitHub, or an alternative "repository" hosting service although GitHub is the most popular among developers.
This allows developers to easily collaborate, allowing them to download a new version of the software, make changes, and upload the newest revision. Every developer can see these new changes, download them, and contribute.
Among the sites that feature Git repositories which is the term used for where Git is stored, often shortened to "repo" GitHub is the most popular, and thus, has the most to offer collaboratively. Put simply, it's the standard for coders.
There are several features that have made GitHub so popular with developers.
If you're looking for a resource to maintain and share code, you can easily install Git and sign up for GitHub for free. Here's how to get started:
1. First, you'll need to install the Git version control system, which you can download for free. Follow the directions specific to the device you're using.
2. Next, you can create your GitHub account at GitHub.com. A free account will have some limitations, but gives you access to both public and private repositories.
3. With your free account, you can get started right away and create a repository by clicking Create a repository on the GitHub welcome page to start a new project.
From the same page, you can also select Start Learningto take an "Introduction to GitHub" course if you need more expertise before getting started with creating a repository.
View original post here:
What is GitHub? How to start using the code hosting platform - Business Insider
Increasing Government Resilience with Cloud-Based Disaster Recovery – GovTech
As state and local governments increasingly rely on cloud services, they have a responsibility to protect their data and ensure their systems are secure. This starts by understanding current practices and solutions are not always secure by default and developing best practices to mitigating new risks that may emerge in the future.
While many state and local governments are making progress to strengthen enterprise security, their efforts must focus not only on prevention, but also robust disaster recovery. By moving disaster recovery to the cloud, state and local governments can reduce time and lower cost to recovery while ensuring mission-critical applications and services are available when constituents need them most.
Last fall, ransomware on a web hosting provider forced the company to take its servers offline, causing several state and local government websites across the country to be inaccessible. The pandemic has also increased the threat landscape. Sixty-one percent of local governments have reported an increase in cyber threats since the beginning of the pandemic, according to recent research from the Center for Digital Government (CDG) and Amazon Web Services (AWS).
State and local government organizations can improve IT resilience with cloud-based disaster recovery while strengthening their overall security posture to combat ransomware threats.
Disaster recovery challenges in governmentState and local governments face several challenges when it comes to disaster recovery. Alex Berkov, manager of solutions architecture for CloudEndure Disaster Recovery, offered by AWS, a leading cloud-based disaster recovery and business continuity solution, says theres often confusion about what disaster recovery actually encompasses.
There is a lot of misconception around the difference between backup and disaster recovery. Often what customers call disaster recovery is actually backup, he says. State and local governments need to ensure security controls are also in place for backups as these backups can be impacted by ransomware.
Many organizations also cannot adequately test their environment, Berkov adds. They may not do frequent testing from their backups or they may rely on traditional disaster recovery solutions that require them to spend weekends in a physical data center to run tests. State and local agencies also might rely on a magnetic tape backup solution where they dump data out, back it up, and then save it on a physical tape.
All these processes are so labor- and time-intensive that organizations might only do them on an annual or infrequent basis, which leads to inadvertent security gaps. These security gaps are costly for government agencies and can lead them to pay ransom to help achieve business continuity for critical constituent services. In 2019, governments reported 163 ransomware events, with more than $1.8 million dollars in ransoms paid.2 In 2020, these figures only increased, as outside parties demanded an average payment of just over $570,000, with requested ransoms ranging from $2,500 to $5 million.
Garrett Pollard, a senior enterprise sales specialist for CloudEndure Disaster Recovery at AWS, says its critical for governments to have a comprehensive continuity strategy given ITs increasing value to the business.
IT supports so many different revenue streams that any downtime may pose a significant loss, Pollard says.
Its clear the traditional approach to disaster recovery doesnt give state and local governments the agility they need to quickly and effectively respond when security issues occur.
Optimizing operations with the cloudMoving disaster recovery to the cloud offers cost and operational benefits to state and local governments that can improve their resiliency.
With the cloud, agencies can access a cost-effective data storage solution for their backups instead of building their own solution on premises, says James Perry, solution architect security lead for WorldWide Public Sector, Education, and State and Local Government at AWS. Perry says with managed cloud services, state and local agencies can see more benefits. Cloud-based disaster recovery can also lower total cost of ownership.
Agencies can avoid performing the undifferentiated heavy lifting associated with racking and stacking equipment, hardware procurement processes, and so on, he says.
While agencies can save on technology costs, theres also the somewhat intangible costs associated with time to recovery. In 2020, state and local governments lost 773 days to downtime. In government, this could mean days, if not weeks, when constituent data is compromised or when constituent services and applications arent operating at their full capacity.
Moving to the cloud also allows agencies to take advantage of automation and reduce demands on IT staff. With the cloud, they gain access to advanced disaster recovery capabilities because cloud-based solutions can be more easily upgraded. Additionally, agencies can take advantage of artificial intelligence and machine learning capabilities to automate threat response.
It gives them the opportunity to focus the IT resources they have on more strategic initiatives, Perry says about cloud- based disaster recovery. Instead of them buying software, managing software inventories, and installing hardware, they can enhance their business applications and deliver value to citizens.
Berkov says moving disaster recovery to the cloud doesnt require massive IT effort for government agencies. Even those who operate in a largely on-premises or hybrid environment can seamlessly make this transition.
Its a very easy entry point for organizations that are either on-prem or hybrid, because it doesnt change how they operate their production infrastructure. They can maintain their production infrastructure, wherever it may be, and the solution can do all the replication, management, and orchestration of their resources, Berkov says.
Some organizations are already seeing improved operational impact from cloud-based disaster recovery. One state agency, for example, experienced a ransomware event that affected its entire on-premises infrastructure, including a database that contained all its employees password information. Backups for the agencys business-critical applications were also compromised during the event, leaving it without any backups from which to recover.
Rather than undergo a lengthy hardware procurement process and entirely rebuild its data center, the agency decided to shift its entire IT operation to the cloud. It was able to restore all of its mission-critical applications in a cloud environment in less than two weeks. The agency also has realized significant cost savings it is now running its IT operation at 40 percent of what it would cost to run it on premises.
Best practices for moving disaster recovery to the cloudDisaster recovery in the cloud addresses several key challenges for state and local agencies by providing a flexible, scalable solution that can reduce time and lower cost to recovery while helping to address budget constraints and minimize unintended security risks.
State and local agencies should consider the following as they transition to cloud-based disaster recovery and compare solutions.
Establish recovery time objectivesBefore a state or local agency enlists the services of a cloud provider, they should clearly map out and understand their disaster recovery needs, Pollard says.
Sit down and take a hard look at your business and establish what your recovery time objectives are for each application. Its a very common exercise where you take a step back, analyze the data, and see what the recovery times are so you can figure out which solution is the best fit, he says.
Plan for flexibility and scalabilityYou need to make sure the solution is effective not just for today, but can handle any future growth, Berkov says. The other thing you need to consider, particularly when it comes to ransomware, is the flexibility and insurance of having different recovery forms. That way, if you are hit by ransomware, your organization can go back to a previous point in time just as quickly as you can fail over.
He adds: The cloud really does, from a scalability perspective, give organizations the option to right-size their disaster recovery environment. You dont need to over- provision anything its scalable and its elastic you only use what you need.
Ensure data governance and compliance Whether an organization operates on premises or in the cloud, good data governance is critical to effective disaster recovery.
Its important for government organizations to have compliance across all of their workloads, which is why they should work with a cloud provider who has public sector expertise and a solid track record of managing these types of workloads.
Perry says some of the questions organizations should ask potential cloud providers include: How do you encrypt the data in transit? How do you encrypt it at rest? How can we make sure that only the right people have access to the data? Theyre [government agencies] often learning about how the cloud operates and all of the compliance benefits it provides. So, theres a learning process, and how to extend their governance processes, auditing, and monitoring [activities] to the cloud is part of that.
Test, test, testTesting is critical when it comes to disaster recovery.
You dont want to wait until an event happens before you test, Berkov says. The cloud opens up the ability for you to test on your own schedule at any time with really no impact. It also allows you to increase the frequency of those tests, so you can make sure that, as your environment changes, you can validate it and verify everything is running properly.
ConclusionFrom ransomware and malware to email phishing schemes and denial-of-service issues, security threats continue to impact state and local governments.
As these organizations try to build a more robust cybersecurity program, effective disaster recovery should be an integral component of their holistic cybersecurity strategy. The traditional approach to disaster recovery with backups from magnetic tape and a reliance on on-premises data centers can be costly and time consuming for state and local governments facing budget cuts and limited IT resources. Government organizations can leverage the cloud to modernize their disaster recovery program and make their IT operations more cost efficient. By doing so, they can improve business continuity and build their resilience.
The challenge state and local governments face is they often dont have IT staff or security experts to build disaster recovery processes internally and execute them in a consistent way on premises, Perry says. One of the greatest benefits of the cloud is that the services you need to combat ransomware whether its patch management, encryption, firewalls, or intrusion detection are provided as managed services in the cloud. Theyre integrated so you have a toolbox thats been built to work together to greatly simplify the IT complexities and challenges your organization faces.
Read more here:
Increasing Government Resilience with Cloud-Based Disaster Recovery - GovTech
Google and Ericsson team up to help enterprises apply 5G in their operations – SiliconANGLE News
Google LLC today announced it has inked a partnership with Ericsson to develop new joint offerings that will help enterprises harness ultrafast 5G network connections in their technology projects.
Telecommunications providers worldwide are upgrading their networks to the 5G standard, which promises to enable connections up to 100 times faster than before. Those speedy connections allow enterprises to perform tasks that werent practical until now. Manufacturers, for example, can quickly upload sensor data from their equipment to the cloud for analysis and alert technicians to potential malfunctions in near real-time.
Google is building products that enable organizations apply 5G in their operations with less effort. As part of its strategy, the search giant has partnered with numerous telecommunications providers and tech firms worldwide to help bring its 5G products to more enterprises around the globe. The new partnership with Ericsson, one of the biggest players in the 5G ecosystem, marks a notable milestone in the effort.
Sweden-based Ericsson is a major supplier of networking equipment to the telecommunications sector. The company makes, among other products, hardware and software for building 5G networks. As part of the partnership announced today, Google will work with Ericsson to bring joint offerings to market that will combine their respective technologies to help enterprises use 5G more effectively.
The search giant shared a few early details about the effort this morning as the Mobile World Congress conference in Barcelona began its second day. The joint offerings, it said, will be built with the help of D15 Labs, a research and development center operated by Ericsson that has an on-premises 5G network for testing new technologies.
The companies say that theyve already completed functional onboarding of Ericsson 5G on Anthos to enable telco edge and on-premises use cases.
The telco edge is a term that describes an emerging business model whereby carriers lease some of their computing infrastructure to enterprises. A manufacturer, for example, may wish to run the monitoring application it uses to track the health of its factory equipment in close physical proximity to the factory. If a carrier has computing hardware available near the factory, it can make the infrastructure available to the manufacturer, which can in turn use it to host the monitoring application.
This approach is gaining traction because hosting workloads close to a companys assets reduces network latency and thereby speeds up processing. Google is targeting such edge telco use cases with its Anthos platform, which provides a kind of operating system for running applications at the edge of the network in proximity to enterprise assets. Anthos also supports other types of environments.
Google and Ericsson are exploring new ways of combining Anthos and 5G infrastructure not only at the latter firms D15 Labs center but also on a wireless network operated by Italian carrier TIM. The companies are using Googles cloud technology and Ericssons 5G gear for a pilot project focused on running enterprise applications at the edge of the network. The applications, the companies said, aim to help organizations in manufacturing, transportation and a number of other sectors improve operational efficiency.
Organizations have a tremendous opportunity to digitally transform their businesses with 5G and cloud capabilities like artificial intelligence and machine learning at the edge, said Google Cloud Chief Executive Officer Thomas Kurian.
We are holding our second cloud startup showcase on June 16.Click here to join the free and open Startup Showcase event.
TheCUBEis part of re:Invent, you know,you guys really are a part of the eventand we really appreciate your coming hereand I know people appreciate thecontent you create as well Andy Jassy
We really want to hear from you. Thanks for taking the time to read this post. Looking forward to seeing you at the event and in theCUBE Club.
Link:
Google and Ericsson team up to help enterprises apply 5G in their operations - SiliconANGLE News
Frequent Disaster Recovery Testing Is Critical To Meeting Recovery Objectives, New iland Study Finds – GlobeNewswire
HOUSTON, June 29, 2021 (GLOBE NEWSWIRE) -- iland, a leading VMware-based provider for application hosting, data protection and disaster recovery services delivered on the iland Secure Cloud Platform, today released the findings of its research into organizations disaster recovery readiness. The study found that as organizations work diligently to support evolving business needs, while at the same time battling cybercrime and other threats to critical data, the majority of disaster recovery solutions are not tested on a regular basis. More importantly, as the IT estate changes over time, the survey indicated most disaster recovery solutions would not meet recovery objectives.
The research, When Plan B Goes Wrong: Avoiding the Pitfalls of DRaaS surveyed 150 technical and business decision makers from organizations drawn from a wide cross-section of U.S. enterprises, each employing a minimum of 500 people. The objectives of the research were to establish what DR systems organizations currently have in place, how often plans are tested and whether enterprises are confident in their ability to recover from disaster as swiftly and easily as possible.
Key findings include:
With the rise in remote work and the frequency and impact of cybercrime growing each year, having a comprehensive disaster recovery strategy in place is critical to ensure organizations can defend, protect and quickly recover from data loss, said Scott Sparvero, CEO at iland. As we found in our research, disaster recovery implementations are on the rise, but regular testing is falling behind. This means that as IT teams deploy new resources to support increasing workload requirements, the disaster recovery plan needs to be updated in kind. Regular testing can quickly uncover any potential disaster recovery shortfalls. Working with a DRaaS provider like iland gives enterprises confidence in their DR solution. Through planned testing intervals, iland ensures that organizations are ready to recover as swiftly and quickly as possible .
While many organizations have been slow to embrace the cloud and DRaaS, the study indicates that the pandemic is likely to be accelerating the transition given the increased focus on remote work and access, said Justin Augat, vice president of product marketing at iland. Enterprises that have not yet done so must give very serious thought to the status of disaster recovery in their organization, and find the right platform to meet expectations of business continuity. iland is proven to protect a customers critical applications, and provides tangible benefits such as real-time replication to an increasing number of businesses.
About iland
iland is a global cloud service provider of secure and compliant hosting for infrastructure (IaaS), disaster recovery (DRaaS) and backup as a service (BaaS). They are recognised by industry analysts as a leader in disaster recovery. The award-winning iland Secure Cloud Console natively combines deep layered security, predictive analytics and compliance to deliver unmatched visibility and ease of management for all of ilands cloud services. Headquartered in Houston, Texas and London, UK, and Sydney, Australia, iland delivers cloud services from its cloud regions throughout North America, Europe, Australia, and Asia. Learn more at http://www.iland.com.