Category Archives: Cloud Servers

Legacy IT: The hidden problem of digital transformation – SC Magazine

Companies may want to undertake digital transformation, but often start with legacy servers built in the early 2000s. Todays columnist, Hemanta Swaim, formerly of TiVo, offers some insights on how to secure legacy IT systems. Jemimus CreativeCommons Attribution 2.0 Generic CC BY 2.0)

Legacy IT has become the dirty little secret of digital transformation. These systems, which include servers, OSes, and applications, are relied on by almost every organization for business-critical activities and many CISOs struggle to protect them from attackers.

During my time as CISO for a public company, I got a first-hand look at the depth of the legacy challenge. We had more than 1,000 servers in use that were built in 2003 but no longer supported by vendors, and more than 200 legacy servers were designated for business-critical activity that drove significant annual revenue. Its a non-starter to take these servers offline, and protecting them comes at a significant cost.

The cost and complexity of protecting legacy systems

The complexity of the legacy system lies in the IT teams inability to update and maintain them. Many of these systems and apps have been used for multiple years and may have millions of lines of code written in them. Changing or altering the code could impact one of the revenue-generating applications that keeps the business running.

On top of this, legacy systems are near impossible to patch. This makes them incredibly vulnerable and a target for attack. So how can organizations protect the systems that serve as the core of their business?

Legacy security cant protect legacy systems

Companies have to absorb the cost of protecting legacy systems within current cybersecurity spending. As such, organizations try to retrofit existing solutions like firewalls and endpoint protection.

Digital transformation has made this approach obsolete. Modern infrastructure, data centers, and the move to hybrid clouds give attackers more pathways to target these vulnerable systems.

Legacy systems that were once used by a handful of on-premise applications can now get used by hundreds of applications both on-prem and in the cloud. Containers may even interact with the mainframe. These are connections that firewalls were simply not built to secure. Many firewalls are also legacy devices and dont integrate with modern applications and environments. Using them to secure legacy systems against outside intrusion simply increases the total cost of ownership without actually securing the systems against modern threats.

CISOs require a convergence of security approaches that protect legacy assets, while also minimizing threats across modern assets. The approach we evaluated and trusted was based on the core principles of Zero Trust.

Improve legacy systems with Zero Trust

Establishing Zero Trust around legacy systems and applications requires four critical components: visibility of legacy assets, micro-segmentation, identity management and continuous monitoring.

Companies find it challenging to obtain the proper view of existing legacy assets, but its vital to ensuring the security of the organization. Its not enough to secure most assets it only takes one overlooked server that attackers find to breach the organization.

After an acquisition the first step we took was to create a full view of the entire ecosystem and map everything from legacy systems to cloud environments, containers, and applications. By understanding which workloads present the most risk, we could deduce the prime starting points for enforcing Zero Trust.

Its a recipe for inconsistent policy and blind spots to start on the path to Zero Trust with anything less than a holistic view of the entire network. Taking a holistic view empowers the security teams to identify the critical areas to start the second step: implementing micro-segmentation.

While firewalls have been the traditional choice for segmenting assets from networks, theyre not built to protect legacy and unpatched assets at such a granular level. Older techniques such as firewalls and VLANs are costly to own and maintain, and they frequently place similar legacy systems in a single silo. For an attacker, its like shooting fish in a barrel a single intrusion can lead to multiple critical systems being exploited.

In addition, security and operations teams need to constantly update rules and policies between the firewalls and the applications and assets theyre supposed to protect. This leads to overly permissive policies that may improve workflow, but significantly undermines the security posture the organization will try to build.

We used micro-segmentation technology Guardicore Centra, which let us build tight, granular security policies to prevent lateral movement. In addition, security teams can deploy micro-segmentation across the entire infrastructure and workloads of all types, including data centers, cloud and modern applications. This eliminates high-risk gaps in the security across infrastructure.

Its very important to enhance the organizations identity and access management platform. Proper user identity management plays a critical role in the Zero Trust principle. Users need access to systems and applications. Security teams must grant access based on each individual users role and automate to verify before granting access to minimize the operational burden and enable to scale.

Micro-segmentation technology offers deep visualization capabilities that make policy management easier and provide capabilities to manage segmentation based on application usage. Applying micro-segmentation across production infrastructure helps to minimize the risks with proper visualization of modern and legacy workloads. This enables the enforcement of server-level policy, which allows only specific workflows between legacy systems and between modern environments and applications to and from the legacy systems.

Legacy systems and applications continue to present a tough challenge for organizations. Theyre business critical, but incredibly hard to maintain and properly secure. As organizations embark on digital transformation and introduce hybrid cloud, new applications and data centers, the problem becomes exacerbated.

Securing the business starts with securing the critical assets that make the business run. Visibility of the infrastructure, combined with micro-segmentation and continuous monitoring, controls the risk of legacy systems by building tight segmentation policy that attackers cant exploit. And dont neglect enforcing the basic security hygiene across enterprise.

Hemanta Swain, senior independent consultant, former chief information security officer, TiVo

See the rest here:
Legacy IT: The hidden problem of digital transformation - SC Magazine

TGen Leverages phoenixNAP’s Hardware-as-a-Service Powered by Intel to Empower COVID-19 Research – PR Web

Empowering COVID-19 Research with Cutting-Edge Tech

PHOENIX (PRWEB) December 23, 2020

phoenixNAP, a global IT services provider offering security-focused cloud infrastructure, dedicated servers, colocation, and specialized Infrastructure-as-a-Service technology solutions, announced a case study detailing its collaboration with Intel on building an IT platform for a COVID-19 project by Translational Genomics Research Institute (TGen), an affiliate of City of Hope.

In an effort to help the global fight against COVID-19, TGen proposed the creation of a centralized platform for knowledge and information sharing between researchers from all over the world. The platform is intended to automatically pull data related to COVID-19 sequenced genomes from multiple sources and provide an aggregated dataset to enable comparative research. This would help identify previously uncharacterized elements in the SARS-CoV-2 genome and observe important correlation between them for the purpose of improving diagnostics, vaccine constructs, and treatments for COVID-19.

Considering the volume and complexity of biomedical data, the platform needed powerful hardware to ensure seamless processing, reliable storage, and global availability. phoenixNAP and Intel collaborated to provide a customized solution to support these needs. phoenixNAPs hardware-as-a-service (HaaS) powered by Intel Xeon Dual Gold 6258R CPUs and Intel NVMes (P4610) with Intel VROC, Intel NICs, and Intel Optane persistent memory met the needs of the project. The ultrafast network experience is enabled through a customized implementation of Intel Tofino Programmable Ethernet Switch Products, which Intel has offered since the acquisition of Barefoot Networks in June 2019.

We needed a robust computational environment for large data volumes and sophisticated analytical tools. We have maintained compute infrastructure with phoenixNAP for years, but we needed to expand and customize it to support this project. We got a more streamlined, powerful infrastructure that will give us enough power and memory, while at the same time providing us with a great degree of flexibility as our research expands. Intel Optane PMem emerged as a logical solution to support large data sets, said Glen Otero, VP Scientific Computing, TGen.

Healthcare is becoming more intelligent, distributed, and personalized. Intel technologies are helping to enable a new era of smart, connected, value-based patient care, remote medicine and monitoring, individually tailored treatment plans, and more-efficient clinical operations. Intel-enabled technologies help optimize workflow to lower research and development costs, improve operational efficiency, speed time to market, and improve patient health, said Rachel Mushahwar, VP and GM, Intel US Sales, Enterprise, Government and Cloud Server Providers

TGen is doing an amazing job every day and this project is one of the examples of how they are actively working to make life-changing results. We discussed their project and knew that Intel will be open to collaborating with us on building a proper platform for it. We are excited for having the opportunity to work with both Intel and TGen on something this relevant to the entire world, said Ian McClarty, President of phoenixNAP.

TGen has so far identified several new features in the SARS-CoV-2 genome and continues to focus on making new contributions to the cause. Its project addresses a critical need of the global biomedical community and promises to enhance further research on COVID-19. It also demonstrates the potential of using innovative technology to make a difference in the lives of millions of people.

Download full case study here: https://phoenixnap.com/company/customer-experience/tgen

About phoenixNAP

phoenixNAP is a global IT services provider with a focus on cyber security and compliance-readiness, whose progressive Infrastructure-as-a-Service solutions are delivered from strategic edge locations worldwide. Its cloud, dedicated servers, hardware leasing and colocation options are built to meet always evolving IT businesses requirements. Providing comprehensive disaster recovery solutions, a DDoS-protected global network, hybrid IT deployments with software and hardware-based security, phoenixNAP fully supports its clients business continuity planning. Offering scalable and resilient opex solutions with expert staff to assist, phoenixNAP supports growth and innovation in businesses of any size enabling their digital transformation.

phoenixNAP is a Premier Service Provider in the VMware Cloud Provider Program and a Platinum Veeam Cloud & Service Provider partner. phoenixNAP is also a PCI DSS Validated Service Provider and its flagship facility is SOC Type 1 and SOC Type 2 audited.

Share article on social media or email:

Go here to see the original:
TGen Leverages phoenixNAP's Hardware-as-a-Service Powered by Intel to Empower COVID-19 Research - PR Web

Global Cloud Server Market Share, Competition Analysis, COVID-19 Impact Analysis & Projected Recovery, and Market Sizing & Forecast to 2026 -…

A recent market research report added to Reportspedia is an in-depth analysis of Global Cloud Server Market 2020-2026.

This report examines all the key factors influencing growth of global Cloud Server market, including demand-supply scenario, pricing structure, profit margins, production and value chain Regional assessment of global Cloud Server market unlocks a plethora of untapped opportunities in regional and domestic market places.

Top Key Players Profiled in this report are:

NECGoogle Inc.Vmware.AmazonIBM CorporationLiquid WebDell Inc.Cisco Corp.HitachiMicrosoft CorporationFujitsuOracle Corp.Hewlett-PackardRackspace

Get the PDF Sample Copy of this report @https://www.reportspedia.com/report/business-services/global-cloud-server-market-report-2020-by-key-players,-types,-applications,-countries,-market-size,-forecast-to-2026-(based-on-2020-covid-19-worldwide-spread)/69584#request_sample

The latest report on Cloud Server Market contains a detailed analysis of this marketplace and entails information about various industry segmentation. According to the report, the market is presumed t0 amass substantial revenue by the end of the forecast duration while expanding at a decent growth rate.

In addition, the research report provides a comprehensive analysis of the key segments of the Cloud Server market. An outline of each market segment such as type, application, and region are also provided in the report.

Major Product Types covered are:

Hybrid CloudPrivate CloudPublic CloudOthers

Major Applications covered are:

Banking, Financial Services and Insurance (BFSI)EducationGovernmentHealthcare and Life SciencesManufacturingMedia and EntertainmentRetailTelecommunication and ItTransportation and LogisticsTravel and HospitalityOthers

Get 40% Discount (upcomingChristmas & New Year) Offer on this Premium Report @ https://www.reportspedia.com/discount_inquiry/discount/69584

On the basis of region, the market is evaluated across:

Cloud Server Market COVID-19 Impact Analysis:

The COVID-19 outbreak was sudden and could not have been considered as dangerous during the first attack on the Chinese city of Wuhan. Although, everything in that city was shut down but coronavirus infection was as widespread in China as wildfires. Within a few months, it spread to neighboring countries and then to every corner of the globe. The World Health Organization has declared it a pandemic and has so far caused massive losses in several countries.

Do You Have Any Query Or Specific Requirement? Ask to Our Industry Expert @

https://www.reportspedia.com/report/business-services/global-cloud-server-market-report-2020-by-key-players,-types,-applications,-countries,-market-size,-forecast-to-2026-(based-on-2020-covid-19-worldwide-spread)/69584#inquiry_before_buying

The study objectives of this report are:

Table of Contents

Global Cloud Server Market Research Report 2020 2026

Chapter 1 Cloud Server Overview of the Market

Chapter 2 Economic Impact on Industrial Sector

Chapter 3 Global Cloud Server Market Competition by Manufacturers

Chapter 4 Global Production, Revenue (Value) by region

Chapter 5 International Supply (Manufacturing), Consumption, Export, Regional Importation

Chapter 6 Global Production, Income (Price), Trend In Price In Type

Chapter 7 Global Cloud Server Market Performance Analysis

Chapter 8 Production Cost Analysis

Chapter 9 Industrial Liaison, Surveillance Strategy and Consumer Consumers

Chapter 10 Marketing Strategic Review, Distributors / Traders

Chapter 11 Cloud Server Market Features Analysis

Chapter 12 Global Cloud Server Market Forecast

To Analyze Details Of Table Of Content (TOC) of Cloud Server Market Report, Visit Here: https://www.reportspedia.com/report/business-services/global-cloud-server-market-report-2020-by-key-players,-types,-applications,-countries,-market-size,-forecast-to-2026-(based-on-2020-covid-19-worldwide-spread)/69584#table_of_contents

Visit link:
Global Cloud Server Market Share, Competition Analysis, COVID-19 Impact Analysis & Projected Recovery, and Market Sizing & Forecast to 2026 -...

Private Cloud Server Market Report, History And Forecast 2020-2025, Breakdown Data By Manufacturers, Key Regions, Types And Application – The Monitor

These high-end research report highlighting market developments across current and historical timeframes highlights market size and dimensions besides taking into account value and volume based estimations.

The primary aim of this research report is to optimally identify major growth favoring elements as well as growth retardants such as barriers and risks that significantly dampen optimistic growth spurt.

Access the PDF sample of the Private Cloud Server market report @ https://www.orbisresearch.com/contacts/request-sample/4068569?utm_source=Atish

Other requisite details portrayed in the report include sections on top-notch vendor assessment, with detailed emphasis on industry forerunners. Sections on trend assessment and their capabilities in favorable decision-making process have also been discussed at length.

Key Players Mentioned in the Report:This report focuses on the global top players, coveredAmazonMicrosoftGoogleDropboxSeagateEgnyteBuffalo TechnologySpiderOakMEGAD-LinkElephantDriveMozy Inc.POLKASTDellJust CloudSugarsync

Make an enquiry of Private Cloud Server market report @: https://www.orbisresearch.com/contacts/enquiry-before-buying/4068569?utm_source=Atish

Segmentation Overview

The global Private Cloud Server market has been examined with ample detailing to disclose vital market specific developments across segment categories. Segment classification of the market structure has been encouraged by our seasoned in-house research experts to allow readers comprehend the versatility of the market in terms of product and service variation. Additional details on regional expanse and geography-based vendor investments are also discussed extensively based on which global Private Cloud Server market is splintered into type, application and end-user.

Browse the complete Private Cloud Server market report @ https://www.orbisresearch.com/reports/index/global-private-cloud-server-market-report-history-and-forecast-2014-2025-breakdown-data-by-companies-key-regions-types-and-application?utm_source=Atish

Segment by Type, the product can be split intoUser HostProvider Host

By Application, the market can be split intoIndividualSmall BusinessLarge Organizations

Geographical Expanse Analysis: Global Private Cloud Server Market

This research report also highlights details on region-wise demarcation, encapsulating details on massive growth opportunities and favorable growth conducive elements that harness sales optimization and revenue expansion. The report is placed to encourage appropriate vendor initiatives aligning with dynamics transitions and customer preferences.

About Us:Orbis Research (orbisresearch.com) is a single point aid for all your market research requirements. We have vast database of reports from the leading publishers and authors across the globe. We specialize in delivering customized reports as per the requirements of our clients. We have complete information about our publishers and hence are sure about the accuracy of the industries and verticals of their specialization. This helps our clients to map their needs and we produce the perfect required market research study for our clients.

Contact Us: Hector CostelloSenior Manager Client Engagements4144N Central Expressway,Suite 600, Dallas,Texas 75204, U.S.A.Phone No.: USA: +1 (972)-362-8199 | IND: +91 895 659 5155

The rest is here:
Private Cloud Server Market Report, History And Forecast 2020-2025, Breakdown Data By Manufacturers, Key Regions, Types And Application - The Monitor

Bare Metal Cloud Market Poised to Expand at a Robust Pace Over 2025 – Farming Sector

Global Bare Metal Cloud Market: Snapshot

As a public cloud service that offers a facility to customers to rent hardware resources from a remotely situated service provider, bare metal cloud comes with the primary benefit of flexibility for the businesses to meet their specific and diverse requirements. With base metal clouds services, small and medium enterprises can also troubleshoot their applications without interfering with other nearby virtual machines (VMs). Since bare metal cloud are made out of dedicated servers, complications from neighbors are avoided and it works very well for high traffic workloads that are intolerant to latency as well as applications pertaining to big data.

Get Free Sample of Research Report @https://www.tmrresearch.com/sample/sample?flag=B&rep_id=2777

Some of the key factors augmenting the demand in the global bare metal cloud market are: critical need for reliable load balancing latency sensitive and data-intensive operations, decommissioning of workloads after termination of service level agreements (SLAS), no noisy neighbors and hypervisor taxes, and the advent of fabric virtualization. On the other hand, restraints such as stringent cloud regulations, expensive model, hindrances pertaining to restoring, and lightweight hypervisors are challenging the bare metal cloud market from attaining greater potential. That being said, growing usage for big data and DEVOPS applications, micro-services and batch processing applications, and growing interest in open compute project (OCP) are anticipated to open new opportunities in this market in the near future. Some of the industry verticals that are generating the demand for bare metal cloud are manufacturing, retail, healthcare, IT and telecommunication, government, and BFSI.

Some of the key audiences of this research report are providers of base metal cloud, application developers, managed service providers, third party system integrators, bare metal hardware vendors, regulatory agencies, and government. The report provides analytical and figurative assessment of the markets potential during the forecast period of 2017 to 2025.

Global Bare Metal Cloud Market: Overview

Bare metal cloud is an alternative for virtual cloud services and works with the help of a dedicated server. The dedicated server is needed in order to balance and scale the arrangement of this model. However, the dedicated hardware is attributed without including any additional storage. Yet, bare metal cloud server can support huge workloads. The main motto of bare metal cloud is to minimize the overhead which is incurred on account of the implementation of virtual technology. Despite the elimination of implementing virtual technology, bare metal cloud services offers efficiency, scalability, and flexibility. One of the other benefits of bare metal cloud servers is that it does not require any host or recipient and can be deployed with cloud-like service model. Bare metal cloud combines features of traditional hosting as well as infrastructure as a service (IaaS) in order to provide high performance workloads. Due to all these reasons, this market is expected to witness high growth in the years to come.

Global Bare Metal Cloud Market: Key Trends

There is a high demand for bare metal cloud from the telecom and the IT sector on account of big data, resulting in high demand for effective storage. The advertising sector will also make extensive use of bare metal cloud and this is a trend which is anticipated to continue throughout the forecast period. Today enterprises are switching from conventional hosting services to bare metal cloud on account of the escalating demand for secure storage facility as well as advancements in the cloud industry. Bare metal cloud solutions offer innumerable benefits such as data security, effective service delivery, efficient data storage, faster service delivery, improved performance, streamline data center operations, and standardized hardware platforms.

Global Bare Metal Cloud Market: Market Potential

The global bare metal cloud market has displayed promising potential as it offers various advantages such as easy maintenance of records, enhanced security, and ability to monitor activities in residential and commercial areas. Bare metal cloud has also found its use and application in providing national security. Because it can help monitor activities, it is enabling countries to fight against terrorism as well as external threats. This is anticipated to create potential growth opportunities within the global bare metal cloud market.

Check Exclusive Discount on this report @https://www.tmrresearch.com/sample/sample?flag=D&rep_id=2777

Global Bare Metal Cloud Market: Regional Outlook

On the basis of geography, the global bare metal cloud market is segmented into Asia Pacific, North America, Latin America, Europe, and the Middle East and Africa. Of these, North America has been leading in the market on account of the increasing focus on research and development in cloud technology. The European bare metal cloud market is also estimated to expand at a fast pace with key contribution from Germany, Spain, and the UK. However, it is Asia Pacific which is anticipated to expand the fastest pace during the forecast period on account of the increasing number of new market players. The digicloud initiative which is undertaken by the government in Singapore so as to offer IaaS, SaaS, along with the use of bare metal servers is also an important factor driving the growth of the Asia Pacific bare metal cloud market.

Global Bare Metal Cloud Market: Competitive Landscape

Key players in the market are concentrating towards achieving organic growth and thus implementing various strategies in order to maintain their position. The report profiles leading players operating in the market. They are: Rackspace Hosting, Inc. (The U.S.), CenturyLink, Inc. (The U.S.), IBM Corporation (The U.S.), Media Temple (The U.S), and Internap Corporation (The U.S.).

About TMR Research

TMR Research is a premier provider of customized market research and consulting services to busi-ness entities keen on succeeding in todays supercharged economic climate. Armed with an experi-enced, dedicated, and dynamic team of analysts, we are redefining the way our clients conduct business by providing them with authoritative and trusted research studies in tune with the latest methodologies and market trends.

Read more here:
Bare Metal Cloud Market Poised to Expand at a Robust Pace Over 2025 - Farming Sector

Dedicated server or cloud server: which one to choose? – Business MattersBusiness Matters

Before making a decision, the most appropriate way to resolve this dilemma is to understand the differences between dedicated server and cloud server.

Some issues must be very well clarified, such as the way of functioning, the advantages and disadvantages of each alternative. It is also necessary to understand on what occasions one or another technology is more suitable.

Below, we will show in detail the answers to each of these questions. Thus, you will be able to visualize the best option for your business.

First, you need to understand what a dedicated server is. Basically, it is a computer that allows remote access to files and software. In other words: we are talking about a physical machine, also known as Bare Metal.

Maybe the best example of dedicated server is Proto Compute. This is a dedicated server specifically designed for high performance operations, data security, IP Pool, and stability. Proto Compute provides various features such as reboot, boot-up control, shutdown, and so on. These features make it easy for users to remotely control and configure the operating system. This dedicated server works on Supermicro based systems. Supermicro based system that supports both hot and less critical cold storage. Proto Compute is provided by Heficed, a high-tech server provider company.Dedicated servers are very popular with companies that manage large amounts of data but dont want to lose any privacy.

The term dedicated is used to determine that only one customer has access to the hardware. Not to mention that the server is configured according to the companys needs. This option is still frequently adopted, as it allows a high density of resources in a unitary environment. This means cost savings.

Suppose a machine occupies only one unit in the data center, for example. But it is possible to put up to 528 GB of memory, with 8 SSDs of up to 3.8 TB each. This option is indicated for those who intend to implement their own virtualization or in cases in which a large acquisition of resources is required on a single machine.

This service option provides some advantages to companies. See the main ones below!

As the server is dedicated, all resources are used only by the contracting company. Therefore, all your performance can be used exclusively this is an important feature for businesses that need high performance in processing and IOPS.

Another considerable benefit of this technology is stability. Since all resources are centralized, their operation is also simpler. But, as it is an equipment that works in isolation (stand alone), it is not always possible to maintain the redundancy of all components when a hardware problem occurs.

There is the redundancy of cards, controllers and sources, for example. However, in the event of a complete hardware loss, the entire environment will be unavailable. Cloud environments (IaaS), on the other hand, work with a greater contingency for the loss of hardware, enabling a wide availability of resources.

Security is also seen as a major advantage in this model of servers, since only authorized persons can access the system. However, when concentrating the entire operation of the company on a single server, people should be given unrestricted access, since it is not possible to restrict it by specific parts of the environment. This is a feature easily configured on targeted servers.

Before we talk about the advantages of using the cloud, lets quickly conceptualize it. The term infrastructure in the cloud or virtual data center is used to refer to infrastructure resources created through virtualization, for the purpose of performing data processing. Thus, when hiring a cloud computing service, the customer does not pay for equipment, but for virtual resources that are capable of processing and storing different workloads.

Cloud servers offer a number of benefits, as you can see below:

This feature allows the customer to increase or decrease the computational resources of his service according to the business need.

Because it offers the possibility to scale resources, the cost is an attraction of the cloud. This is because you pay only for the services that are actually used. In addition, there is a reduction in the amounts referring to investments in hardware and software from a proprietary data center.

Cloud servers have features that enable high availability in critical cases. Thus, when there are technical problems, new sections are created, maintaining all the characteristics of the environment. Imagine that you have several applications in a single system, such as e-mail, ERP and database. In this case, the cloud offers a greater guarantee compared to dedicated servers, on which all applications would be stopped. But if they are on a cloud server, they will maintain their availability.

The choice between the two models must take into account the needs of the company both technological and financial. The cost of the dedicated server is lower, however the cloud can be segmented, which makes it more accessible, in addition to offering greater availability. What each company must do is identify the profile and which of the two solutions is best suited to its business.

There are situations in which the best alternative is a compromise between both options. If you have a database that needs a high concentration of resources, you can choose a physical server in order to have the benefit of density. The other services that talk to the database are hosted on virtual servers, taking advantage of segmentation. Before deciding on the best option, it is necessary to make some considerations. No matter where the server is: in the cloud or in the companys own data center, the big difference is related to spending on resources.

The cost of 528 GB of memory on a dedicated physical server is much less than the lowest value that a cloud server can reach, the same goes for SSD storage. This means that you can have a fast, lower cost and high quality environment, but without so much availability guarantee. This is the single point of failure.

Anyway, choosing between physical dedicated servers or cloud server is a decision that requires great care on the part of the company. It is necessary to consider a series of issues, such as the operation of the two options, the advantages of using each technology and all its benefits. Thus, it will be possible to choose the alternative that offers the most benefits to the business.

Excerpt from:
Dedicated server or cloud server: which one to choose? - Business MattersBusiness Matters

4 reasons your business needs to switch to cloud servers – TechEngage

If your business is still running on local servers in this advanced age of technology, it is time for you to give it a second thought. Cloud computing is on the rise, and in the past few years, many businesses have moved their data to cloud servers. The main reason behind this is those cloud servers are easy to manage and have plenty of benefits. More than 90% of businesses running their functions online are already using cloud computing, including your competitors. So, if you do not want to be left behind and want to speed up your business process, it is time for you to switch to cloud servers. In this article, I have mentioned a few reasons that will convince you to use Cloud Servers for your business; lets take a look:

One of the major reasons businesses switch to cloud services is that it helps them save money. Once you have moved your business to cloud storage, you would not have to worry about spending money on any costly hardware. You would not need any local servers anymore, and you wont have to pay for its maintenance either. The cloud company you will be using will take care of the maintenance, which means less hassle for you.

If you end up losing your business data on in-house servers, you will have to go through a lot of trouble to recover it. In fact, in some cases, you could end up losing your data permanently, and you would not be able to do anything about it. However, by moving your data to cloud servers, you are protecting yourself from a situation like this. Cloud computing offers you plenty of affordable data recovery options, and you are more likely to get all your data back. In this modern age, data is everything, and losing your data could be equivalent to losing your entire business. It is always better to be safe than sorry.

Another great advantage of switching to cloud servers is that you can enjoy more security for your systems and data. It doesnt matter how good your in-house servers are; they are always at a risk of breach and can be taken down easily by hackers. With the cloud servers, the story is completely different as they are extremely hard to penetrate. If your data is just sitting there on a cloud server, you would never have to worry about it following in the wrong hands. You can also use two-factor authentication to make your data more secure.

With your in-house servers, you would not be able to access your data from any place you want. Your options will be very limited, and in most cases, you will have to go to your offices to access the files you need. On the other hand, as cloud servers run online, you will be able to access your business from any part of the world, and at any time you want. If you or your team has to go out a lot in the field, switching to cloud servers is the wise thing.

See original here:
4 reasons your business needs to switch to cloud servers - TechEngage

Microsoft is designing its own ARM-based processor for Surface and cloud servers – Digital Trends

Microsoft is working on designing its own custom processor for its products, according to a report from Bloomberg news. Details are scarce, but the report notes that the new processor Microsoft is working on is in-house and will use ARM-based designs. It could not only be used to power data centers, but also the Surface line of computers.

Though Microsoft already worked with its partner Qualcomm on ARM-based processors for the Surface Pro X and other Windows 10 on ARM devices like the Galaxy Book S, this move also could signal another bigger shift away from depending on Intels processors and Qualcomms partnership.

Microsoft is not alone in making such a move. Apple recently proved that it could successfully shift away from relying on Intels line of processors in favor of its own in-house processor. In November, the iPhone maker announced the M1 CPU, in a new line of MacBook Air, MacBook Pro, and Mac mini models. That ARM-based M1 CPU has been praised for being more powerful, with longer battery life, too.

Previous Microsoft Surface products have all had chips from Intel and AMD. The exception is the Surface Pro X, which features the Microsoft SQ1 and SQ2 chip, which was co-engineered with help from Qualcomm. The Surface Laptop 3, meanwhile, featured a custom Surface Edition AMD Ryzen chip.

Intels Xeon chips power most data centers. AMD dipped into the server market, too, with its EPYC lineup of processors.

Even if the ARM-based chip Microsoft is working on is powerful enough, Microsoft would have its work cut out for itself in shifting to its own custom ARM-based processor, mainly due to issues with the software.

Apple depends on the Rosetta 2 emulation to power certain apps that are designed for traditional MacBooks with Intel processors and not yet updated for the M1 chip. Microsoft only recently tweaked Windows 10 to support both 64-bit and 32-bit applications, and it is still in beta testing. Reports have also shown that Windows 10 performs better on Apples M1 Macs running under emulation than when it runs natively on devices like the Surface Pro X.

See the original post here:
Microsoft is designing its own ARM-based processor for Surface and cloud servers - Digital Trends

Cybersecurity expert: After Russian hack, common security tools, including cloud-based multi-factor systems, shown to be less effective in preventing…

Bertrand Cambou, a professor of nanotechnology and cybersecurity at Northern Arizona University, is available to discuss what went wrong in the Russian hack attack revealed this week and what organizations, including the U.S. government, can learn from the attack. Cambou is a senior member of the National Academy of Inventors and is an invention ambassador of the American Association for the Advancement of Science.

Bertrand Cambou

Media coverage has mentioned two specific methods the hackers used:

According to Cambou:

The use of these products is inherently risky; cloud-based email services should not be trusted for security andsensitive operations. In general, software tools with mandatoryupdates can be used as Trojan malware. These updates are forced on the client devices without authentication, and the servers havethe upper hand and are able to shortcut security.

The weak link for massive attacks is the server or cloud having the authority to infect terminal devices at large scale. Tools like MS Duo have the objective to block malicious users, not malicious cloud services.

It was reported that SolarWind customers often use Microsoft'sDUO multi-factor authentication, which did not prevent the attack.

Due to the recent attack, the information already stored on the cloud is as suspicious, and all government personal computers with the monitoring system should be quarantined, with the assumption that worms were potentially planted in the software stack. In both cases the users were interacting with contaminated networks. This is a really bad situation.

Recommendation:

Implementtwo-way authentication, which is much more secure than cloud-based multi-factor.The objective should be both to prevent a bad server to play and block malicious users.

Earlier this year, Cambou hosted industry and military partners on a multimillion-dollar cybersecurity project. Learn more about the grant from the U.S. Air Force.

Read the original post:
Cybersecurity expert: After Russian hack, common security tools, including cloud-based multi-factor systems, shown to be less effective in preventing...

Remote and cloud-based systems to be ruthlessly targeted next year – Help Net Security

Home networks, remote working software and cloud systems will be at the center of a new wave of attacks in 2021, Trend Micro predicts.

Cybercriminals in 2021 will particularly look to home networks as a critical launch pad to compromising corporate IT and IoT networks.

As we begin to enter a post-pandemic world, the trend for remote working is likely going to stick for many organizations. We predict more aggressive attacks to target corporate data and networks, said Jon Clay, director of global threat communications for Trend Micro.

Security teams will need to double down on user training, extended detection and response and adaptive access controls. This past year was all about surviving: now its time for businesses to thrive, with comprehensive cloud security as their foundation.

The report warns that end users who regularly access sensitive data (e.g. HR professionals accessing employee data, sales managers working with sensitive customer information, or senior executives managing confidential company numbers) will be at greatest risk. Attacks will likely exploit known vulnerabilities in online collaboration and productivity software soon after they are disclosed, rather than zero-days.

Access-as-a-service business models of cybercrime will grow, targeting the home networks of high-value employees, corporate IT and IoT networks. IT security teams will need to overhaul work from home policies and protections to tackle the complexity of hybrid environments where work and personal data comingle in a single machine. Zero-trust approaches will increasingly be favored to empower and secure distributed workforces.

As third-party integrations reign, Trend Micro also warned that exposed APIs will become a new preferred attack vector for cybercriminals, providing access to sensitive customer data, source code and back-end services.

Cloud systems are another area in which threats will continue to persist in 2021, from unwitting users, misconfigurations, and attackers attempting to take over cloud servers to deploy malicious container images.

Cybercriminals will continue to go where the money is seeking the greatest financial returns on their attacks. Organizations and security teams must remain nimble and vigilant to stay ahead of criminals.

Continued here:
Remote and cloud-based systems to be ruthlessly targeted next year - Help Net Security