Category Archives: Cloud Servers

Which [r]evolution to expect for cloud computing in 2020? – Data Economy

From a small revolution at first, cloud computing has in recent years evolved into a key strategic development driver not only for businesses but also for governments in short, for society at large. Whether with the launch of new players or the emergence of the first open source cloud platform, the past decade has been marked by outstanding innovations that have forever transformed the use of IT, both by and for businesses. The core of these innovations has been data, the new black gold without which future innovations would be compromised.

The new year 2020 heralds the start of a new era in which businesses intend to play a more instrumental role in the cloud offering at hand, including business models, in order to always make the most of their data. Even though it is difficult if not impossible to imagine what the next decade will be like, we can, however, anticipate some trends for 2020:

Hybrid cloud willcontinue to appeal to businesses

Business appetite for hybrid cloud has grown significantly in 2019. The challenges faced by businesses in terms of new skills, new application needs, legacy IT management etc. are constantly increasing as businesses realize that cloud computing is no panacea.

What is at stake is the significant cost associated with the extensive use of public cloud services and the ever more critical need for data control and security. Against this backdrop, businesses are turning away from exclusive public cloud offerings to move part of their data back to a private cloud.

On the other hand, they are abandoning on-premises cloud computing in favor of a hosted private cloud service that combines the best of both worlds greater cost control and a higher level of security, all with the elasticity and scalability of the cloud.

According to the Nutanix Cloud Enterprise Index, 92% of IT decision makers say this type of infrastructure best meets their needs.

and prefigures the advent ofmulti-cloud

Following in the footsteps of hybrid cloud but going one step further, there comes multi-cloud a combination of cloud environments ranging from on-premises cloud to hosted private cloud to public cloud, each dedicated to different use cases.

Given that no single cloud today can competitively provide for all solutions, the most mature businesses find in multi-cloud the promise of excellence selecting the best solutions from the entire cloud offering available to build a single application environment, in which all components are interdependent.

A business can choose to host its database with one provider, turn to another provider for its compute needs, store its data in yet another location, and orchestrate everything in a multi-cloud architecture.

As applications become less and less monolithic and their components communicate in an increasingly standardized way, it is a safe bet that multi-cloud has a bright future ahead of it.

Increasedinvestment will be made in orchestration and monitoring

While 2019 saw a decline in cloud budgets due to the ongoing consolidation, 2020 is expected to see investment pick up, increasing at a rate of 6.9% per year to $90.9billion by 2023 according to IDC forecasts in both public and private cloud. Initially seen as peripheral solutions, automation capabilities were the first to develop with the rise of containerization, providing simpler and faster portability.

Now has come the time for orchestration capabilities, which are at the heart of business concerns in order to better control traffic and align costs with actual needs and usage. First of all, Kubernetes is becoming the default orchestration technology. The aim here is to have applications that are capable of communicating natively and requesting resources in real time, deploying volumes and orchestrating everything based on application needs, the current contract and/or the parameters provided by the operator.

On the private cloud side, investment associated with data monitoring and observability capabilities will increase, allowing a more detailed understanding of infrastructure activity, in particular thanks to Machine Learning and AI applications.

Newsletter

Time is precious, but news has no time. Sign up today to receive daily free updates in your email box from the Data Economy Newsroom.

Security willremain a strategic issue

Previously, data security solutions focused on storage or networking capabilities. For example, if you wanted to store encryption keys securely, you had to rely on an HSM (Hardware Security Module), a monolithic solution that was poorly aligned with the cloud concept.

The ability to secure data in use, called Confidential Computing, is a big leap forward. More processors will embed this capability, which will therefore be increasingly available in infrastructures.

OVHcloud was one of the first to offer bare-metal offerings integrating and providing APIs to lease servers with this Confidential Computing capability built in. For instance, we already have partners who use such servers to offer key management features.

These servers, therefore, now make it possible to store and run all or part of software programs that require end-to-end security, thus greatly improving the security of data encryption and, in turn, of entire systems. Data encryption will be more readily available, whether for data in transit or at rest, to enhance data security.

Locally-based cloudservices will be increasingly in demand for legal reasons

With the introduction of data protection regulations and increased public awareness of this issue, businesses have realized the strategic nature of data sovereignty for themselves.

The issue of the legal framework for data goes beyond the scope of cloud providers alone and also affects businesses that use cloud solutions. Local initiatives are multiplying to set the rules for a trusted cloud, which meets everyones expectations in terms of data sovereignty.

Taking as example the recent French-German Gaia-X project, it would not be surprising that in 2020, private as well as public organizations were to favor their regional ecosystem face to the American-Chinese duopoly. We should see the development of new collaborative projects allowing the implementation of more local alternatives, made possible by a collective awareness by European vendors of their ability to provide a relevant cloud offering.

Many other topics could have been addressed here, such as open source, blockchain, AI and machine learning, but also applications related to smart cities, autonomous cars and connected health.

These technologies and fields of application involve the storage, exchange and processing of a large sometimes quite large amount of data, and are still in their infancy. In any case, one thing is for sure; society is evolving and cloud computing will continue to evolve as well, in order to better support it.

Read the latest from the Data Economy Newsroom:

See the article here:
Which [r]evolution to expect for cloud computing in 2020? - Data Economy

Top 10 Cloud Computing Groups on LinkedIn in 2020 – Analytics Insight

The professional networking site, LinkedIn is a premier place for technology companies to gather information, connect with industry contemporaries, share ideas and develop a network. Moreover, even a cloud professional, be it a cloud engineer or IT professional with specialization in operating cloud solutions, can have access to additional insights into what the smartest in the industry are talking about. For them, various cloud computing LinkedIn professional groups are a proficient space, to begin with. Through this, cloud professionals will be able to stay aware and ahead in such a wide market space. Therefore, we have compiled a list of top 10 cloud computing groups on LinkedIn in 2020 where numerous similar expertise comes together to discover new insights, best practices and learn about software or tools their contemporaries are using in their daily operations.

About: A group for Cloud Computing, Cyber Security & Virtualization professionals to expand their network of contacts, share ideas and discuss industry-related topics.

The group covers VMware, SaaS, PaaS, Cloud Security, Cloud Computing & Server Virtualization technologies, Enterprise 2.0 Applications, technologies and architectures, CRM, cloud services, data center, Software as a Service, and on-demand applications.

Members Count: 494,628 professionals

About: This is a group for people involved in Current Big Thing Cloud Computing.

Members Count: 436,745 professionals

About: An exclusive group for Virtualization & Cloud Computing professionals to network and discuss industry-related topics such as Virtualization, VMware, Microsoft Hyper-V, Citrix Xen, Security, Enterprise, Mobile, Storage, VCP, VCDX, Cloud Power, SaaS, PaaS, Data Storage, Security. Technical Q&A and news are all supported and encouraged.

Members Count: 97,520 professionals

About: Amazon Web Services a global cloud solution provider. So far the main vendor to offer real tools for business to either move their applications to the cloud or build new solutions. Here you are welcome to link up with other users of AWS or you may want to learn from others. News of developers putting up new solutions into the AWS Cloud.

Members Count: 40,910 professionals

About: The IBM Cloud LinkedIn Group is for Cloud experts, customers, Business Partners, analysts, and other stakeholders to discuss, share, and collaborate on Cloud Computing. This network will help better understand the potential of Cloud Computing, the attributes of Cloud Computing, and what it means as an evolving model for IT infrastructures building on concepts such as virtualization, utility and grid computing, and SaaS and driving expectations for access, consistency, and quality in the user computing experience.

Members Count: 28,149 professionals

About: The Telecom IT Updates group facilitates relationships, education, and new opportunities for professionals in the IT and Telecom industries. This group is sponsored by Telarus, Inc.

Members Count: 70,627 professionals

About: The Cloud Storage group was formed in order to provide a common ground for the introduction and advancement of cloud storage and computing technology

Members Count: 45,664 professionals

About: The Cloud Networking group was formed in order to provide a common ground for the introduction and advancement of cloud networking and distributed network computing technology

Members Count: 12,711 professionals

About: The Cloud Computing and SaaS Best Practices Group is an open forum for Directors, VPs, and C-level executives working in Product Management, Operations and Information Technology. Join in to interact with peers and discuss best practices in Cloud Computing, cloud hosting, and SaaS.

Members Count: 2,829 professionals

About: Two of the most vibrant information technology trends are open source software and cloud computing. In combination, they are multiplying value at a near Moores Law rate. This group is for discussion and exchange of information about open source cloud computing (OSCC).

Members Count: 2,325 professionals

Continue reading here:
Top 10 Cloud Computing Groups on LinkedIn in 2020 - Analytics Insight

Southeast Asia Cloud Computing Market size to USD 40.32 billion by 2025 according to a new research report – WhaTech Technology and Markets News

IaaS holds an approximate market share of 55% in the cloud computing market as of 2017 followed by Platform as a Service (Paas). IaaS is expected to witness a fair amount of growth in the coming years owing to increasing volume of business and acute financial data and other significant information among the businesses in multiple sectors.

The report on Southeast Asia cloud computing market, documents a comprehensive study of different aspects of the cloud computing market. It focusses on the steady growth in market in spite of the changing market movement.

Every market intelligence report covers certain important parameters that can help analysts define the market situation. It includes a thorough analysis of market trends, market shares and revenue growth patterns and the volume and value of the market.

It also covers methodical researches.

Report: http://www.adroitmarketresearch.com/contactsample/383

The Southeast Asia cloud computing market revenue is estimated to reach USD 40.32 billion by 2025 driven by the increasing demand for the cloud computing among the emerging small and medium size business organizations in this region. The cloud computing technology has proven to be the ultimate leapfrog technology that allows companies in small countries such as Indonesia, Thailand or Myanmar to connect to the rest of the word and compete with them.

Cloud computing uses a network of remote servers on the internet to manage, sore and process data instead of using a local server. This technology has gained popularity among smaller and growing businesses due to its cost effectiveness instead of using a local server.

The Southeast Asia cloud computing market share growth is parallel to the growing demand for data. Data access has been one of the key cloud computing market trends in driving and assisting in the growth of small noisiness, improvement in the ecommerce industry and development of new technologies such as artificial development (AI).

Read Complete Details at:www.adroitmarketresearch.com/industring-market

Singapore cloud computing market revenue share, by organization size, 2017 (%)

On the basis of organization size the Southeast Asia cloud computing market size was dominated by the small businesses primarily due to the presence of a large volume of small businesses across the region. Small business have financial restraints and have to operate on really tight budgets.

Therefore the installation and management of physical servers proves to be a costly affair to these businesses. Cloud computing platforms can cut costs as well render these small companies more competitive in the regional as well as the global markets.

These platforms establish a robust IT foundation for companies to incorporate the latest wave of technological developments in to their operations. This is one of the major cloud computing market trends that is expected to propel the market for this segment during the forecast period.

The Southeast Asia cloud computing market size is anticipated to be driven by Singapore during the forecast period and is expected to grow at a CAGR of more than 13%. The recent study by the Asia Cloud Computing Association (ACCA) projected Singapore as the most cloud ready country out of 14 Asia Pacific countries.

This is mainly due to the high quality of broadband services, enhanced cybersecurity and levels of business sophistication.

The Southeast Asia cloud computing market consists of major participants which includes Amazon, Akamai Technologies, CA Technologies, Alibaba, Cisco Systems and Google Inc. among others. The cloud computing market leaders are looking to this region for expanding further.

For instance, in August 2018, Google announced the building of its new data center in Singapore, and Alibaba Cloud has announced its second infrastructure zone in Malaysia. The expansion of Googles data centers in Singapore, takes the companys total investment to USD 850 million.

The launch of Alibabas new infrastructure in Malaysia will be certified for SAP hosting and bring new products such as elastic computing, database, networking and monitoring services to the market. These new developments from the cloud computing market leaders is expected to sustain the growth of the market during the forecast period.

Key segments of the Southeast Asia cloud computing market

Deployment Overview, 2015-2025 (USD million)

Product Overview, 2015-2025 (USD million)

Organization Size Overview, 2015-2025 (USD million)

Product Overview, 2015-2025 (USD million)

Country Overview, 2015-2025 (USD million)

Enquire more details of the report at:www.adroitmarketresearch.com/researchreport/383

This email address is being protected from spambots. You need JavaScript enabled to view it.

Follow this link:
Southeast Asia Cloud Computing Market size to USD 40.32 billion by 2025 according to a new research report - WhaTech Technology and Markets News

Current research: Cloud Hosting Service Market status and prospect to 2026 – WhaTech Technology and Markets News

Key Players: A2 Hosting, SiteGround, InMotion, HostGator, DreamHost, 1&1 IONOS, Cloudways, Bytemark Cloud, Hostwinds, Liquid Web Hosting, AccuWeb, SiteGround, FatCow, BlueHost, Vultr.

2020 Report on Global Cloud Hosting Service Market is a professional and comprehensive report on the Cloud Hosting Service industry.

#Download Free PDF Sample Brochure of report Global Cloud Hosting Service Market 2020 across with 134 Pages and in-depth TOC Analysis @ http://www.reportsnreports.com/contactme=2891895

The report pinpoints on the leading market competitors with explaining Cloud Hosting Service company profile depends on SWOT analysis to illustrate the competitive nature of the Cloud Hosting Service market globally. Even more, the report consists of company recent Cloud Hosting Service market evolution, market shares, associations and level of investments with other Cloud Hosting Service leading companies, monetary settlements impacting the Cloud Hosting Service market in recent years are analyzed.

Development policiesand plans are discussed as well as manufacturing processes and cost structures are also analyzed. This report also states import/export consumption, supply and demand Figures, cost, price, revenue and gross margins.

The report focuses on global major leading Cloud Hosting Service Industry players providing information such as company profiles, product picture and specification, capacity, production, price, cost, revenue and contact information. Upstream raw materials and equipment and downstream demand analysis is also carried out.

The Cloud Hosting Service industry development trends and marketing channels are analyzed. Finally the feasibility of new investment projects are assessed and overall research conclusions offered.

Access this Latest Research Report @ http://www.reportsnreports.com/contactme=2891895

Geographically, this report is categorized into various main regions, including sales, proceeds, market share and expansion Rate (percent) of Cloud Hosting Service in the following areas, North America, Asia-Pacific, South America, Europe, Asia-Pacific, The Middle East and Africa.

Market segment by Type, the product can be split into- Linux Servers - Cloud- Windows Servers - Cloud

Market segment by Application, split into- Commercial Operation- Government Department- Others

List of Tables

Table 1. Cloud Hosting Service Key Market SegmentsTable 2. Key Players Covered: Ranking by Cloud Hosting Service RevenueTable 3. Ranking of Global Top Cloud Hosting Service Manufacturers by Revenue (US$ Million) in 2019Table 4. Global Cloud Hosting Service Market Size Growth Rate by Type (US$ Million): 2020 VS 2026Table 5. Key Players of Linux Servers - CloudTable 6. Key Players of Windows Servers - CloudTable 7. Global Cloud Hosting Service Market Size Growth by Application (US$ Million): 2020 VS 2026Table 8. Global Cloud Hosting Service Market Size by Regions (US$ Million): 2020 VS 2026Table 9. Global Cloud Hosting Service Market Size by Regions (2015-2020) (US$ Million)Table 10. Global Cloud Hosting Service Market Share by Regions (2015-2020)Table 11. Global Cloud Hosting Service Forecasted Market Size by Regions (2021-2026) (US$ Million)Table 12. Global Cloud Hosting Service Market Share by Regions (2021-2026)Table 13. Market Top TrendsTable 14. Key Drivers: Impact AnalysisTable 15. Key ChallengesTable 16. Cloud Hosting Service Market Growth StrategyTable 17. Main Points Interviewed from Key Cloud Hosting Service PlayersTable 18. Global Cloud Hosting Service Revenue by Players (2015-2020) (Million US$)

Download Free Sample Report @ http://www.reportsnreports.com/contactme=2891895

In the end, the Global Cloud Hosting Service Market reports conclusion part notes the estimation of the industry veterans.

This email address is being protected from spambots. You need JavaScript enabled to view it.

See the original post here:
Current research: Cloud Hosting Service Market status and prospect to 2026 - WhaTech Technology and Markets News

Tachyum’s Reference Design Will Be Used In a 2021 AI/HPC Supercomputer – Business Wire

SANTA CLARA, Calif.--(BUSINESS WIRE)--Semiconductor company Tachyum Inc. announced today that its Prodigy Processor AI/HPC Reference Design will be used in a supercomputer which will be deployed in 2021. This reference design will provide customers, partners, OEMs and Original Design Manufacturers (ODMs) a proven blueprint for building and deploying ultra-high performance Exascale AI/HPC supercomputers and datacenters in 2021.

Tachyums Prodigy Universal Processor, slated for commercial availability in 2021, is a 64-core processor with a clock speed in excess of 4GHz that brings to market new Universal Computing capabilities. In normal datacenter workloads, Prodigy handily outperforms the fastest processors while consuming one-tenth the electrical power, and it is one-third the cost. In AI applications, Prodigy outperforms GPUs and TPUs on neural net training and inference workloads, and is orders of magnitude easier to program.

The reference design platform is architected to deliver the power-performance benefits of Prodigy to wide ranging applications in verticals, including AI/HPC supercomputing, Edge Computing, Datacenter operations (public and private cloud), Telecommunications, and Automotive. Today's datacenters are under relentless pressure to support both regular and AI powered workloads. To address this, a heterogeneous infrastructure is being built, with x86 servers for regular workloads and GPU/TPU accelerators for AI workloads. Additional hardware, plus time and energy are needed to move the enormous amount of data between the two computing silos.

Datacenters are provisioned with enough servers to accommodate peak customer demand; during off-peak hours, more than 50% of most cloud datacenter servers are powered down to save electricity costs. Servers equipped with Prodigy offer the highest performance at the lowest cost to power conventional applications and also provide low-cost AI on demand. Idle Prodigy servers can be seamlessly and dynamically powered up and used for AI training or inference workloads. With Prodigy, provisioning an AI environment becomes CAPEX-free, since the idle servers powered up to handle AI workloads are already on the books as capital equipment

We are excited that customers are interested in our technology. There is never a more exciting time for a startup than THE FIRST CUSTOMER, especially one who wants to deploy an AI/HPC supercomputer in 2021. The preparations at the supercomputer site are expected to start later this year, so that infrastructure is ready when Tachyums first hardware arrives. Tachyum has responded to its customers and partners need for an easy blueprint to enable hyperscalers and qualified ODMs and OEMs to begin changing the competitive landscape, while accelerate democratizing AI/HPC in the process, said Dr. Radoslav Rado Danilak, Tachyum founder and CEO. Every Prodigy-equipped datacenter is also a low-cost, scalable AI/HPC datacenter.

Tachyums Prodigy Universal Processor is the smallest and fastest general purpose, 64-core processor developed to date, requiring 10x less processor power, and reducing processor cost by 3x. Prodigy will directly enable a 32-Tensor Exaflop supercomputer and allow the building of machines more powerful than the human brain in 2021, years ahead of industry expectations. Prodigy reduces datacenter annual TCO (total cost of ownership) by 4x, through its disruptive processor architecture and a smart compiler that has made many parts of the hardware found in typical processors redundant. Fewer transistors, fewer and shorter wires, due to a smaller, simpler core, translates into much greater speed and power efficiency for the Prodigy processor.

Interested organizations can obtain the Prodigy server reference design, or consult with Tachyum at https://www.tachyum.com/contact.shtml.

Follow Tachyumhttps://twitter.com/tachyum https://www.linkedin.com/company/tachyum https://www.facebook.com/Tachyum/

About Tachyum

Named for the Greek prefix tachy, meaning speed, combined with the suffix -um, indicating an element (e.g. lithium), Tachyum is meant to evoke the notion of an element of speed. Tachyum emerged from stealth mode in 2017 to engineer disruptive intelligent information processing products. Tachyums founders have a track record of solving problems caused by device physics in semiconductors, to deliver transformational products to global markets, and are backed by IPM Growth, the Central & Eastern European venture capital platform, as Tachyums lead investor. For more information visit: http://tachyum.com.

Read more:
Tachyum's Reference Design Will Be Used In a 2021 AI/HPC Supercomputer - Business Wire

Cohesity loses cohesion: Rapidly diversifying firm has an identity problem – Blocks and Files

You though you understood Cohesity well enough. It supplies hyperconverged secondary storage. It is basically doing a Nutanix on the secondary storage market and converging the file use cases for test and dev, compliance and other copy data users. The San Jose firm makes a golden master copy of a file and farms out virtual copies of it for temporary use, saving storage space.

Only it doesnt just do this. It provides a backup appliance. It tiers to the cloud. It provides file storage. It archives data. It can do disaster recovery. It can migrate data as well. So what is Cohesity in product positioning terms?

Thats a tough question to answer, in that it doesnt fit in the standard product boxes. There are three main boxes to consider here: file storage, backup, and data management. We can easily populate these boxes with suppliers because thats mostly how they define themselves; by product fit to a market sector. A diagram shows what we mean;

Certain companies and products are known for file storage Isilon, NetApp and Qumulo, for example.

Certain companies are known for backup, masses of them in fact, starting with Acronis and Asigra and running through the alphabet to Veeam and Veritas.

Other companies are known for copy data management, such as Actifio, Cohesity itself, and Delphix.

Some suppliers are known for file life cycle management, such as Komprise and file access acceleration, such as InfiniteIO.

Where Cohesity fits, according to Michael Letschin, its director for Technology Advocacy, who briefed us, is in all three boxes. As we understand it, Cohesitys technology is based on a Span File System, which is a highly scale-out filesystem with some unique properties. For example it can receive and present files using NFS, SMB and S3 protocols at the same time.

Cohesitys software runs in scale-out clusters which are managed, in single or multiple geos, by the Helios SaaS facility.

Its generalised file data processing platform receives data from a variety of sources, does things with them, and makes them available to a variety of target use cases.

As a file store, Letschin said, it cannot do tier 0 file access work; its not equipped for that low latency, high speed access activity. NetApp and Isilon and Qumulo can rest easy in that use case. But Cohesity can do the tier 1, tier 2, and 3 work, what we can call the secondary file data or unstructured data world. And here, because of the breadth of its coverage, the firm could potentially reign supreme.

Backup is a way to get data onto its platform, an on-ramp, an ingest method. It built a HW/SW appliance product to do that, but is now switching to a software-only product available through subscription. This can run on-premises or in the public cloud. Cohesity can back up applications in physical servers and in virtual servers (vSphere, Hyper-V Acropolis). It can back up relational and, via its Imanis acquisition, distributed databases. It can back up Kubernetes-orchestrated containerised systems.

The product can be a target system for backup products, such as Veeam. It can write backup data out to archives in the public cloud (AWS, Azure, GCP) and also to tape via a Qstar gateway. The archive data can be written in an immutable form (Write once: read many or WORM).

It can tier file data to the cloud, leaving a reference access stub behind, and so save space on primary (tier 0-class) filers. And it can supply data to Test and Dev, with personally identifiable information detected and masked out. It can move backed-up VMs to the cloud ready to spin up if a disaster happens (CloudSpin) and even run them on the Cohesity cluster as a stop-gap.

Third-parties have built applications that use Cohesitys software to do extra things, such as the Clam AV anti-virus product and the firms own Splunk facility for small log file populations.

Customers can download these from the Cohesity Marketplace,running on Cohesitys distributed infrastructure and using the Cohesity App SDK to access the data managed by the Cohesity DataPlatform. They have to buy the licence from the vendor or partner directly.

Mostly all its functions are policy-driven and set up through a clean UI.

It would seem that a lot of what a customer might want to do with secondary file/unstructured data can be done with Cohesitys software. (Were using secondary data to mean non-tier 0 data.)

This is why trying to position Cohesity in a single standard file storage activity-related box is like nailing jelly to a wall. All of which, its execs must be hoping, makes for remarkably sticky software.

Read this article:
Cohesity loses cohesion: Rapidly diversifying firm has an identity problem - Blocks and Files

Options for the Windows Server 2008 End of Life Blues – ITPro Today

Windows Server 2008 end of life has finally passed its last incarnation. Some IT operations will still have servers running it though, since some industry- or workplace-specific applications work and play well on more recent Windows Server offerings. This means these shops are especially vulnerable to new security threats aimed at unsupported operating systems.

Microsoft isn't entirely abandoning those still dependent on Windows Server 2008. For three more years, users can continue to receive support by taking advantage of Microsoft's Extended Security Update program, which promises to supply "critical" and "important" security patches to those with active Software Assurance or subscription licenses. With a few restrictions, the program is also available to those still using SQL Server 2008, with patches limited to "critical" updates.

Related: Say Goodbye to Windows Server 2008 and Hello to Azure?

Extended Security Update is expensive, however, logging-in at "75% of the full license cost annually," according to Microsoft. That would represent a broad range of pricing, since licensing costs for any Windows Server version vary widely across different editions. When Windows Server 2008 R2 hit the market, for example, a license could be as inexpensive as $469 yearly for the Web Server edition, or as expensive as $3,999 for the Enterprise edition.

For those who want to consider support options beyond the Microsoft offering and want to keep their instance on-premises instead of lifting-and-shifting to the cloud, there is only one solution that fits the bill.

0Patch (as in "zero-patch," and not to be confused with Oracle's OPatch utility), is a service of Slovenia-based ACROS Security that typically supplies security fixes to companies running currently supported versions of Windows. The fixes either address critical zero-day exploits that haven't yet been addressed by the vendor, or patches to be used as a stopgap measure while vendor-supplied patches are being tested.

0Patch will keep some no longer supported software, including Windows Server 2008 as well as Windows 7, patched against security issues at a cost of a little over $25 annually per machine, with volume discounts starting at 20 computers.

ACROS CEO Mitja Kolsek told ITPro Today that while some of the patches might be based on vendor supplied patches, "We create a lot of patches ourselves."

"While having access to a vendor's patch is helpful in determining what the original developers thought was the best way of fixing the vulnerability, we often fix in a different way to minimize the code we change," he said. "Sometimes our fix is also better that the vendor's."

In addition, he said, the company has fixes for some security issues that have yet to be patched by Microsoft.

The company's reason for needing to "minimize" the changed code might be something that potential users might want to consider before signing up for the service. Any fix that 0Patch supplies is not in the form of a traditional patch, which replaces an entire changed file or application on the hard drive, but is a memory resident "micropatch" and is applied on the fly.

"0patch Agent is designed to inject a dynamic load library (DLL) into each running process so that it can then apply and un-apply micropatches in that process," 0Patch explains on its website. "While there are some processes that don't let themselves get injected this way, most processes will spend an additional 600-700 KB of memory each for hosting that DLL. On a typical Windows 10 system with [about] 100 running processes this means a memory consumption of 60-70 MB."

When asked if 0Patch's system presents a new security worry for users, Kolsek replied: "While we're trying hard to avoid that and utilize 20-plus years of experience in finding vulnerabilities, it's almost sure that there are vulnerabilities in our product, as are there in any other software product. We can also micropatch our own product, so fixing can be fast and deployment of the fix instant and unobtrusive for the user."

For those unwilling to pay for Microsoft support or to rely on a third party's unique solution for continued security updates following Windows Server 2008 end of life, the only solutions involve moving to the cloud.

The easiest solution here is probably Microsoft, which will supply free security updates for three years to organizations that move their Windows Server 2008 workloads to its Azure cloud to run on as a VM or managed instance.

A little more complex, but perhaps a more complete, long-term solution is being offered by Amazon Web Services with its End-of-Support Migration Program for Windows Server. With this program, users upload their unsupported workloads to the cloud and upgrade to a supported version of Windows Server in the process, using a compatibility layer to do things like redirecting APIs that have changed.

AWS says that the EMP technology is offered without cost, although users will have to pay a fee to have applications assessed and repackaged.

Other than these solutions, IT shops can face the final Windows Server 2008 end of life by upgrading to a newer version on their own, or they can continue to ride bareback and hope any security holes that surface don't lead to an attack by the black hats.

That last option is not recommended. As Rocky used to say to Bullwinkle, "That trick never works."

View post:
Options for the Windows Server 2008 End of Life Blues - ITPro Today

Maintaining Uptime in the Data Center Is No Game of Checkers – Data Economy

Its only Monday but somehow there is already enough tech news to fill up your working day.

So to help you keep on top of things, Data Economy has listed the top five things you need to know today.

A recently published report predicts that the global data centresmarket will grow with a CAGR of 15.1% over the forecast period from 2019-2025,according to ReportLinker.

The study on data centres market covers the analysis of theleading geographies such as North America, Europe, Asia-Pacific, and RoW for theperiod of 2017 to 2025.

This time last week, Google Drive, Docs, Sheets, and Slidesall briefly went down due to an unknown issue that affected the consumer andbusiness versions of Googles productivity apps.

The word processing app, which also hosts Googles spreadsheet service Sheets and submission host Forms among others, failed to load for users worldwide at around 6:30 p.m. GMT on Monday, with many complaining of error messages saying systems had detected unusual traffic.

Google Docs is utilised by millions of individuals andbusinesses worldwide, and an outage of this scale has undoubtedly led tofrustration for the students and workers whose service has been disrupted, TimDunton, MD, Nimbus Hosting.

In fact, in todays digital climate simple safe andstress-free websites must be considered a priority for organisations.

Therefore, all businesses must ensure they have efficientcyber security protocols in place, including a website and internal ITinfrastructure which is modern, secure and constantly kept up to date.

Amdocs has announced the availability of its cloud-nativeAmdocs Service & Network Automation solution to support all aspects ofservice design, inventory and orchestration across physical, logical andvirtual elements for all lines-of-business (LOBs), including enterprise/B2B,mobile, consumer broadband, as well as NFV and 5G-based services.

Amdocs also announced that a number of its customers areaccelerating their network transformation and NFV journey with the deploymentof Amdocs cloud-native operations automation suite including three integratedservices providers in EMEA, a provider of telecommunications services in theAPAC, and a MSO in North America.

Pulse Secure has announced that Alex Thurber is joining theexecutive team as Chief Revenue Officer, where he will be responsible forglobal sales strategy, management and team development.

Newsletter

Time is precious, but news has no time. Sign up today to receive daily free updates in your email box from the Data Economy Newsroom.

We are committed to providing the easiest, most effectivesolutions for enterprises to secure access to applications and data acrosstheir hybrid IT environments, said Sudhakar Ramakrishna, CEO of Pulse Secure.

We are excited to have Alex join our team and spearhead ourforward momentum in the marketplace.

Alex has the experience to drive growth of ourindustry-leading Zero Trust solutions into an ever-evolving securitylandscape.

Edge computing firm Scale Computing reports it achievedrecord sales in Q4 driven by its OEM partnerships and edge-based deal activity,exiting 2019 at a growth rate of over 90% in total software revenue.

As a result, Scale Computing recorded its best year yet,extending worldwide sales as it added hundreds of new customers, includinggains in distributed enterprises, strategic partnerships, channel growth, andan expansion of its HC3 Edge product portfolio.

Across industries and segments, companies are looking todrive out complexity and maximize application uptime, said Jeff Ready, CEO andco-founder of Scale Computing.

Nowhere is this more apparent than in edge computing, wherehands-on IT personnel and on-site resources are limited.

Whether at the edge or in the data centre, customers wantto reduce the complexity and minimize the cost of their IT processes, and atScale Computing, we are delivering competitive solutions that solve the needsof our customers around the world, as evidenced by our ratings on Gartner PeerInsights, Spiceworks, TechValidate and TrustRadius.

In 2020, we anticipate even higher growth for Scale Computing as a leading player in the edge computing and hyperconverged space, and we look forward to the successes this year will bring.

Read the latest from the Data Economy Newsroom:

Read more here:
Maintaining Uptime in the Data Center Is No Game of Checkers - Data Economy

Using the Cloud: Seven Top Security Threats to Know About – Infosecurity Magazine

It is often taken for granted that cloud solutions will become the default option for businesses in the next few years. Enterprises which decide to migrate their resources to the cloud indicate security as one of the major advantages of this solution (alongside scalability, cost optimization and fast deployment). Unfortunately, hackers are also turning their attention to the cloud, and there are several ways they can pose a serious threat to your operations. It is important to be aware of the problems that may occur in order to prevent them.

Top Security Issues for the Cloud

Lack of Awareness

Most threats to your cybersecurity are external, but to prevent them or deal with them you need your employees to be aware of potential issues. Well-trained employees are among the best investments you can make to improve your companys security, as attackers often rely on human error, lack of attention (or knowledge) or social engineering techniques to spread ransomware or steal credentials. Allocate the time and budget for appropriate training and make sure it is updated regularly. Everyone knows that a strange email from an unknown domain should be treated with suspicion, but how many people are aware that SharePoint or Skype can be used to attack their organization? Prevention is not only better than cure it is also cheaper.

Data Breaches

Data breaches or data leaks are among the top security concerns for all organizations, as they may result in losing even more than just data. Reputation, credibility, money and even customers are all at risk.

Data Loss (and No Backup)

Human error, an accident or a natural catastrophe can lead to permanent loss of data. Set the backup as one of your priorities and consider using an external disaster recovery center (DRC) to avoid such a situation.

Denial of Service (DoS) Attacks

Popular DoS (or distributed denial of service) attacks can shut down your services and make them unavailable to users. Attackers can block your systems with extensive traffic that your servers cannot cope with. If all cloud servers are affected, it is impossible for a company to manage their business.

Cryptojacking

This relatively new form of attack is becoming increasingly common. Cyber-criminals access your cloud computing resources and use cloud computing power to mine for crypto-currencies such as Bitcoin. Such an attack can be difficult to detect, as your systems still work, but are slower than usual. It is often mistaken for a processing power or network issue.

Hijacked Accounts

If a hacker gains access to your system through an internal staff account they can penetrate your virtual resources without being detected for a long time. As most widespread techniques for this kind of attack involve phishing emails and password cracking, it is vital to provide your employees with appropriate training. In addition, make sure that the minimal access rule is in place, so everybody can access only those applications, systems or databases that are necessary for them to do their jobs.

Non-Secure Applications

Even if your own system is secure, you can still be let down by external applications which may present a serious risk to your cloud security. Ensure that your cybersecurity team establishes whether an application is suitable for your network. Warn your employees not to download applications straight from the network before receiving approval from the IT team.

New Tech Means New Vulnerabilities

Knowledge is power. Once you are aware of potential threats to your cloud environment, you can take steps to prevent them. Ask your IT team to re-think your cybersecurity strategy and work on a new, updated plan. If you do not have in-house security experts, think about outsourcing your IT security (or cloud together with security if you have not migrated yet) to an external company experienced in delivering such solutions. Comarch is one of the biggest Polish IT companies delivering both software and an extensive portfolio of IT services. Trust in the best.

See more here:
Using the Cloud: Seven Top Security Threats to Know About - Infosecurity Magazine

What Is A Data Passport: Building Trust, Data Privacy And Security In The Cloud – Forbes

Data privacy is one of the most critical elements of business today and also one of the most challenging. Sixty percent of businesses reported they suffered a data breach caused by a vendor or third party in 2018, and according to Gemalto, more than 6 million records are lost or stolen every day.

What Is A Data Passport: Building Trust, Data Privacy And Security In The Cloud

That represents a huge problem for businesses, especially in highly regulated and sensitive industries like banking. Governments around the world are starting to take notice, and new policies in Europe and elsewhere are beginning to take effect that require businesses to be more accountable for data breaches and more proactive about privacy and security.

Thats why several companies have introduced data passports as a way to protect data in the cloud and in transit.

What is a data passport?

Data passport technology is based on classic mainframe technology, which today, can include full encryption of your data, to ensure that every piece of data is encrypted. When each piece of data is encrypted, even if it is stolen, it cant be used.

Data passports allow you to extend the encryption technology that used to be only available on a physical mainframe to cloud computing. Each piece of data in the cloud has a passport assigned to it, and with the passport, you can verify if the data is misused, if the passport is still valid, etc.

These data passports also give companies the ability to protect data and revoke access to it at any time, across a multi-cloud environment. Because the data carries its passport and its encryption with it, it will help enterprises secure their data wherever it travels.

And that's the most significant development that makes data passports so unique and important: the protection and enforcement of data privacy and security are available on and off any given platform as it travels with the data.

Why does data need a passport?

One of the biggest problems with data security is that so much of our computing these days takes place not in a physical mainframe, but in the cloud.

It used to be that data thieves might have to break into a physical space to steal hard drives or mainframes in order to steal data. Not any more.

With more and more computing of all kind taking place in the cloud, that data can become extremely vulnerable. In fact, the movement of data between parties, through the cloud, is its most vulnerable point, and with the growing use of multi-cloud environments, the problem is only exacerbated.

Imagine a single piece of information that must be transmitted from Company A to Company B.Company A knows that its servers are secure, and Company B feels like its data is also secure. But what about the space in between?

Data passports allow the data to carry its own encryption with it, so that even if it is intercepted, its useless. This is extremely valuable for companies and industries that transmit data in multi-cloud environments, and will be especially useful in highly regulated industries like banking and insurance.

Data passports are the future

Encryption everywhere is the watchword right now for industry experts and analysts, and data passports are the technology that will help make that happen.

This technology is new, but experts believe that it will become a new standard for privacy and security in any industry that collects, uses, and transmits data which is practically any industry in the world today.

This kind of total encryption available with a data passport should also increase consumer trust, and help companies repair any damage that may have been done to their reputations in the past through data breaches and losses.

To learn more, have a look at my conversation with Ivo Koerner from IBM, in which we discuss the concept and importance of data passports:

Continued here:
What Is A Data Passport: Building Trust, Data Privacy And Security In The Cloud - Forbes