Category Archives: Cloud Servers

NetApp To Acquire Instaclustr: Moving Up The App Stack – CRN

NetApp Thursday said it plans to acquire Instaclustr, a developer of a platform for delivering fully managed open-source database, pipeline and workflow applications as a service. With the acquisition, Sunnyvale, Calif.-based NetApp said it aims to take its technology to yet a higher layer above its traditional storage focus.

The acquisition is expected to close in 30 to 45 days, subject to regulatory approval. NetApp declined to discuss the value of the acquisition.

NetApps acquisition of Instaclustr, with its ability to run open-source databases on the cloud and on-premises, is part of what has become NetApps centerpiece of optimizing the cloud for customers, said Anthony Lye, NetApps executive vice president and general manager for public cloud services.

[Related: NetApp CEO George Kurian: Dell, HPE Are Doing What We Did In 2014]

Were helping clients manage storage from on-prem to the cloud, Lye told CRN. Our OnCommand Insight is now a multitenant, cloud-based monitoring platform. And we optimize storage to compute with Spot.

NetApps Instaclustr acquisition is the latest in a series of acquisitions that NetApp has made that has moved the vendor from a focus on storage to one of optimizing data and applications across public clouds and on-premises.

It comes just a month after NetApp acquired Fylamynt, which brought CloudOps automation to its Spot portfolio of cloud-native services.

NetApp in June 2020 acquired Spot, which develops technology to manage and optimize compute instances on public clouds.

Since then, NetApp has expanded the Spot portfolio to include its Ocean Kubernetes DevOps technology; its CloudJumper acquisition, which gave it the ability better manage virtual desktop infrastructure and is now known as Spot PC; its CloudHawk security technology, now known as Spot Security; and its Data Mechanics acquisition for optimizing Apache Spark analytics, now known as Ocean for Apache Spark.

NetApps move to optimize hybrid multi-cloud environments above the storage layer is paying off, Lye said.

A couple years ago, our goal was to reach $1 billion in ARR [annual recurring revenue] by fiscal year 2025, he said. Last week, we said at our investor conference we expect $2 billion in ARR by the end of fiscal year 2026.

For NetApp, the goal is to bring all its goodness to wider platforms, Lye said.

A lot of customers tell us they love our tools, and that we have these cool services, he said. But they ask us, what else can we do for them? So we have Spot PC, which lets us run their virtual desktop infrastructures. We have Data Mechanics, which provides customers with a fully managed Spark service. Instaclustr will sit nicely on top of everything we do.

Customers have alternatives to Instaclustr, but they dont offer the capabilities NetApp can with Instaclustr, Lye said.

So for the ability to work across different open-source projects and multi-cloud environments, Instaclustr can be a very valuable service for us.

Lye said NetApp wants to be part of Platform as a Service and not just Infrastructure as a Service.

We want to shift as more decisions are made by the application teams than by the IT teams, he said. IT teams used to say they need a data center, servers, storage and on top of that virtualization, Linux and databases. The last people to get in on the decision-making was the app team. Now I want to flip that around and let the app pick the infrastructure. We can do that not just at the storage, compute and network layer, but also at the app layer. Customers dont want to deal with the infrastructure.

NetApps acquisition of Instaclustr is another brilliant move, said John Woodall, vice president of engineering and NetApp enablement at General Datatech, a Dallas-based solution provider and longtime NetApp channel partner.

In my opinion, NetApp is accelerating their transition towards more automated application pipelines and a DevOps perspective, Woodall told CRN. Theyre moving up the application stack. Instaclustr fits well with the Spot portfolio. Customers really understand how Spot works with applications.

Instaclustr lets NetApp move toward more open-source, cloud-native database applications, and helps it do more at the application layer and not just at the infrastructure layer, Woodall said.

This has big ramifications for making applications more cloud-aware, he said. Anthony [Lye] is clearly moving farther into the application and database end of the stack, and making NetApp more application-aware, typology-aware, cloud-aware and even security-aware. This is a different story for NetApp. Its indicative of the transition to running as a services-led, hybrid cloud capabilities-driven business.

See the original post here:
NetApp To Acquire Instaclustr: Moving Up The App Stack - CRN

SMBStream for Accelerated VPN-Less Access to SMB shares, is Now Available in the AWS Marketplace – IT News Online

PR.com2022-04-09

London, CA April 09, 2022 --(PR.com)-- Storage Made Easy, with a mission of simplifying storage for everyone, announced today that their new SMBStream product can now be launched directly from the AWS Marketplace.

SMBStream provides high-performance, secure access to file servers in the cloud, in data centers, and between geographically distributed offices across the world. Unlike using a VPN, users and applications have speedy access to the file data they need in real-time, and the solution scales as more users are added.

Launching SMBStream from the AWS Marketplace makes it even easier to consolidate file servers into the cloud, to include remote storage in cloud workloads and to integrate distributed file storage into the Enterprise File Fabric platform.

SMBStream Highlights:

Real-time Access - Users are able access live file storage over the internet. Real-time access means there is no office cache to procure, no snapshots to synchronize, and no global locking challanges.Fast - SMBStream enables productive use of remote file systems from distributed offices. Improves remote file access up to 15 times compared to a traditional VPN.Secure - Adds key authentication, repudiation and AES-256 encryption for secure access over the public internet.Vendor Neutral Extends the reach of your SMB compatible file servers including Amazon FSx, Nasuni, NetApp Cloud Volumes.

For more information about SMBStream visit: https://storagemadeeasy.com/smbstream/

Contact Information:Storage Made EasyMariado Martinez, Marketing Manager+442086432885Contact via Emailhttp://StorageMadeEasy.com

Read the full story here: https://www.pr.com/press-release/858959

Press Release Distributed by PR.com

More here:
SMBStream for Accelerated VPN-Less Access to SMB shares, is Now Available in the AWS Marketplace - IT News Online

How to combine the power of cloud and edge computing – Raconteur

Like companies all around the world, US fast-food chain Taco Bell responded to the pandemics commercial impact by accelerating its shift to the cloud. As customers traditional patterns of restaurant and drive-through consumption changed rapidly and permanently to include kiosk, mobile and web ordering, often through third-party delivery services, Taco Bell moved the remainder of its group IT to cloudservices.

But this 100% cloud-based approach stops at the restaurant door. Given that many of its 7,000 outlets dont have fast and/or reliable internet connections, the company has recognised the limitations of the public cloud model and augmented its approach with edge computing. This set-up enables the company to process data near the physical point at which it is created, with only a periodic requirement to feed the most valuable material back to the cloud and receive updates fromit.

Taco Bell is just one of thousands of firms seeking to exploit the fast-evolving and much-hyped distributed IT capability that edge computing can offer.

Edge computing is getting so much attention now because organisations have accepted that there are things that cloud does poorly, observes Bob Gill, vice-president of research at Gartner and the founder of the consultancys edge research community.

Issues of latency (time-lag) and limited bandwidth when moving data are key potential weaknesses of the centralised cloud model. These drive a clear distinction between the use cases for cloud and edge computing. But the edge is also a focus for many organisations because they want to add intelligence to much of the equipment that sits within their operations and to apply AI-powered automation at those endpoints.

Early adopters include manufacturers implementing edge computing in their plants as part of their Industry 4.0 plans; logistics groups seeking to give some autonomy to dispersed assets; healthcare providers with medical equipment scattered across hospitals; and energy companies operating widely dispersed generation facilities.

For such applications to be viable and efficient, their data must be processed as close to the point of origin or consumption as possible, says George Elissaios, director of product management at Amazon Web Services. With edge computing, these applications can have lower latency, faster response times and give end customers a better experience.Edge computing can also aid interconnectivity by reducing the amount of data that needs to be backhauled to datacentres.

In some ways, the emergence of edge computing represents a new topology for IT. So says Paul Savill, global practice leader for networking and edge computing at Kyndryl, the provider of managed infrastructure services that was recently spun out ofIBM.

Companies are looking at the edge as a third landing spot for their data and applications. Its a new tier between the public cloud and the intelligence at an end device a robot, say, heexplains.

But most organisations dont expect their edge and cloud implementations to exist as distinct entities. Rather, they want to find ways to blend the scalability and flexibility they have achieved with the cloud with the responsiveness and autonomy of internet-of-things (IoT) and satellite processors installed at theedge.

Gill believes that cloud and edge are pure yin and yang. Each does things the other doesnt do well. When put together effectively, they are highly symbiotic.

They will need to be, as more and more intelligence is moved to the edge. More than 75 billion smart digital devices will be deployed worldwide by 2025, according to projections by research group IHS Markit. And it is neither desirable nor realistic for these to be interacting continuously with thecloud.

Cloud and edge are pure yin and yang When put together effectively, theyre highly symbiotic

When you start to add in multiple devices, you see a vast increase in the volume, velocity and variety of the data they generate, says Greg Hanson, vice-president of data management company Informatica in EMEA and Latin America. You simply cant keep moving all of that data into a central point without incurring a significant cost and becoming reliant on network bandwidth and infrastructure.

In such situations, edge IT performs a vital data-thinning function. Satellite processors sitting close to the end points filter out the most valuable material, collate it and dispatch it to the cloud periodically for heavyweight analysis, the training of machine-learning algorithms and longer-term storage. Processors at the edge can also apply data security and privacy rules locally to ensure regulatory compliance.

Gill notes that edge computing has shifted quickly from concept and hype to successful implementations. In many vertical industries, it is generating revenue, saving money, improving safety, enhancing the customer experience and enabling entirely new applications and datamodels.

Before achieving such gains, many edge pioneers are likely to have surmounted numerous significant challenges. Given that the technology is immature, there are few widely accepted standards that businesses can apply to it. This means that theyre often faced with an overwhelmingly wide range of designs for tech ranging from sensors and operating systems to software stacks and data management methods.

Such complexity is reflected in a widespread shortage of specialist expertise. As Savill notes: Many companies dont have all the skills they need to roll out edge computing. Theyre short of people with real competence in the orchestration of these distributed application architectures.

The goal may be to blend cloud and edge seamlessly into a unified model, but the starting points can be very different. There are two fundamentally different though not totally contradictory schools of thought, according to Gill. The cloud out perspective, favoured by big cloud service providers such as Amazon, Microsoft and Google, views the edge as an extension of the cloud model that extends the capabilities of theirproducts.

The other approach is known as edge in. In this case, organisations develop edge-native applications that occasionally reach up to the cloud to, say, pass data on to train a machine-learning algorithm.

Adherents of either approach are seeing significant returns on their investments when they get itright.

We may be in the early phase of exploiting that combination of IoT, edge and cloud, but the capabilities enabling these distributed architectures the software control and orchestration tools and the integration capabilities have already reached the point where theyre highly effective, Savill reports. Some companies that are figuring this out are seeing operational savings of 30% to 40% compared with more traditional configurations.

In doing so, they are also heralding a large-scale resurgence of the edifice that cloud helped to tear down: on-premises IT albeit in a different form.

In the next 10 to 20 years, the on-premises profile for most companies will not be servers, Elissaios predicts. It will be connected devices and billions ofthem.

See the article here:
How to combine the power of cloud and edge computing - Raconteur

The smart lock market is estimated to be valued at USD 2.1 billion in 2022 and reach USD 3.9 billion by 2027, registering a CAGR of 12.9% – Yahoo…

ReportLinker

between 2022 and 2027. According to a United Nations report published in July 2018, the global urban population is expected to increase to 4. 46 billion in 2021 and 6. 68 billion by 2050. Urbanization is expected to increase in 600 large cities worldwide by 2025.

New York, April 08, 2022 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Smart Lock Market with COVID-19 Impact by Lock Type, Communication Protocol, Unlocking Mechanism, Vertical and Region - Global Forecast to 2027" - https://www.reportlinker.com/p05169655/?utm_source=GNW Emerging nations are also witnessing rapid urbanization with the development of smart cities. In emerging nations, the concentration of industrial development in cities leads to a growing demand for improved infrastructure. This would ultimately lead to the development of several new and renovated educational and healthcare institutions, public administration offices, shopping malls, stores, and warehouses. Thus, infrastructure development would boost the demand for advanced biometric solutions and smart locks, particularly in technologically advancing countries such as India, China, and Brazil.

The increasing adoption of the Internet of Things has enabled the use of various cyber-physical devices such as smartphones, connected cars, and wearable devices. Smart locks can be operated through the remote servers of the manufacturers of smart locks.

All information regarding the properties of a smart lock and its virtual key is stored in the vendors cloud server.Smart locks allow for easy sharing of keys to other authorized persons and unauthorized entry and breakage of the lock to be monitored and reported.

However, the vendor cloud server can be attacked through code injection, cross-site scripting, password eavesdropping, and other means. Data communication between a smartphone and a smart lock through the Bluetooth Low Energy (BLE) protocol can also be hacked through a man-in-the-middle (MITM) attack.

Server locks & latches: The fastest lock type of smart lock market .Server locks and latches use server systems based on the cloud to access doors from a remote location.The server cover prevents unauthorized access to the users server.

Locking the front door (available on some models) avoids unauthorized access to the installed drives.Noke is one of the startups working on this model.

The Noke web-based portal provides users with a comprehensive tool to manage lock functions, track digital keys, share real-time data, and even integrate users existing platforms.

Wi-Fi: The fastest communication platform of the smart lock market .

Wi-Fi is one of the key technologies responsible for the increasing implementation of the IoT in smart locks.Wi-Fi can be accessed through various devices such as smartphones, personal computers, and tablets.

These devices can be connected to the Internet through a wireless access point.The Wi-Fi Direct standard enables users to connect any 2 devices without a wireless router.

The growth in the adoption of smartphones and tablets has increased the application scope of IoT in smart locks equipped with Wi-Fi connectivity. Key players offering Wi-Fi-based smart locks are LockState (US), Allegion (Ireland), Gate, and ASSA ABLOY (Sweden).

Touch based: Fastest growing unlocking mechanism of smart lock marketThe touch-based unlocking mechanism in smart locks uses the fingerprint recognition technique.Fingerprint recognition is an effective and simple method for identifying and authenticating individuals.

One technology that should be of particular interest to manufacturers in the smart home industry is biometrics.Biometric authentication can complement the new smart home trend and add real value to modern domestic security solutions.

Unlike password-protected smart locks, biometric authentication uses personally identifiable information stored securely on-device (whether the lock itself or a fingerprint-secured access card) for maximum privacy. This makes biometrics difficult to hack and near-impossible to spoof, ensuring that homes stay considerably safer than using password-secured, internet-enabled, or traditional key locks.

Residential: Fastest vertical of smart lock marketSmart locks are increasingly used in individual houses to ensure security and safety.Controlled access to allow the entry of only authorized persons is considered the critical function of asset security.

Smart locks have become an integral part of smart homes, as they help control the door locks remotely, protecting people and property.The growth of this market can be attributed to the increasing demand for smart homes and rising urbanization across the globe.

The growing urban population is increasing the need for better infrastructure equipped with security systems to protect people and proper.

North America: The largest region in the global smart lock market .

North America is one of the most technologically advanced regions and is a large market for smart lock technology.The growing awareness about home security solutions, benefits provided by smart locks such as connectivity through smart devices, and their remote access features are driving the regional markets growth.

In addition, the recent upswing in the trend of smart homes, rising adoption of IoT-based services, and the large presence of smart lock vendors are the factors supporting the growth of the smart lock market in North America. The US, Canada, and Mexico are the key countries contributing to the growth of the smart lock market in this region.

The study contains insights from various industry experts, ranging from component suppliers to Tier 1 companies and OEMs. The break-up of the primaries is as follows: By Company Type: Tier 1 55%, Tier 2 25%, and Tier 3 20% By Designation: C-level Executives 75%, and Managers 25% By Region: APAC 40%, RoW 30%, Europe 20%, North America 10%

Key players operating in the smart lock market are ASSA ABLOY AB (Sweden), dormakaba Group (Switzerland), Spectrum Brands, Inc. (US), SALTO Systems, S.L. (Spain), Allegion plc (Ireland), Honeywell International Inc. (US), Dahua Technology Co., Ltd (China), Samsung Electronics Co., Ltd. (South Korea), Vivint, Inc. (US), ZKTECO CO., LTD. (China), igloohome Pte Ltd (Singapore), RemoteLock (US), Onity (US), Master Lock Company LLC. (US), MIWA Lock Co. (Japan), SentriLock (US), Avent Security (China), HavenLock, Inc. (US), Shenzhen Vians Electric Lock Co., Ltd. (China), Anviz Global Inc. (US), CANDY HOUSE, Inc. (US), AMADAS (South Korea), Thekeywe (South Korea), Gate Video Smart Lock (US), and DESSMANN Schliessanlagen GmbH (Germany).

Research Coverage:The report segments the smart lock market and forecasts its size, by value, based on lock type, communication protocol, unlocking mechanism, vertical, and region.The report also provides a comprehensive review of market drivers, restraints, opportunities, and challenges in the smart lock market. The report also covers qualitative aspects in addition to the quantitative aspects of these markets.

Key Benefits of Buying the ReportThe report will help the leaders/new entrants in this market with information on the closest approximations of the revenue numbers for the overall market and the sub-segments.This report will help stakeholders and gain more insights to better position their businesses and plan suitable go-to-market strategies.

The report also helps stakeholders understand the pulse of the smart lock market and provides them information on key market drivers, restraints, challenges, and opportunities. The report also covers COVID-19 impact on smart lock market.Read the full report: https://www.reportlinker.com/p05169655/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Story continues

Here is the original post:
The smart lock market is estimated to be valued at USD 2.1 billion in 2022 and reach USD 3.9 billion by 2027, registering a CAGR of 12.9% - Yahoo...

Storage requirements for AI, ML and analytics in 2022 – ComputerWeekly.com

Artificial intelligence (AI) and machine learning (ML) promise to transform whole areas of the economy and society, if they are not already doing so. From driverless cars to customer service bots, AI and ML-based systems are driving the next wave of business automation.

They are also massive consumers of data. After a decade or so of relatively steady growth, the data used by AI and ML models has grown exponentially as scientists and engineers strive to improve the accuracy of their systems. This puts new and sometimes extreme demands on IT systems, including storage.

AI, ML and analytics require large volumes of data, mostly in unstructured formats. All these environments are leveraging vast amounts of unstructured data, says Patrick Smith, field CTO for Europe, the Middle East and Africa (EMEA) at supplier Pure Storage. It is a world of unstructured data, not blocks or databases.

Training AI and ML models in particular uses larger datasets for more accurate predictions. As Vibin Vijay, an AI and ML specialist at OCF, points out, a basic proof-of-concept model on a single server might expect to be 80% accurate.

With training on a cluster of servers, this will move to 98% or even 99.99% accuracy. But this puts its own demands on IT infrastructure. Almost all developers work on the basis that more data is better, especially in the training phase. This results in massive collections, at least petabytes, of data that the organisation is forced to manage, says Scott Baker, CMO at IBM Storage.

Storage systems can become a bottleneck. The latest advanced analytics applications make heavy use of CPUs and especially GPU clusters, connected via technology such as Nvidia InfiniBand. Developers are even looking at connecting storage directly to GPUs.

In AI and ML workloads, the learning phase typically employs powerful GPUs that are expensive and in high demand, says Brad King, co-founder and field CTO at supplier Scality. They can chew through massive volumes of data and can often wait idly for more data due to storage limitations.

Data volumes are generally large. Large is a relative term, of course, but in general, for extracting usable insights from data, the more pertinent data available, the better the insights.

The challenge is to provide high-performance storage at scale and within budget. As OCFs Vijay points out, designers might want all storage on high-performance tier 0 flash, but this is rarely, if ever, practical. And because of the way AI and ML work, especially in the training phases, it might not be needed.

Instead, organisations are deploying tiered storage, moving data up and down through the tiers all the way from flash to the cloud and even tape. Youre looking for the right data, in the right place, at the right cost, says Vijay.

Firms also need to think about data retention. Data scientists cannot predict which information is needed for future models, and analytics improve with access to historical data. Cost-effective, long-term data archiving remains important.

There is no single option that meets all the storage needs for AI, ML and analytics. The conventional idea that analytics is a high-throughput, high-I/O workload best suited to block storage has to be balanced against data volumes, data types, the speed of decision-making and, of course, budgets. An AI training environment makes different demands to a web-based recommendation engine working in real time.

Block storage has traditionally been well suited for high-throughput and high-I/O workloads, where low latency is important, says Tom Christensen, global technology adviser at Hitachi Vantara. However, with the advent of modern data analytics workloads, including AI, ML and even data lakes, traditional block-based platforms have been found lacking in the ability to meet the scale-out demand that the computational side of these platforms create. As such, a file and object-based approach must be adopted to support these modern workloads.

Block-based systems retain the edge in raw performance, and support data centralisation and advanced features. According to IBMs Scott Baker, block storage arrays support application programming interfaces (APIs) that AI and ML developers can use to improve repeated operations or even offload storage-specific processing for the array. It would be wrong to rule out block storage completely, especially where the need is for high IOPS and low latency.

Against this, there is the need to build specific storage area networks for block storage usually Fibre Channel and the overheads that come with block storage relying on an off-array (host-based) file system. As Baker points out, this becomes even more difficult if an AI system uses more than one OS.

As a result, system architects favour file or object-based storage for AI and ML. Object storage is built with large, petabyte capacity in mind, and is built to scale. It is also designed to support applications such as the internet of things (IoT).

Erasure coding provides data protection, and the advanced metadata support in object systems can benefit AI and ML applications.

Against this, object storage lags behind block systems for performance, although the gap is closing with newer, high-performance object technologies. And application support varies, with not all AI, ML or analytics tools supporting AWSs S3 interface, the de facto standard for object.

Cloud storage is largely object-based, but offers other advantages for AI and ML projects. Chief among these are flexibility and low up-front costs.

The principal disadvantages of cloud storage are latency, and potential data egress costs. Cloud storage is a good choice for cloud-based AI and ML systems, but it is harder to justify where data needs to be extracted and loaded onto local servers for processing, because this increases cost. But the cloud is economical for long-term data archiving.

Unsurprisingly, suppliers do not recommend a single solution for AI, ML or analytics the number of applications is too broad. Instead, they recommend looking at the business requirements behind the project, as well as looking to the future.

Understanding what outcomes or business purpose you need should always be your first thought when choosing how to manage and store your data, says Paul Brook, director of data analytics and AI for EMEA at Dell. Sometimes the same data may be needed on different occasions and for different purposes.

Brook points to convergence between block and file storage in single appliances, and systems that can bridge the gap between file and object storage through a single file system. This will help AI and ML developers by providing more common storage architecture.

HPE, for example, recommends on-premise, cloud and hybrid options for AI, and sees convergence between AI and high-performance computing. NetApp promotes its cloud-connected, all-flash storage system ONTAP for AI.

At Cloudian, CTO Gary Ogasawara expects to see convergence between the high-performance batch processing of the data warehouse and streaming data processing architectures. This will push users toward object solutions.

Block and file storage have architectural limitations that make scaling beyond a certain point cost-prohibitive, he says. Object storage provides limitless, highly cost-effective scalability. Object storages advanced metadata capabilities are another key advantage in supporting AI/ML workloads.

It is also vital to plan for storage at the outset, because without adequate storage, project performance will suffer.

In order to successfully implement advanced AI and ML workloads, a proper storage strategy is as important as the advanced computation platform you choose, says Hitachi Vantaras Christensen. Underpowering a complex distributed, and very expensive, computation platform will net lower performing results, diminishing the quality of your outcome, ultimately reducing the time to value.

Continued here:
Storage requirements for AI, ML and analytics in 2022 - ComputerWeekly.com

Carbon reduction in cloud – S&P Global

Introduction

We are entering an era in which cloud computing is no longer a separate IT category itisIT. The use of the cloud in all its forms is growing exponentially, and investment in cloud infrastructure by big hyperscale cloud providers data centers, servers, storage and data transit to meet demand is running at over $70 billion per year. As a result, there is a lot of interest in, on the one hand, just how much energy hyperscalers consume, and, on the other hand, the carbon reduction potential of the cloud. So, what is going on, and what is the data based on?

Energy conservation (saving) relies on better efficiencies, so the most effective strategies will focus on squeezing out as much of these efficiencies as possible, getting the energy requirement as low as possible and then looking at how to decarbonize that. Simply moving to using green energy is not going to deliver the desired or the best carbon reduction. The best place to go from a carbon reduction perspective is the cloud. This is the case despite a significant amount of overprovisioning in some regions and underprovisioning in others because the hyperscalers do not have sufficient visibility around demand to be able to build less than 30% extra capacity into their data center servers and storage to avoid capacity constraint. Not all workloads can go to the cloud, so for those organizations the next best thing may be a multi-tenant facility whose power utilization effectiveness, or PUE, score will be lower than could be achieved on premises. Multi-tenant providers will have better access to green energy. Only after that does an enterprise data center make sense perhaps for proximity or multicity requirements.

Isn't the carbon reduction potential of the cloud an oxymoron?

Using the results from its 2020-21 surveys, 451 Research's data center team built a data model that looks at how much energy is used by typical IT equipment and how much energy is used by typical data centers. With this, it compared enterprise setups with cloud setups to see if they are similar or different in their energy use. The model determined that cloud servers are much more efficient in general than enterprise servers, partly because they are newer and there is more new equipment in the server farm of a typical cloud supplier. But that is only part of the story: Cloud servers are also much more highly utilized, which makes them more efficient. The team found the savings amounted to as much as a 60% efficiency improvement. The implication is that by just switching servers moving to the cloud enterprises can save up to 60% of their energy usage.

Next, our team looked at the data center building itself and found that typical cloud facilities are also much more efficient. Although they are seen as consuming a lot of electricity, they are typically much more efficient than enterprise data centers that are not fully utilized, especially those with older equipment. Ignoring the change in hardware mentioned above, simply moving to the cloud provider's data center delivers an additional 10% to 15% boost in energy efficiency.

Overall, 451 Research data finds that if they move their IT to the cloud, enterprises can save up to 85% of energy usage, resulting in a smaller carbon footprint. In some cases, even more of a reduction can be seen if the cloud facilities deploy completely renewable carbon-free energy. Here, organizations can improve just by moving to the cloud.

By contrast, when an organization houses its own IT stack, it must account for all of its energy use and, therefore, waste. By moving to the cloud, it is further abstracting its applications from the devices themselves from an accounting perspective, anyway and therefore the organization is only responsible for the emissions associated with the energy it consumes. It is like pay per use but in terms of emissions. This really only works in the cloud, as in a leased data center it is a bit harder for an organization to absolve itself of the responsibility of added waste. This can be debated, but it is way less cut and dry with the cloud.

Thus, there are essentially three levers that determine the level of saving: Equipment is more efficient, mostly because of enhanced efficiency of processors, over and above the now diminishing impact of Moore's Law; equipment is better utilized, which is one of the biggest determinants of overall operating cost, according to 451 Research's Cloud Price Index; and access to a green energy grid brings additional benefits. This presents a strong underlying message in terms of cloud migration.

Regional differences

The IT savings found are fairly consistent geographically 60% up to 68%, depending on the country. The U.S., for example, is at the higher end. The typical U.S. enterprise virtualizes servers much more and tends to have greater utilization of servers compared with Asia-Pacific and EU enterprises.

On the data center side, the U.S. has the most efficient data centers among the enterprises surveyed, followed by Europe and the Asia-Pacific; that is partly due to climatic differences, as a hotter climate means greater energy use for cooling. The key differences, however, come back to the access to essentially carbon-free power, and here Europe has far greater access to green power than the U.S. or the Asia-Pacific.

What about PUE?

Energy transformations taking place in most organizations are focused on PUE. The industry and enterprises have focused so much on that data center efficiency metric partly because it is relatively easy to at least envision, and there are actions that can be taken that make immediate and obvious enhancements. The servers, however, are what is important, though it is much harder for enterprises to virtualize their servers and ensure each server is being deployed at a high level of efficiency. This is where the cloud suppliers are the answer.

It is also an organizational issue. Facility managers whose responsibility is the building itself will fixate on PUE. To them, it is the network administrators who are dealing with the servers. And so, from their perspective, the whole utilization story is not their job. The opposite is also the case, where IT sees access to green energy as the facility's problem. Looking on the data center floor and then outside the data center walls, that has always been somebody else's responsibility. The same has been the case for multi-tenant data center providers, but they are realizing that incentivizing customers to actually have utilization rates helps drive down the PUE of the facility and increases overall efficiency.

Other factors

Additional factors now include consumer demand for sustainable and carbon-free IT services as part of their supply chains. There are now software packages, such as Ledger8760, that can show customers the greenness of the energy they consume on an hour-by-hour basis something that was not available 12 months ago. In addition, cloud providers are starting to expose their end users to the relative greenness of each workload.

Enterprises will increasingly face investor and regulatory requirements for sustainability. The largest cloud suppliers and even some of the biggest enterprises are also coming under pressure from Greenpeace and other organizations.

This article was published by S&P Global Market Intelligence and not by S&P Global Ratings, which is a separately managed division of S&P Global.

451 Research is part of S&P Global Market Intelligence.

View post:
Carbon reduction in cloud - S&P Global

Cloud Types: Everything You Need to Know in 2022 – Nerdbot

The Rise of Cloud Computing and Cloud Storage

Cloud computing is like a delivery service on the internet. When you access files, databases, networks, or other resources online, the cloud delivers it to you. For example, when you search for an image in Google, it will return a number of photos in just a few seconds. This is because the photos delivered to you are stored in a cloud.

Another example is when you watch on Netflix. Reportedly, there are more than 221 million Netflix subscribers as of the last quarter of 2021. However, these subscribers do not have to download the entire movie to watch it. What Netflix does is that it loads the movie directly from the cloud and plays it on your device. Super convenient, right?

Cloud computing acts as a fast and reliable library that delivers whatever resources you want from the internet. The best part is, you will get the results in an instant and there is no need for downloads. Gone were the days when our devices storage was a big issue. We can now enjoy resources without having to sacrifice our devices storage and speed!

On a larger scale, cloud computing can be a collection of data centers and networks. It can also be a file backup resource for huge companies. As we all know, the cloud is disaster-proof. With the rise of social media, cloud computing grew with it as well. Especially its cloud storage model.

You can notice how we post photos and videos online, but did you ever wonder how it stayed there? Where does Facebook or Instagram store our files after uploading? Where do they store our credentials and personal information after we save it on our accounts? The answer? Its all stored in the cloud! These large companies might even have entire buildings solely for cloud storage! Amazing, isnt it?

Cloud storage helped us access any digital data uploaded for public use. It acts as our virtual hard drive without the risk of device malfunction. However versatile it may be, clouds are not created fair and equal. There are different types of cloud storage for the users different needs.

As the name suggests, a private cloud is owned by a private entity or company. These types of clouds act like internal storage and can only be accessed from the inside. It is also maintained and protected by the entity and its own firewalls.

Private clouds are the best option for companies that want full control of the data they are handling. However, the cost of maintenance can be hard to manage when using a private cloud for storage.

Public clouds can be accessed by anyone with permission. It only allows a few administrative controls and can be accessed by just using the internet. Typically, a public cloud is hosted by a different service provider that offers cloud servers. Hence, it does not need any maintenance from the user.

Google Drive is one popular example of a public cloud. You can share a file with anyone as long as they have your permission to do so. Public clouds are affordable, convenient, secure, and most importantly, reliable. They can also be upgraded to larger cloud storage if ever you run out of space.

Hybrid Clouds allow the users to have a mix of both a private and public cloud in one. A hybrid cloud provides you with the security of a private cloud but with the customization of public clouds. This gives the user more control over their data.

A community cloud storage is a kind of private cloud. However, it only provides cloud storage for specific types of businesses or communities. This type of cloud storage follows the specific rules, guidelines, and requirements of the said community.

Proxy servers act as a link or middleman between the user and the internet. It allows the users to browse the internet more freely and securely. Proxies mask your IP address which minimizes your traces and hides your exact location. Ust like clouds, proxies have different types too. For starters, there are free proxies for web scraping, data gathering, or web crawling. But what difference can it make for cloud computing?

The internet alone is not the safest place to store valuable data. It is prone to breaches and hackers. Thankfully, cloud storage providers encrypted their clouds with multiple firewalls and safety nets to avoid chaos. However, as developers add more security to clouds, hackers also put extra effort to break the firewall down. It would surely not hurt to have extra protection, right?

Proxies are also helpful in gathering data online and web scraping. As the proxy servers mask your identity online, they also hide you from hackers, phishers, and breachers. This makes your cloud credentials, passwords, and important data safe.

Proxy servers can enhance the experience of the user. Physical devices have their limitations. They are prone to hardware malfunction and physical damage. By using clouds, this will not be the case. As the clouds store data on the internet, all the processes will also be online. It can go to many different routes and perform a lot of different processes just for a user to get the information that they want.

However, by using proxy servers, the traffic does not need to go through as many routes. This makes it easier for the user to get to the information and reduces latency. Proxies also provide a smoother flow of internet traffic between servers.

Because of cloud service providers, you can access your data and files online. This is also the case for proxy servers. There are proxy server providers which you can use to mask your identity as you access your data.

This means that compatibility will not be an issue, saving you quite a lot. On the other hand, hardware and maintenance will not cost you a dime. These service providers have it already covered for you.

There are a lot of reasons why you should use cloud storage. It is easy to maintain, affordable, secure, and accessible to all. Cloud storage has different types that cater to different needs.

Of course, that goes with proxy servers too. Using a proxy along with it is a great move for additional cybersecurity. But there are different types of it too. Make sure to choose the one that best suits you!

Go here to see the original:
Cloud Types: Everything You Need to Know in 2022 - Nerdbot

4 Best Web Hosting Services & Providers Of 2022 – Blog – The Island Now

When looking for the best web hosting provider for your online presence or business, there are thousands of options to explore on the internet. You can choose from VPS hosting to shared hosting and cloud hosting. The range is limitless. The vast web hosting providers in the web-hosting space make it easy to create and manage their online presence. However, not all these platforms are created equally. The hosting platform you choose is critical to the success of your website. The last thing you want is your business website to be down when you need it the most. Reputable web hosting platforms would ensure your website is always online, helping you avoid unnecessary downtimes. Not having to worry about your websites technical aspect is a blessing, and that is what every potential website owner should aim at. However, with the vast number of web hosting platforms, it can be challenging to separate the wheat from the chaff, especially for newbies. There are many good web hosting providers, however, the top option may sometimes be relative. Your best pick that suits your business boils down to the key features you need. Nevertheless, there are general features to keep your eyes on when browsing for the best web hosting This article will review four of the best web hosting platforms on the market today. We used several criteria to evaluate our options before presenting them to you. This guide includes providers you cannot go wrong with when looking for the best web hosting services to build or host. Also, they are excellent for managing any type of website. Read on to see our expert recommendations. The Best Web Hosting Services For Small & Large Businesses: 1. Bluehost Overall Best Web Hosting Provider 2. GreenGeeks Most Popular Eco-Friendly Web Hosting 3. Nexcess Hosting Platform For Growth and Scalability 4. Hostinger Affordable Web Hosting With Outstanding Uptime #1. Bluehost Overall Best Web Hosting Provider Bluehost is undoubtedly one of the best web hosting platforms on the market. If you are looking for a web hosting service that can help you elevate your online business, count on Bluehost. At the writing of this article, Bluehost has more than two million websites across the globe. Bluehost is very easy to use. It has a straightforward platform suitable for both beginners and professionals. You do not need to be tech-savvy to host your website with this platform. They listed and organized all their services, making it easy to choose what you want. When it comes to features, Bluehost has loads of them. From managed hosting and WooCommerce to WordPress hosting, this platform has it all regarding website hosting. They also stand out in pricing, offering top-quality services for affordable rates. They have a plan for individuals, businesses, organizations, and more. In addition to helping you host your website, this platform has a website builder. Hence, you can build and host your website on Bluehost. They stand out in the industry in many ways. You cannot go wrong when choosing them for your website services. HighlightsEssential Features: Bluehost offers a wide range of features, starting with its cheapest plan. The most affordable plan comes with several great features: website builder domain name. Plus, it includes a one-click install for Joomla, WordPress, and Drupal. The basic plan offers 50 GB storage and unmetered bandwidth, excellent for starters. All their tiers come with free SSL and email accounts. Furthermore, this company offers traditional shared hosting. Also, it provides managed, VPS, and dedicated WordPress hosting. High-End Performance: Bluehost boasts of impressive performance. Since its emergence in 2017, it has been delivering remarkable site performance. The average uptime of this hosting platform is 99.97 percent, the highest in the industry. The average loading time of Bluehost is 324 ms, excellent even for high-traffic websites. Friendly Pricing: Bluehost also stands out from most of its rivals due to its pocket-friendly prices. It offers many affordable plans to accommodate online businesses of all sizes. All the plans of this company come with a 30-day money-back guarantee. 24/7 Customer Support: In addition to having a simple and easy-to-navigate website, Bluehost provides excellent, 24/7 customer support. You can also reach the support team via different channels like email, phone, ticketing system, and live chat. The support team has a vast knowledge of the entire system. Thus, it is willing to help you whenever you consult them. Advanced Security: We also love this platform because of the high-end security it provides. It offers top-notch security for all types of websites, including WordPress sites. All their plans come with a free SSL certificate, an important security feature for all websites. Dozens of Templates: Bluehost has thousands of templates for creating your own WordPress website on the platform. Bluehosts Great Deals: Basic Plan: Bluehosts Basic Plan costs $2.95 per month when you pay annually. This plan is for only one website. It comes with 50 GB SSD storage, 24/7 customer support, and custom themes. Plus, you get AI-driven templates, drag-and-drop functionality, WordPress integration, and free CDN and SSL certificates. Plus Plan: The Plus Plan costs $5.45 per month when you pay yearly. This plan comes with all Basic plan features, including unlimited websites, free Office 365 (30 days), and unlimited SSD storage. Choice Plus Plan: The Choice Plus plan is $5.45 per month for 12 months, but it auto-renews at $18.99. The additional features of this plan include free automated backup for one year and a free domain for one year. Pro Plan: This plan costs $13.95 per month when paid annually, with additional features like optimized CPU resources and free dedicated IP. Pros

Read more here:
4 Best Web Hosting Services & Providers Of 2022 - Blog - The Island Now

Crypto-hackers have to play ‘capture-the-flag in the cloud’ to exploit victims’ servers – PC Gamer

Illegal cryptocurrency mining outfits that hack servers for profit are having to fight each other for limited resources within the hijacked cloud space. So, on top of getting ahead of the hacked system's security, there's a silent battle ensuing behind the scenes between potential profiteers.

And while it may sound like great fun to watch cryptominers pathetically scuffling over server scraps, this is a fierce contest, one that encourages a certain level of innovation from the involved parties. Their in-fighting only makes them stronger, faster, more agile.

The use of malware to turn profit in the cryptocurrency space has been on the rise in recent years, with security reports in 2018 seeing a 4,000% rise (opens in new tab), and it's only been getting more prevalent over the years. After all, why use your own resources when you can hack into someone else's?

As Trend Micro reports (opens in new tab), more and more of these illicit cryptocurrency mining outfits are turning to cloud-based servers to maximise profit on wider, more powerful hardware arrays, but it's not always as simple as shouting "I'm in," and watching the zeros roll in.

Trend Micro's recent research paper (opens in new tab) (PDF warning)goes into more detail, but the crux (outlined in a blog post (opens in new tab)) is this: "The battle to take and retain control over a victims servers is a major driving force for the evolution of these groups' tools and techniques, prompting them to constantly improve their ability to remove competitors from compromised systems and, at the same time, resist their own removal."

The competing groups will utilise kill scripts to knock out rivals, 'obfuscate' code to make it harder to understand, and increase persistence mechanisms such as continual password updates to keep the competition at bay. All the while, batting off backlash from the hacked system's security protocols.

It seems illegal cryptocurrency miners have forgotten the fifth rule of fight club: One fight at a time, fellas.

With the competition being so hot, groups are continually churning out "new exploits that enable them to attack systems that their competitors cannot and, at the same time, they constantly improve both their ability to resist being deleted by competitors."

The report cites a rivalry between Kinsing and 8220, two groups who target WebLogic vulnerabilities, who are constantly found pushing back against one another within the infected system, "sometimes even several times a day."

Trend Micro is calling it "a sort of capture-the-flag in the cloud."

This kind of hacking commotion is only going to become more rampant as we move into a more cloud-based future. And this almost parodic dance illegal cryptocurrency miners have found themselves inhaving to act as both attacker and defenderwill only serve to improve their tactics.

Link:
Crypto-hackers have to play 'capture-the-flag in the cloud' to exploit victims' servers - PC Gamer

Safeguarding Cloud-Based Data & Mitigating the Cyber Risks Associated with a Remote Workforce – JD Supra

[author: Stephen O'Maley]

INTRODUCTION

Efficiency, scalability, speed, increased cost savings, and advanced security for highly sensitive data remain in high demand by users of eDiscovery services. To meet that demand, cloud technology promised several of those benefits.

However, the advanced security of the data depends on how an eDiscovery service provider implements, maintains, and manages sensitive client information.

This issue has become more significant as the majority of the workforce is dispersed and often working from unsecured home environments has therefore driven an increased usage of cloud services. That greater cloud usage has opened the door to riskier data storage scenarios that might not be fully apparent to those users of eDiscovery services. Furthermore, the firms providing these services may not be knowledgeable about all of the risks inherent to their activities and processes.

Because the industry has moved toward commoditization over customization, the workforce within some eDiscovery providers consists largely of junior staff who should follow strict protocols and procedures while in the office. While these activities may have been proven and vetted in the office environment to meet minimum security standards, the majority of employees are not likely to be mindful of the security risks inherent to working at home.

This paper examines the inherent risks surrounding the protection of client electronic data on cloud-based platforms that have arisen with the proliferation of the at-home work setting. It also explains why its important for users of eDiscovery services to scrutinize the technical capabilities, practices, and experience of the professionals that will be handing their data to ensure proper precautions are in place.

THE CLOUD: A SOLUTION THAT INTRODUCES ADDITIONAL RISKS

Many eDiscovery providers have recently migrated hosted client data from private data centers to public or private cloud environments. As hosted data volumes increased, so did the complexities involved in scaling the physical resources required to maintain private hosting environments in a way that met the speed, efficiency, redundancy, and security requirements of clients. Consequently, eDiscovery providers began reexamining the risks and costs associated with their hosted portfolios and many of them turned to the cloud as a solution. But this also introduced other issues as well that may not have been fully reconciled to date and may have been exacerbated by the pandemic.

Security

It is not uncommon for an organizations most sensitive data to be found on eDiscovery platforms. That data often includes privileged communications, business strategy decisions, trade secret information, potentially embarrassing personal communications, and other confidential communications from its employees, leadership, and legal counsel. Cloud hosting services that are run by eDiscovery providers have a range of security capabilities that are often unexamined by the eDiscovery user.

Due to the increasing sophistication of state and non-state cyber hackers, there is continued and mounting risk of infiltration by hostile actors. This was illustrated in the 2020 SolarWinds attack on the U.S. government. In that scenario, a trusted technology service firm tasked with maintaining the computing environment within several of the worlds most secure data centers provided the doorway for hackers to access the countrys most sensitive data.

Then there are the inherent risks with at-home working environments that have increased due to the COVID-19 pandemic. With the advancement and continued adoption of IOT (Internet of Things) devices and the expansion of high bandwidth Internet services for residential consumers, there exists multiple pathways for trusted home-based Wi-Fi connected services in the form of smart devices (smart speakers, thermostats, alarm systems, TVs, etc.) to become compromised in an environment that isnt usually monitored for malicious network activity. This is compounded when employees of eDiscovery providers lack experience or knowledge around network security risks.

Reliability

Cloud services offer the promise of unparalleled reliability with limited downtime for the document review operations of eDiscovery users. Although there may be regularly scheduled maintenance windows, emergency outages do happen occasionally. Consider Googles outage in December of 2020. Disaster-related outages to users of eDiscovery services hosted in the cloud can have severe impacts on a clients ability to meet court-mandated and other production timelines.

Data protection and privacy concerns

Cloud hosting solutions can and often do provide data storage local to regional jurisdictions that require personally identifiable information (PII) redaction and identification before extraditing that information to another country (such as the United States). This offers the promise of eDiscovery providers having locally available data storage in the region requiring the privacy regulations.

However, given the multitude of regions throughout the globe with data privacy regulations, a user of eDiscovery services should not assume that their data is be hosted in accordance with local regulations. In general, users of eDiscovery services should confirm with their providers where the physical servers are located that will be housing the protected data.

Additionally, with the majority of the staff of eDiscovery providers working from home due to the pandemic, it may be important to ask how a mindful approach to global data privacy regulations is being addressed.

Global context

Cybercrime is projected to have cost the global economy nearly $1 trillion in 2020. Furthermore, hacking and infiltrations into government and business entities is increasingly viewed as the best way for adverse nations and other bad actors to have the greatest impact on their targets. This is all intensified by the global pandemic, when at-home working environments and increased use of social engineering in generally insecure environments present added risks to the security of data under management.

HOW TO ENSURE YOUR DATA IS SECURE

What are some of the ways that users of cloud-based eDiscovery services can verify that their data is being safeguarded?

Cloud security

One important step to take is to ask if the cloud-based eDiscovery solution has been certified to various security standards. While this isnt a guarantee that your data is not exposed, it does present some level of comfort that security protocols are tested on a regular basis by an impartial third party. Some certifications that are relevant here include: SOC2 Type 2, ISO 27001, ISO 27017, ISO 27018, as well as certifications that indicate the hosting provider is mindful of data privacy regulations and HIPPA requirements.

Its important to differentiate certifications that are attributed to the cloud operator as opposed to the data hosting service provider. For example, AWS, Google, and Microsoft Azure have a number of sophisticated data security certifications associated with their up-stream operation of the cloud environment.

However, its important to note that an eDiscovery platform running within that cloud environment employs its own security protocols to allow reviewers to access documents and as a result does not inherit all of the security controls that exist on the base layer cloud offering. Make sure you know what security protocols and certifications your application of choice can directly lay claim to.

Work from home security considerations

This presents additional considerations. Many eDiscovery providers will point to employee handbooks and corporate policy documents as an initial answer, but in this unprecedented time, it is unlikely that those guidelines anticipated a scenario where the majority of the workforce was working from disparate outside and nonsecure locations.

Depending upon the technical environment available at the eDiscovery provider, measures can be taken to come close to the network restrictions in place in the office. No solution will be 100 percent risk free , but there are best practices that can be implemented to mitigate major risks. For example, the provider can take a centralized security approach through the use of a VPN (virtual private network) connection to the office environment that restricts access to non-essential networks and prevents employees from using non-work issued computers.

Its also crucial to be aware of the different levels of security restrictions appropriate for employees focused on different aspects of the eDiscovery process. For instance, someone performing document review likely requires less access to sensitive client data than the project manager in charge of organizing the review. Its necessary to understand what at-home procedures your provider is using and how that affects the safety and exposure of your data.

CONCLUSION

Notwithstanding the issues that have arisen, cloud-based eDiscovery solutions provide users numerous advantages in tackling the unprecedented challenges being faced in the post-COVID world. At the same time, its equally important for users to know and understand what protection providers are enacting to safeguard their data. Cloud storage solutions address issues faced by aging technical infrastructure, can greatly bolster cybersecurity and provide eDiscovery providers the flexibility to operate in a global setting. The added risks posed by work from home environments due to the pandemic mean that buyers of these services should closely monitor the whereabouts, protection, and technical environments employed by the firms working with their sensitive data.

Go here to read the rest:
Safeguarding Cloud-Based Data & Mitigating the Cyber Risks Associated with a Remote Workforce - JD Supra