Category Archives: Cloud Storage
Frustrated with your companys cloud obsession? What goes around, comes around – ComputerWeekly.com
Many of you reading this will be familiar with the experience of having to bite your lip as you watch your organisation pursue a cloud first agenda in an almost religious manner. You know that the obsession with moving everything into the public cloud as quickly as possible is blinkered and misguided, but you go along with it because everyone seems so committed and you dont want to look like a naysayer.
This kind of experience came up during a recent briefing with Jeff Denworth, CMO and Co-Founder of VAST Data. If youre not familiar with VAST, its a company focused on the area of extremely high-volume storage management. Its solutions were originally designed to address the needs of customers who want cloud-like scalability, flexibility and ease of use, but delivered via an on-prem infrastructure. As Denworth says: Our customers are generally dealing with upwards of 5PB of data, and you need to think differently when working at this level.
Well get into that need to think differently thing in future discussions, but if youre aching to learn more right now, check out the VAST website, where you can geek-out on the companys disaggregated and shared everything architecture.
Back to the current discussion, we asked Denworth to describe his ideal customer, to which he responded: Apart from having a need to store and manage data at scale, its mostly about timing. Our proposition resonates particularly well with customers that have been aggressively pursuing a cloud migration strategy, have gained enough experience to figure out that its not the Nirvana they thought it would be, but havent yet got around to letting their storage specialists go.
As an aside we chatted about cloud evangelists who seem to build their careers by going from company to company, encouraging them to shift everything to the cloud while downsizing internal IT, then moving on to their next job just before all of the problems with an obsessive cloud approach become obvious.
How much this happens in exactly this manner is debatable, but weve been tracking the way in which challenges accumulate as the number of cloud services proliferate for over a decade. Put this together with the way in which activity is becoming even more distributed, and many IT teams are seeing the cost, risk and other issues escalate even further.
Does this mean that public cloud services are inherently bad news? Of course not, its just that cloud adoption should be regarded as a potential means to an end, rather than an end in its own right. Our advice is always to focus on service delivery objectives, and when you do this you generally end up with some kind of hybrid/multi-cloud approach a topic well be publishing some new research on soon (watch this space).
In the meantime, theres now enough experience out there to provide the insights necessary to challenge the cloud crusaders and inject a little more rationality into the discussion. And players like VAST and others are consistently demonstrating that its nowadays possible to build on-premise systems that can operate reliably, securely and cost-effectively at extreme scale.
See the article here:
Frustrated with your companys cloud obsession? What goes around, comes around - ComputerWeekly.com
Storage news ticker May the 4th be with you Blocks and Files – Blocks and Files
Seagate celebratesStar Wars Daywith the introduction of three collectible disk drives inspired by Boba Fett, Grogu, and The Mandalorian. Seagates marketeers have gone over the top, saying these Special Edition FireCuda External Hard Drives are available with three unique aesthetics that represent each legendary character: the cool demeanor of legendary bounty hunter Boba Fett, the joyful look of Grogu, and the honorable and unwavering purpose of the Mandalorian. The drives come equipped with customizable RGB LED lightingand each design features a default custom character light out-of-the-box: a flashy red for Boba Fett, a glowing blue for Grogu, and a bold blue for the Mandalorian. How can buyers resist?
Parsec Labs, which supplies enterprise data management and protection products, announced the general release of v5.0 of its data management suite with the ability to deploy Parsecs products as a virtual appliance. Parsecs software supports all data types (Block, File, & Object), and can easily, reliably, and securely move data from any source to any target, including to the cloud or between clouds.
Cloud storage provider Backblaze has published its latest drive stats table and presented the numbers a new way using a four-box quadrant style diagram a Drive Stats Failure Square measuring drive age and failure rate. It charts which top drive models are Winners, Challengers, Muddlers, or candidates to become Retirees. Heres the base table, which leaves you with an impression of just how reliable most disk drives are:
Standout points include the top performing drives over the long haul:
Composable computing supplier Liqid is collaborating with Samsung (DDR5-based DRAM CXL module) and Tanzanite Silicon Solutions (memory pooling technology) to demonstrate composable memory via the Compute Express Link (CXL) 2.0 protocol at Dell Technologies World 2022. CXL can decouple DRAM from the CPU. Liqids Matrix composable disaggregated infrastructure (CDI) software can pool and compose memory in tandem with GPU, NVMe, persistent memory, FPGA, and other accelerator devices. It claims this allows for previously static DRAM resources to be shared for exponentially higher performance, reduced software stack complexity, and lower overall system cost, permitting users to focus on accelerating time to results for target workloads as opposed to maintaining physical hardware.
The demo config consisted of two Xeon-based Archer City systems (codenamed Sapphire Rapids), along with Tanzanites SLICSoC implemented in an Intel Agilex FPGA, demonstrating clustered/tiered memory allocated across two hosts and orchestrated using Liqid Matrix CDI software.
NAKIVO has released v10.6 version of its Backup & Replication software with new functionality for protecting file shares on NAS devices and Windows/Linux file servers. The 10.6 release also expands the backup scope of Microsoft 365 and VMware vSphere environments, and adds a ransomware-resilient deployment option with hardened AMI. With NAS Backup, customers can back up entire file shares or specific folders within those shares to a variety of backup targets and recover files and folders to the location of their choice. The feature supports backup, backup copy and recovery of file shares using the NFS and SMB (CIFS) protocols.
Pavilion Data Systems announced the certification of NVMe/TCP for VMware vSphere 7 Update 3, and says it has the worlds broadest certified support for VMware vSphere 7 Update 3 with its NVMe/RDMA, NVMe/TCP, NFS, and iSCSI with its HyperParallel Flash array. Administrators can change protocols on the fly through an intuitive GUI. Customers decrease risk, enjoy choice and control, and can easily deploy Pavilion into existing NAS or iSCSI storage networks, replacing legacy systems that lack performance, capacity, and ability to leverage NVMe-oF.
Kiran Bhageshpur has been hired as Qumulos CTO. He was the co-founder and CEO of Igneous, which crashed in late 2020 with the unstructured data management-as-a-service (UDMaaS) assets bought by Rubrik. Kiran returns after a two-year career break. Welcome back to the fray.
Samsung has developed a JEDEC-compliant UFS v4.0 technology, using seventh generation V-NAND (176-layer). It provides a speed of up to 23.2 Gbits per lane double UFS 3.1. The sequential read speed is 4.2GB/sec and sequential write runs at 2.8GB/sec. These are approximately 2x and 1.6x faster speeds over the previous UFS 3.1 product, respectively.Samsung says UFS 4.0 is a good match with 5G smart phones. Its UFS 4.0 deliver a sequential read speed of up to 6.0 MB/sec per milliampere (mA) a 46 per cent improvement over UFS 3.1. This will allow smartphones to be used longer at the same battery capacity.Samsung will be mass producing its UFS 4.0, sized at 11mm x 13mm x 1mm, in Q3, with capacities up to 1TB.
Seagate is shipping 20TB-plus drives in high volume. This includes 22TB drives to select customers suspected to be among the hyperscalers. It will provide updates as these products become more widely available. The companys maximum capacity drives that are publicly available are its 20TB Exos, IronWorld Pro and SkyHawk products. We dont know if the 22TB HDDs are conventional or use shingled magnetic recording, but suspect they are conventional.
Veeam has appointed a CRO. John Jester comes from Google Cloud and Microsoft. He will oversee Veeams three regional sales heads Paul Strelzik (SVP & GM Americas), Daniel Fried (GM & SVP EMEA) and Shaun McLagan (SVP APC).
Verbatim in Japan has warranted an external SSD for ten years. Its a write-once SSD called the SWOVA128G with 128GB of capacity and a USB 3.2 Gen1 interface at 5Gbit/sec. The drive can be used to protect againstaccidental file deletion and overwriting, and avoid difficult backup/restore processes with a simple file copy. You could keep tax returns and similar long-life data on it which is a need in Japan apparently.
WANdisco announced itsEdge to Cloud offering, which replicates edge data to a central site, such as the public cloud. Moving this data to the cloud is the only way for companies with thousands of edge centers to analyze sensor data collectively. Edge to Cloud moves multiple petabytes of data per day over an exabyte of data per year. Making sense of all your IoT data is the business opportunity of a generation. Its what will turn companies in sectors like manufacturing, telecommunications, energy and utilities, healthcare and automotive into data companies, said David Richards, co-founder and CEO of WANdisco.
Excerpt from:
Storage news ticker May the 4th be with you Blocks and Files - Blocks and Files
GFT leverages Google Cloud to simplify AI deployments at scale in manufacturing, reducing dependencies on data science experts – PR Newswire
GFT drives factory-floor digital transformation with support for new Google Cloud industry-specific solutions
NEW YORK, May 5, 2022 /PRNewswire/ -- Recruiting data scientists is one of the greatest challenges in the current labor market: analysts say that there was a shortage of 250,000 data scientists globallyeven in 2020. The situation has only exacerbated since then.
AI solutions have the potential to significantly boost productivity in manufacturing. Usually, skilleddata scientists are key for AI projects to be successful. Google Cloud has now introduced newpurpose-built solutions that can be used by manufacturing engineers without requiring the help ofspecialized data scientists or additional integration code in order to scale digital transformation pilotsinto production. This will facilitate the digital transformation efforts in the manufacturing segment. As a Google Cloud Global Partner, GFT has tested and implemented these solutions in Top 10 automotive and manufacturing leaders.
"These solutions will democratize access to data on the manufacturing floor. They will generate morecompetitiveness for industries, and manufacturing engineers will gain the opportunity to acquire AI based skill sets," says Marco Santos, President of GFT USA and Latin America. "Our first-handexperience tells us manufacturers can expect a major boost in their manufacturing floor."
"Transforming factory-floor operations with data and analytics is hugely important to manufacturerstoday," said Charlie Sheridan, Technical Director, Industry Solutions, Manufacturing, Google Cloud. "We're thrilled to work with partners like GFT to extend the outreach of our core technology and provide customers with the foundational technologies needed to solve business challenges and scalesmart factory implementations."
The new Google Cloud solutions supported by GFT include:
The exposed data can be further applied to a growing set of industry-specific use cases, such as:
With Google's new solutions, it is possible to collect data, normalize it, analyze it, then use it forstrategic decisions providing factory-floor engineers with the tools to be self-sufficient. This is also anopportunity to introduce more engineers to AI-based engineering.
GFT has been collaborating with Google Cloud for many years and won the prestigious 'Google Cloud 2019 EMEA Breakthrough Partner of the Year' award. Today, the company has 340+ experienced engineers and architects, 200+ certifications and 94 current engagements with Google Cloud all over the world.
To learn more about Google Cloud's new manufacturing solutions, please visit this page or join for panel discussions at Google Cloud's Manufacturer Spotlight event.
Logo - https://mma.prnewswire.com/media/742447/GFT_Logo.jpg
SOURCE GFT Technologies AG
Go here to read the rest:
GFT leverages Google Cloud to simplify AI deployments at scale in manufacturing, reducing dependencies on data science experts - PR Newswire
5 Qualities of a Great Cloud Provider for Small Business – The Tech Report
The popularity of cloud providers for small businesses has immensely increased over the last couple of years. However, the lack of a standard methodology for evaluating cloud service providers (CSPs) and the fact that CSPs are unique can make choosing the correct one for your small business quite tricky.
We will see more consumer-friendly cloud providers for small businesses in the coming years. However, theres still a risk you might come across a flop in your search for the perfect cloud provider and enhanced office automation.
With that said, here are some great qualities you should look for. Theyll help you spot a great cloud provider for small business owners.
When you outsource a cloud provider, you will essentially rely on their servers in their data center.
Its crucial to ensure your providers data center has physical and environmental controls that monitor and alert security traps, redundancies, and staffing. That helps to keep the environment operational and secure.
When it comes to cloud security, you want to know exactly what your security goals are, what measures each provider offers, and the kind of mechanisms they use to keep your data and applications safe.
Security has proven to be a key priority in cloud computing. However, its crucial to ask specific and extensive questions about your particular use cases, industry, and legal needs, among other issues you might have.
It may all seem a bit overwhelming. However, you must understand this is an essential aspect of operating in the cloud.
Technology is only as good as the people who create it.
Therefore, as you choose a cloud provider, you must ensure that the company behind your cloud is deserving of your trust. Check through their client list to see whether the company is well-known. Ask if theyve worked with organizations similar to yours before.
If there havent been any significant security incidents, we can say they might be a trustworthy lot in general.
On top of that, you should consider the support the team will offer you. If you need help quickly, will they be able to offer it? Some cloud providers only offer support through a phone call center or chat service. That may or may not be acceptable to you.
Therefore, its crucial to find a cloud provider who caters to your needs. With that said, before choosing a cloud service, you must inquire about the level of support and type of support they will provide.
Complete disaster recovery capabilities are critical.
Backups are essential, but they arent enough on their own. In the case of internet, power, or hardware problems, your cloud should have automatic failovers.
Your data should be live-replicated to a backup site. That way, your environment is protected even if your original data center is destroyed.
See how the provider reacts when you ask them the what if question. That way, you can tell if their cloud services are reliable or not.
The cloud provider should document their malware protection, permissions, patching policies and procedures, incident reporting, and response policy.
In addition to that, the cloud provider should share the documentation with you.
When it comes to technology, contentment and laxity are very dangerous things. Hackers are advancing their methods every day, and so should your cloud provider. Your cloud provider must have a systematic forward movement.
The roadmap could involve migrating to a much better data center, implementing two-factor authentication, data loss prevention, or even access-based enumeration.
When choosing a great cloud provider for your small business, you must consider how the architecture of the selected cloud provider can be seamlessly integrated into your business workflow today and in the future.
On top of that, when making your decision, you should consider the available cloud storage designs.
The three major suppliers have similar designs and offer many types of storage to meet diverse needs when it comes to storage architecture. However, you must keep in mind that they all have different forms of archive storage. So it helps to know the variations between each cloud provider.
Each service provides options for regularly storing and accessing data vs. infrequently retrieving data (hot and cool storage). While cool storage is typically less expensive, it comes with a wide range of limitations.
There you have it, the minimum requirements when selecting a cloud provider. A cloud provider who offers anything short of this wont be able to satisfy your needs.
All the same, as you choose your cloud provider, make sure you ask all the tough questions. Trust your gut. Also, keep in mind that a great cloud provider for small business owners should be able to customize its services to fit their business needs.
Follow this link:
5 Qualities of a Great Cloud Provider for Small Business - The Tech Report
New infosec products of the week: May 6, 2022 – Help Net Security
Heres a look at the most interesting products from the past week, featuring releases from AuditBoard, BIO-key, Data Theorem, Enpass, Microsoft, N-able, and Uptycs.
Microsoft has announced the stand-alone version of Microsoft Defender for Business, which aims to bring enterprise-grade endpoint security to SMBs, including endpoint detection and response capabilities to protect against ransomware and other sophisticated cyberthreats.
Cove Data Protection provides streamlined, cloud-first backup, disaster recovery, and archiving for physical and virtual servers, workstations, and Microsoft 365 data, managed from a single web-based dashboard. It also delivers fully managed cloud storage, with 30 data centers to keep backups stored in region, to respect data sovereignty.
BIO-key announced upgrades to BIO-key MobileAuth, a multi-factor authentication (MFA) mobile app that integrates the power of IBB across multiple authentication modalities, allowing customers to build a more complete MFA strategy. In addition to the newest IBB-powered modalities, MobileAuth supports device-based biometric authentication methods including Apple Face ID, Touch ID, and Android Biometrics, as well as support for push token authentication.
With Enpass Business, all passwords remain within the trusted boundaries of the organizations local IT systems. Enterprises have the option to store data on employee devices, or use their existing cloud storage, enabling them to maintain control over their data without the need to host additional servers.
Data Theorems new supply chain product can automatically categorize assets under known vendors, allow customers to add additional new vendors, curate individual assets under any vendor, and alert on increases in policy violations and high embed rates of third-party vendors within key applications.
Uptycs announced new cloud infrastructure entitlement management (CIEM) capabilities that strengthen its cloud security posture management (CSPM) offering. These new capabilities provide Security and Governance, Risk, and Compliance teams with continuous monitoring of cloud services, identities, and entitlements so they can reduce their cloud risk.
AuditBoard announced a set of new automation capabilities for its CrossComply solution. These features combine automation with scalability to accelerate security compliance programs with automated framework mapping, evidence collection, and continuous monitoring, as well as providing the ability for teams to leverage applications and data sources.
Read more:
New infosec products of the week: May 6, 2022 - Help Net Security
Newly Announced Speakers to Address 5G, DPUs, Confidential Computing, Sustainability and Other Hot Topics at Open Infrastructure Summit Berlin – PR…
AUSTIN, Texas (PRWEB) May 05, 2022
The Open Infrastructure (OpenInfra) Foundation today announced additional speakers for its upcoming Open Infrastructure Summit Berlin in June as well as the nominees for the 2022 Superuser Award.
Joining the Summit agenda are leaders from BBC R&D, Bloomberg, BMW, Ciena, European Centre for Medium-Range Weather Forecasts, Federal Ministry of Germany, Fungible, Nubificus, Skatteverket, Socit Gnrale and Vexxhost. Additional presenters, announced previously, include users from Adobe, AWS, Canonical, China Mobile, OVHCloud, Red Hat, SovereignCloudStack, Volvo and Workday.
At the Open Infrastructure Summit Berlin, June 7-9, OpenInfra users and supporters will engage in more than 100 sessions around open infrastructure use cases like cloud computing, edge computing, hardware enablement and security. The Summit is focused on helping users compose, integrate and operate these different technologies at scale to solve real problems for the next decade.
*** Register today: https://openinfra.dev/summit/ ***
New speakers, including several Linux OpenStack Kubernetes Infrastructure (LOKI) users, presenting at the Summit include:
*Superuser Award Nominees*A highlight of the Open Infrastructure Summit Berlin will be the presentation of the Superuser Award. Elected by members of the Open Infrastructure community, the individual or team that wins the Superuser Award is lauded for the unique nature of their use case as well as their integration and application of open infrastructure. Meet the 2022 Superuser Awards Nominees, which include:
Canonical and Wind River are the Headline Sponsors for the Open Infrastructure Summit Berlin 2022, and T-Systems, Vexxhost and Mirantis are the Premier Sponsors. Bloomberg, Red Hat, B-1 Systems and OVHCloud are Spotlight Sponsors. Members of the media can contact jennifer@cathey.co for information about how to cover the event.
*About the Open Infrastructure Foundation*The OpenInfra Foundation builds communities who write open source infrastructure software that runs in production. With the support of over 110,000 individuals in 187 countries, the OpenInfra Foundation hosts open source projects and communities of practice, including infrastructure for AI, container native apps, edge computing and data center clouds. Join the OpenInfra movement: http://www.openinfra.dev
###
Share article on social media or email:
See the original post here:
Newly Announced Speakers to Address 5G, DPUs, Confidential Computing, Sustainability and Other Hot Topics at Open Infrastructure Summit Berlin - PR...
How to prevent Dropbox photo import on Windows 11 – BollyInside
This tutorial is about the How to prevent Dropbox photo import on Windows 11. We will try our best so that you understand this guide. I hope you like this blog How to prevent Dropbox photo import on Windows 11. If your answer is yes then please do share after reading this.Table of contents
Hundreds of cloud storage options are currently available for desktop operating systems. Among all of them, however, only a few individuals stand out. You can save files online using cloud storage services like Dropbox, Google Drive, and OneDrive. People can also avail the free plans of these cloud services. In this essay, well take a look at Dropbox, which offers each customer 2GB of free storage. If youre new to Dropbox, read this article to learn more about cloud storage services. If youre a Dropbox user, you probably know that Windows prompts you to import photos and videos to Dropbox every time you insert a memory card or USB stick. Despite this, while the indicator is a useful feature, many users may wish to disable it. So if you want to block Dropbox photo import on Windows 10 and Windows 11, you have come to the right place.
Hundreds of cloud storage options are now available for the desktop operating system. However, out of all these, only a few stand out from the crowd. Cloud storage services like Dropbox, Google Drive, OneDrive, etc. allow you to save files online. Also, these cloud services offer free plans for individuals. Lets talk about Dropbox, which offers 2 GB of free storage space for each user. In case you dont know much about Dropbox, you can read this article to know more about the cloud storage service.
If youre an active Dropbox user, you may know that when you insert a memory card or USB flash drive, Windows asks if you want to import photos and videos to Dropbox. While it is a great feature, many users may want to disable Command Prompt. The feature that Dropbox tells you when you insert a delete device is called AutoPlay. Therefore, we need to disable autoplay in Windows 10/11 to stop importing photos from Dropbox. Alternatively, you can disable autoplay for all media and devices entirely. To do this, toggle the switch next to Use AutoPlay for all media and devices to Off.
I hope you understand this article How to prevent Dropbox photo import on Windows 11, if your answer is no then you can ask anything via contact forum section related to this article. And if your answer is yes then please share this article with your family and friends.
Link:
How to prevent Dropbox photo import on Windows 11 - BollyInside
Dell’s Rola Dagher: Security ‘At The Heart Of Everything We Do’ – CRN
Dells Global Channel Chief On Security Push And Channel Partner Vision
Cybersecurity took center stage at Dell Technologies World 2022, a topic that typically isnt the focus of discussion for the worldwide market leader in servers, storage and hyperconverged infrastructure.
However, with a slew of new security launches at Dells flagship eventfrom Dell PowerProtect Cyber Recovery for Microsoft Azure to fight ransomware attacks to Dell Apex Cyber Recovery Services to simplify recovery from cyberattacksDell is pushing partners more than ever to sell its growing security portfolio.
From a channel perspective, the solutions were putting out there in security is at the heart of everything we do, said Rola Dagher, Dells worldwide channel chief in charge of driving the companys $59 billion channel business, in an interview with CRN.
When youre talking about technologybe it multi-cloud, be it whatever you build as a partner and as a customersecurity has to be the foundation of everything. Its like building this huge, massive house, and then keeping the door unlocked. So you really need to ensure security is at the heart of the conversation, said Dagher. Security has to be the foundation of everything that [partners are] building.
Dells global channel leader said its cybersecurity push was led by channel partners.
For us, our partners have been key to providing us the feedback in terms of the solution they want to build around what we do, she said.
Dells global channel partner business climbed 27 percent in fiscal year 2022 with huge revenue growth in client systems and servers. The Round Rock, Texas-based PC and infrastructure giant generated a record-breaking sales year of $101 billion, an increase of 17 percent annually.
[Related: Alienware To McConaughey: Cool Scenes From Dell Tech World]
Dell hosted its first in-person Dell Technologies World event this week for the first time since 2019 due to the COVID-19 pandemic.
More than 5,500 customers, channel partners and other IT professionals flocked to Las Vegas for Dell Technologies World 2022 to listen the companys strategy and future from the likes of CEO Michael Dell, Vice Chairman Jeff Clarke and Co-Chief Operating Officer Chuck Whitten.
Rola Dagher talks to CRN about Dells security strategy, partner training and enablement, and her message to channel partners.
Original post:
Dell's Rola Dagher: Security 'At The Heart Of Everything We Do' - CRN
PeerGFS adds AI/ML anomaly detection to distributed file system product – ComputerWeekly.com
File management software maker Peer Software plans to launch Linux file server compatibility this summer, along with enhanced artificial intelligence/machine learning (AI/ML)-based file access anomaly detection and storage audits.
Peers PeerGFS file service allows multi-site file access and sharing with hub-spoke and peer-to-peer failover and replication between sites. The product started out nearly 30 years ago specialising in Windows DFS replication (distributed file system and DFS-R) beyond Windows storage.
It provides distributed storage for Windows servers for customers who want to use Windows file servers and third-party NAS storage. In so doing, it claims to be different to providers of similar file management software, such as Nasuni.
Nasuni uses a proprietary namespace to overlay customer files, said CEO Jimmy Tam. We still base ourselves on Windows DFS, so theres nothing in the data path to obfuscate things.
Peer also plays in a similar space as the likes of Ctera and Panzura.
PeerGFS enables a distributed file system to be created across mixed storage systems that include Windows, NetApp Data ONTAP, DellEMC Isilon/VNX/Unity, Nutanix Files, S3, and Azure Blob, with asynchronous near real-time replication across the systems. Linux server support is set to be added this summer.
PeerGFS requires deployment of its Peer management centre, which is the brain that organises input/output (I/O) traffic flows, and can be run on a physical or virtual server.
Below that, there are Peer agents which are tailored to storage platform application programming interfaces (APIs) and which log file events such as access, changes, saves, and so on. The agents forward messaging via the management centre to other PeerGFS instances.
That event flow provides the core of the functionality, with customers able to set file sharing and replication between specified folders, with additional features such as size limits on folder, pinning of files, and so on.
A so-called network of brokers sets policies on file event flows so that, for example, data is replicated between specified folders or kept within set geographies.
Key use cases are global file-sharing and collaboration, with local caching possible. Deltas are picked up from file event streams and changes replicated between locations. That provides for rapid failover between sites should a server or storage go down, with RPOs claimed to be cut to near zero, according to Tam.
There is also automated failback as Peers software brings the primary system back up when it is ready, with Tam adding that failover functionality had proved very popular in virtual desktop infrastructure (VDI) deployments.
PeerGFS will also handle file sharing wherever an instance of Windows file system is running, including instances in the AWS, Azure and Google clouds, said Tam, although he was keen to point out that many customers dont want to use the cloud for any or all their data.
Lots of customers, such as those in the military or finance, dont want to use the cloud for reasons of security, said Tam. We say that you dont have to be cloud-first you can be cloud-friendly or cloud-optional.
Beyond file management services, Peer is trying to add value with auditing and behaviour monitoring, including via AI/ML. It already has malicious event detection (MED), which includes use of bait files that do not generate file events by default, but if they do such as by ransomware crawling systems this will raise alerts.
Similarly, trap folders slow down ransomware crawling behaviour by use of recursive looping behaviour in programming. Meanwhile, patterns from known ransomware variants can be matched to event streams in file systems.
Planned additions to MED include collecting event stream data per user to develop a whitelist of expected behaviour against which anomalous activity can be identified, which could range from bad human actors to machine-generated actions.
Wed generate a pattern of what normal looks like and monitor for deviations, added Tam.
Also coming in summer will be file access audits that will report in detail on user interaction with files, as well as trend alerting and the addition of network file system (NFS) to bring Linux compatibility.
Go here to read the rest:
PeerGFS adds AI/ML anomaly detection to distributed file system product - ComputerWeekly.com
5 Must-Have Features of Backup as a Service For Hybrid Environments – CIO
As the value and business criticality of data increases, so do the challenges of backup, recovery, and data management. These challenges are exacerbated by exploding data growth, increasing SLA requirements, and an evolving threat and compliance landscape. Although hybrid cloud environments provide businesses with greater agility, protecting and managing apps and data across the core data center, private cloud, and public cloud can prove increasingly complex and costly.
Many IT organizations that have cobbled together data protection solutions over the years now find themselves saddled with rigid, siloed infrastructure based on an equally rigid backup approach. Organizations that postpone modernization complain about being stuck in maintenance mode with disparate, multi-vendor backup and recovery systems that are complex and expensive to maintain. Multiple touch points of administration slow down production, and the costs of software licensing, disruptive upgrades, and over-provisioning can add up fast. Resources are tight but not protecting business-critical data and apps can jeopardize the health of the business.
This situation presents a challenge for IT organizations to find a solution that simply and reliably safeguards your most important and valuable assets. Modern cloud services are designed to do a better job protecting data and apps in hybrid cloud environments, and to simplify operations and keep costs down.
When it comes to data protection modernization, most businesses realize they cannot afford to wait. According to ESG, 57% of organizations expect to increase spending on data protection in 2022, and 26% identify data backup and recovery as a top-5 area of data center modernization planned for the next 12 to 18 months.
If you are considering the transition to data protection as a service (DPaaS), youre not alone. Recent research indicates the importance of these emerging services in cloud-centric strategies. According to IDC, DPaaS is the fastest-growing segment of the data protection market with forecast 19.1% CAGR through 2025.
In large part, thats because cloud services improve efficiencies by reducing cost, risk, and complexity. Protecting data on premises and in a hybrid cloud environment sets you up to deliver on future SLAs, enabling you to meet demanding RPOs and RTOs while keeping your business moving ahead.
New backup as a service offerings have redefined backup and recovery with the simplicity and flexibility of the cloud experience. Cloud-native services can eliminate complexity of protecting your data and free you from the day-to-day hassles of managing the backup infrastructure. The innovative approach to backup lets you meet SLAs in hybrid cloud environments, and simplifies your infrastructure, driving significant value for your organization.
Resilient data protection is key to always-on availability for data and applications in todays changing hybrid cloud environments. While every organization has its own set of requirements, I would advise you to focus on cost efficiency, simplicity, performance, scalability, and future-readiness when architecting your strategy and evaluating new technologies. The simplest choice: A backup as a service solution that integrates all of these features in a pay-as-you-go consumption model.
Modern solutions are architected to support todays challenging IT environments. Introduced in September 2021 as a part of HPE GreenLake for data protection, HPE Backup and Recovery Service is designed to deliver five key benefits to hybrid cloud environments:
HPE Backup and Recovery Service brings the simplicity of the cloud experience to on-prem and cloud environments. This service breaks down the silos of a typical backup deployment, supporting any primary storage array to protect your VMware virtual machines (VMs), and theres no need to manage backup hardware, software, or cloud infrastructure. The solution can be deployed quickly and managed very simply through a single console. Policy driven, the HPE Backup and Recovery Service easily organizes VMs into protection groups, making it easy to apply policies to multiple VMs or datastores and to automate protection at scale.
No modern data security solution is complete without ransomware protection. The most effective way to protect backup data from cyberattacks is to keep it hidden from attackers. Ransomware cant infect and encrypt what it cannot access.
HPE Backup and Recovery Service creates backup stores which are not directly accessible by the operating system. Backup images are made inaccessible to ransomware, ensuring data backup security, and enabling reliable data restores. And with backup data immutability, users can also prevent a backup being inadvertently or maliciously deleted or modified before the configured retention date. Once the retention/immutability date has been set it cannot be reduced, and the backup is safe from attack or accidental deletion.
Data is the lifeblood of your organization. Data protection modernization provides your organization with an opportunity to:
Get the peace of mind that your data is rapidly recoverable, always secure, and provides value to your business without compromise with the HPE Backup and Recovery Service. See how it works in this 4-minute demo.
You can also register for a 90-day free trial of the HPE Backup and Recovery Service.* All features of the cloud service are available during the trial period and after evaluation, with HPE support.
*Cloud storage capacity is fixed at 5 TB during the trial period.
____________________________________
Ashwin Shetty is a Product Marketing Manager for HPE Storage. In this role, Ashwin is responsible for helping customers understand the value of modernizing data protection with HPE Backup and Recovery Service, HPE StoreOnce, HPE RMC, and HPE StoreEver Tape. Prior to joining HPE, Ashwin worked in the sales and marketing groups of Oracle and HCL.
See more here:
5 Must-Have Features of Backup as a Service For Hybrid Environments - CIO