Category Archives: Cloud Storage

The Obsessive World of Digital Music Collectors – Pitchfork

When it comes to playing the music, his sprawling setup involves multiple computers, tablets, monitors, speakers, and headphones. His collection stretches back to 2006, though hes got boxes filled with even older data CDs and hard drives, his adolescence encased in digital amber, ready to be unearthed.

Not all collectors are so chaotic. Marty Sartini Garner, a copywriter at the Los Angeles Philharmonic and music critic who has written for Pitchfork, occupies a place somewhere near the other end of the spectrum. He limits his CD rips to one per day, and moves them to an aging iPod Classic that he listens to with his Airpods and a Bluetooth adapter. This more minimal setup allows him to put down his phone, close the laptop, and narrow his focus to whatever hes listening to. Hell download records from Bandcamp, but likes the IRL experience of walking to Fingerprints, his local record shop in Long Beach, California, known for its impressive stock of new and used CDs. He still keeps a vinyl collection and an analog stereo, but his latest move necessitated a pruning of his physical collection.

My personal setup splits the difference between Nappys maximalist strategy and Garners more Zen-like approach. I still use the same iTunes library I started in 2003, when I bought my first iPod. And I subscribe to Apple Music, which includes a service called iTunes Match, that either matches songs in my library or uploads my own copy to their servers, letting me stream or download any song in my collection from any Apple device associated with my Apple ID. I currently have about 47,000 songs over 700GB, and the library is stored on a pair of hard drives inside my desktop computer that are backed up twice internallyonce to a single drive every night, and again to a Time Machine backup that saves hourly incremental backups. Every few months I swap a third backup in and out of a fireproof safe at my parents house just in case my building burns down. This process may sound like a lot, but in reality, its mostly self-sustaining: Once I add a record to the collection, its almost immediately available to play in any room of my home, or on any device I ownanywhere with an internet connection.

Every collectors setup is different, and what works for you should be based on what gear you already have, your budget for what you dont, and the effort youre willing to put in. Storage space has never been cheaper: Even the most durable 4TB hard drives can be had for less than $100 nowadays. Most of the equipment youll needcomputer, smartphone, stereo, headphones, routeryou likely already own. You might even have a dusty CD collection waiting to be ripped, old drives that can be repurposed, or a cloud storage account that can be set up for automatic backups.

You dont need thousands of dollars, a huge home, or an advanced degree to maintain a digital collection. Some collectorsincluding mecan be intense, but it doesnt have to be that serious to be rewarding. Everyone started with one file, one song, one album. All it takes is time, effort, and a little bit of love.

This week, were exploring how music and technology intersect, and what todays trends and innovations might mean for the future. Read more here.

Read more from the original source:
The Obsessive World of Digital Music Collectors - Pitchfork

Microsoft urged to do more to address European cloud antitrust complaints – ComputerWeekly.com

The founding member of a coalition of tech firms that accused Microsoft of anti-competitive behaviour, based on how it sells and packages its cloud services in Europe, has claimed the software giant must do more to address the antitrust complaints being levied against it.

Microsoft published a blog post this week acknowledging the antitrust concerns that have been raised with regulators and authorities about its cloud-related business practices in Europe, which it also used to outline a series of meaningful actions it would take to address the issues raised.

As detailed in a report on Reuters last month, these concerns are known to have prompted the European Commissions antitrust authorities to send a questionnaire to Microsoft customers and competitors, asking for their views on Microsofts cloud-related licensing deals.

The commission has information that Microsoft may be using its potentially dominant position in certain software markets to foreclose competition regarding certain cloud computing services, the questionnaire said, reported Reuters.

This information is based on complaints filed with the European Commission by several European cloud service providers, including German file sync and share software maker NextCloud and French infrastructure-as-a-service (IaaS) provider OVHcloud.

Nextclouds antitrust complaint, filed in early 2021, takes umbrage at the way Microsoft bundles its OneDrive cloud storage service and online collaboration platform Teams in with its flagship Windows operating system. It claims this practice is aggressively pushing consumers to sign up and hand over their data to Microsoft.

Nextclouds complaint has since won the support of more than 50 tech firms and non-profit organisations, leading to the formation of a coalition that is collectively speaking out against how Microsoft sells and packages its cloud software in Europe. The company has also filed a complaint against Microsoft of a similar nature with Germanys own antitrust authorities.

In ablog post dated 18 May 2022, Microsoft president and vice-chair Brad Smith said the company was taking meaningful action on the complaints being raised against it, including the adoption of five pledges that it claims will shape its approach to doing business in Europe in years to come.

These pledges include commitments to ensuring its public cloud meets Europes needs and serves Europes values, that its platforms are set up to ensure the success of European software developers and that it will provide support for European cloud providers through partnership.

The remaining two pledges made by Microsoft include a commitment to ensure our cloud offerings meet European governments sovereign needs, in partnership with local trusted technology providers and a vow to recognise that European governments are regulating technology, and we will adapt to and support these efforts.

According to Microsoft, these pledges mark the start of the work it is doing to address regulatory concerns, and are intended to guide all aspects of our cloud business, enhance transparency for the public, and help us to better support Europes technology needs.

In addition, the company said it was also taking steps to ensure European cloud providers could more easily host a wider variety of Microsoft products on their cloud infrastructure.

It added: This will make European cloud providers more competitive by enabling them to better serve customers.

While these actions are broad they are also not necessarily exhaustive, continued Smith. As I said in a video meeting a few weeks ago with the CEO of a European cloud provider, our immediate goal is to turn a long list of issues into a shorter list of issues.

In other words, lets move rapidly so we can learn quickly. Today were taking a big step, but not necessarily the last step we will need to take, and we look forward to continuing feedback from European cloud providers, customers and regulators, he added.

Speaking to Computer Weekly, Nextcloud CEO Frank Karlitschek said the actions Microsoft was committing to take were indicative of the pressure it is feeling in the wake of the complaints, but there is still more the company should be looking to do.

The main issue here is that we have a super-dominant position from Microsoft [It is] really dominating this whole market and this is not healthy, he said. Thats not healthy for the open market, thats not healthy for privacy and its not healthy for digital sovereignty for Europe. We want the regulators to do something against it to make sure theres fair competition and a level playing field.

In terms of the follow-up action Nextcloud and the coalition would like to see Microsoft take, Karlitschek said a commitment from the company to make parts of its cloud stack open source would be a start.

Across Europe, you have this movement towards digital sovereignty, where governments want to be in control of their data and applications. So, if you are a government or a company and you use Microsoft or Google or Amazons service even if its hosted in Europe thats still under US jurisdiction because of the CLOUD Act, he said.

This is what theyre trying to solve here by giving other cloud providers the option to hold this Microsoft stick, but obviously this is not enough, because you still have a dependency to Microsoft because Microsoft is not open source.

He continued: Digital sovereignty would only come with open source software. What it has proposed so far is interesting and is a move in the right direction, in response to the pressure it is under, but this is not enough.

Data from IT market watcher Synergy Research Group in September 2021 shed some light on the impact the US tech giants growing hold on the European market was having on the fortunes of local cloud providers.

While the market itself has grown nearly fourfold since 2017 to a value of $8.8bn, European cloud providers have seen their share of the market fall from 27% to 16% during that same time period, although the revenue these firms make has doubled during that time.

Computer Weekly also contacted OVHcloud for its take on Microsofts plans, given it has also raised an antitrust complaint against the company with regulators in the past, and received the following statement in response.

Microsoft acknowledges the merits of our complaint and we can only regret that it has to go as far as mobilising the relevant authorities to secure a level playing field in Europe, where competition is both open and fair, said the statement.

We are now waiting to see the concrete implementation conditions of these resolutions and remain committed to defending a level playing field for the European cloud ecosystem.

Go here to see the original:
Microsoft urged to do more to address European cloud antitrust complaints - ComputerWeekly.com

Verizon, AWS expand edge computing to more metro areas – ComputerWeekly.com

Verizon has announced that it is now offering, with AWS, 5G mobile edge computing (MEC) in more US metro areas, with the addition of Nashville, Tennessee and Tampa, Florida.

Through a partnership that began in August 2020, the companies can now provide mobile edge computing via AWS Wavelength Zones in 19 locations in the US, which means 75% of the US population is now within 150 miles of a Wavelength Zone.

TheVerizon and AWS edge computing collaborationbegan with the launch of Verizon 5G Edge with AWS Wavelength.AWS Wavelengthextends AWS compute and storage services to the edge of Verizons public mobile network and provides access to cloud services running in an AWS region, therebyminimising the latency and network hops required to connect from a 5G device to an application hosted on AWS.

In August 2020, the companies announced the general availability of 5G mobile edge computing via Wavelength Zones in 10 cities across the US.

Verizon 5G Edge with AWS Wavelength is currently available in 19 locations: Atlanta, Boston, Charlotte, Chicago, Dallas, Denver, Detroit, Houston, Las Vegas, Los Angeles, Miami, Minneapolis, Nashville, New York City, Phoenix, the San Francisco Bay Area, Seattle, Tampa and Washington DC.

The relationship evolved to see the companies create technology fully integrating Verizons private 5G networks and private 5G Edge platform withAWS Outposts, a fully managed service that is said to offer the same AWS infrastructure, services, application programming interfaces (APIs) and tools to virtually any datacentre, colocation space or on-premise facility for a consistent hybrid experience.

The benefits for users of being in closer proximity to the applications they use means faster response times by shortening the round trip that data needs to travel, significantly reducing lag time, or latency, for getting data to a device from the cloud. For developers and businesses, Verizon said that by using 5G Edge with AWS Wavelength allows them to build and deploy a variety of latency-sensitive applications for use cases such as immersive virtual reality (VR) gaming, video distribution and connected and autonomous vehicles.

With the ongoing expansion of our mobile edge compute infrastructure, were enabling developers to build transformational applications that enhance consumers experiences by moving the data and processing done by applications and services to the edge of Verizons wireless network and closer to the end-users device, said Verizon Business CEO Tami Erwin. By offering both public and private mobile edge compute, we are giving businesses ultimate optionality. This can transform the way companies can leverage predictive analytics, allowing them to improve operational efficiency, mitigate risk and increase revenue.

George Elissaios, director and general manager of AWS EC2 core product management at AWS, added: With the rapid expansion of AWS Wavelength Zones across the US, even more developers can innovate faster and deploy powerful cloud-based applications to the edge offering ultra-low latency, high bandwidth, and high performance for these applications. We are excited to collaborate with Verizon to bring AWS services to the edge of the Verizon 5G network across the US to help our customers transform consumer experiences.

Read the original:
Verizon, AWS expand edge computing to more metro areas - ComputerWeekly.com

How to Print Pictures From Google Photos on Android and iPhone – Guiding Tech

Google Photos is a reliable cloud storage platform to save your memories. It lets you access your photos from any device by logging in to your Google account. You can also store your photos in alocked folder, rediscover old photos using theMap Viewfeature, and search pictures by faces in Google Photos.

The app also allows you to print your photos and other features. This post will show you how you can print photos from Google Photos on Android and iPhone. Please ensure that you are using the latest version of the app.

Google Photosis the default photo gallery and cloud storage app for many Android phones, especially if you are using a Google Pixel. Users get 15GB of cloud storage for free for saving their pictures and other documents. In case you are a shutterbug who regularly loves to click and capture, you can opt for increasing the cloud storage by paying a monthly subscription fee.

Before we begin, make sure that your printer and Android phone are connected to the same Wi-Fi network. Follow these steps.

Step 1:Open the Google Photos app.

Step 2:Select the photo that you wish to print.

Step 3:Tap on the three dots at the top-right corner.

Step 4:Swipe left the top options and select Print.

The Print menu will open on your screen.

Step 5:Tap the top arrow to Select a Printer.

Step 6:Select your printer from the list which is connected to the same Wi-Fi network.

Step 7: After you select the printer, tap on the bottom arrow to reveal print settings.

You can now select print settings like the color, layout, and print copies.

Step 8:After selecting your preferences, tap on the Print icon.

Alternatively, you can follow these steps to print photos from Google Photos on Android.

Step 1:After selecting the photo that you wish to print, tap on the Share icon in the bottom-left corner.

Step 2: Swipe left and tap on Print from the Share options.

Step 3:You will once again arrive at the Print page. Select your printer, choose your print settings and tap on the Print icon.

Thats how you can print your photo directly from the Google Photos app. You can select and print multiple photos as per your choice.

The Google Photos app is also available for iPhone users. This is very helpful in case you have switched from an Android device. You can also sync your Apple Photos library with Google Photos. Follow these steps to print photos from Google Photos on your iPhone.

Step 1:Open the Google Photos app.

Step 2:Select the photo you wish to print.

Step 3:Tap the Share icon at the bottom-left corner.

Step 4:From the Share menu, tap the Share To option.

Step 5:Scroll down and tap on Print.

Step 6:From the Print Options, select your Printer.

Step 7:Select your Printer from the list.

Step 8:Select your printer settings.

Step 9:After choosing your print preferences, tap on Print.

Bonus Tip Print Photos from Google Photos Web Page

Users can directly access their photos on the Google Photos webpage as well. Like your iPhone or Android, you need to ensure that your computer and printer are connected to the same Wi-Fi network. Follow these steps.

Step 1:Open Google Photos site in a browser on your computer

Visit Google Photos

Step 2:Sign in to your Google account. Click on the photo that you wish to print.

Step 3: Press Command + P if you are using a Mac. Press Control + P on Windows.

The Print Settings will appear on your screen.

Step 4: Choose your print preferences and click on Print.

If your printer doesnt support wireless connectivity, you can connect your laptop or phone to the printer using an OTG cable. Then you can follow the same steps mentioned above to print your photos.

The Google Photos app lets you view, store and print high-quality photos. The world of photographs has gone digital with these cloud services. But there are still a lot of folks who love the physical nature of photographs. This feature is helpful, especially for photographers who have to display their work physically and digitally.

See the article here:
How to Print Pictures From Google Photos on Android and iPhone - Guiding Tech

#ThinkBeforeYouClick: Wasabi’s IT Hero Nate Returns to Warn About the Dangers of Ransomware – PR Newswire

The cornerstone of the campaign is Nate's latest music video, the "Ballad of Ransomware," in which he educates the public about common ransomware pitfalls like "clicking on something stupid," not updating passwords, falling for African prince email scams, or anything online that "seems too good to be true." Nate's signature style is used to break through the monotony of typical cybersecurity training that still has not helped slow down the number of ransomware attacks faced today. According to The Long Road Ahead to Ransomware Preparedness, a new Enterprise Strategy Group (ESG) survey of IT and cybersecurity professionals, 79% of respondent organizations reported having experienced a ransomware attack within the last year.

"Cybercriminals extort millions of dollars every year through ransomware with no signs of slowing down. Any person with a computer is at risk for an attack, and that leaves businesses vulnerable every day. We need to find a more relevant way to communicate the danger of cybercriminals with employees at all types of organizations," said Julie Barry, Vice President of Global Brand and Communications, Wasabi Technologies. "Through Nate's voice and his fresh approach, we want IT teams to know that Wasabi has their backs. We don't just offer immutable cloud storage, but also educational resources that support their teams in the mission to combat ransomware in their organizations."

"Data is without a doubt a company's most valuable asset, and protecting that data from ransomware is a top priority for me and my team," said Kyle Burnette, Director of IT Infrastructure & Security, BrightStar Care. "What I love so much about Wasabi's #ThinkBeforeYouClick campaign is that it calls out the everyday cyber tricks in a funny, relatable way that holds people's attention and reminds them to think twice about their online interactions."

It's clear that ransomware is not a matter of if, but when. Protect yourself with proven security best practices, including regular ransomware awareness training and a robust backup and recovery strategy with immutable cloud storage. For more resources, visit wasabi.com/thinkbeforeyouclick and wasabi.com/ransomware.

About Wasabi Technologies

Wasabi provides simple, predictable and affordable hot cloud storage for businesses all over the world. It enables organizations to store and instantly access an unlimited amount of data at 1/5th the price of the competition with no complex tiers or unpredictable egress fees. Trusted by tens of thousands of customers worldwide, Wasabi has been recognized as one of technology's fastest-growing and most visionary companies. Created by Carbonite co-founders and cloud storage pioneers David Friend and Jeff Flowers, Wasabi has secured nearly $275 million in funding to date and is a privately held company based in Boston. Wasabi is a Proud Partner of the Boston Red Sox, and the Official Cloud Storage Partner of Liverpool Football Club and the Boston Bruins.

Follow and connect with Wasabi onTwitter,Facebook,Instagram, and ourblog.

Wasabi Technologies PR contact:

Kaley CarpenterInkhouse for Wasabi[emailprotected]

SOURCE Wasabi Technologies

View original post here:
#ThinkBeforeYouClick: Wasabi's IT Hero Nate Returns to Warn About the Dangers of Ransomware - PR Newswire

Visualizing the 5 Pillars of Cloud Architecture The New Stack – thenewstack.io

Dan Lawyer

Dan Lawyer, chief product officer at Lucid Software, is passionate about creating value by solving problems in delightful ways. Prior to Lucid, he led product and design organizations at Adobe, Ancestry and Vivint.

Getting the most value out of an organizations cloud infrastructure can be a daunting task. But the key considerations can be winnowed down to an easy-to-remember acronym: CROPS, which stands for cost optimization, reliability, operational excellence, performance efficiency and security. (Sometimes, the five cloud pillars are called CORPS, which is the same thing, just in a different order.)

These five pillars are proven guidelines through which companies can design, evaluate and implement cloud architecture in a way that can most effectively scale, ensuring compliance with the relevant standards and saving money over time. And one of the best ways to implement CROPS principles is with real-time cloud visualization. A complete, real-time understanding of your cloud environment will ensure your resources are best utilized and your CROPS get the attention they need.

There are many different aspects of cloud computing that can get costly, such as infrastructure, downtime and staffing. The way to get the most value out of your cloud infrastructure and minimize costs is to eliminate unused components and refine suboptimal processes. This begins by knowing what youre paying for or in other words, knowing exactly whats in your cloud infrastructure.

Companies that want to know the content of their cloud infrastructure pour resources into an effort to visualize that infrastructure. This is where cloud visualization can play a critical role. The right cloud visualization solution will work seamlessly with your Amazon Web Services, Google Cloud Storage or Azure cloud environments to build an inventory of your cloud components. Then through automation, a resulting diagram can allow you to see all the relationships between resources in your current cloud environment, making it easy to identify where costs can be cut.

When engaging in a cost optimization exercise, such as analyzing opportunities for cost reduction, its also important to remember that time is money. For example, if something is misconfigured, gets hacked or malfunctions, that could cause costly downtime. Cloud visualization will allow you to compare the intended state of your cloud with its current state through filters. This way, you can more quickly identify the malfunctioning areas and address them without significant downtime.

If you understand your cloud infrastructure, you can more confidently ensure your customers can rely on your organization. With the ability to constantly meet your workload demands and quickly recover from any failures, your customers can count on you to consistently meet their service needs with little interruption to their experience.

A great way to increase reliability in your cloud infrastructure is to set key performance indicators (KPIs) that allow you to both monitor your cloud and alert the proper team members when something within the architecture fails. Using a cloud visualization platform to filter your cloud diagrams and create different visuals of current, optimal and potential cloud infrastructure allows you to compare what is currently happening in the cloud to what should be happening.

When you can quickly identify and fix problems, youll be able to maintain uptime and establish ongoing reliability.

Striving for operational excellence means creating an environment for your cloud to always function at its best, and this includes continuous improvement. If you neglect to upgrade products or processes to help your cloud environment function at higher levels, you put a ceiling on the levels to which your business can ascend.

Its essential to do constant research to see where and how you can improve your cloud infrastructure and environment. However, improvement doesnt have to be a massive overhaul. Keep improvements small and continuous to balance the need for upgrades while minimizing downtime. One way to help identify these opportunities for improvement is through a cloud visualization platform that allows real-time discussion about improving your cloud environment.

Cloud visualization can enable different presentations of your environment to understand different scenarios. For example, you may need planning architecture designs for communication with engineers, architects and coders. On the other hand, you may need easy-to-understand, simplified diagrams for nontechnical stakeholders from whom you need buy-in. A high-quality cloud visualization solution should be able to automatically generate these different views.

Many factors can impact cloud performance, such as the location of cloud components, latency, load, instance size and monitoring. If any of these factors become a problem, its essential to have procedures in place that result in minimal deficiencies in performance. For example, if you have cloud components in different locations, a malfunction in one region shouldnt lead to severe downtime and service disruption throughout your whole cloud environment.

The ability to analyze horizontal and vertical scaling structures is invaluable. Ask yourself: How much do we have in our cloud infrastructure? Where is each component and are they working best where they currently reside? When companies can access a comprehensive, dynamic view of their entire cloud environment, they will better understand where each dependency is. You can visualize your auto-scaling groups and compute instance sizes, availability zones and relationships between resources. Then you will be better able to decide how you need to adjust your cloud infrastructure to improve performance.

Each cloud architecture must be able to protect the confidentiality and integrity of your information, systems and assets. A robust and proactive approach to security will also ensure that your organization maintains compliance with all government regulations and cloud security standards, such as General Data Protection Regulation, System and Organization Controls 2, Payment Card Industry Data Security Standard and Health Insurance Portability and Accountability Act. Companies may face disruptive technical debt if they dont realize the need to meet these standards until after deployment.

This is another place where cloud visualization can play a critically valuable role. With real-time visuals, you can stay on top of your cloud security, cloud compliance and internal best practices by visualizing and overlaying your metadata in the context of your diagram. Such metadata may include instance names, security groups, IP addresses and more.

For example, you can develop categories to visualize your cloud based on sensitivity levels and mechanisms such as encryption, tokenization and access control. Additionally, you can use cloud visualization to document where data is stored and how its transmitted.

Last, you can set up conditional formatting in your cloud visualization solution that allows easy identification of security issues, such as unencrypted databases, instead of taking an extended amount of time to search for these issues in your cloud. Conditional formatting can also be valuable in other pillars, such as determining what resources are underperforming or what cloud components are a waste in cost.

Each of the five pillars of cloud architecture plays a vital role in optimizing your cloud environment. Following these principles can help avoid wasting time and money. The ability to dynamically visualize your complex cloud infrastructure and enable real-time collaboration within your cloud environment will help you gain clarity and communicate what is needed to adhere to each of these pillars across your organization. Each stakeholder should be left with little question about the current and future state of your organizations cloud environment, making it easier to build and maintain and pursue future organizational growth.

Featured image via Pixabay.

See the original post here:
Visualizing the 5 Pillars of Cloud Architecture The New Stack - thenewstack.io

5 tips on how to select the most secure backup solution – Wire19

The need for a backup plan is becoming more important than ever. Keeping your systems up and running without interruption is no easy task. Systems are constantly changing; data grows daily with new insights coming out every day it makes sense that you would need a backup plan for these changes! With ransomware on the rise, its not enough to simply have a traditional backup solution in place. Managers must employ new-generation strategies that protect their businesses most valuable assets or else risk suffering costly downtime!

The following rules will help managers choose the most suitable cyber backup solution for their organization.

Investing in a modern backup solution can save your business bottom line and reputation. With the most secure data protection solutions, businesses will find themselves with an all-inclusive approach towards securing their information with cybersecurity and backup solutions that work efficiently with other defenses like cloud storage. This will enable organizations to stay ahead of todays threats by having powerful tools at each stage.

Source: Acronis

Read next:Do the different types of cybersecurity terms confuse you? Know how each of them differ.

View post:
5 tips on how to select the most secure backup solution - Wire19

Western Digital: The flash roadmap Blocks and Files – Blocks and Files

Western Digital execs revealed its disk and flash/SSD roadmaps at a May 10 Investor Day event. We covered the disk part of this in a previous article. Here we look at flash.

EVP Robert Soderbery, head of the flash business unit, talked about growing the capacity of a flash die by increasing the layer and shrinking the lateral dimensions of a cell. The latter means that more cells can fit in a layer and thus fewer layers are needed to reach a set capacity level. A slide showed a 10 per cent increase in lateral density through a 40 percent reduction in cell size.

President of Technology Siva Sivaram used a slide showing that Western Digitals 162-layer NAND, at a 1Tbit x4 die capacity and 100TB wafer size,had a 68mm2 cell size compared to Kioxia and Seagates 69.6 and 69.3mm2 cell size and they are building 176-layer NAND.

Sivaram said Western Digitals charge trap NAND cell had a 40MB/s program performance vs competitors 60MB/sec. He presaged 200+layer 3D NAND coming, calling it BiCS+. We have previously understood this to be 212-layers and called BiCS 7. A Sivaram slide showed what looked like string-stacking, called multi-bonding, and penta-level cell (PLC 5bits/cell) technologies coming.

BiCS+ will have 55 percent more bits/wafer than BiCS 6 (162-layer), a 60 percent better transfer speed and 15 percent more program bandwidth.

Sivarams 3D NAND roadmap showed a route to 500+ layers in 2032.

Western Digital makes three main classes of SSD consumer, client and cloud (enterprise), along with automotive and IoT drives using its own NAND, controllers, and firmware. It has a 37 percent share of the consumer SSD market, 20 percent of the client market, but only 8 percent of the cloud market. Soderbery wants to get that cloud market share higher, to 16 percent, and says the cloud SSD market is separating into three segments: compute (for cache and direct access), storage (capacity-optimized), and boot and journaling (endurance-optimized).

The BiCS 4 (96-layer) was good for storage (TLC 3 bits/cell) and boot segments, and BiCS 5 (112-layer) was good for storage (TLC and QLC) and boot. BiCS 6 (162-layer) will be good for compute, storage (TLC, QLC), and boot.

Soderbery sees a significant consumer SSD opportunity as flash replaces 2.5-inch disk. In 2022 62 percent of consumer drives were disk and 38 percent SSDs. In 2026 that is forecast to have changed to 30 percent disk and 70 percent SSD. He thinks there is a 100EB opportunity in this disk-to-SSD transition with consumer SSDs having a greater than 45 percent CAGR from 2022 to 2025.

Overall WD has a 14-16 percent overall SSD market share target. Wells Fargo analyst Aaron Rakers noted that WD is forecasting flash capacity shipped in the cloud to grow at ~37 percent year-on-year from 2022 to 2027, and told subscribers: This is where WDs qualification/ramp of their NVMe SSDs is a key focus for the company.

That is probably Soderberys key goal: get the cloud/enterprise NVMe SSD sales up while not foregoing growth in the consumer and client markets. WD is betting that its cell density and layer count advantages will translate to better price/performance and so enable it to win share, grow its business, and, maybe, fend off activist investor Elliott Management.

Read more from the original source:
Western Digital: The flash roadmap Blocks and Files - Blocks and Files

Global cloud spending to hit $495bn in 2022, Gartner says – The National

Global spending on public cloud services is expected to jump 20.4 per cent annually to $495 billion this year, as businesses expedite the pace of their digital transformation in the post-Covid era, US researcher Gartner has said.

Total spending is nearly $84bn more than the amount spent in 2020 and is expected to surge nearly 21.3 per cent yearly to almost $600bn next year.

Cloud is the powerhouse that drives todays digital organisations, said Sid Nag, research vice president at Gartner.

CIOs [chief information officers] are beyond the era of irrational exuberance of procuring cloud services and are being thoughtful in their choice of public cloud providers to drive specific, desired business and technology outcomes in their digital transformation journey.

For businesses, moving to a cloud system hosted by a specialised company such as Oracle, Amazon Web Services or SAP is more economical than creating their own infrastructure of servers, hardware and security networks, industry experts said. It also brings down the overall cost of ownership.

In overall cloud spending, infrastructure-as-a-service software is forecast to experience the highest end-user spending growth this year at 30.6 per cent. It will be followed by desktop-as-a-service at 26.6 per cent and platform-as-a-service at 26.1 per cent, Gartner predicted.

In cloud industry, businesses pay only for those selective services or resources that they use over a period of time.

The new reality of hybrid work is prompting organisations to move away from powering their workforce with traditional client computing solutions, such as desktops and other physical in-office tools and opt for the latest cloud solutions, the Connecticut-based market researcher said.

In the Middle East and North Africa, end-user spending on public cloud is forecast to reach $5.8bn this year, growing 18.8 per cent year-on-year.

Several global players are establishing data centres in the region as the cloud market picks up.

In 2020, IBM unveiled two data centres in the UAE, making its first foray into the Middle East and Africa cloud storage market. In 2019, Amazon Web Services opened three data centres in Bahrain.

Germany's SAP has centres in Dubai, Riyadh and Dammam, which house servers for local cloud computing clients.

Alibaba Cloud a comparatively smaller player and the cloud computing arm of the Chinese e-commerce company opened its first regional data centre in Dubai in 2016.

Public cloud services have become so integral that providers are now forced to address social and political challenges, such as sustainability and data sovereignty, Mr Nag said.

IT leaders who view the cloud as an enabler rather than an end state will be most successful in their digital transformational journeys the organisations combining cloud with other emerging technologies will fare even better, he added.

Updated: May 10, 2022, 4:52 AM

Link:
Global cloud spending to hit $495bn in 2022, Gartner says - The National

The limits and risks of backup as ransomware protection – ComputerWeekly.com

Ransomware has pushed backup and recovery firmly back onto the corporate agenda. Without a sound backup and recovery strategy, firms have little chance of surviving a ransomware attack, even if they pay the ransom.

IBM, for example, named ransomware as the leading cyber security threat in 2021, accounting for 23% of all cyber attacks.

This has forced CIOs to revisit their backup and recovery strategies, says Barnaby Mote, managing director at online backup provider Databarracks. The paradox is that ransomware has brought backup and recovery back into focus, he says. If you go back five years, it was a hygiene issue, and not on the CIO or CEO agenda. Now it is again.

High-profile attacks against organisations including shipping company Maersk and US oil network Colonial Pipeline have focused attention on the risks posed by this type of cyber attack and prompted organisations to invest in cyber defences.

But ransomware is becoming smarter, with double- and triple-extortion attacks, and techniques that allow the malware to remain undetected for longer. This puts pressure on that other essential defence against ransomware good data backups.

The other factor that has changed dramatically is that when you get a ransomware infection, it doesnt always trigger immediately, says Tony Lock, analyst at Freeform Dynamics. You might find that the ransomware has been in your system a long time before you noticed it, but its only now theyve triggered it and everythings encrypted.

As a result, organisations have to go back further in time to find clean backups, stretching recovery point objectives (RPOs) to the point where the business is put at risk, or its leaders might even feel they must pay the ransom. How far do you need to go, says Lock, so that when youre doing a recovery from your copies, you make sure youre not bringing the infection back with you?

As Lock suggests, when organisations deal with a ransomware attack, one of the greatest risks is reinfecting systems from a compromised backup. Some of the industrys tried-and-tested backup and recovery and business continuity tools offer little protection against ransomware.

Snapshots record the live state of a system to another location, whether that is on-premise or in the cloud. So, if ransomware hits the production system, there is every chance it will be replicated onto the copy.

Conventional data backup systems face the same risk, copying compromised files to the backup library. And malware authors are adapting ransomware so it actively targets backups, prevents data recovery, or immediately targets any attempt to use recovered files by encrypting them.

Some ransomware Locky and Crypto, for example now bypass production systems altogether and go straight for backups, knowing that this puts the victim at a real disadvantage. This has forced organisations to look again at their backup strategies.

One option is to use so-called immutable backups. These are backups that, once written, cannot be changed. Backup and recovery suppliers are building immutable backups into their technology, often targeting it specifically as a way to counter ransomware.

The most common method for creating immutable backups is through snapshots. In some respects, a snapshot is always immutable. However, suppliers are taking additional measures to prevent these backups being targeted by ransomware.

Typically, this is by ensuring the backup can only be written to, mounted or erased by the software that created it. Some suppliers go further, such as requiring two people to use a PIN to authorise overwriting a backup.

The issue with snapshots is the volume of data they create, and the fact that those snapshots are often written to tier one storage, for reasons of rapidity and to lessen disruption. This makes snapshots expensive, especially if organisations need to keep days, or even weeks, of backups as a protection against ransomware.

The issue with snapshot recovery is it will create a lot of additional data, says Databarracks Mote. It will work, but has a large impact on the storage you need, and there is the cost of putting it on primary storage.

Another way to protect against ransomware is to air gap storage, especially backups. In some ways this is the safest option, especially if the backups are stored off-site, on write-only (WORM) media such as optical storage, or even tape.

Personally I like air gaps, says Freeforms Lock. Id like the backup to be on something that is totally air-gapped take a copy on tape and put it somewhere. Preferably with logical and physical air gaps.

The disadvantage of air gaps, especially physical air gaps with off-site storage, is the time it takes to recover data. Recovery time might be too long to ensure business continuity. And if IT teams have to go back through several generations of backups to find ransomware-free copies, the cost of recovering lost data can be high, maybe even higher than the cost of the ransom.

Time to restore, at scale, is now key, says Patrick Smith, field CTO, Europe, Middle East and Africa (EMEA) at Pure Storage. This may mean specific solutions for the business-critical applications that need to be online first.

Suppliers are trying to work round this through virtual air-gapped technology, which allows backups to be stored on faster local (or cloud) storage. But for businesses with the most critical data, it is likely that only fully immutable and air-gapped backups will suffice, even if it is as a second or third line of defence.

However, CIOs are also looking to augment their backup tools with security measures aimed specifically at ransomware.

Perhaps the greatest risk to an organisation with a solid backup policy is unwittingly re-infecting systems from ransomware hidden in backups.

Firms need to put measures in place to scan backups before they restore to a recovery environment, but again this takes time. And malware authors are adept at hiding their trails.

Anomaly detection is one route suppliers are exploring to check whether backups are safe. According to Freeform Dynamics Lock, machine learning tools are best placed to pick up changes in data that could be malware. This type of technology is increasingly important as attackers turn to double- and triple-extortion attacks.

You need to make data protection, observability and checking for anomalies a continuous process, he says.

Link:
The limits and risks of backup as ransomware protection - ComputerWeekly.com