Category Archives: Cloud Storage

Backup done right on this World Backup Day – Times of India

As the world around us becomes increasingly reliant on technology, it is imperative to understand the importance of backing up data. In reality, most backup platforms are not keeping pace with todays demands, making storage more complex for the user. The amount of data generated in the global data sphere is expected to grow exponentially - reaching 221ZB in 2026. Yet around one-third of individuals do not back up their data. World Backup Day, observed on 31st March every year, is a reminder for all individuals - from content producers to large-scale enterprises to those in varied sectors like healthcare, finance, and personal identifiable information (PII) to safeguard their digital heritage for future generations. There is only one way to ensure your data is safe and that is to back it up regularly.What is the best approach to adopt when creating a backup strategy or reassessing an existing one? A 3-2-1 approach may serve as a baseline for maintaining data integrity - 3 copies of data, on two types of storage media (disk, cloud/tape), and 1 copy offsite (cloud servers). Combining cloud and on-premises backup technologies will ensure businesses data sets, applications, environments, and users are kept safe and secure. This includes copying data from internal systems to cloud storage, utilising hybrid clouds, etc. With the advent of generative AI applications like chatbots, companies will be able to train these on their own internal data with active. . With mass data movement from the edge to the cloud for AI and ML to happen, cloud backup helps keep valuable information and data sets safe. . Combining the existing backup technologies is the right way forward for organisations that seek to be future ready.The Hybrid Approach - Cloud, On-Premises, or BothThere is a growing fragmentation of data across disparate infrastructures, including on-premises, cloud, and end-user devices, making it increasingly difficult to ensure successful backups. Most traditional backup methods require different tools to back up data for different environments. With a hybrid approach, you can take advantage of both cloud and on-premises backup. The two can often be complementary. Cloud scalability and security can be leveraged without compromising on-premises control. As a result of this possibility, several cloud-based solutions are available, including S3-intergated storage-as-a-service for multicloud environments, fully managed data migration services, and seamless data transfers between the edge and the cloud. As well as allowing secondary data sets to be stored remotely, cloud backups can also serve as another type of media, provide another location for primary backup or archives, or even serve as a consolidation for many distributed backups, allowing for greater accessibility and reduced manual intervention. A compelling S3 alternative with always on cloud storage is offered by the cloud, complementing any existing multicloud deployment. Taking a hybrid approach to data protection, backup, and recovery will help ensure business continuity as well as deliver cost effective data backup solutions.Ensuring business continuity with cloud storage backupTo accommodate the rapidly changing needs of enterprises, cloud storage for backup is available in many forms - public, private, hybrid, and multicloud. Ransomware protection, enterprise-grade identity management, and data encryption at rest and in flight, safeguards critical enterprise data from loss or theft. Since most businesses are moving to multicloud, using object storage that is designed to support multicloud deployments would be a smart move. Cloud-based object storage can help businesses manage massive amounts of unstructured data with ease, sophisticated management, and scalability. Data backup is the best line of defence against cyber threats As organisations systems become more complex, cyberattacks have increased significantly.. Decentralised storage, which lowers the risk of cyberattacks because data is distributed, can help them stay ahead of evolving cybersecurity threats by providing immutable storage that cannot be edited or deleted.The 3-2-1 backup strategy with on-premises and in the cloud with a trusted data storage solutions provider, is a good first step for companies when developing or revising a data backup plan. In this manner, the best value per byte, leading capacity, and proven reliability, can be achieved regardless of the number of backups required (over and above the initial 3); storage type (SAN, cloud, etc), and location (on-premises or off-premises).With always-on cloud storage designed to complement an existing multicloud environment, data access, security, efficiency, and compliance can be improved while empowering businesses to activate backups.By: Sameer Bhatia, Director of Asia Pacific Consumer Business Group, and Country Manager for India & SAARC, Seagate Technology

The rest is here:
Backup done right on this World Backup Day - Times of India

Google Photos: How to free up space and delete safely backed-up … – 9to5Google

Storage as a whole throughout Googles Android ecosystem flows much better in recent years. Google Photos, for instance, has a couple of dedicated tools that will do the heavy lifting and delete safely backed-up images from local storage, freeing up precious space. Heres how to use them.

Since Google Photos operates as a cloud-first photo library, theres much less need for users to store photos locally on their devices. A photo stored on the device simply takes up space, though one could argue that locally stored photos are better quality. Fortunately, Google lets you back up images in original quality, though that is going to impact your cloud storage.

Backed-up photos take very little time to pull up and view, though youll need at least some internet connection to do so. That, in combination with the innate ease of sharing images through the cloud, makes a good case for opting to rely on cloud-store images rather than going the local route.

Of course, we cant ignore the security concern with cloud photos, though that risk lives everywhere cloud storage is used.

Once a photo is successfully backed up, the original local file can be deleted from your device. Over time, that can become tens of GBs of photos and video that are essentially stored in two locations in the cloud and on your phone. Using a tool called Free up space in Google Photos, you can let the app automatically remove any local file that has a safely backed-up copy in the cloud.

Once you start the process, Google Photos will take care of the rest. Remember, it will only delete photos that have a copy in the cloud. Any photo that is local only will be safe where it is.

The deletion process can take as little as three seconds or ten minutesit solely depends on how much is being deleted.

If you find that none of your photos are backing up to the cloud or they are but arent in original quality, you may have to visit your backup settings in Google Photos.

Ensure the toggle is set to on and that youre properly signed into the Google Account you want your images to backup to especially if you pay for extra storage with Google One. Toward the bottom, youll see several options:

Here, you can adjust the amount of compression taking place when photos are backed up as well as if they back up during mobile data usage. The last setting lets you choose which folders backup automatically.

If you take a lot of screenshots but dont necessarily want them taking up cloud space, you can turn that folder off. The same goes for documents, downloaded images, and images received in Google Messages or other messaging apps.

Note: If you delete an image in Google Photos that isnt backed up to the cloud, that photo only has a limited time before its no longer recoverable. Head to Library > Trash to recover photos that have been deleted.

Either way, storage is a precious thing for modern devices. Googles Free up space tool is useful, especially in conjunction with taking full control of what gets backed up.

FTC: We use income earning auto affiliate links. More.

See the original post here:
Google Photos: How to free up space and delete safely backed-up ... - 9to5Google

How to Transfer Photos from iCloud to Google Photos – The Mac Observer

If you want to move your photos from iCloud to Google Photos, youre in luck. Since 2021, Apple has made the process of moving your photos from iCloud to Google Photos quite easy. In this guide, you will learn how to transfer your photos from iCloud to Google Photos.

There are several reasons why you may decide to transfer photos from iCloud to Google Photos. You may not be canceling your iCloud subscription, instead maintaining the two cloud storage services. Perhaps youve decided to use Google Photos as a backup.

If thats not why you are moving your photos from iCloud to Google Photos, perhaps youve decided to abandon Apples cloud storage service in favor of Google Photos. I wouldnt be surprised if thats your decision. While both are equally good cloud storage solutions, a Google Photos subscription is priced a bit lower than iClouds.

That being said, let me walk you through the steps to transfer your photos to Google Photos.

As mentioned, in 2021, Apple made it easier to migrate your photos from iCloud to Google Photos. By easier, I mean you no longer have to download the photos to your Mac and then upload them to Google Photos.

Instead, Apple will take care of the transfer at your request. See the steps below to request Apple to send a copy of your data to Google Photos.

Follow the steps below to request Apple to transfer a copy of your iCloud data.

Once youve completed all the steps above, it will take 3 to 7 days before your photos are transferred to Google Photos. In case you decide to cancel the transfer, all photos that have already been transferred to Google Photos will remain on Googles servers.

Requesting Apple to transfer photos and data from iCloud to Google Photos can be done by iCloud users in more than 240 countries. Note that transferring photos does not erase them from iCloud.

Instead, Apple will send a copy of your photos to Google. That also means that Apple will not alter your photos in any way. The copy sent to Google is identical to what you stored on iCloud.

Another thing to remember here is that some data and formats used with iCloud Photos may not be available in Google Photos. These include Live Photos, Smart Albums, and RAW image file support.

Apple maintains a list of file formats that it can transfer to Google Photos in a support document. You may want to check the fine print of the process first before you proceed.

Finally, before you go ahead with requesting Apple to transfer your photos from iCloud to Google Photos, make sure that you follow some more requirements listed below.

Unfortunately, since launching the service, Apple has not offered this kind of support for any additional cloud storage services However, that doesnt mean that you cant transfer your photos to other services. Many, including Dropbox, offer their own migration tools to transfer your iCloud Photos library over. You can do this manually by downloading your photos from iCloud to your Mac and then uploading them to your chosen cloud storage service.

Here is the original post:
How to Transfer Photos from iCloud to Google Photos - The Mac Observer

Missing the T? Data Storage ETL an Oversight, Says KNIME CEO – Solutions Review

Rather than seeing holistically, how many people interact with content on our website, on our forum, or on social media, wouldnt it be nice to see activity grouped by the organization? Wed see not just an individuals view of the most recent blog but also her colleagues comments on LinkedIn the following day. It would be even better if we could see the connection between the twoenabling us to distinguish between high engagement on a single team or interest from a new department. Wouldnt it be great if an account manager tasked with growing a given account could spot patterns between support calls, social media comments, and online-store visitseven if some of that data came from a recently acquired company.

The biggest problem to allow for this continued making sense of (all of our) data is the nasty combination of ever-changing requirements or questions seeking an answer with ever-changing data sources that need continuous cleaning, transforming, and integrating. Without first organizing and adding structure to all those data sources, its impossible to derive interesting insights. The prominent claim that data is the new oil is surprisingly apt. Like oil, data in its raw form is initially useless only once you refine it is it valuable and useful.

But how do I get to this state of well-organized data?

The solution for this used to be to build a data warehouse i.e. define the one, and only proper structure once and for all and then live with it. When that turned out to be infeasible since data and data sources are ever-changing, data lakes became popular until they also turned out to be, well, rather messy. Then things moved to the cloud, but that didnt really solve the problem of reaching and maintaining a state of well-organized data. Instead of solving it via smart (or not so smart) storage setups, meta query or federated setups promise another answer. Still, they, too, only solve a part of the puzzle.

Keeping your data accessible goes beyond just figuring out how to store the data. Teams also need a way for transformation (the T in ETL) to happen as needed without compromising resources or time. In this piece, we argue that low-code offers exactly that flexibilitygiving anyone access to just the insights they need, as they need them.

But first, lets revisit whats been tried so far.

Data Warehouses have been the holy grail for ages but are rarely spotted in real life. The truth is that they are easy to imagine, hard to design, and even harder actually to put to work.

Lets say we came up with the one true relational model to structure all the data floating around in an organization. In an automotive plant, for instance, perhaps your database holds manufacturing data (e.g., cycle times, lot priorities), product data (e.g., demands and yields), process data (e.g., control limits and process flows), and equipment data (e.g., status, run time, downtime, etc). If you can make sure all this data is properly cleaned, transformed, and uploadedwhich is a big Ifthen theoretically, youd see immediate benefits because the architects of the data warehouse made it easy for you to ask specific questions of your data. Perhaps youd be able to reduce costs related to equipment failures. Or better optimize inventory because you become familiar with the patterns of demand versus yields. Or improve end-of-line testing for higher product quality.

But what happens when we want to add new data from a new machine? Well, we rework the relational modelsomething that is expensive, difficult, and often politically challenging. And what happens when we want to evaluate our CO2 footprint, so we need to connect data from suppliers and data from logistics? We, again, rework the relational model.

Even if people are successfully using our data warehouse to create new insights, new requirements will pop up that we did not think about when we first designed the structure of our warehouse. So rather than freezing that structure once and for all, this will quickly turn into a never-ending construction site, which will never have a coherent, consistent structure that includes all current data of interest. This will, at the very least, delay finding the answers to new questions but more likely make it simply impossible. Not at all the agile, self-service data warehouse we had in mind when we started this project years ago.

After data warehouses, the industry came up with the idea of a data lake dont worry about structure (not even in the data itself), just collect it all and figure out later how to organize it when you actually need it. That was made possible by increasingly cheap storage facilities and NoSQL storage setups. Distributed mechanisms to process this data were also developed, MapReduce being one of the most prominent examples back then.

Our manufacturing, product, process, and equipment data is never cleaned or transformed but dumped, as-is, into one centralized storage facility. When analysts want to make sense of this data, they rely on data engineers to custom-build solutions that include cleaning and transforming for each bespoke question. Although we dont need to rebuild an entire relational model, data engineers do need to be involved in answering each and every business question. Also, an old problem resurfaced: lots of data keeps sitting across the organization in various formats and storage facilities, and even newer data continues to be generated outside of that swamp.

Data Lakes force us, just like data warehouses, to ensure all data sits within that one house or lake; we just dont need to worry about structure before moving it there. And thats precisely the issue the organizing of data into the proper structure still needs to be done; it just gets done later in the process. Instead of structuring the warehouse upfront, we now need to deal with the mechanisms to add structure to the data lake at the time when we look for insights in our data. And we need the help of data engineers to do that.

The next generation of this type of setup moved from on-premise distributed storage clusters to the cloud. The rather limiting map-reduce framework gave room to more flexible processing and analysis frameworks, such as Spark. Still, the two main problems remained: Do we really need to move all our data into one cloud to be able to generate meaningful insights from all of our data? And how do we change the structure after its been living in our data lake? This may work for a new company that starts off with a pure cloud-based strategy and places all of its data into one cloud vendors hands. Still, in real life, data has existed before, outside of that cloud, and nobody really wants to lock themselves in with one cloud storage provider forever.

One big problem of all the approaches described so far is the need to put it all into one repository may that be the perfectly architected warehouse, my inhouse data lake, or the swamp in the cloud.

Federated approaches try to address this by leaving the data where it is and putting a meta layer on top of everything. That makes everything look like it all sits in one location but under the hood it builds meta queries ad hoc, which pull the data from different locations and combine them as requested. These approaches obviously have performance bottlenecks (Amdahls law tells us that the final performance will always depend on the slowest data source needed) but at least they dont require a once and for all upload to one central repository. However, querying data properly is much more than just building distributed database queries. Structuring our distributed data repositories properly for every new query requires expert knowledge for all but basic operations.

The central problem of all these approaches is the need to define the overall structure, e.g. how all those data storage fragments fit together. Beforehand in case of data warehouses, at analysis time for data lakes, through automatic query building for federated approaches.

But the reality is different. In order to truly aggregate and integrate data from disparate sources we need to understand what the data means so we can apply the right transformations at the right time to arrive at a meaningful structure in reasonable time. For some isolated aspects of this, automated (or even learning) tools exist, for instance for entity matching in customer databases. But for the majority of these tasks, expert knowledge will always be needed.

Ultimately, the issue is that the global decision of how we store our data is based on a snapshot of reality. Reality changes fast, and our global decision is doomed to be outdated quickly. The process of extracting insights out of all available data is bottlenecked by this one be-all-end-all structure.

This is why the important part of ETL, the Transformation is either assumed to have been figured out once and for all (in data warehouses), completely neglected (in data lakes), or pushed to a later stage (in federated approaches). But pushing the T to the end has, despite making it someone elses problem, a performance impact as well. If we load and integrate our data without proper transformations we will often create extremely large and inefficient results. Even just ensuring database joins are done in the right order can change performance by several orders of magnitude. Imagine doing this with untransformed data, where customer or machine IDs dont match, names are spelled differently, and properties are inconsistently labeled. Its impossible to get all of this right without timely domain expertise.

Transformation needs to be done where it matters and by the person who understands it.

Low Code allows everybody to do it on the fly, SQL or other experts inject their expertise (code) where its needed. And if a specific type of load, aggregate, transform process should be used by others, its easy to package it up and make it reusable (and also auditable if needed because its documented in one environment the low code workflow). Low-code serves as a lingua franca that can be used across disciplines. Data engineers, analysts, and even line-of-business users can use the same framework to transform data at any point of the ETL process.

Should that low code environment be an integral part of (one of) our data storage technologies? Well, nounless we plan to stick with that data storage environment forever. Much more likely well want to keep the door open to add another type of data storage technologies in the future or maybe even switch from one cloud provider to another one (or go a completely hybrid path and use different clouds together). In that case a low code environment, which after all, is home to lots of our experts domain expertise by now, should make it easy to switch those transformation processes over to our new data environment.

Why did warehouses fail and data lakes dont provide the answer either? Just like with software engineering, the waterfall system doesnt work for dynamic setups with changing environments and requirements. It needs to be agile, explorative when needed, and documentable/governable when moved into production. But since data transformation will always require expertise from our domain experts, we require a setup that allows us to add this expertise continuously to the mix as well.

In the end, we need to provide the people who are supposed to use the data with intuitive ways to create the data aggregations and transformations themselves from whatever data sources, however they want. And at the same time we want to keep the doors open for new technologies that will arise, new tools that we want to try out, and new data sources and types that will show up.

Michael Berthold is co-founder of KNIME, the open analytics platform. He recently left his chair at Konstanz University and is now CEO of KNIME AG. Before that, he held positions in both academia (Carnegie Mellon, UC Berkeley) and industry (Intel, Tripos). He has co-authored several books (the second edition of the Guide to Intelligent Data Science appeared recently), is an IEEE Fellow, and a former president of the IEEE-SMC society.

Follow this link:
Missing the T? Data Storage ETL an Oversight, Says KNIME CEO - Solutions Review

Why are Rowan emails moving to the cloud? – The Whit Online

Information Resources and Technology (IRT) announced that students and faculty using Rowan-managed Windows computers and emails ending in @rowan.edu would be moving to the Microsoft cloud-hosted service Exchange Online on the weekend of March 11.

Exchange Online is a cloud-hosted service from Microsoft that provides email and calendaring. Exchange Online also offers an increased storage capacity of 100 gigabytes per mailbox, an upgraded Outlook for web experience and better reliability and accessibility.

Assistant Director of Communications for IRT, Erin ONeill, explained the universitys reasoning for switching to cloud services.

Rowan University is working to build greater elasticity, resiliency and agility into our campus infrastructure, as well as ensure accessibility and realize greater efficiency. One of the ways we are achieving those goals is by migrating new and existing services, when appropriate, to the cloud. Email for employees and medical students was previously hosted in Rowan Universitys data centers, ONeill said.

This move primarily affects medical students at Rowan-Virtua SOM, CMSRU and some graduate students whose data and information were previously housed in Rowans data centers. Undergraduate students and the remaining graduate students with email addresses ending in @students.rowan.edu will continue to have their information stored on the Google Cloud.

Those impacted by this change must upgrade their windows application to Microsoft 365 either on their own or by selecting the upgrade prompt sent from IRT.

Data stored in cloud storage is transferred to physical servers maintained by third-parties who are in charge of keeping that information safe and accessible to users through public and private internet connections.

These students may still access their email as they always did, but they may have had to take steps to reconfigure their access following the move to Exchange Online. You can find more information on our website at go.rowan.edu/email, ONeill said.

Cloud servers such as Microsofts are considered to be an extremely secure environment to store data as they meet the highest strict security requirements.

By moving those accounts to Exchange Online, we were able to increase storage capacity, reduce ongoing costs, provide an improved online experience for accessing email and improve the reliability and accessibility of email, ONeill said.

For comments/questions about this story tweet @TheWhitOnline or email thewhit.newseditor@gmail.com

Related

Continue reading here:
Why are Rowan emails moving to the cloud? - The Whit Online

Cloud Security Market is Set to Grow at a CAGR of 13.9% Leading to … – AccessWire

WILMINGTON, DE / ACCESSWIRE / March 31, 2023 / Transparency Market Research Inc. - According to TMR, the global cloud security market is estimated to grow at a CAGR of 13.9% during the forecast period of 2023-2031.

The market research report suggests that increase in adoption of cloud computing, rise in cyber threat landscape, and need for centralized security management have opened new avenues for market growth. Furthermore, surge in trend of online working models has created immense opportunities for business growth.

Transparency Market Research provides deep insights into company profiles, product ranges, business verticals, and developments. This exhaustive report proves to be crucial in understanding the current market scenario and helps stakeholders make proper decisions.

Get the Recently Updated Report on the Cloud Security Market as a Sample Copy at - https://www.transparencymarketresearch.com/sample/sample.php?flag=S&rep_id=197

Cloud security helps secure cloud storage from security risks such as data breaches and cyber threats. Consequently, the business trend to increase scalability and agility along with cost-effective solutions has played a vital role in influencing the global industry.

Substantial demand for cloud-native security solutions and the development of advanced IT security models such as zero-trust security have proved to be pivotal in supporting market development. The significant need to secure data has led to substantial investments in security systems. This is driving the cloud security market, according to TMR market analysis.

Cloud Security Market: Growth Drivers

Get Customization on this Report for Specific Research Solutions: https://www.transparencymarketresearch.com/sample/sample.php?flag=CR&rep_id=197

Key Findings of Cloud Security Market

Cloud Security Market: Regional Dynamics

North America is projected to dominate the global market due to the emerging trend of adoption of cloud technology across various organizations in the region. Rise in end-use applications in various industries, including IT & Telecom, manufacturing, government, retail, and e-commerce, is likely to fuel the market growth.

The market in Asia Pacific is also estimated to witness significant growth owing to the rise in security threats in the region. Additionally, substantial investments in security systems and solutions to ensure the protection of applications and data have created growth prospects for the cloud security market. Concurrently, rise in focus of government bodies of various countries such as India and China on protecting data may accelerate industry growth.

Cloud Security Market: Competitive Landscape

Key service providers are offering novel products with enhanced features to expand their market reach. Advanced technologies include the integration of applications and cloud security. Market players are focusing on R&D activities to develop technologically advanced products. Strategic collaborations with various stakeholders to enhance cloud security features have led to the simplification of the cloud computing process. This is expected to boost market progress. The competitive landscape in the cloud security market is intense, with companies competing on product quality, innovation, and customer support. Companies that differentiate themselves and offer unique value propositions are more likely to succeed in this highly competitive market.

Key Points from TOC:

Preface

1.1. Market Introduction

1.2. Market Segmentation

1.3. Key Research Objectives

2. Assumptions and Research Methodology

2.1. Research Methodology

2.1.1. List of Primary and Secondary Sources

2.2. Key Assumptions for Data Modelling

3. Executive Summary: Global Cloud Security Market

4. Market Overview

4.1. Market Definition

4.2. Technology/ Product Roadmap

4.3. Market Factor Analysis

4.3.1. Forecast Factors

4.3.2. Ecosystem/ Value Chain Analysis

4.3.3. Market Dynamics (Growth Influencers)

4.3.3.1. Drivers

4.3.3.2. Restraints

4.3.3.3. Opportunities

4.3.3.4. Impact Analysis of Drivers and Restraints

4.4. COVID-19 Impact Analysis

4.4.1. Impact of COVID-19 on Cloud Security Market

4.5. Market Opportunity Assessment - by Region (North America/ Europe/ Asia Pacific/ Middle East & Africa/ South America)

4.5.1. By Security Type

4.5.2. By Service Model

4.5.3. By Enterprise Size

4.5.4. By End-user

TOC Continued

Buy this Premium Research Report | Immediate Delivery Available - https://www.transparencymarketresearch.com/checkout.php?rep_id=197<>

Prominent players operating in the global market are:

Cloud Security Market: Segmentation

Security Type:

Service Model

Enterprise Size

End-user

Regions

Latest It & Telecom Industry Reports : -

Key Developments in Global Sports Technology Market

Key Players in Unified Communication-as-a-Service (UCaaS) Market

WebRTC Market Outlook 2031

Demand for Better Customer Experiences to Drive Global Retail Analytics Market

Key Players in Global Virtual Reality in Gaming Market

Key Developments in Peer-to-Peer Lending Industry

Increase in Construction Activities to Drive Building Information Modeling (BIM) Industry

IoT in Healthcare 2023

Chatbot Industry Trends

Size of the IT Asset Disposition Market

Industrial Artificial Intelligence Growth Statistics

eGRC Software Market Size

About Transparency Market Research

Transparency Market Research, a global market research company registered at Wilmington, Delaware, United States, provides custom research and consulting services. Our exclusive blend of quantitative forecasting and trends analysis provides forward-looking insights for thousands of decision makers. Our experienced team of Analysts, Researchers, and Consultants use proprietary data sources and various tools & techniques to gather and analyses information.

Our data repository is continuously updated and revised by a team of research experts, so that it always reflects the latest trends and information. With a broad research and analysis capability, Transparency Market Research employs rigorous primary and secondary research techniques in developing distinctive data sets and research material for business reports.

Contact:

Nikhil SawlaniTransparency Market Research Inc.CORPORATE HEADQUARTER DOWNTOWN,1000 N. West Street,Suite 1200, Wilmington, Delaware 19801 USATel: +1-518-618-1030USA - Canada Toll Free: 866-552-3453Website: https://www.transparencymarketresearch.comBlog: https://tmrblog.comEmail: [emailprotected]

SOURCE: Transparency Market Research Inc.

Read the rest here:
Cloud Security Market is Set to Grow at a CAGR of 13.9% Leading to ... - AccessWire

NetApp sheds light on state of cloud complexity and DX – SecurityBrief New Zealand

NetApp has released the 2023 Cloud Complexity Report, a global survey exploring how technology decision makers are navigating cloud requirements coming from digital transformation and AI initiatives and the complexity of multicloud environments.

The report found that 98% of senior IT leaders have been impacted by increasing cloud complexity in some capacity, potentially leading to poor IT performance, loss in revenue and barriers to business growth.

Ronen Schwartz, Senior Vice President and General Manager, Cloud Storage, NetApp, says, "Our global research report highlights paradigm shifts in how technology leaders look at and manage their cloud initiatives. As cloud adoption accelerates and businesses innovate faster to compete, technology leaders are facing growing pressure to juggle multiple priorities at once causing many to rethink how they manage efficiency and security in this new environment."

Gabie Boko, Chief Marketing Officer, NetApp, comments, "Our global survey data demonstrates the extreme complexity of modern IT environments, and the pressure technology executives are under to show measurable outcomes from cloud investments. At NetApp, we've simplified the complex through our approach, which enables technology executives to increase the speed of innovation, lower costs and improve consistency, flexibility and agility across on-premises and cloud environments."

Key findings from the report include the following:

Cloud complexity hits boiling point

Data complexity has reached a tipping point for companies globally, and tech executives are feeling the pressure to contain its impact on the business. However, technical and organisational challenges may stunt their cloud strategies, with 88% citing working across cloud environments as a barrier, while 32% struggle just to align on a clear vision at the leadership level.

In Asia Pacific, the top business impacts due to increasing complexity of data across their cloud environments are increased skepticism over cloud from leadership (47%), staff not taking full advantage of business applications (47%), increased cybersecurity risk (45%), and lack of visibility into business operations (41%), the survey shows.

Sustainability drives demand for cloud

NetApp finds, sustainability has become an unexpected cloud-driver, with nearly eight in ten tech executives citing ESG outcomes as critical to their cloud strategy. However, return on investment (ROI) is a concern among leadership, with 84% of tech executives saying their cloud strategy is already expected to show results across the organisation.

Nearly half of tech executives (49%) report that when cloud strategy discussions happen, cost concerns come up often or all the time. Data regulation and compliance is another cloud driver, with various local regulations promoting their multicloud strategy most or some of the time.

In APAC, 86% of tech executives are already expected to show results across the organisation. Furthermore, 80% of executives in APAC say cloud systems are developed with sustainability goals specifically in mind. Three out of four tech (75%) APAC executives say their multicloud strategy is driven by data sovereignty requirements.

AI increasingly considered a top option

In the next year, more than a third (37%) of tech executives report that half or more of their cloud deployments will be supported by AI-driven applications. Nearly half of tech executives at smaller companies those with fewer than 250 employees expect to reach the 50% mark in the next year, and 63% by 2030, while larger companies lag.

In APAC, 56% of tech executives report that half or more of their cloud deployments will be supported by AI-driven applications by 2030. This presents a long-term growth opportunity for AI-driven applications in the region.

Matthew Swinbourne, CTO, Cloud Architecture, NetApp Asia Pacific, comments, "APAC leaders today recognise clouds importance in producing critical business outcomes such as data sovereignty and sustainability. By addressing the cloud complexity confronting their organisations, they can unlock the best of the cloud and innovate faster to compete.

"With NetApp's unique combination of expertise, capabilities and hyperscaler partnerships, we help customers use the clouds they want, the way they want, while optimising for cost, risk, efficiency, and sustainability."

Follow this link:
NetApp sheds light on state of cloud complexity and DX - SecurityBrief New Zealand

DU Graduate Committed to Building a Better Future – University of Denver Newsroom

As the world shifted to remote work and online learning in the spring of 2020, Cordell Covington took on a major challenge: earning a masters degree while working full-time.

When he graduates this June with a Master of Science in Organizational Leadership from University College and an MBA from the Daniels College of Business, Covington will have accomplished that goal twice over.

Before arriving at DU, Covington spent four years building his leadership skills as an operations and supply chain manager in the Army, while stationed at Fort Carson in Colorado Springs. Covington, whose father served in the Air Force and brother joined the Marine Corps after high school, says that he started to consider enlisting in the Army after he transferred to Southern Illinois University Edwardsville during his undergraduate studies. Growing up in an Air Force family, I always had the itch to join the military, he says.

Already partway through earning a bachelors degree in psychology, he chose to postpone his enlistment until after graduation. I decided to stick with it, he says. I was already committed to my education, so I finished that.

With a diploma in hand, Covington moved home for three months before shipping off to basic training and later being stationed at Fort Carson. When he left the Army in 2016, Covington jumped right into a job with the U.S. Chamber of Commerce before earning management positions at Oracle and the Colorado Athletic Club.

Just as the coronavirus pandemic turned the world upside down in 2020, Covington took on a new role as strategy and business development manager with ShareScape, a medical- and legal-focused cloud storage startup.

While settling into his new job, Covington says he was drawn to University Colleges organizational leadership program, with its remote and hybrid course offerings and commitment to ethical leadership. Thats very high on my priority list, he says. Weve all seen it now and in recent times; unethical leadership can destroy an organization, destroy morale or create hostile environments. And with a concentration in strategic innovation and change, Covington combined the core tenets of leadership with the practical skills needed to affect positive change throughout the organizations he is involved with.

In 2021, Covington was accepted into the professional MBA program at Daniels College of Business and began taking classes in both programs simultaneously. After traveling to Copenhagen with his PMBA cohort, he added a custom concentration in sustainability. Inspired by the advancements in corporate social and environmental sustainability he witnessed during the trip, Covington says he wants to put those skills to use. After meeting with all the companies, I was like, Hey, I like what theyre doing here. I wanted to learn more, and to be able to bring that back here and expand on organizational sustainability goals internally, he says.

After completing both masters programs in June, Covington wants to put his leadership skills to work. I would really love to get into management consulting or to find an opportunity in diversity, equity and inclusion consulting, he says. Thats where I want to go, and Im going to try and get there with everything Ive accumulated over the last three years.

And with a passion for environmental, social and economic sustainability and the skills to match, Covington has what it takes.

Im here to make my family proud, make those who supported me along the way proud and really to have a positive impact on those Im around, my community and those who I work with, he says. Im here to implement some of the positive change that we need in the world.

Ending a three-year stint as a double masters student and working full time, Covington also plans to spend more time with his wife and their two dogs, a Weimaraner named Storm and a Belgian-Malinois mix named Atlas, get back onto the basketball court, and take some time off and enjoy a hard-earned vacation.

More here:
DU Graduate Committed to Building a Better Future - University of Denver Newsroom

StorONE boss: We’re heading towards an IPO Blocks and Files – Blocks and Files

Gal Naor, StorONE co-founder and CEO, claims the company is growing very fast and he wants an IPO, possibly within two years.

Naor was a joint founder of Storwize, the real-time compression storage software company started up in 2004 and acquired by IBM in 2010 for $140 million. As Storwize raised $38 million in funding, it was a successful exit. Naor started up StorOne as its CEO with CTO Raz Gordon the following year, 2011, and raised $30 million in funding in 2012.

The sales pitch was to remove inefficiencies from the storage software stack and produce array software that could provision file, block and object storage from the same underlying drives, delivering near full IO speed from the drives without being locked in to any class or capacity of drive type. The product was delivered eight years later.

Now, at an IT Press Tour briefing in Tel Aviv, Naor said the company was cash-flow positive. Ultimately, the aim for the company will be to demonstrate another six or more cash flow-positive quarters, however, after which it could potentially IPO. The time is not right for an IPO, Naor said. Why? Because its only just scored cash flow positivity, he told us, and the world is facing general economic problems stemming from COVID, the Ukraine-Russia war, bank problems and so forth. These factors could change by 2025 and then an IPO might be a more realistic prospect.

StorONE doesnt seem to have burnt through very much VC cash, with its tech announcements and performance announcements relatively low key.

Why does StorONE think customers are buying its product? From its point of view, the key differentiator is its storage software, the ONE storage engine, a hypervisor that it makes better use of a servers compute and memory resources and of the attached disks and solid state drives than other, legacy, storage software systems. Naor said it represents a completion of the virtualization of computer systems started by VMware (virtual machines) and SDN (software-defined networking).

It is a single all-in-one storage provisioning and managing system with block, file and object access protocols to virtual storage containers. StorONE is basically an abstracting storage entity which hides the details of storage drives and their protection from client applications and provides a single storage silo. It treats all drives as the same in that it can include different capacity drives in its virtual RAID (vRAID) system. It also supports tiering to different drive classes but does not support data reduction. Thats because drive capacity these days is cheap enough for it not to be needed, unlike the time when Storwize was being developed when drive capacity was limited and expensive.

StorONE protects its data with immutable snapshots, millions of them, providing a highly granular rollback capability. It also monitors snapshot creation rates and if these spike, perhaps due to a ransomware attack, then this is detected and alerts dispatched.

Because its software is so efficient, Naor said, StorONE gets more performance from storage drives, in IOPs terms, than competing software, he claims. That lowers its costs to customers, who buy its software based on the number of drives it manages, not the amount of capacity it provides.

He told us customers buy StorONE software for one workload, and then find it is fast enough and economical in its use of host server resources that it can run other workloads as well the classic land-and-expand idea. However, Naor would not say how many customers are using StorONEs software, nor its revenue growth rate, average capacity managed or any other statistic that might provide a hint of actual revenue numbers.

StorONEs strategy is to be the single storage engine needed by its customers. As they adopt bigger drives so too does StorONE. As they embrace the public cloud so too does StorONE. It runs on Azure and is available in Azures marketplace. Its software runs on AWS but that has not been publicly announced as available, and we understand the same is true for the Google Cloud Platform.

Customers are embracing cloud-native software development using Kubernetes and StorONE has a CSI driver.

The company has five offices and concentrated its sales growth efforts in the US. Its now broadening that to focus more on Europe and Asia, using channel partners. It has recruited certified ethical hacker Jeff Lamoche to be its chief product evangelist and in two months hes created five white papers and started running ransomware recovery workshops on StorONEs capabilities.

We think StorONEs progress is accelerating and its relative under-the-radar approach is changing. Think if it as a supplier of super-efficient Ceph-like storage that fulfils pretty much any general enterprise storage requirement.

Link:
StorONE boss: We're heading towards an IPO Blocks and Files - Blocks and Files

Public Cloud Market to hit $1 Tn by 2032, Says Global Market Insights Inc. – Yahoo Finance

Global Market Insights Inc.

Major public cloud market participants include Alibaba Group Holding Limited, SAP SE, Oracle Corporation, IBM Corporation, Tencent Cloud and Nutanix.

Selbyville, Delaware, March 29, 2023 (GLOBE NEWSWIRE) --

The public cloud market valuation is expected to reach USD 1 trillion by 2032, as reported in a research study by Global Market Insights Inc.

The growing adoption of cloud computing solutions in developing countriesis slated to have a positive impact on the industry outlook. Rapid uptake in emerging countries such as India and Singapore due to thriving digitalization and the development of advanced network grid infrastructure is playing a key role in establishing cloud-operated businesses. In September 2022, the Government of India announced an investment of USD 30 billion for digital transformation in rural areas that will ensure good quality, high-speed data connectivity across the nation.

Request for a sample of this research report @ https://www.gminsights.com/request-sample/detail/5442

The public cloud market from the IaaS segment is poised to exceed USD 350 billion by 2032. Soaring digitization and soaring inclination toward cloud-based business operations are increasing the demand for IaaS that offers inexpensive delivery of IT infrastructure, such as computing and storage. Recently, in February 2023, Tencent Cloud, a technology solutions provider, signed an MoU with Saudi Arabia-based integrated service expert Mobily to provide IaaS products, such as cloud virtual machines, storage, and network solutions, in the Kingdom of Saudi Arabia.

The public cloud market from the large enterprises segment was held more than 40% industry share in 2022, as a result of the huge amount of less-sensitive data possessed by large businesses that need adequate storage. The public cloud is affordable and supports the integration of novel measures into business models. In February 2023, computer technology behemoth Oracle announced a seven-year cloud partnership with Uber, a mobility services firm. Oracle Cloud Infrastructure agreed to aid Uber in boosting innovation, storing client data, and modernizing its infrastructure.

Story continues

Make an inquiry for purchasing this report @ https://www.gminsights.com/inquiry-before-buying/5442

The public cloud market from the media and entertainment applications segment will expand at over 12% CAGR through 2032. The adoption of public cloud is growing in the media & entertainment sector as it improves the effectiveness and quality of content streaming compared to on-premise infrastructure that hampers content quality and demands significant maintenance. To cite an instance, in July 2022, Comcast Technology Solutions announced a deployment deal of its Cloud TV Suite with Deutsche Telekom. The company expanded its cloud solution portfolio for Deutsche Telekoms Magenta TV business landscapes.

Europe public cloud market will account for over 20% market revenue by 2032, as major companies are investing heavily in technical innovations for regional infrastructure. To quote an instance, in August 2021, tech leader Google announced an investment of more than USD 1 billion to extend a new Google Cloud region in Frankfurt. The company aims to accelerate digitalization in Germany and build a sustainable economy with advanced infrastructure and clean energy.

Top participants operating in the public cloud market are Alibaba Group Holding Limited, SAP SE, Oracle Corporation, IBM Corporation, Tencent Cloud., and Nutanix. These firms are expected to focus on advancements in product capabilities and engage in promising collaborations. For instance, in May 2022, IBM Corporation, an advanced technology firm, inked a collaborative agreement with Amazon Web Services Inc., an IT service management company, to enable SaaS software on AWS. This software supports automation, security capabilities, AI & data, runs cloud-native on AWS, and is built on Red Hat OpenShift Service. This deal enabled users to access IBM SaaS software within AWS marketplace and integrate it with AWS services.

Partial chapters of report table of contents (TOC):

Chapter 2Executive Summary2.1 Public cloud 360 synopsis, 2018-20322.2 Business trends2.2.1 Total Addressable Market (TAM), 2023-20322.3 Regional trends2.4 Deployment model trends2.5 Organization Size trends2.6 Application trendsChapter 3Public Cloud Market Industry Insights3.1 Introduction3.2 Impact of COVID-193.2.1 North America3.2.2 Europe3.2.3 Asia Pacific3.2.4 LATAM3.2.5 MEA3.3 Russia- Ukraine war impact3.4 Industry ecosystem analysis3.4.1 Platform providers3.4.2 Service provider3.4.3 System integrators3.4.4 Distribution channel analysis3.4.5 End-users landscape3.4.6 Profit margin analysis3.4.7 Vendor matrix3.5 Technology & innovation landscape3.6 Patent analysis3.7 Key initiative and news3.8 Regulatory landscape3.8.1 North America3.8.2 Europe3.8.3 Asia Pacific3.8.4 LATAM3.8.5 MEA3.9 Industry impact forces3.9.1 Growth drivers3.9.1.1Increasing Integration of big data, AI and ML with cloud3.9.1.2Increasing Public cloud spending3.9.1.3Growing adoption of cloud computing solutions in developing countries3.9.1.4Rising deployment of IaaS and PaaS in SMEs3.9.1.5Cost effective and scalable3.9.2 Industry pitfalls & challenges3.9.2.1Data privacy and information security concerns3.9.2.2Cloud Wastage3.10 Growth potential analysis3.11 Porter's analysis3.12 PESTEL analysisBrowse our Reports Store - GMIPulse @https://www.gminsights.com/gmipulse

Browse Related Reports:

Multi-Cloud Security Market Size By Offering (Solution, Service), By Security Type (Data and Storage Security, Identity and Access Management (IAM), Disaster Recovery, Governance, Compliance), By Organization Size (Large Enterprises, SMEs), By End-Use (IT &Telecommunication, BFSI, Healthcare, Retail, Manufacturing, Government & Public Enterprises), COVID-19 Impact Analysis, Growth Potential, Regional Outlook, Competitive Market Share & Forecast, 2023 2032

https://www.gminsights.com/industry-analysis/multi-cloud-security-market

Open-Source Intelligence (OSINT) Market Size By Security (Human Intelligence, Content Intelligence, Big Data Security, AI Security, Data Analytics, Dark Web Analytics, Link/Network Analytics), By Technology (Big Data Software, Video Analytics, Text Analytics, Visualization Tools, Cybersecurity, Web Analysis, Social Media Analysis), By Application (National Security, Military & Defense, Private Sector, Public Sector), COVID-19 Impact Analysis, Growth Potential, Regional Outlook, Competitive Market Share & Forecast, 2023 2032

https://www.gminsights.com/industry-analysis/open-source-intelligence-osint-market

About Global Market Insights Inc.

Global Market Insights Inc., headquartered in Delaware, U.S., is a global market research and consulting service provider, offering syndicated and custom research reports along with growth consulting services. Our business intelligence and industry research reports offer clients with penetrative insights and actionable market data specially designed and presented to aid strategic decision making. These exhaustive reports are designed via a proprietary research methodology and are available for key industries such as chemicals, advanced materials, technology, renewable energy, and biotechnology.

See the original post:
Public Cloud Market to hit $1 Tn by 2032, Says Global Market Insights Inc. - Yahoo Finance