Category Archives: Cloud Storage

What Are The Risks Of Cloud Storage? – Geek Vibes Nation

Cloud storage is safe as long as you use a secure network. However, thats not the case for everyone because there are some dangers associated with cloud storage.

Cloud storage has a handful of risks, such as data security and privacy due to leaks or hacks, outages and lack of control over your data. Various risks are influenced by the sensitivity of data stored in the cloud and measures taken by a cloud service provider to protect data.

Todays post identifies the possible threats and vulnerabilities that businesses face when transferring applications or data to the cloud.

Security of Data

In some scenarios, unauthorized third parties may access sensitive or private data.

People are frequently duped by cloud services that offer hundreds of gigabytes of free storage. Then their data inexplicably vanishes from the service, or they start getting inundated with suspicious/spam emails.

Nonetheless, you are wrong if you think scam websites or emails are the only ones that have user privacy problems. Cloud storage is offered by large organizations such as Microsoft and Google.

Large firms like Google and Microsoft are notorious for the volume of data they collect from consumers using their cloud services. This puts you in danger of accessing your data by unauthorized third parties.

Less Control of Your Data

When storage is moved to the cloud, businesses and individuals lose some sight and control over their data as it is stored elsewhere in an offsite setup.

When leveraging external cloud services, the CSP assumes responsibility for some of your data and may decide what to or not to expose. This could adversely affect your organization, such as loss of revenue, mistrust from your clients or missed opportunities.

Unreliable Cloud Service Providers

Cloud service providers may have poor infrastructure leading to a lack of security, database failure, timeout and overflow. Some CSPs dont have strategies to prevent data loss.

It may be inconvenient or impossible for a client to personally check that the vendor is following the contracts terms, forcing the customer to rely on third-party audits, including certifications, rather than simply trusting the vendor.

Here are important factors customers should consider:

Moving Data is Difficult

Unreliable CSPs are very common. Its tough to move data to another cloud storage provider once youve signed up and started using one.

Vendor lock-in is a term used to describe this occurrence. This is a significant issue that affect medium-to-large enterprises that store large volumes of data with a single cloud provider.

If you have issues with that service, switching to another provider may be difficult due to the sheer volume of data and the associated difficulties.

Data Loss

Other than malicious hacks, data saved in the cloud can also be lost or forgotten. The hosting company can permanently lose customer data if they inadvertently deletes data or a physical disaster, such as a fire or earthquake.

Additionally, a cloud storage user may lose their data if a client encrypts it before transferring it to the cloud but forgets the encryption key. This risk can be mitigated by having backup storage for your data, such as SAN, NAS or DAS.

Final Thoughts on Cloud Storage

If cloud storage is fault-tolerant, it can be more efficient and reliable. It is difficult, however, to create a cloud that is fault-free.

It is important to always analyze the risk of using cloud services to protect confidential information and avoid them at all costs with alternative storage options.

Caroline is doing her graduation in IT from the University of South California but keens to work as a freelance blogger. She loves to write on the latest information about IoT, technology, and business. She has innovative ideas and shares her experience with her readers.

Like Loading...

Related

Read more from the original source:
What Are The Risks Of Cloud Storage? - Geek Vibes Nation

From 16 days to 16 minutes, HYCU Simplifies Ransomware Protection and Recovery Across On-Premises and Public Cloud Environments Through New Protg…

New Workload Support, Federal Gov Cloud, Security and Assessment Certification Program Designed to Eliminate Insidious Nature of Growing Cybersecurity Threats Crippling Midsize Enterprises

Boston, Massachusetts, April 26, 2022 (GLOBE NEWSWIRE) -- HYCU, Inc., a pioneering enterprise SaaS company specializing in on-premises and public cloud data backup and recovery as a service, today announced significant enhancements to HYCU Protg designed to counter the rise of ransomware attacks facing organizations across both on-premises and public cloud environments. With ransomware preparedness, requirements for WORM-enabled and immutable storage along with enhanced data protection for greater availability and recoverability, the latest enhancements to HYCU Protg are designed to provide an as a service foundation for data migration, protection and recovery, today and well into the future.

HYCU Protg was introduced at ITPT 31 in June 2019. Since launching, the solution has been recognized as an honorable mention in the Gartner Magic Quadrant for Data Center Backup and Recovery Solutions, published July 2020, along with several other industry accolades. HYCU Protg is the industrys only application-aware, light weight cross-cloud migration and DR solution for enterprise applications and workloads and delivers application consistency, protected migration, 1-click DR and consolidated management across clouds.

Highlights of the latest enhancements to HYCU Protg include:

Additional data protection for Dell EMC PowerScale OneFS: HYCU now supports native Isilon REST APIs, including the snapshot-change-list API, for efficient and fast incremental backups. HYCU customers can now easily scale across Isilon clusters with minimalist deployment, improving overall backup SLAs and Recovery Point Objectives (RPOs). Support extends from direct backups to on-prem or cloud-based object storage with no performance penalty.

Enhanced protection for public cloud: As security and compliance become more mainstream in the shared responsibility model with public cloud providers, HYCU now supports direct backups and archives to WORM-enabled object storage on Azure (Blob) and Google Cloud (Google Cloud Storage GCS). This builds on existing support of performing backups to AWS S3 WORM enabled object storage and eliminates the probability of a backup being impacted. This also allows for quicker recovery on a remote site/cloud from an offsite copy.

Simplified security credential management: Building on industry-leading support for Multi-factor Authentication (MFA), network segmentation, in transit encryption, black-box architecture and multitenancy, HYCU customers can now centralize, simplify, and secure their application and system credentials in CyberArk Conjur Credential Store, an external Credential store, both for organization users and also for tenants.

Advanced security and compliance for GovCloud use: To leverage solutions in Gov Cloud, HYCU customers need to deploy and manage themselves or have a Fed Ramped SaaS offering. As there are no Fed Ramped SaaS offerings available today, HYCU now provides a single-click migration for OnPrem workloads onto Azure Gov Cloud for a seamless, application consistent lift and shift up experience. This builds on support of Azure Gov Cloud Blob to store HYCU backups.

Story continues

The statistics are staggering. A ransomware attack occurs every 11 seconds. On average, it takes 16 days to recover from a ransomware attack. The estimated cost of ransomware attacks exceeds $50 billion since 2018. And, this is just the data from attacks that have been reported, said Simon Taylor, Founder and CEO, HYCU Inc. At HYCU, we have focused our effort on supporting customers and partners to keep the lifeblood of their organizations, data and applications, protected, safe, secure and recoverable in the event of human error or malicious actor attack. With these latest updates to HYCU Protg and extended support for our Federal customers with GovCloud, we are keeping our commitment to provide the right solution to address not only todays rising needs but those of future needs as well. These updates are also the culmination of work around R-Score, the public service initiative we created to help identify what needs to be done in order to be best prepared to recover in the event of a ransomware attack, along with tight collaboration from our engineering team and key customers and partners. I could not be more proud of the results of these efforts.

In addition, with the introduction of the CRN 5-Star PACE (Partners Accelerating Cloud Environments) Global Partner Program, HYCU laid the foundation for new ransomware assessment and recovery services for partners (see related news HYCU Ransomware Services and Certification Program Now Available to MSPs, MSSPs and CSPs Through Global Partner Program PACE).

HYCUs growth has been extremely impressive over the past three years, said Philippe Nicolas, Founder and Principal, Coldago Research. When you factor in tangible results from the approach they take with ransomware recovery, and the advancements they continue to make, it is no surprise to hear that customers have been able to recover from a ransomware attack from 16 days to 16 minutes. Todays announcements only illustrate the impact that HYCU is making with customers and partners to better manage, protect and recover mission critical workloads across on-premises to the public cloud. Bravo Simon and the HYCU team.

To learn more about the latest enhancements to HYCU Protg, visit our blog. For more information on HYCU and how the company delivers solutions to protect and recover from ransomware, visit: https://www.hycu.com/, or follow @hycuinc and connect with HYCU on LinkedIn.

###

About HYCU

HYCU is the fastest-growing leader in the multi-cloud backup and recovery as a service industry. By bringing true SaaS-based data backup to both on-premises and cloud-native environments, the company provides unparalleled data protection, migration and disaster recovery to more than 3,100 companies worldwide. HYCUs award-winning, purpose-built solutions eliminate the complexity, risk and high cost of legacy-based solutions, providing data protection simplicity in a hyper-connected, multi-cloud world. Customers experience frictionless, cost-effective data backup and recovery, no matter where their data resides. Based in Boston, Mass., the company employs 300 people across the globe. Learn more at http://www.hycu.com.

Read this article:
From 16 days to 16 minutes, HYCU Simplifies Ransomware Protection and Recovery Across On-Premises and Public Cloud Environments Through New Protg...

Internxt is valued at 40 million and plans to become the Spanish Web3 Google Drive – Novobrief

Internxt is an open-source, blockchain-based cloud storage service advocating for user privacy rights. This Valencia-based Web3 startup provides safe, secure, and GDPR-compliant digital storage designed to compete with those currently provided by Google, Microsoft, and Amazon.

During 2021 alone, Internxts revenue and user base grew by more than 1000%. The Valencian startup aims to consolidate similar growth figures this year: it has already generated more revenue in Q1 2022 alone than during the whole of 2021. Internxt plans to generate a total of 4 million in revenuein 2022 with an impressive 90% gross margin.

The Web3 startup has recently been valued at 40 million after securing equity investment from Balaji Srinivasan, former CTO of Coinbase and former general partner at Andreessen Horowitz (a16z), and Wayra, the innovation hub from Spanish Telecom giant Telefonica (Europes biggest Telecom company and the 5th biggest in the world).

Srinivasan is no stranger to tech startups. And no wonder he is investing in a Web3 cloud storage provider like Internxt. He is the co-founder of Earn.com (acquired by Coinbase), Counsyl (acquired by Myriad), Teleport (acquired by Topia), and Coin Center. He held the position of General Partner at Andreessen Horowitz and served as the CTO of Coinbase until 2019 before moving on to become one of the most prominent angel investors in tech and Web3.

Internxt now counts with shareholders as relevant as Juan Roig, Telefonica, Balaji Srinivasan, TheVentureCity, ESADE BAN, and Internxts founder, Fran Villalba Segarra. The startup team is formed by former talent from companies like CERN, ProtonMail, NordVPN, and CoverWallet, among others.

Web3 applications are becoming progressively more valued and publicly accessible over the last few years. In that sense, users are realizing the lack of personal data security afforded by current services. As such, projects like Brave, Signal, DuckDuckGo, and Ethereum are gaining popularity and support. Internxt is also part of that cohort. All of its services are designed for Web3, harnessing blockchain and end-to-end encryption to safeguard user privacy.

Millions of users are now switching to services built on technologies that enhance user privacy and data sovereignty. That is why the future ahead is incredibly promising, both for Internxt and the potential of Web3.

More:
Internxt is valued at 40 million and plans to become the Spanish Web3 Google Drive - Novobrief

Realtimecampaign.com Promotes the Advantages of Cloud Protection – Digital Journal

Many companies set up business services that connect workers via virtual or on-site offices. The designs require file and data storage that is accessible to all workers. Cloud services and storage options are efficient and convenient for organizations. Cloud protection is essential for all businesses that use cloud-based services or storage.

Improved Availability of Cloud Services

Security schemes improve the availability of cloud services to businesses and their workers. User accounts are available for all workers. The administrator applies permissions to each account according to the employees security clearance and role in the organization. Advanced security blocks security breaches and attacks to help users access cloud services. Availability of cloud services is of the utmost importance to companies that operate via a network, according to realtimecampaign.com.

Enhanced Data Security

Cloud services provide storage for company files and data. The internet-based storage allows workers to access files and data via their user accounts. No one who doesnt have clearance is permitted to access, view, or alter the data. Data security is critical for all network and data systems. Cloud services require the same security schemes to protect the data that protect the network. According to, Cybersecurity Mesh: ITs Answer to Cloud Security, the security schemes must meet all IT standards and current regulations.

Advanced Threat Detection

Cloud storage and services must have threat detection to stop cyber attacks. Businesses must monitor the data system to prevent unauthorized access to the information. Customer data must be protected effectively. Federal regulations and IT standards enforce strict penalties if a customer incurs a financial loss after a breach. Internet-based services and storage require data encryption. Businesses owners click here for info about cloud services and threat detection.

Compliance with Standards and Federal Regulations

Businesses choose cloud-based services to get IT support and off-site administration services. The vendor that provides the services applies all current IT standards to the business services. Administrators monitor the services and create a log of all unauthorized attempts to access or use the business services. Vendors such as Radware offer guidance about cloud services and features. The service providers can clarify what IT standards or regulations apply to cloud-based designs.

Authorization and Authentication for All Users

Each time a worker signs into the service a new entry appears in the softwares log. Administrators assess any entries that come from unauthorized parties, and off-site workers review these entries and follow measures to stop future attempts. The information may provide an IP address for the user, and the administrators block the connection.

These efforts are used for cloud services to protect the data and stop unauthorized use of the services. The administrators connect remote workers to the cloud services. The services are available through an internet connection, and remote workers need safer ways to use the service.

Cloud services are efficient and easy-to-use business services. Workers connect to the services via a user account. Administrators set up user accounts based on security clearances. All services must have cloud protection and robust security schemes. To find out more about cloud protection, contact a vendor now!

Media ContactCompany Name: Realtimecampaign.comContact Person: Media RelationsEmail: Send EmailPhone: 407-875-1833Country: United StatesWebsite: Realtimecampaign.com

Read the original post:
Realtimecampaign.com Promotes the Advantages of Cloud Protection - Digital Journal

Why You Need To Start Saving All Your Important Memories, Documents, Pictures And Videos To The Cloud Right Now – Africa.com

Our mobile devices have become our portal to the digital world. Whether its taking photographs with friends and family, recording videos of what will become treasured memories, or simply writing down important thoughts, everything is saved on our trusty sidekicks. But have you considered what would happen to these digital stories if your device were to get lost?

This is where a service likeHUAWEI Mobile Cloudbecomes a life saver. Simply put, the cloud is here to stay and it is an invaluable resource in helping you protect your most cherished memories. Using this freely available app, you can store everything in the cloud and keep your treasured photos, videos, documents and other files safe. Not only is this a safeguard against loss, but it also makes importing data to a new device as easy as logging into the HUAWEI Mobile Cloud storage service.

Even better, when using HUAWEI Mobile Cloud, you can automatically synchronise your content across all your devices. By using your HUAWEI ID on your smartphone, tablet, or PC, you instantly gain access to everything, anytime, anywhere.

Of course, cloud storage extends beyond just your digital memories. With HUAWEI Mobile Cloud, you can easily back up third-party app data. This means the data from all your favourite apps are saved in the cloud as well. Whether it is your CV, confidential documents or social media logins the list is virtually endless to ensure that all of these documents are uploaded for fast retrieval whenever and wherever.

Huawei is even throwing in 5G of cloud storage for free. You heard right! You get enough space for the equivalent of 2500 high-definition photographs or 50000 documents just by downloading the HUAWEI Mobile app and creating a HUAWEI ID.

HUAWEI Mobile Cloud also has a very handy location tracking feature, so you can quickly locate your devices by just logging into the HUAWEI Mobile Cloud app. You can activate your ringtone to find your phone in the house or even erase all the data on your mobile remotely if it should ever fall into the wrong hands.

If you need more convincing, Huawei is currently running two exciting campaigns around the HUAWEI Mobile Cloud service. Pay with Ozow on Huawei and you could win a R1500 Takealot voucher, for more information clickhere.

In the HUAWEI Cloud Pocket Your Treasure campaign, you can stand the chance to win 1 (one) of 10 (ten) Huawei prizes. All you need to do to enter is subscribe to a 50GB monthly plan at the 50% discounted rate of R7.50 and you can potentially win one of two HUAWEI MateBook Pro Laptops, one of five HUAWEI GT3 Watches, or one of three HUAWEI nova 8i smartphones. Ts and Cs apply.

EmbraceHUAWEI Mobile Cloudand have peace of mind that your digital memories and documents will be kept safe and secure no matter what comes your way.

Original post:
Why You Need To Start Saving All Your Important Memories, Documents, Pictures And Videos To The Cloud Right Now - Africa.com

Spring Cleaning for Legal Teams: The Cloud and Defensible Deletion of Data – JD Supra

Due in large part to unlimited cloud storage and the ever-evolving list of new cloud-based tools and applications, organizations are creating more data than ever before. This can create risk, increase cost, and weigh down eDiscovery workflows. However, many leaders fear the risk of accidentally deleting required information or being unable to defend their deletion practices. Developing a sound data retention and deletion program and getting key stakeholders' buy-in to do so is critical for any successful data governance program. Seemore+

Due in large part to unlimited cloud storage and the ever-evolving list of new cloud-based tools and applications, organizations are creating more data than ever before. This can create risk, increase cost, and weigh down eDiscovery workflows. However, many leaders fear the risk of accidentally deleting required information or being unable to defend their deletion practices. Developing a sound data retention and deletion program and getting key stakeholders' buy-in to do so is critical for any successful data governance program. Bill and Rob are joined by Erika Namnath, Executive Director of Global Advisory Services at Lighthouse, to discuss key steps for defensible deletion of data and how to ensure participation and adherence throughout an organization. Seeless-

Go here to see the original:
Spring Cleaning for Legal Teams: The Cloud and Defensible Deletion of Data - JD Supra

Carbon and the cloud: Why data may be part of the climate change problem – VentureBeat

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

Efforts to achieve carbon neutrality are on the rise, with a global consensus that climate change is an urgent concern that requires international action.

While carbon intensive industries like manufacturing, transportation and agriculture are used to being on the receiving end of public criticism for their ecological shortcomings, data has emerged as another bottleneck.

In the spirit of Earth Day, it may be time to reconsider more sustainable solutions when it comes to data storage.

Were generating more data than ever before, which, of course, means that data centers are also growing to accommodate increasingly complex IT infrastructures.

But these data storage centers, both on premises and in the cloud, consume massive amounts of energy and electricity. With new devices connecting to the internet every second, according to a recent McKinsey study, data centers account for 1.8% of electricity use in the United States and contribute to 0.3% of all global CO2 emissions.

So if data is flowing like water trying to get out, eventually, itll be nearly impossible to stay ahead of the dam breaking. Luckily, there are steps enterprises can take to plug the holes early and become truly green in the near future.

In the Exponential Climate Action Roadmap laid out by the World Economic Forum, digital technologies could help reduce global carbon emissions by up to 15% one-third of the 50% reduction required by 2030.

On the surface, this sounds like a lofty goal. But, reducing your environmental footprint first requires understanding how your data storage is currently managed, both from an IT perspective and across other departments within your organization. This involves more than purchasing energy-efficient hardware.

Recently, Mendix, a low-code software development platform, and Atos, an IT service and consulting company, expanded their global partnership to drive digital decarbonization with their enterprise low-code solutions.

Atos is already in the process of building industry-specific solutions based on Mendixs platform to monitor, report and track real-time energy consumption and carbon emission across 1,800 locations globally.

At a time when the tech industry is booming, the global software developer shortage is predicted to reach 85.2 million workers by 2030. This developer drought necessitates the need for a paradigm shift in the way companies develop and maintain software applications.

Low-code tools are designed to manage updates through automation, helping to bridge the skills gap by enabling non-programmers to build products without having to learn complex computer languages. This not only frees up IT teams for other vital tasks, but drastically reduces their products time-to-market.

Johan den Haan, CTO of Mendix, notes, The goal of low-code is to deliver maximum value using minimal resources. If you want to decarbonize, you need to digitize. And if you want to digitize, you need to democratize your software with low-code solutions.

In addition to low-code, the term cloud-native computing has become somewhat of a catchall for the various systems and strategies required by software developers to create, deploy and manage modern software applications on cloud infrastructure.

Research from Veritas Technologies shows that 99% of organizations plan to increase workloads in cloud environments over the next three years, with 98% attributing the shift to sustainability strategies. However, storing data in the cloud takes energy to power their servers, storage equipment and cooling infrastructure, which largely comes from burning fossil fuels.

Eric Seidman, senior director at Veritas Technologies, says that evolving backup strategies to include advanced deduplication can help organizations meet their environmental impact goals to shrink their carbon footprints.

At a high level, deduplication refers to a process that eliminates redundant data. Most enterprises create multiple copies of the same datasets mainly to ensure that a backup exists at all times in the event of a hardware failure or security breach.

Though replication provides many benefits, U.S. Grid Emissions Factor data suggests that storing just 1 petabyte of unoptimized backup data in the cloud for one year could create as much as 3.5 metric tons of CO2 waste.

In order to make data storage more efficient and less harmful to the climate, the lifecycle of the data needs to be intelligently managed, said Seidman.

To achieve this, organizations need to determine what data can be eliminated, compressed and/or optimized for storage in the cloud.

Doing so will result in conserved storage capacity, a reduction in idle time and significant cloud storage savings.

The more digitized our economy becomes, the more well continue to rely on data storage centers to support it.

But, the environmental implications behind data creation and storage can no longer be ignored. Key technical decision-makers must find environmentally sustainable ways to bring value to a company without moving at a glacial pace.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

See more here:
Carbon and the cloud: Why data may be part of the climate change problem - VentureBeat

Pure Storage wants to work with data gravity, not against it Blocks and Files – Blocks and Files

Pure Storage CEO Charles Giancarlo expressed two noteworthy views in an interview with Blocks & Files that hyperconverged infrastructure doesnt exist inside hyperscaler datacenters, and that data needs virtualizing.

He expressed many noteworthy views actually but these two were particularly impressive. Firstly, we asked him if running applications in the public cloud rendered the distinction between DAS (Direct-Attached Storage) and external storage redundant. He said: In general the public cloud is designed with disaggregated storage in mind with DAS used for server boot drives.

The storage systems are connected to compute by high-speed Ethernet networks.

Its more efficient than creating virtual SANs or filers by aggregating each servers DAS in the HCI (hyperconverged infrastructure). HCI was a good approach generally, in the 2000 era when networking speeds were in the 1Gbit/s area, but now with 100Gbit/s and 400Gbit/s coming, disassociated elements can be used and this is more efficient.

HCIs use is limited, in Giancarlos view, by scaling difficulties, as the larger an HCI cluster becomes, the more of its resources are applied to internal matters and not to running applications.

Faster networking is a factor in a second point he made about data virtualization: Networking was virtualized 20 years ago. Compute was virtualized 15 years ago, but storage is still very physical. Initially networking wasnt fast enough to share storage. Thats not so now. He noted that applications are becoming containerized (cloud-native) and so able to run anywhere.

He mentioned that large datasets at petabyte scale have data gravity; moving them takes time. With Kubernetes and containers in mind, Pure will soon have Fusion for traditional workloads and Portworx Data Services (PDS) for cloud-native workloads. Both will become generally available in June.

What does this mean? Fusion is Pures way of federating all Pure devices on-premises hardware/software arrays and off-premises, meaning software in the public cloud with a cloud-like hyperscaler consumption model. PDS, meanwhile, brings the ability to deploy databases on demand in a Kubernetes cluster. Fusion is a self-service, autonomous, SaaS management plane, and PDS is also a SaaS offering for data services.

We should conceive of a customers Pure infrastructure, on and off-premises, being combined to form resource pools and presented for use in a public cloud-like way, with service classes, workload placement, and balancing.

Giancarlo said datasets will be managed through policies in an orchestrated way, with one benefit being the elimination of uncontrolled copying.

He said: DMBSes and unstructured data can be replicated 10 or even 20 times for development, testing, analytics, archiving and other reasons. How do people keep track? Dataset management will be automated inside Pure.

Suppose there was a 1PB dataset in a London datacenter and an app in New York needed it to run analysis routines? Do you move the data to New York?

Giancarlo said: Dont move the [petabyte-level] dataset. Move the megabytes of application code instead.

A containerized application can run anywhere. Kubernetes (Portworx) can be used to instantiate it in the London datacenter. In effect, you accept the limits imposed by data gravity and work with them, by moving lightweight containers to heavyweight data sets and not the inverse. You snapshot the dataset in London and the moved containerized app code works against the snapshot and not the original raw data.

When the apps work is complete, the snapshot is deleted and excess data copying avoided.

Of course data does have to be copied for disaster recovery reasons. Replication can be used for this as it is not so time-critical as an analytics app needing results in seconds rather than waiting for hours as a dataset slowly trundles its way through a 3,500-mile network pipe.

Giancarlo claimed: With Pure Fusion you can set that up by policy and keep track of data sovereignty requirements.

He said that information lifecycle management ideas need updating with dataset lifecycle management. In his view, Pure needs to be applicable to the very large-scale dataset environments, the ones being addressed by Infinidat and VAST Data. Giancarlo referred to them as up-and-comers, saying they were suppliers Pure watched although he said it didnt meet them very often in customer bids.

Referring to this high-end market, Giancarlo said: We clearly want to touch the very large scale environment that out systems havent reached yet. We do intend to change that with specific strategies. There was no more detail said about that. We asked about mainframe connectivity and he said it was relatively low on Pures priority list: Maybe through M&A but we dont want to fragment the product line.

Pures main competition is from incumbent mainstream suppliers such as Dell EMC, Hitachi Vantara, HPE, IBM, and NetApp. Our main competitive advantage, he said, is we believe data storage is high-technology and our competitors believe its a commodity This changes the way you invest in the market.

For example, its better to have a consistent product set than multiple, different products to fulfill every need. Take that, Dell EMC. Its also necessary and worthwhile to invest in building ones own flash drives and not using commodity SSDs.

Our takeaway is that Pure is bringing the cloud-like storage consumption and infrastructure model to the on-premises world, using the containerization movement to its advantage. It will provide data infrastructure management facilities to virtualize datasets and overcome data gravity by moving compute (apps) to data instead of the reverse. Expect announcements about progress along this route at the Pure Accelerate event in June.

See the rest here:
Pure Storage wants to work with data gravity, not against it Blocks and Files - Blocks and Files

How tech-driven disruptions are driving the future of Indian logistics – Times of India

The Indian logistics sector is one of the biggest in the world, growing at an average of 10.5% every year and valued at around $215 billion. However, the infrastructure is still quite a bit behind in terms of implementing technological advancements. The Covid-19 outbreak has further exacerbated the need for fixing the fragmented networks and the impact of poor infrastructure.

Companies that were not too keen on investing in technology are now seeing the advantages and its necessity. The technological advancements being made in the logistics can cause further disruptions, which would push the efficiency of the sector. The logistics sector is undergoing a transformation and consolidation, organizing the highly fragmented industry. Resilience and agility are the need of the hour and companies are adopting digital technologies to remain competitive in the face of disruptions.

Cloud computing

Cloud computing can help provide access to a shared pool of storage, applications and networks. This can help optimize asset utilization. With road transportation in India still fragmented, vehicles tend to remain idle or return empty. With cloud computation, logistics providers can collaborate with others to share fleets. Service providers will be able to share information using the cloud platforms and coordinate with each other to deliver and pick up goods. This reduces the amount of time vehicles remain idle and making the entire ecosystem efficient.

Collaborative Execution Preparing todays supply chains

Ever since the Covid pandemic hit the country, businesses have been facing unpredictable out-of-stock situations, stranded trucks, and a lack of visibility into their supply chains. The modern supply chain poses a host of new challenges for leaders to grapple with increasing costs, a slew of disruptions, and the complexities posed by new distribution and sales channels add to the complexity of managing operations efficiently. There is an urgent need for improved demand and distribution visibility, integrating new channels, and improving control over quality and speed of delivery in both the first and last miles of the supply chain.

The next stage of productivity improvement is expected to be delivered by the deployment of technologies to facilitate collaborative execution by integrating processes across the shipper & vendor organization to provide end-to-end visibility & get better control thru the transaction execution. With a tech-enabled supply chain, organizations can better allocate critical resources and organizations have protocols and SoPs for their manufacturing facilities as well as field and office staff to continue operations to steer through the disruptions.

Big Data analytics

Logistics providers generate a lot of data with every process and transaction. This data is usually fragmented and stored in various locations in the supply chain. But with Big Data analytics can help parse through the data to find operational efficiencies.

Big Data can help with data mining and complex statistical analysis. This can bring out key insights and trends in the supply chain that companies can use to optimize their processes. Companies can also help develop algorithms to figure out which parts of the supply chain are inefficient and remove costs that are redundant. It can also help determine how much life remains in assets held by companies. All this information can help companies figure out and implement a future strategy.

Cloud computing is also a great way to store the huge amount of data generated throughout the logistics supply chains. Companies will no longer need to invest in physical servers or even need hard drives, which can have a limited amount of storage space. With cloud storage, companies will also be able to access the data from anywhere. This allows them to monitor all their critical processes round the clock.

Machine learning and artificial intelligence

Machine learning is a technology being used by logistics providers to help automate the supply chain. Machine learning and artificial intelligence can streamline the process based on the data theyve gathered related to tracking and internal functions. It brings down the turnaround time required to complete any process. Its a technology that helps reduce costs and improves the time in which orders are processed and delivered.

The Way Forward

Its important for Indian logistics players to invest in upgrading technologies to help their systems become smarter and efficient, even if it means reinventing their processes. An increase in adoption of automation of processes through robotics and artificial intelligence in transportation and warehouse management. This will ensure companies can reduce their reliance on human intervention. AI driven automation, Direct to consumer initiatives and acceleration towards sustainability are also bringing in focus complex technology solutions.

With more disruption in technology set to come, companies should be ready and anticipate how their capabilities need to evolve. Companies can work on developing services that create demand rather than restricting themselves to meeting current demand. To do this, logistics companies will need to maintain strong relationships with customers and the market, so that they can be at the forefront of shaping the future.

Views expressed above are the author's own.

END OF ARTICLE

Read the original here:
How tech-driven disruptions are driving the future of Indian logistics - Times of India

2 Unstoppable Growth Stocks to Buy Now and Hold – The Motley Fool

Cloud computing is one of the trendiest, most transformative technologies of the past two decades. Organizations can now provision infrastructure and software services through the internet, eliminating the need for costly on-site hardware. To that end, cloud computing has fundamentally changed the way many businesses operate, making them more agile and efficient.

Going forward, research company Gartner believes that cloud spend will climb to $917 billion annually by 2025, implying 103% growth from $451 billion in 2021. With that in mind, Cloudflare( NET -3.32% ) and DigitalOcean ( DOCN -4.82% ) look like smart stocks to buy and hold.

Here's why.

Image source: Getty Images.

Cloudflare is on a mission to make the internet faster. Its global network offers tremendous capacity, and its various data centers sit within 50 milliseconds of 95%of internet users worldwide, meaning it can move a lot of data very quickly. That infrastructure is a significant competitive advantage, and Cloudflare uses it to offer a range of cloud services that accelerate and secure corporate networks and applications.

To reinforce its edge, the company offers a free tier of service that has made its platform wildly popular -- Cloudflare's content delivery network powers 19%of the internet, more than every other cloud vendor combined. The company leverages that massive user base to try new products and collect data, and it leans on artificial intelligence to surface insights that accelerate and improve product development.

Not surprisingly, the company is growing like wildfire. Last year, Cloudflare increased its paying customer base by 26% to 140,000, and the average customer spent 25%more. In turn, revenue soared 52% to $656 million, and while the company is still unprofitable on a GAAP basis, it generated positive free cash flow of $8.6 millionin the fourth quarter.

Going forward, Cloudflare has plenty of room to grow. Management puts its addressable market at $100 billion by 2024, and the company is an innovation machine. For instance, it recently added email security tools to its growing portfolio of zero-trustcybersecurity services, and the company is beta-testinga cloud storage service to support its application development platform.

In short, Cloudflare has already achieved a strong market presence, but shareholders have good reason to believe that trend will continue. That's why this high-growth tech stock is a buy.

DigitalOcean is democratizing cloud computing. It offers a growing number of infrastructure and platform services, including computer and networking solutions, managed databases, and software development tools. But the unifying theme is simplicity. Its platform enables clients to deploy cloud services in minutes, even if they lack robust IT support. That differentiates DigitalOcean from vendors like Amazon and Microsoft, both of which tailor their products to larger enterprises.

To reinforce its niche, DigitalOcean also provides 24/7 technical and customer support, and it has created an extensive library of developer tutorials and community-generated questions and answers. To that end, DigitalOcean is a lifeline for small businesses, allowing them to harness the power of the cloud to build and scale applications. That value proposition has generated strong demand.

Last year, the company grew its customer base by 6% to 609,000, and the average customer spent 16% more. In turn, revenue climbed 35% to $429 million, and the company generated positive free cash flow of $24 million, up from a loss of $58 million in the prior year. Better yet, shareholders have good reason to believe that momentum will continue or even accelerate.

DigitalOcean currently tailors its services to start-ups and other small businesses, but the company is adding new products and features at a good clip. For instance, it recently launched solutions for database migration, application development, and infrastructure monitoring. As its portfolio continues to evolve, DigitalOcean should scale alongside the small businesses on its platform, meaning its total addressable market (TAM) should continue to climb in the future.

On that note, management puts its TAM at $145 billionby 2025, and in light of its differentiated business model, this growth stock looks like a smart way to invest in cloud computing.

This article represents the opinion of the writer, who may disagree with the official recommendation position of a Motley Fool premium advisory service. Were motley! Questioning an investing thesis even one of our own helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.

Read the rest here:
2 Unstoppable Growth Stocks to Buy Now and Hold - The Motley Fool