Category Archives: Cloud Storage
A Man Backing Up Millions Of GB Of Porn May Have Ended … – Indiatimes.com
There used to be a time when CDs and flash drives were a godsend, allowing us to digitally store anything we felt like for later perusal. Sure, theyre used to some extent, but cloud storage has largely overtaken those forms in our day to day usage. We remotely save and access data everyday from our computers and smartphones.
While a lot of these cloud storage providers offer a subscription based unlimited plan, one user set out to find just how unlimited they actually are. More specifically, how much would Amazons unlimited data plan let you hoard.
Reddit user beaston02 seemed determined to discover the upper limits of Amazons cloud storage, of which the unlimited plan was killed off earlier in June. In fact, others on Reddit speculate he may be a major reason the plan was killed off. Thats because beaston02 was hoarding a massive cache of porn in his cloud storage account. Specifically, one petabyte (1 million GB) worth of it.
To achieve the feat, beaston02 compiled a code that would record public shows from a number of adult webcam sites, including CamSoda, Chaturbate, and MyFreeCams. It is nearly entirely porn, he posted on Reddit. Ever since I got into computers, I found myself learning more, and faster when it was something more interesting. Call me crazy, but women interest me more than most other things on the internet and there is a huge amount of data being created daily, so it was a good fit for the project.
Beaston02 eventually ditched the project when his interest waned a couple of months ago, stopping just shy of the 1.8 petabyte mark, and making his collection available on GitHub. Other Redditors have now joined forces on what theyre calling the Petabyte Porn Project, continuing to archive these public cam shows on both Amazon and Googles cloud storage
So just how much porn does 1 petabyte add up to? Well, an average 720p video would work out to about 2GB per hour. However, most cam websites offer a significantly lower quality, making it best to average them at about 480p. That means, at about 0.7GB per hour, beaston02 archived approximately 293 year's worth of porn.
No matter the situation, no one can ever be that bored.
See the article here:
A Man Backing Up Millions Of GB Of Porn May Have Ended ... - Indiatimes.com
Person tests Amazon’s "unlimited" cloud storage by uploading 1.8 petabytes of porn – Boing Boing
A fellow who goes by the handle beaston02 wanted to see how unlimited Amazon's "unlimited" cloud storage plan was, so he uploaded 293 years' worth (2 million gigabytes or 2 petabytes) of PornHub videos to his account.
From Motherboard:
Beaston02 told me he stopped recording streams simply because his interest in it waned. "I know plenty of people have labeled me some huge pervert or someone with a huge porn addiction, but that's really not me at all," he said. "I have more of a problem with collecting or hoarding data than I do with porn." He said he used the exercise to learn Python, SQL databases, and how to handle that much data. "The project ran its course, I got the knowledge I was hoping to get, and I just had no interest in it anymore."
While he's no longer running the scripts that collected the porn, he made them available on Github. Another Redditor, -Archivist, took up the cause with the "Petabyte Porn Project," recruiting other hoarders to help continue recording live public cam sessions all day every day. -Archivist told me in a Reddit message that this represents "upwards of 12 terabytes per day." Those helping hoard are close to two petabytes now, stored on Amazon's cloud and mirrored on Google Drive. Amazon did not respond to Motherboard's request for comment.
Amazon canceled its unlimited storage offering in June.
Gothamist sent a video crew out with dumpster diving comedian Jeff Seal on a trip through NYCs nighttime streets, in which he raided the trashbags of the priciest, most upscale grocers and packaged-food restaurants in Manhattan.
Shooting better photos is often a matter of technique, but sometimes the tiny integrated optics on your device just wont cut it. This 3-in-1 lens attachmentcan provide that camera some added versatility.This lens accessory gives your phone an easily detachable mounting bracket for its collection of lenses. It uses a simple clip mechanism, so it []
For the uninitiated, dinoflagellates are the single-celled organisms behind the eerie algal bloom known as red tide. But some of these tiny marine creatures are capable of producing a beautiful bioluminescent glow, and you can harness their natural light with a Dino Pet, available now in the Boing Boing Store.This dinoflagellate-filled aquarium is shaped like []
Continued here:
Person tests Amazon's "unlimited" cloud storage by uploading 1.8 petabytes of porn - Boing Boing
Hedvig Advances Private, Hybrid and Multi-cloud Storage with New Integrations, Security, and All-flash Capabilities – insideBIGDATA
Hedvig, the company modernizing storage and accelerating enterprise adoption of private and hybrid clouds, announced the availability of the Hedvig Distributed Storage Platform version 3.0, a new release of its software-defined storage (SDS) platform. Innovations in 3.0 include end-to-end integrations, advanced security capabilities and enhancements to its comprehensive suite of caching technologies. These 3.0 capabilities enable customers to store and protect virtualized, containerized, and backup workloads from a single platform.
Weve seen a sea change in customer requirements since we released the Hedvig Distributed Storage Platform two years ago. Enterprises require a platform that significantly simplifies their IT infrastructure and operations, said Avinash Lakshman, CEO and founder of Hedvig. As more mainstream enterprises adopt software-defined storage, they seek technology that plugs into their existing architecture, natively protects against growing cybersecurity and compliance mandates, and future-proofs their infrastructure with innovations in flash technology.
As more businesses adopt software-defined storage and multi-cloud infrastructure, the flexibility to accommodate primary and secondary data in a single platform becomes critical. New certifications, plugins, encryption, auditing, multitenancy, and flash-caching capabilities found in Hedvig Distributed Storage Platform 3.0 offer organizations an elastic storage system that runs on-premises, in the public cloud, or both.
New CloudScale Plugins simplify integration of SDS into existing infrastructure and operations
Hedvig adds to its existing Docker and OpenStack CloudScale Plugins with new and improved plugins for Veritas, VMware, and Red Hat. CloudScale Plugins provide pretested, validated options to ensure SDS is easier to use and operate. Hedvigs CloudScale Plugins now include:
New security features including Encrypt360 reduce the risk of hacking and ransomware
Enterprise cyber and malware attacks are on the rise, resulting in a growing need for enhanced security and encryption built directly into storage infrastructure. Hedvig Encrypt360 delivers a native, in-software approach to protecting data throughout its entire lifecycle, encrypting data thats in-use, in-flight, and at-rest. Based on Hedvigs unique distributed systems architecture, data encryption starts at the host level and carries all the way through the backend, distributed cluster nodes. Encrypt360 supports a variety of key management systems, including AWS, and enables customers to select a 256-bit AES encryption policy on a per-volume basis. Also new to the 3.0 release are advanced auditing and multitenancy access control mechanisms that, combined with Encrypt360, ensure customers can securely meet IT compliance and regulations while adopting hybrid and multi-cloud architectures.
FlashFabric improvements lower the cost and complexity of adopting an all-flash data center
The Hedvig FlashFabric is a comprehensive suite of flash caching technologies that optimize performance in Hedvig clusters. In 3.0, Hedvig adds improvements to the platforms all-flash caching, including more advanced auto-tiering and read cache capabilities. Customers architecting all-flash data centers or those interested in adding all-flash resources in public clouds can easily take advantage of NVMe, 3D Xpoint and other flash innovations without having to add complex, proprietary hardware. Simply add new commodity servers or public cloud instances configured with the latest flash technologies and Hedvig will automatically consume the resources and optimize performance based on application needs.
Sign up for the free insideBIGDATAnewsletter.
Why Storage Companies Are Partnering with Microsoft Azure for … – Enterprise Storage Forum
These days, everyone who is anyone in data storage appears to be partnering with one or more of the big three of the cloud Amazon Web Services (AWS), Google Cloud Platform (GCP) and Microsoft Azure. Of late, Azure seems to be in the ascendancy. The likes of DataCore, Panzura, Nasuni, Zadara, Veritas, Caringo, Datapipe and many others have announced deals with Azure to harness its cloud platform as part of their storage offerings.
Storage and data protection vendors need to have a point of presence in and with the cloud providers, such as Azure, for their customers, said Greg Schulz, an analyst with StorageIO Group. If vendors dont add support on and for those cloud platforms, somebody else will do it and their customers or prospects may look elsewhere.
Another way to look at it is that cloud platforms such as Azure are another tier of storage. But more than that, their underlying compute platforms have local cloud instance of storage backed by elastic block, file and object storage. Storage vendors can deploy their own storage software on this foundation. This gives users options and flexibility regarding where and how they want their storage software deployed.
Azure continues to expand including with the new Azure Stack which enables hybrid deployments, said Schulz.
Some data storage vendors simply connect to cloud service providers via Application Programming Interfaces (APIs) or a gateway. Others port their storage software to run on a cloud instance (VM) or emerging Virtual Private Servers (VPS) and Dedicated Private Servers (DPS). In this latter case, instead of the storage vendors functionality running on a local hardware appliance, VM or container, it runs in the cloud, using a mix of instances and persistent cloud storage, said Schulz. The advantage, he added, is that when vendor software is running on Azure or another cloud instance, its technology is now in the cloud and using cloud resources.
Examples include Caringo Swarm on Microsoft Azure, which brings file storage into the cloud using Azures worldwide network of managed data centers. The Zadara Storage Cloud delivers block, file and object storage for applications leveraging Azure. By leveraging a Zadara VPSA storage array on Azure, you can migrate applications such as SQL Server, Exchange, SharePoint and others to Microsoft Azure without having to reconfigure them. Similarly, Box and Microsoft are working together on Box cloud content management using Azure services. They are exploring ways to integrate artificial intelligence (AI) capabilities and add Azure regions outside the U.S. to Box Zones for in-region storage on Azure.
Nasuni is another company with a strong Microsoft Azure relationship. Microsoft has a massive base of enterprise customers who, in many cases, already have Azure subscriptions, said Warren Mead, vice president, alliances and business development, Nasuni. They are looking for ways to increase their use of Azure as a means of achieving digital transformation, and Azure object storage is a natural first or second foray into the cloud, especially given every enterprise is dealing with storage growth pains.
Thats why he said it makes good business sense for storage software companies like Nasuni to partner with Microsoft. This provides an opportunity to help users transition from legacy, device-constrained NAS and SAN infrastructure to a scalable, Azure-based storage infrastructure. Mead believes cloud object storage is the new disk, as it provides limitless capacity, high availability and geo-redundancy with cloud economics.
How does Nasuni add value? Without a file system, object stores are merely containers, and it is difficult for enterprises to tap their potential for file sharing and collaboration. Nasuni has developed a global file system that lives in and scales with the cloud. Nasuni enterprise file services, powered by the Nasuni UniFS global file system, leverages Azure object storage to provide primary NAS and archive capacity, backup/recovery, disaster recovery (DR) and global file access in one hybrid cloud. By storing all metadata and files in Azure first and using on-premises caching appliances to provide fast, secure access only to the active data, it is said to offer unlimited capacity on demand, pay-as-you-grow flexibility, rapid recovery points and recovery times, DR anywhere you have power and bandwidth, file access through any device, file collaboration without version conflict and centralized management.
DataCore, too, has made its DataCore Cloud Replication software available in the Microsoft Azure Marketplace to support DR efforts. Those with on-premises DataCore deployments can use the Azure cloud as an added replication location to safeguard high availability systems.
For DataCore users seeking hybrid cloud storage, Azure offers a convenient and economically attractive way quickly to roll out and scale a secure remote replication site to safeguard and archive their data; especially when compared to setting up and maintaining a dedicated remote location at a branch office or data center for just this purpose, said Augie Gonzalez, director of product marketing, DataCore Software.
DataCore software replicates data from on-premises virtual storage pools to a remote pool in Azure cloud storage. The asynchronous remote copy travels over standard Internet links protected by a virtual private network (VPN). The on-premises configuration consists of either a software-defined storage infrastructure controlled by DataCore SANsymphony or DataCore Hyper-converged Virtual SAN. They communicate with other instances of DataCore software executing in Azure. Should the on-premises site suffer a major outage or have no space to keep a long-term archive, the copy stored in Azure may be accessed from anywhere.
Managed service providers (MSPs) are also leaping on the bandwagon. Datapipe, for example, has added Azure cloud storage into its portfolio. The company utilizes local disk storage all the way up to its own cloud storage solution.
Being able to include products like StorSimple for Azure allows our clients to keep investments they already have in one of our storage solutions and expand into Azure, said Tim Campbell, product manager, at Datapipe. Start with an easy project like server backup to Azure storage as a way to get backups off site in a cost-efficient manner and gain familiarity with cloud storage technologies.
Speaking of backup, Veritas and Microsoft have inked a multi-year partnership to link Azure (cloud storage, compute, analytics and machine learning) with new management and governance capabilities from Veritas. This includes collaboration to sell hybrid cloud storage jointly to mutual customers.
As part of the partnership, Veritas NetBackup 8.0 now supports storage tiering to Azure. This is said to improve data lifecycle management by optimizing the movement of data to Azure cloud storage. This reduces or eliminates the need to deploy additional storage with a separate point product. For small and midsized businesses, Veritas Backup Exec 16 supports the movement of backup data to Azure. Veritas is also leveraging Azure to drive greater efficiencies for its own workloads. The company has selected Azure as the cloud backend on which to run its Enterprise Vault.cloud service, which provides policy-based information retention to streamline eDiscovery and helps Office 365 subscribers to find archived information quickly.
Enterprises are increasingly looking to Microsoft to help power their digital transformation as they adopt Azure cloud services and Office 365, said Chris Mancebo, global strategic alliances, Veritas. Such hybrid cloud storage scenarios will enable customers to take advantage of the cloud (or multiple clouds) to reduce their storage costs while increasing business agility, and to extract greater value from their data.
Photo courtesy of Shutterstock.
See the article here:
Why Storage Companies Are Partnering with Microsoft Azure for ... - Enterprise Storage Forum
Microsoft Unveils Cost-Cutting Archival Cloud Storage Option – eWeek
Microsoft kicked off a public preview of its Azure Archive Blob Storage service this week, offering customers a lower-cost cloud storage solution for rarely accessed data.
Storage tiering is hardly a new concept, although it may be coming to an end thanks to the advent of solid-state drives and innovations in the storage and hardware space. For now, it's common for organizations to place their critical data on expensive, high-performance storage arrays and then move it down the line as it winds up being accessed less frequently over time.
Eventually, older data ends up on comparatively cheap and slower storage mediums like tape. Naturally, the tradeoff for lower storage bills is leisurely data retrieval times.
Enterprises can employ a similar strategy to contain their cloud storage costs.
Amazon Web Services (AWS), for example, introduced its Glacier archival cloud storage service in 2012 for the low price of a penny per gigabyte (GB) per month at the time. Today, Glacier storage goes for as low as 0.4 cents per GB in some regions.
Last year, Microsoft introduced its own Azure Cool Blob storage option, which also cost customers a penny per GB per month in some areas. Now, users have another, lower-cost option called Azure Archive Blob Storage, along with new Blob-Level Tiering data lifecycle management capabilities, announced Kumail Hussain, a senior program manager at Microsoft Azure.
"Azure Archive Blob storage is designed to provide organizations with a low cost means of delivering durable, highly available, secure cloud storage for rarely accessed data with flexible latency requirements (on the order of hours)," Hussain wrote. "Archive access tier is our lowest priced storage offering. Customers with long-term storage which is rarely accessed can take advantage of this."
According to Microsoft's pricing page, Azure Archive Blob costs 0.18 cents per GB per month when the service is delivered through its cloud data center in the Eastern U.S. Customers can expect a 99 percent availability SLA (service level agreement) when the service makes its way out of the preview stage, said Hussain.
Complementing the new service is a new Blob-level Tiering feature that will allow customers to change the access tier of storage objects, or "blobs" as Microsoft calls them, among Hot, Cool or Archive.
Also in preview, it enables users to match costs to usage patterns without moving data between accounts, Hussain said. Microsoft will upgrade the accounts of customers who have large amounts of data parked in their General Purpose storage accounts, allowing them to capitalize on the new feature when it reaches general availability.
Adaptability appears to be guiding principles guiding Microsoft's cloud storage strategy.
Earlier this month, the company announced the availability of Imanis Data on Azure. The solution integrates with Azure Blob Storage and Data Lake Store, and uses the HDInsight platform from Microsoft along with software from Imanis Data (formerly Talena) to provide bid data migration and recovery services for big data applications.
See the rest here:
Microsoft Unveils Cost-Cutting Archival Cloud Storage Option - eWeek
Back up everything with lifetime 2TB cloud storage for under $50 – Popular Science
Backing up your most precious files is always a smart move. But local backups are always at risk of fire, flood and hardware failure. You can save your data from these disasters with Zoolz Cloud Storage. This digital time capsule has enough space to back up all your files, with lifetime 2TB storage now just $49.99 at the Popular Science Shop.
Backing up your entire hard drive online would normally be very expensive. Zoolz makes it affordable by using cold storage technology. This provides a long-term home for your files, while still allowing you to download your data within 35 hours. You can keep two computers backed up with one subscription, and Zoolz works across Mac and PC.
The desktop app is actually stacked with great features. The Smart Selection tool lets you choose which files to back up, while bandwidth throttling is really useful on slower connections. You can also control file retention, meaning you can revert to the previous version of any document. For added peace of mind, Zoolz keeps all your data backed up in multiple data centres.
This much lifetime storage would normally cost $3,600order now for just $49.99 to start backing up.
Go here to see the original:
Back up everything with lifetime 2TB cloud storage for under $50 - Popular Science
A Redditor Archived Nearly 2 Million Gigabytes of Porn to Test Amazon’s ‘Unlimited’ Cloud Storage – Motherboard
What does "unlimited" data storage mean, really? Unlimited plansoffered by many cloud storage and backup companies as promotions for their storage capabilitiesare almost never truly limitless. Most come with stipulations about data usage limits if you start hoarding too hard, and the majority of users will never come close to needing that much storage.
Reddit user beaston02 was determined to find the true ceiling of Amazon's cloud storage plan, which was killed off in June. He decided to push its limits with a petabyte of porn. (Some people credit beaston02 for Amazon's decision to cancel the unlimited storage offering, but he denies that rumor.) For reference, a petabyte is one million gigabytes.
To gather this much data, beaston02 wrote scripts that recorded public webcam shows from a variety of adult cam sites, including CamSoda, Chaturbate, and MyFreeCams.
"I have more of a problem with collecting or hoarding data than I do with porn"
"It is nearly entirely porn," he told me in a Reddit message. "Ever since I got into computers, I found myself learning more, and faster when it was something more interesting. Call me crazy, but women interest me more than most other things on the internet and there is a huge amount of data being created daily, so it was a good fit for the project."
He said it took five or six months to collect one petabyte of porn, and he stopped collecting just shy of 1.8 petabytes.
How long would it take one to consume 1.8 petabytes of porn? 1.8 petabytes is about 23.4 years of HD-TV video, but webcam streams are nowhere near that quality. A few good folks crunched the numbers: 720p is about two gigabytes per hour, and at 900,000 hours, that's 102 years of straight calendar time. If the videos are even lower quality, say, 480p, that's around 0.7 gigabytes per hour, or 293 years and six months. Better get to watching.
Image: Camsoda
Beaston02 told me he stopped recording streams simply because his interest in it waned. "I know plenty of people have labeled me some huge pervert or someone with a huge porn addiction, but that's really not me at all," he said. "I have more of a problem with collecting or hoarding data than I do with porn." He said he used the exercise to learn Python, SQL databases, and how to handle that much data. "The project ran its course, I got the knowledge I was hoping to get, and I just had no interest in it anymore."
While he's no longer running the scripts that collected the porn, he made them available on Github. Another Redditor, -Archivist, took up the cause with the "Petabyte Porn Project," recruiting other hoarders to help continue recording live public cam sessions all day every day. -Archivist told me in a Reddit message that this represents "upwards of 12 terabytes per day." Those helping hoard are close to two petabytes now, stored on Amazon's cloud and mirrored on Google Drive. Amazon did not respond to Motherboard's request for comment.
Recording live streams and downloading them for later viewing is technically legal, but morally questionable. Some say "these people chose to be in the public eye" since they're recording from free, public streams, but most free cam models are paid on tips from viewers during live shows.
I asked Charley Hart, a cam model at CamSoda, what she thought of people downloading streams without her knowledge. "Part of me is ok with some of it," she said. "It's one thing if I do itI think that's the whole thing, it's all about consent." She said that some women use live streams because it keeps their identities slightly safer than a static video that's uploaded to a website for repeated viewing. It could also be good advertising for a model to have her work spread widely, but only if viewers seek her out specifically instead of only seeking the next free video.
Some of her viewers download her shows while they watch, but always with her consent, and usually with the understanding that if they capture a good GIF or clip, they'll share with her so she can promote herself with it.
"That's why we ask guys to pay for our porn, because instead of going to production companies now, you're pretty much going specifically to the girl," Hart said. "The fact that some guys are trying to get that fantasy for free? I feel a little taken advantage of. I'm already giving away so much for free ... We choose this job and I understand that, but people should be compensated for what they do."
Beaston02 said that although people request he add features for recording private shows or ones that have geolocation requirementsviewable only by those within a geological area the model or site has specifiedhe doesn't deliver them. "We all have different places where we draw the line morally, and that happens to be one of my lines, at least as far as publicly releasing code goes," he said. "Although I have no control over what any captured content is used for later, I also have asked any content gathered by the scripts not be shared or sold on websites."
Get six of our favorite Motherboard stories every day by signing up for our newsletter.
Google Cloud: How to Reduce Data Storage Costs and Maximize Performance – Enterprise Storage Forum
The Google Cloud Platform (GCP) has gained serious momentum in recent months. Data storage companies such as NetApp, Veritas, Cohesity, MapR, Cloudian and Nutanix are partnering with Google in an effort to broaden the appeal of their offerings. But the Google Cloud itself is a vast universe of services, storage tiers, speeds, feeds and price points.
Google Cloud Platform offers a diverse portfolio that ranges from flexible and unified storage classes to scalable and secure databases, said Chris Talbott, head of cloud storage product marketing at Google. Its designed to handle mission-critical workloads and connect data to any application.
As such, it consists of compute, storage, databases, machine learning, analytics, networking, big data, internet of things, developer tools, management tools and security features. With so many facets to understand, how can users use GCP to reduce storage costs, maximize performance and gain competitive advantage?
Some storage and IT managers have a carte blanche from management to implement cloud-first strategies. They are under no strain to get it right the first time. They are blessed with all the time in the world to figure out the best way to learn from their mistakes and eventually arrive at the right cloud architecture for their organizations. But they represent the lucky few.
In most cases, storage managers are under the gun to show some immediate return. Within a few months, they know they will be called to the carpet to show tangible results in terms of lower storage costs and smaller budgets.
The best way to achieve that, suggested Talbott, is to look for the low-hanging fruit. One likely area, for example, is tape backup. Anyone who is going to have to invest yet again in tape hardware or an upgrade to the latest tape platform should consider the cloud. As well as offering the potential for cheaper storage, the cloud opens the door to doing something with the data (such as analytics) rather than gathering dust in a vault.
Think about underutilized data sets and easy wins, said Talbott. Many organizations are currently backing-up and archiving data to tape, requiring costly infrastructure to maintain and providing little value outside of a recovery event. Not only can you reduce the effort and costs of maintaining that on-premises you can expose those datasets to other technologies in GCP like data analytics and machine learning.
Talbott said that most users are smart enough to assess current needs. They take the time to figure out how they can use cloud storage to fulfill ongoing requirements. But not enough companies look ahead to gain some idea of how their needs may evolve in the years to come. Although it is impossible to predict such things accurately, it is wise to attempt some kind of projection of storage needs at least a couple of years into the future. That might save some embarrassment when you discover in a years time that your plan for cloud storage was hopelessly inadequate.
Storage managers are advised not to rush headlong into cloud storage decisions. They should consult other areas of IT as well as line of business heads before making any firm commitments. After all, storage is just one part of a much larger IT ecosystem. It has to be implemented sensibly in full alignment with other components and enterprise objectives.
While the cloud can reduce the cost of storage and increase the accessibility of data, choosing the right combination of systems is essential in realizing the benefits of cloud storage, said Talbott.
In addition, Talbott recommended using the cloud to take advantage of storage tiers to make information lifecycle policies work for you. As users begin to move data to the cloud, they should try to understand what data needs to be accessed and when so they can take advantage of different storage tiers. For example, GCP offers Nearline and Coldline archival storage. Nearline is best for data that is accessed a few times a year, but if you have data you dont need on a yearly basis, you could move it to Coldline to reduce costs.
To optimize the storage of an object throughout its life, it will likely spend time in each tier of storage from multi-regional to cold, said Talbott. Object lifecycle policies can automate that cascade.
Its one thing to dump data into a cloud repository as a means of reducing costs. But the real value comes in how that data is managed strategically.
Various data and storage management solutions are available from vendors such as StorEdge, Red Hat, NetApp and others. While functionality varies markedly from one tool to another, in general, they are designed to help organizations maintain control over their data, where it resides, how quickly they can access it and how they can harness it in their business.
For organizations who are looking to understand how their GCP cloud investments are meeting business expectations, NetApps data management solutions provide them with clear visibility and insight into cost, performance and data placement to better understand the impact of IT decisions, said Michael Elliott, cloud advocate, NetApp. Organizations can also address the challenges associated with regulatory, data security and sovereignty requirements by maintaining control of their data across its entire lifecycle.
Part of the reason so many storage businesses are partnering with Google is that it provides easy access to innovation and expanded markets.
Collaborate closely with Google, as they are very responsive when it comes to integrating with their services, said Patrick Rogers, head of marketing and product at Cohesity. APIs may work differently across different cloud providers, so what is possible with one provider may not exactly be the same with another.
Greed is good, said Gordon Gekko in the movie Wall Street. Google has paraphrased that to Green is good. Accordingly, the company believes those journeying to the cloud should look beyond cost savings, which are regarded almost universally as the typical measure of success for those adopting cloud storage.
In many cases, cost savings is just a start. This year Google will reach 100 percent renewable energy for all its operations, data centers included. This means users can reduce the environmental impact of their data storage and operations by taking advantage of data centers which use 50 percent less energy than a typical data center.
With the exponential increase in data being stored, it becomes increasingly important to consider how green the electrons are that power its storage, said Talbott.
Photo courtesy of Shutterstock.
See the rest here:
Google Cloud: How to Reduce Data Storage Costs and Maximize Performance - Enterprise Storage Forum
Enterprises Encounter Cloud Storage Cost and Management Challenges – Enterprise Storage Forum
Although it seems that enterprises are flocking to the cloud for their IT needs, data storage in particular, a new survey from DataCore Software suggests that a good number of organizations are running into trouble during the transition.
"Challenges and false starts with technologies have introduced reluctance in the industry to fully commit to software-defined, hyperconverged or a hybrid data storage infrastructure," wrote Paul Nashawaty, product evangelist and director of Technical Marketing at DataCore Software, in a blog post. "Until recently, the promise of cloud, ease of use, and faster application performance have fallen short of expectations."
Some of those expectations include storage services that don't break the budget.
Despite claims by vendors that cloud storage is cheaper than on-premises solutions, the opposite is often true. Nearly a third (31 percent) of the 426 IT professionals quizzed for the company's State of Software-Defined Storage, Hyperconverged and Cloud Storage survey (registration required) said that instead of slashing costs, their move to cloud storage increased costs instead.
Storage management was a key factor in why many organizations are having a tough time containing cloud costs. Twenty-nine percent of respondents said management proved to be more difficult on the cloud.
Continuing the theme of data storage disappointments, DataCore discovered that speedy flash storage also fell short for some organizations. Sixteen percent said the technology did little to accelerate their applications.
In terms of hyperconverged infrastructure (HCI), systems that typically integrate storage, compute, networking and virtualization, more than a third (34 percent) are strongly considering the technology. Forty-one percent define HCI as a hardware-agnostic solution that is tightly integrated with a hypervisor and 27 percent expected an integrated appliance in which hardware and software are tightly interlocked.
Generally, enterprises are considering HCI for their database workloads (34 percent) and their data center consolidation projects (28 percent). Another 28 percent of respondents are eyeing HCI to power their enterprise applications.
Meanwhile, IT executives remain wary of hybrid-cloud deployments.
More than half of the survey takers cited sensitive data and security as main reasons to avoid the cloud. Forty-seven percent said that they had no plans to move any type of application to the cloud, of either the public or hybrid variety.
A third (33 percent) said they expected to move some enterprise applications and data analytics workloads (22 percent). Cloud databases and virtual desktop infrastructure (VDI) services were appealing solutions for 21 percent and 16 percent of respondents, respectively.
Pedro Hernandez is a contributing editor at Enterprise Storage Forum. Follow him on Twitter @ecoINSITE.
Excerpt from:
Enterprises Encounter Cloud Storage Cost and Management Challenges - Enterprise Storage Forum
Cisco Validates SwiftStack’s hybrid cloud storage software – Read IT Quik
SwiftStack, a leading provider of hybrid cloud storage solutions for enterprises, has announced that Cisco has validated its hybrid cloud storage software for the Cisco UCS S3260 storage server and integrated UCS Manager. It is the only object storage solution with solution bundle part numbers on Ciscos global price list, and is the only such solution with the option of Cisco Solution Support. The combination of ordering efficiency, end-to-end solution support, and validated designs provides for the ideal consumption experience for enterprises.
This Cisco Validated Design (CVD) status is the result of successful completion of comprehensive testing and documentation of SwiftStack by Cisco engineers to facilitate faster, more reliable and more predictable deployments for customers. CVDs provide the foundation for systems design based on common use cases or current engineering system priorities. They incorporate a broad set of technologies, features and applications to address customer needs. With SwiftStack software running on Cisco UCS Storage Servers, organizations can leverage the benefits of the public cloud while retaining the control and level of protection typically associated with private data centers running behind enterprise firewalls and security.
SwiftStack innovations power hybrid cloud storage for enterprises. Enabling freedom to move workloads between private data centers and public clouds like Amazon or Google, SwiftStack software delivers universal access to petabytes of unstructured data in a single namespace. Corporate data remains under the management control of internal IT, served by infrastructure that starts small and seamlessly scales huge within and across geographic regions. The result, pay-as-you-grow consumption of IT-managed resources on-premises and in the cloud.
Enterprises are looking for the benefits of public cloud storage as they modernize their business workflows and leverage hybrid cloud solutions, said Don Jaworski, CEO of SwiftStack. As a Cisco Preferred Solution Partner, we have collaborated to establish a joint solution that leverages the high density and industry-leading networking capabilities of Cisco UCS Storage Servers in a solution that Enterprises can easily and confidently consume.
SwiftStack brings the fundamental attributes of public cloud resources into the enterprise infrastructure: scalability, agility, and pricing based on consumption. Legacy applications can access and consume storage via file services, and cloud-native applications via object. Users gain the freedom to move workloads between private data centers and public clouds like Amazon or Google, and from cloud to cloud, according to administrative policies. Whether on-premise or in public cloud, data remains under the management control of internal IT, residing wherever it is needed by users and applications.
See the original post here:
Cisco Validates SwiftStack's hybrid cloud storage software - Read IT Quik