Category Archives: Cloud Storage
Cloud Storage Software Market 2020 Services, Demand, Size, Growth, Trends, Business Opportunities, Industry Analysis, Top Players and Forecast to 2025…
The research report on the Cloud Storage Software market offers a comprehensive study on market share, size, growth aspects, and major players. In addition, the report contains brief information about the regional competitive landscape, market trends, and drivers, opportunities and challenges, distributors, sales channels, risks & entry barriers, as well as Porters Five Forces Analysis. Moreover, the main objective of this report is to offer a detailed analysis of how the market aspects potentially influence the coming future of the Cloud Storage Software market. The report also offers a comprehensive analysis about the competitive manufacturers as well as the new entrants also studies along with their brief research.
Request sample here : https://www.orbisresearch.com/contacts/request-sample/2358428
In addition, this report also contains a price, revenue, market share, and production of the service providers is also mentioned with accurate data. Moreover, the global Cloud Storage Software report majorly focuses on the current developments, new possibilities, advancements, as well as dormant traps. Furthermore, the Cloud Storage Software market report offers a complete analysis of the current situation and the advancement possibilities of the Cloud Storage Software market across the globe. This report analyses substantial key components such as production, capacity, revenue, price, gross margin, sales revenue, sales volume, growth rate, consumption, import, export, technological developments, supply, and future growth strategies.
Moreover, the Cloud Storage Software report offers a detailed analysis of the competitive landscape in terms of regions and the major service providers are also highlighted along with attributes of the market overview, business strategies, financials, developments pertaining as well as the product portfolio of the Cloud Storage Software market. Likewise, this report comprises significant data about market segmentation on the basis of type, application, and regional landscape. The Cloud Storage Software market report also provides a brief analysis of the market opportunities and challenges faced by the leading service provides. This report is specially designed to know accurate market insights and market status
The key players covered in this study
Amazon Web ServicesMicrosoftIBMHPEOracleDell EMCNetappGoogleVMwareCA TechnologiesRackspace HostingRed HatHitachi Data SystemsHuawei Technologies
Market segment by Type, the product can be split into
Private CloudPublic CloudHybrid Cloud
Market segment by Application, split into
BFSIGovernment & EducationHealthcareTelecom & ITRetailManufacturingMedia & EntertainmentOthers
Market segment by Regions/Countries, this report covers
United StatesEuropeChinaJapanSoutheast AsiaIndiaCentral & South America
Get the DISCOUNT on this report : https://www.orbisresearch.com/contacts/discount/2358428
The study objectives of this report are:
To analyze global Cloud Storage Software status, future forecast, growth opportunity, key market and key players.To present the Cloud Storage Software development in United States, Europe and China.To strategically profile the key players and comprehensively analyze their development plan and strategies.To define, describe and forecast the market by product type, market and key regions.
In this study, the years considered to estimate the market size of Cloud Storage Software are as follows:
History Year: 2013-2017Base Year: 2017Estimated Year: 2018Forecast Year 2018 to 2025
Major Points From Table of Content:
Chapter One: Report OverviewChapter Two: Global Growth TrendsChapter Three: Market Share by Key PlayersChapter Four: Breakdown Data by Type and ApplicationChapter Five: United StatesChapter Six: EuropeChapter Seven: ChinaChapter Eight: JapanChapter Nine: Southeast AsiaChapter Ten: IndiaChapter Eleven: Central & South AmericaChapter Twelve: International Players ProfilesChapter Thirteen: Market Forecast 2018-2025Chapter Fourteen: Analysts Viewpoints/ConclusionsChapter Fifteen: Appendix
Browse the complete report : https://www.orbisresearch.com/reports/index/global-cloud-storage-software-market-size-status-and-forecast-2018-2025
Orbis Research (orbisresearch.com) is a single point aid for all your market research requirements. We have vast database of reports from the leading publishers and authors across the globe. We specialize in delivering customized reports as per the requirements of our clients. We have complete information about our publishers and hence are sure about the accuracy of the industries and verticals of their specialization. This helps our clients to map their needs and we produce the perfect required market research study for our clients.
Hector CostelloSenior Manager Client Engagements4144N Central Expressway,Suite 600, Dallas,Texas 75204, U.S.A.Phone No.: +1 (972)-362-8199 ; +91 895 659 5155
HiveIO Introduces Hive Fabric 8.0 with Added Business Intelligence Capabilities and New Cloud Storage Offerings – Citybizlist Real Estate
HOBOKEN, N.J.--(BUSINESS WIRE)--HiveIO today announced version 8.0 of Hive Fabric, an intelligent virtualization solution that provides high-performing, scalable technology that removes complexity in the data center and delivers a seamless IT experience. The new version provides protection for virtual machines (VMs) and user data with its Disaster Recovery (DR) capability by seamlessly integrating with cloud storage. Hive Fabric 8.0 also incorporates advanced business intelligence (BI) tools into Hive Sense a capability that proactively notifies HiveIO of an issue within a customer environment to provide valuable insight to applications and resource utilization.
With 8.0, we deliver a powerful disaster recovery solution capable of replicating VMs and user data to cloud-based storage such as Amazon S3, providing even more flexibility to our customers, said Toby Coleridge, HiveIO Chief Product Officer. Other enhancements seen in 8.0 were a direct result of user feedback, as more and more of our customers are requesting business insights and automation. With 8.0, we can meet that market need.
The new features of Hive Fabric 8.0 include:
Hive Fabric 8.0 continues to build upon the artificial intelligence-ready solution that enables organizations to deploy virtualization technology without unnecessary vendor complexity or the need for costly specialists.
"Moving our virtual server and desktop environment from VMware to Hive Fabric dramatically simplifies the administration for our IT team. The single web-based interface allows us to gain insight to how our users work and what applications they are running, said Dave Gartside, IT Director at Solihull College & University Centre. "We are especially excited to see the integration with cloud storage, which provides us with more agility and a cost-effective solution for Disaster Recovery.
Hive Fabric enables users to deploy virtual desktops, virtual servers, and software-defined storage in a single install, eliminating the need for a multi-vendor and multi-contract approach. To learn more about Hive Fabric 8.0, click here.
About HiveIO Inc.
HiveIO empowers IT users with intelligent virtualization technology that helps their organization thrive. We are the only provider of a virtualization stack virtual desktops, virtual servers, and software-defined storage that runs on AI-ready, zero layer, hardware-agnostic architecture that any technologist, no matter their skill set, can manage. Our platform results in reduced cost and complexity for the organization and delivers superior performance, freedom to scale, and remote access so users can achieve more of the work that matters. HiveIO is based in Hoboken, New Jersey, and serves customers globally. For more information, visit http://www.hiveio.com, or follow HiveIO on LinkedIn, Twitter, or Facebook.
The three keys to a successful, data-driven 2020 – Gigabit Magazine – Technology News, Magazine and Website
It wont come as a surprise to anyone that data growth is on the up, but what may be less widely known is that the places where data is being generated are starting to change rapidly. Businesses will start to see how more of their data is being produced in the cloud or the edge, rather than in traditional data centres. The main impact of this will be on how these businesses analyse and understand their data in these locations, and then how they manage it there, without the need to use a data centre as a go-between.
For 2020, there are three main areas that are predicted to experience the most change when it comes to data:
Data at the edge needs data management at the edge
Edge buildout is already happening, and its pace is accelerating with trends like IoT, self-driving cars, and biometrics. IDC predicts that, by 2023, 50% of all enterprise IT infrastructure will be deployed at the edge, up from 10% today. More apps are generating much more data at the edge, raising the question of why data should not be better understood and managed directly at the edge. Imagine if you could analyse data, figure out what data is useful and needs to be brought back to the data centre, and directly process the rest of the data at the edge itself without having to first move it all. This is why edge data management will rise in importance over the next few years.
Cross-cloud data management as a requirement
Most enterprises using the public cloud already have a hybrid or multi-cloud strategy, and enterprises are increasingly choosing to diversify their cloud strategy across two or more clouds, giving customers more freedom and choice. As this trend continues, enterprises need a simple way to understand and manage their data sprawl across clouds and the hybrid enterprise, leading to greater demand for cross-cloud data management solutions that are not tied to any particular cloud or storage vendor.
Analytics-driven AI takes centre stage
For the last couple of years, Artificial Intelligence (AI) and Machine Learning (ML) have been a big theme and this trend is continuing to grow. While these have initially been more of marketing buzzwords, the potential of AI in data management is clear how can you manage something you dont understand? By using analytics to drive AI/ML, analytics driven data management solutions can continue to leverage the understanding of data to drive better management. This year, businesses will see more exciting developments in this space that leverage deep analytics and data lakes across different storage repositories to help them better manage their data.
As the new decade gets well underway, the amount of data produced, stored and analysed continues to rise. Because of this, businesses in every industry will need to work towards being able to manage this data wherever it resides, from a data centre all the way to the edge or even up in the cloud.
By Krishna Subramanian, COO at Komprise
Electronic health record giant Epic Systems has been notifying customers that it no longer plans to pursue integrations with Google Cloud, and will instead focus on integrating into platforms offered by Amazon Web Services and Microsofts Azure service, according to a report from CNBC.
That report notes that "insufficient interest" from Epic customers in Google is behind the decision to focus efforts instead with those cloud competitors.
Epics Vice President of Research and Development Seth Hain told CNBC that Epic invests "substantial time and engineering effort in evaluating and understanding the infrastructure Epic runs on."
He added that "scalability, reliability, and security are important factors we consider when evaluating these underlying technologies," and said the company prioritizes, "infrastructure the Epic community uses today and is likely to use in the future.
A report in The Wall Street Journal earlier this month noted that Cerner, Epics chief rival in the digital medical records space, which has also chosen AWS for its cloud storage needs, made the choice despite being offered "$250 million in discounts and incentives" from Google.
Cerner has since expanded its relationship with AWS, designating it as a preferred artificial intelligence and machine learning provider and aiming to migrate its core applications to AWS.
Google has also been on the defensive for months following the news that the company had teamed up with Ascension Health in a data sharing deal that raised industry eyebrows and data privacy and security concerns.
Google Health VP Dr. David Feinberg pushed back publicly at the Startup Health Festival this past week, defending the companys partnership.
"The press has made this into something that it's not," Feinberg said. "This is not us mining somebodys records to sell ads, to learn from it, to do machine learning, to develop products. We developed this on de-identified data. We brought this to Ascension. We're piloting with them."
While Google apparently signed a business associate agreement with Ascension, and the scope of the data sharing appears to be in line with HIPAA allowances, there are still many questions about how the patient information is being put to use.
Read the original here:
Epic tells customers it will stop Google Cloud integrations, says report - Healthcare IT News
Ask a line of business manager about disaster recovery, specifically what data they need to recover and when by, and the answer is likely to be: All of it and now.
As businesses depend increasingly on data to function, how to recover from an IT systems failure becomes increasingly important. Businesses and their customers are less and less tolerant of downtime, and of data loss.
So, IT departments are being forced to look again at how quickly, and how comprehensively, the organisation can get back on its feet.
That pressure has forced IT systems architects and DR teams to work to shorter recovery time objectives (RTOs) and recovery point objectives (RPOs).
In practice, this means organisations backup more data, more frequently, and need to restore it more swiftly. The RPOs and RTOs agreed with the business in turn determine the type of technology selected for recovery and business continuity.
As Phil Goodwin of analysts IDC points out, the trend is for businesses to move towards high availability rather than disaster recovery.
Developments in virtualisation and cloud computing have made that goal more realistic for a wider range of organisations.
The gold standard of business continuity and high availability is synchronous mirroring. For effective disaster recovery, the mirror needs to be offsite in the businesss own secondary or backup datacentre. Or, at a colocation site or a disaster recovery suppliers location, such as 4SL or Sungard AS.
The location will depend on the risks and threats the organisation faces. Physical threats, such as extreme weather or terrorism, will mean the secondary site will need to be further away. But this puts more pressure on network infrastructure, and invariably increases costs. Companies will need high bandwidth, and possibly redundant, links between the two mirrors.
This is the standard for business-critical applications and services, but its not cheap because you need two infrastructures and at least one robust, appropriately-sized link between them, says Barnaby Mote, CEO at managed service provider 4SL.
Synchronous mirroring is the favoured option in industries that have a very low tolerance for downtime and very short recovery time objectives. These include financial services, as well as some areas of government.
Businesses often choose to manage synchronously mirrored datacentres in house, because primary and backup sites must be kept tightly aligned. This pushes up costs and reduces flexibility, as the technology platforms have to stay in sync.
Some businesses opt for a lower performance setup at their secondary sites in order to save money, but this will depend on how performance-sensitive applications are.
IT teams could also save money by only deploying synchronous mirroring for their most critical applications. In practice, organisations tend to lift and replicate entire environments, including all storage and data. This is because of the work needed to separate out critical applications, and the risk that by dividing up the infrastructure something would be missed and cause the copy to fail.
Physical-to-virtual failover is a lower cost and potentially more flexible way of providing real time or near real time backup, and quick restoration.
The most obvious use case is for environments that are already virtualised. Companies have the choice to run in-house tools to replicate VMs, or use a service provider. VMWares vSphere replication offers single and multi-site data protection, as does Microsofts System Center Data Protection Manager.
One advantage of failing over to a virtual environment is to be able to use shared infrastructure for the backup site, to cut costs and reduce management overheads.
Another trend is for suppliers to offer platform-agnostic VM replication services. This allows firms to run backups on alternative infrastructure, and to failover a heterogeneous system to a single backup platform. This also opens up the option to use the cloud for replication.
The model is best suited to organisations already running virtualised environments. Technologies exist for physical to virtual recovery (P2V) restoring an entire backup environment to a virtual machine but backup copies need to be created regularly and moved offsite. Nor is failing over to a virtual environment suitable for companies that need continuous access to their data, such as those trading in financial markets.
On the plus side, most suppliers now provide bare metal recovery for virtual backups, which will help bring the business back online quickly. IT can also store local backups of VMs for quick recovery for example, if local hardware fails at the same time as staging copies to offsite storage for true DR.
The choice of dedicated, or shared, failover environments will again depend on the businesss RPO/RTO requirements and its budgets.
And, while backup to a virtual environment is a good option to save costs and cut complexity, for businesses that run highly-virtualised production environments it can be the only practical option.
Conventional backup and recovery technology does not fit well with virtualised systems because of their shared infrastructure and especially, shared storage. A VM-specific backup system is the best way to avoid bottlenecks and ensure safe recovery of the VMs.
The cloud really extends options for businesses to backup their data, and their production environments.
Suppliers that offer backup services for VMs increasingly offer cloud storage too. Acronis, for example, provides platform-agnostic VM backup, while Microsoft Azure can back up Azure VMs, SQL and businesss local VMWare machines. Products from suppliers such as Veeam and Commvault also support replication to the cloud.
Businesses can save money by only spinning up virtual servers when they invoke their DR plan, but they will of course have to pay for storage.
Recovery into the cloud is feasible primarily for virtual environments; physical machines are problematic, says 4SLs Mote. A range of RTO/RPOs are possible depending on the technology.
Cloud services also work well for smaller businesses that might not have the skills and staff to run backup hardware. Companies can start with simple, online file storage or lower-end data backup services and scale up to application or VM backup as they grow.
The IT used to assist in DR has matured greatly over the course of the past decade, says Freeform Dynamics Tony Lock.
Options exist at hardware level to copy or snapshot and replicate data between similar platforms, sometimes even over distances that previously would have been either impossible or prohibitively expensive. At the same time, software tools have been developed to deliver similar capabilities but between different hardware platforms.
Backing up to hybrid environments is more complex than either straightforward physical-to-physical backup, or replication of an entire (virtualised) environment to the cloud.
On paper, a hybrid approach allows the business to decide which workloads should be mirrored, backed up to a virtual environment, or replicated to the cloud. The challenge is to decide which data and which workloads go where, and to maintain consistency.
Recovery from a hybrid environment will also be more complex. IT teams need to ensure all elements can be recovered from each platform after a full-scale disaster. But they also need a plan to deal with operational recovery, such as deleted files. Data egress charges levied by cloud providers are a cost that is easy to overlook.
Its official. Microsoft is no longer offering any support for Windows 7. Support ended on Jan. 14, 2020.
This means Windows 7 will no longer have support specialists backing it up or updating, and in order to have a safe computer, youll need to upgrade your OS to Windows 10. Tap or click here to learn how to get Windows 10 for free.
But what if you really like the look of Windows 7, and Windows 10 has features you dont want? Well, if youre willing to learn a few tech tricks, we can help you make your computer look and act like it still has Windows 7, starting with the Start menu.
The Windows 10 Start menu introduces a new tile format in addition to the traditional, alphabetical, menu list. If you hate this layout, you can get the Windows 7 style back in two ways.
The first involves the removal of pinned apps. Its not a perfect recreation of Windows 7, but its the simpler way to get the idea of the old Start menu back. It also serves as a great visual transition from 7 to 10.
To unpin applications from the Start menu:
The second way to get Windows 7 is to download the third-party app Open Shell, once known as Classic Shell. This way is much more technical, but itll get you the exact Windows 7 Start menu look you want.
To get and use Open Shell:
If you miss the aesthetic of Windows 7, as well as the Start menu functionality, we recommend learning how to use Open Shell. If you just want a simpler menu thats similar to Windows 7, the unpinning method should do the trick.
File Explorer is only slightly different in Windows 10, but there are still new features like the Ribbon. As above, theres a way to get the idea of the Windows 7 version back, and a way to make things look exactly the same with a third party program.
To get the basic Windows 7 File Explorer structure:
If you hate the ribbon from the Windows 10 File Explorer, and want the Windows 7 look back entirely:
Note: Ribbon Disabler makes a backup of the original File Explorer program files, but also replaces them. Meaning this app affects your computers core programming and should be downloaded with extreme caution.
Windows 10 really pushes Cortana, (its built-in AI helper), its bottom search bar and OneDrive, its cloud storage system. Tap or click here to learn about other great cloud storage programs.
If you have no interest in these new features, you can get them out of your way pretty easily, and this time you can do it without third-party software. Lets start with clearing up your view.
To hide things in the task bar:
If you have a Cortana menu instead of Search, click Hidden to make that button go away, and just deselect Show Task View button to make that disappear.
To disable Cortana, and not just hide her, do the following:
To hide OneDrive:
To disable OneDrive:
So to all the Windows 7 lovers: Youre going to be OK. You can get back more of the classic aesthetic by downloading Windows 7 wallpapers and playing with Open Shell skins.
Updating your OS is important for your computers health. Tap or click here to learn more about why that is. It doesnt mean losing your favorite features though, so long as youre a little tech savvy.
Alibaba Cloud Named First Public Cloud Vendor in the World to Obtain Trusted Partner Network (TPN) Certification – Business Wire
HANGZHOU, China--(BUSINESS WIRE)--Alibaba Cloud, the data intelligence backbone of Alibaba Group, has today announced that it is the first public cloud vendor in the world to obtain the prestigious Trusted Partner Network (TPN) certification, an achievement that validates the entertainment industrys confidence in its robust security and trustworthiness as a cloud service provider. With viewing consumption habits changing rapidly, broadcast platforms constantly evolving and new production techniques emerging globally, Alibaba Cloud is successfully helping the entertainment industry to revolutionise how it works in order to respond to and embrace these changing dynamics by offering a highly secure, dependable, flexible and scalable cloud-based platform.
The TPN is a joint venture between two major entertainment industry associations; the Motion Picture Association of America (MPAA) and the Content Delivery & Security Association (CDSA). The TPN's goal is to help companies to ensure content security, prevent leaks, breaches, and hacks of movies and TV shows before they are released, by creating a single, central global directory of trusted partner vendors. The certification is significant because it satisfies content producers that Alibaba Clouds use of industry best practices ensures that its solutions, facilities, people and workflows are secure, as certified by experienced industry evaluators.
To secure this highly prestigious accolade - which distinguishes Alibaba Clouds solutions - they had to undergo very stringent auditing and evaluation processes. A number of Alibabas solutions that are suited to the entertainment industry were tested, including: Object Storage Service - an encrypted and secure cloud storage service which stores, processes and accesses massive amounts of data from anywhere in the world; Express Connect - an easy-to-use network service that enables high-bandwidth, reliable, secure, and private connections between different environments; Cloud Storage Gateway - this uses OSS for cloud-based storage at the back end, and supports standard file and block storage protocols in the industry; and Key Management Service this facilitates the creation, deletion and management of encryption keys with Alibaba Cloud Key Management Service. All of Alibabas Clouds audited solutions passed TPNs demanding tests.
"By completing the on-premises TPN audit process successfully, Alibaba Cloud demonstrates its mature abilities in securing media content with its facility and infrastructure capabilities. As the future of media productions is shifting to public cloud platforms, it is essential for vendors like Alibaba Cloud to pioneer innovations that will propel the advancement of the entertainment industry in a digital era," said Drew Branch, Senior Security Consultant at Independent Security Evaluators.
Today, world-class production houses - including Animal Logic and Territory Studio - have already been assured by Alibaba Clouds ability to meet their demands and have embraced its solutions. They are enjoying the benefits of utilising more cloud computing technologies to drive new trends in media production efficiencies, ranging from improving the speed of decision making, easier collaboration between remotely-located artists and developer teams as well as the inherent advantages that come from using Alibaba Cloud in order to drive new industry trends.
Commenting on the certification, Yuanbin Zheng, Head of Security Compliance and Privacy at Alibaba Cloud Intelligence, said: With such high value and sensitive assets to protect, production houses are naturally drawn to the solutions that offer the highest levels of security. Not only does the TPN certification recognise the effort that Alibaba Cloud has made to deliver industry leading levels of security, it also acknowledges the dependability, flexibility and scalability of our cloud-based platform. Furthermore, as the first public cloud vendor to be accredited with the TPN certification, the accolade further reinforces Alibaba Clouds market-leading position as well as its solutions now proven ability to meet the needs of the entertainment industry.
About Alibaba Cloud
Established in 2009, Alibaba Cloud (www.alibabacloud.com), the data intelligence backbone of Alibaba Group, is among the worlds top three IaaS providers, according to Gartner, and the largest provider of public cloud services in China, according to IDC. Alibaba Cloud provides a comprehensive suite of cloud computing services to businesses worldwide, including merchants doing business on Alibaba Group marketplaces, start-ups, corporations and government organisations. Alibaba Cloud is the official Cloud Services Partner of the International Olympic Committee.
We've already looked at open-source alternatives to several major Google apps and services in this series, but there are still a few categories left to go over. Now it's time to check out the open-source equivalents to Google Drive, the company's cloud storage product.
Thankfully, the feature gap between Google Drive and the alternatives isn't massive all of them have clients for desktop and mobile, easy file sharing, and other features. Depending on what hardware you have on hand, these options might not even cost you anything.
Why does open-source matter?
Free and open-source software (FOSS) has a number of advantages, but to most people, the main benefit is privacy. All the code is out in the open, so anyone with programming knowledge can go through it and see exactly what an app is doing. Proprietary apps can sometimes feel like black boxes, where you don't really know what's going on behind the scenes. That's rarely the case with FOSS.
I say 'almost,' because there's technically nothing stopping open-source apps from spying on you, but that behavior is extremely rare. If a developer is doing something they're not supposed to be, like spying on users or bundling malware, they probably wouldn't announce it to the world.
Many people simply prefer open-source apps out of principle, in the same way that some people prefer shopping at locally-owned stores instead of Walmart or Target. These apps are often created by individuals or small groups in their spare time, as opposed to large companies with income generated from advertising or venture capital.
NextCloud is widely regarded as the golden standard for hosting your own cloud. It goes far beyond simply hosting files there are plugins for adding a task manager, a calendar, collaborative document editing (akin to Google Docs), video conferencing tools, notes, and much more. While the Android app only supports manging files, there are some Android clients for NextCloud plugins. For example, the NextCloud News app allows you to synchronize RSS feeds with the RSS plugin, giving you a self-hosted RSS reader service.
Setting up a NextCloud installation is the tricky part: you either need an always-on PC that you can run the server software from (even a $35 Raspberry Pi will do the job), or you can use a hosting service like Webo, CiviHosting, or Hostio. Unless you have experience with Linux servers, I'd recommend just paying for a hosting service.
As previously mentioned, the NextCloud Android app is primarily for mobile file management. You can upload/download/share files, sync files and folders to your device for offline access, and even auto-upload photos to your server. That last feature makes NextCloud a possible Google Photos replacement too, as long as you have enough storage for all your pictures.
The NextCloud web interface
You can try a demo of the NextCloud web interface here. If you save the temporary username and password to your Chrome passwords (or other password manager), you can also use it to test out the Android app.
I'm including OwnCloud here mostly because I didn't want to have just two options for this article. NextCloud was originally based on OwnCloud, and while NextCloud has flourished in the years since it became a separate project, OwnCloud has seen slower development. Much of the ownCloud development team has moved onto NextCloud, including the original founder, Frank Karlitschek.
OwnCloud functions almost identically to NextCloud: you have to set up your own server (or pay for a hosting service), there are many plugins available, and so on. While there are a few plugins on ownCloud that aren't available on NextCloud, the vast majority of them are enterprise-focused, and not anything you would probably care about.
OwnCloud mostly remains popular because of its existing userbase, but if you're starting fresh, it's probably a better idea to go with NextCloud.
If you really don't want to deal with setting up a server, or paying someone to host a server for you, you might like Syncthing. It's not a cloud service it uses technology based on BitTorrent to synchronize files across all your devices using peer-to-peer data transfer. While this has the advantage of being completely free, you can't share files like you can with NextCloud or OwnCloud.
The Syncthing app is extremely bare-bones: it syncs your files to your phone or tablet... and that's it. All data is saved in the actual Android file system, so you don't have to manually export anything you want to open on your phone. This also makes apps like Moon Reader easier to use, since they can scan the Syncthing folder for new files on their own.
If you don't need the advanced sharing and collaborative access features that NextCloud and OwnCloud offer, Syncthing is a free and simple alternative worth checking out. It's certainly great from a privacy perspective your files never leave your own devices.
Public cloud has served as a catalyst to nearly every successful enterprise. It brought into being a plethora of startups. For small teams with great ideas the public clouds cost model and convenience made it possible to build a business. Public cloud providers unlocked innovations that otherwise would have taken much longer or never would have seen the light of day.
For that, the cloud deserves much credit.
Charging startups only for the storage used, public cloud providers not only enabled their customers growth but powered their own amazing expansion, too. Forrester projects that, even in a supposed slowed growth phase, between 2018 and 2022 the revenue from public cloud infrastructure, platforms, and apps will have a compound annual growth rate of 21%, reaching $411 billion. Its a given these days that some applications are public-cloud exclusive; its what theyre built for and never migrate out of. Enterprises that contributed to this remarkable growth found a way to do more with less by consolidating workloads thanks to moving to the cloud.
The company I work for, Seagate Technology, is among the many beneficiaries. Like many businesses, we used to have data scattered in silos. Operational challenges consumed too much of our staffs time. The heterogeneity was hard to scale. When we first migrated one challenging workload, our Hadoop analytics, to the cloud, we saw a 40% reduction of costs.
But public cloud hasnt turned out to be cloud nine.
While we did see benefits of consolidation in the cloud initially, later on we experienced challenges. The initial euphoria from CapEx reductions gave way to gradual changes in our monthly cloud bill. Because of how we access our data, our total monthly expenses became unpredictable.
As many thriving businesses have learned, the initial comfort public data centers offer tends to dissipate, giving way to bill shock. Once enterprises reach scale, too many CIOs have to walk into their CEOs office with their hat in their hands and ask for an additional 20% to cover the bills. Moves like wanting to migrate some data out from the cloud can incur penalties.
The very things that earned the public cloud its loyal following pricing transparency, predictability of costs, scalability, and latency savings now pose challenges.
Former cloud success stories, Dropbox and Snapchat, are examples of this.
In 2015, Dropbox decided to cut costs by migrating its users onto its own infrastructure and software. The resulting savings amounted to $75 million. Snap Inc., the parent of Snapchat, spent over $1 billion on cloud-computing servers over the last two years and has been attempting to walk back its cloud commitments in order to stop bleeding so much cash.
Meanwhile as more and more data is generated at the edge and 30% of it will soon need processing near where its created the edge is also pulling data away from public cloud. Were entering a multicloud and edge-core world. In this new world, the convenience of public cloud needs to be complemented with predictability in cost, latency, and growth. This is why enterprises are choosing to add private and hybrid clouds into the mix. According to IDC, by 2022, 70% of enterprises will integrate cloud management across their public and private clouds by deploying unified hybrid/multicloud management technologies, tools, and processes.
Not that any of it threatens hyperscale cloud providers. The Data Age is still emergent, bound to generate 175ZB of data by 2025. The proliferation of data guarantees those providers more success. According to Gartner, in 2019 alone, public cloud revenue is set to grow 17.5%. An IDC report predicts that public cloud spending will soar from $229 billion in 2019 to almost $500 billion by 2023.
In spite of this success or perhaps because of it cloud companies have the option to pass benefits on to customers at scale.
On behalf of CIOs and practitioners who are proponents of public cloud, Id like to make a recommendation to cloud providers that will benefit everyone. Why not use this position of scale to help the very customers that the cloud helped mature? Think about it: Data will only grow. It will create more business. Theres enough value, created and shared, to go around. Why not invest in lifelong customers of choice by offering a predictable experience? Instead of pricing tiers and fences that limit the amount of data stored and activated, cloud providers can break open the market with simple, flat, and predictable pricing with one pricing meter only capacity.
Storing hundreds of exabytes of data will unlock new use cases and analytics that help create higher-order revenue streams, which would not exist if the workloads repatriated away from public cloud.
Ravi Naik is Seagate Technologys Chief Information Officer and Senior Vice President of Corporate Strategy.
[Find out about guest-posting for VentureBeat.]
Formulus Black’s Forsa Software Named Finalist for TechTarget’s 2019 Storage Product of the Year – Citybizlist Real Estate
JERSEY CITY, N.J.--(BUSINESS WIRE)--In-Memory Storage and Virtualization innovator Formulus Black today announced that it has been selected as a finalist in the Storage System and Application Software category of the 18th annual Storage magazine and SearchStorage.com Products of the Year awards for its revolutionary Forsa 3.0 software, which enables any application to run cost efficiently in memory without modification.
The 2019 Storage magazine and SearchStorage Products of the Year award recognizes winners in five categories: Backup and Disaster Recovery Hardware, Software and Services; Cloud Storage; Disk and Disk Subsystems; Hyper-converged and Composable Infrastructures; and Storage System and Application Software. All the enterprise storage products were judged based on criteria of performance, innovation, ease of integration, ease of use and manageability, functionality and value. Winners will be announced February 17.
Formulus Black earned its finalist placement with its enterprise-hardened version of Forsa, which sets the bar for enabling larger databases, I/O-hungry HPC jobs, artificial intelligence and machine learning model training to run cost efficiently in memory on commodity hardware.
Along with predictive analytics and AI, finalists in this category were clearly mindful of modern enterprise data storage needs beyond simply meeting high-performance and low-latency requirements, according to the finalist announcement. With 3.0, Formulus Black gave Forsa numerous updates to enable applications to take advantage of memory channel performance. Forsa 3.0 enables any application to run databases and other I/O intensive applications entirely from memory without modification using commodity server hardware.
Forsa software utilizes fast DRAM and Intel Optane DC persistent memory as in-memory storage and a patented BitMarker data encoding algorithm to identify patterns in data and increase the effective storage capacity of memory. Forsa enables applications to dramatically improve data processing speed while protecting against data loss via advanced features such as high availability and BLINK backup and restore. With Forsa, large, multi-terabyte database and analytics workloads can easily persist and run in memory on commodity server hardware benchmark tests show Forsa provisioned persistent memory can deliver up to 2.7x more TPS at roughly 1/4th the latency of NVMe SSDs for mixed read/write database workloads.
While we can extol the virtues of Forsa forever, having a highly respected industry establishment like TechTarget echoing our attributes makes for a much more compelling case for enterprises considering implementing Forsa into their business environment, said Jing Xie, Chief Operating Officer at Formulus Black. Being named a finalist in the Storage magazine and SearchStorage.com Product of the Year awards is a great honor and recognition of all of the hard work by our team here at Formulus Black. We look forward to seeing how we place among our fellow finalists when the awards are given next month.
The complete list of Storage magazine and SearchStorage.com Products of the Year finalists is available at https://searchstorage.techtarget.com/feature/Enterprise-data-storage-2019-Products-of-the-Year-finalists. Additional information about how Forsa supercharges the performance of I/O-intensive applications is available at https://www.formulusblack.com.
Follow Formulus Black
About Formulus Black
Formulus Black develops FORSA, a software technology that is unlocking the power of in-memory compute for all applications by enabling server memory to be easily and efficiently used as a high-performance storage media. FORSA can be used to power the most demanding application workloads and for developers seeking to minimize latency, maximize throughput, and scale without performance loss. For more information and to trial our software, please visit: https://www.formulusblack.com