Category Archives: Cloud Storage
The One Best REIT To Own In 2020 – Forbes
broken chinese fortune cookie with 2020 and red hearts on the paper slip on a dark background with... [+] copy space, new year concept, high angle view from above
As my regular readers know, I am a huge proponent of diversification, as I consider it one of the best ways to sleep well at night.
However, as we roll into the New Year, I am fielding requests from readers inquiring about the best REIT to own in 2020.
So, while I purposely fulfil that request, I must provide the following disclosure, and that is that one of the best ways to mitigate risk is to carefully diversify your portfolio. By diversifying, you provide yourself with insurance that if one stock blows up, it will not severely impact your nest egg.
Digital Realty: A Data Center REIT To Own For Decades
Digital Realty (DLR) is one of the oldest data center REITs in the world, having gone public in 2004 and as of Q3-19 owned 211 data centers in 35 cities across 14 countries. The company has global exposure represented in North America (77%), Europe (13%), Asia Pacific (7%) and Latin America (3%).
The data center REITs primary business model is focused on hybrid cloud offerings, and specifically it provides co-location services, meaning that its facilities are a mix of private (in-house) and public storage solutions. Each facility has numerous small data racks and servers, which customers provide themselves.
Recently Digital Realty announced an agreement to combine with InterXion Holding, a Netherlands-based information technology services company. While strategic in nature (this $8.4 billion transaction will create a leading global provider of cloud and carrier-neutral data center solutions with an enhanced presence in high-growth major European metro areas), integration risk provides uncertainty related to execution and short-term dilution.
However, as we have observed Digital Realty over the years, we have witnessed successful M&A deals and very consistent earnings and dividend growth. Since going public in 2005 the company has generated CAGR dividend growth of 11 percent, while maintaining an impressive payout ratio (currently 73 percent).
Were confident the company can restore its long-term growth trajectory (7% forecasted in 2021) and ability to generate impressive total returns. We consider the InterXion transformative and a deal that should provide long-term growth attributes as the combined companies capitalize on emerging global digitization trends.
Digital shares are trading at $118.18 with a dividend yield of 3.7 percent. Analysts forecast FFO per share growth of 4 percent in 2020 and 8 percent in 2021. We have a Buy rating on the company with expected returns targeted at 15 percent in 2020. Give the exponential growth in 5G and cloud storage, we believe this high-quality (rated BBB by S&P) REIT could provide long-term value for decades.
I own shares in DLR
More here:
The One Best REIT To Own In 2020 - Forbes
From retail to robotics, Jeff Bezos is betting big on technology – ETBrandEquity.com
Amazon Inc, the worlds largest online retailer, is being known these days as more of a technology company, and rightly so.
Technology is at the core of whatever Amazon does from algorithms that forecast demand and place orders from brands, and robots that sort and pack items in warehouses to drones that will soon drop packages off at homes.
At its new Go Stores, for instance, advances in computer vision have made it possible to identify the people walking in and what products they pick up, helping add them to their online shopping carts.
Jeff Bezos, the founder of Amazon and the worlds richest man, is always pulling new rabbits out of his hat, like next-day or same-day shipping and cashier-less stores. Besides, there is Blue Origin, the aerospace company privately owned by Bezos, which is on a mission to make spaceflight possible for everyone.
Be that as it may, a lot more disruption aimed at reaching the common man is on the anvil.
The most far-reaching and impactful technologies being developed today are for Amazons own use, but some others have the potential to disrupt every sector.
The technology marvels that Amazon Web Services the largest profit driving unit in Bezos stable is working on could jolt several industries, including in India, in the same way that Amazon once disrupted retail. In retail, while things like the size of the catalogue, advertising and other stuff might play a role in success, at Amazon, I think success is largely technology driven, said Chief Technology Officer Werner Vogels.
The ecommerce giant is using advances in technology to disrupt several sectors outside of retail though medicine, banking, logistics, robotics, agriculture and much more. Interestingly, some of that work is happening in India.
Initially, the thinking was around allowing enterprises in these sectors to grow by using its cloud storage and computing capabilities.
Now, Amazons reach has become more nuanced and it has moved up the value chain. For example, no longer is Amazon offering banks a place to securely store information, it is going beyond by offering tools to detect fraud, making it unnecessary for the lenders to build expensive data science teams in-house.
It is a similar story in other industries, made possible due to the massive amounts of data that Amazon collects and processes.
We give people the software capability, so they no longer need to worry about that side of things. Most of our services are machine learning under the covers (and) thats possible mostly because theres so much data available for us to do that, Vogels said.
Artificial Intelligence/Machine Learning Amazon is moving up the value chain in offering services backed by Artificial Intelligence and Machine Learning to automate repetitive tasks done by human beings. Enterprise customers will simply be able to buy into these services with minimal customisation and without a large data science and artificial intelligence team. In December, AWS launched its Fraud Detector service that makes it easy to identify potentially fraudulent activity online, such as payment fraud and creation of fake accounts. Even large banks in India have struggled to put together teams to build machine learning models for fraud detection, but with such a service they can train their systems easily. Code Guru is another service that uses Machine Learning to do code reviews and spit out application performance recommendations, giving specific recommendations to fix code. Today, this is largely done manually, with several non-technology companies struggling to build great software for themselves due to bad code. Transcribe Medical is a service that uses Amazons voice technology to create accurate transcriptions from medical consultations between patients and physicians. Medical transcription as a service is a big industry in India, and Indias IT service giants hire thousands to review code. These services are expected to replace mundane manual tasks, freeing up resources for sophisticated tasks, and could lead to disruption in several sectors in the country.
Medicine Hospitals in the United States have to save imaging reports for years. Earlier these were stored on tapes, since doing so digitally cost millions of dollars. The advent of cheaper cloud storage meant new scans could be saved digitally, making them accessible to doctors on demand. Now, doctors could refer to a patients earlier CT scan and compare that with the new one to diagnose an ailment, said Shez Partovi, worldwide lead for healthcare, life sciences, genomics, medical devices and agri-tech at Amazon. The power of cloud and AWS own capabilities in medical technology have only expanded since. Healthcare and life sciences form rapidly scaling units of AWS, which is building a suite of tools that allow breakthroughs in medicine from hospitals using the tools to do process modelling or operational forecasting, refining the selection of candidate drugs for trial or delivering diagnoses through computer imaging. Developed markets will be the first to adopt such technologies, but AWS is seeing demand surge from the developing world, including India. Not everyone is within a mile of a radiologist or physician, so diagnostics through AI could solve for that. Further, theres a lack of highly trained people, but when all you have to do is take an image, it requires a lot less training, said Partovi.
Space Bezos, in his private capacity, is now looking to connect remote regions with high-speed broadband. He is building a network of over 3,000 satellites through Project Kuiper, which will compete with Elon Musks SpaceX and Airbusbacked OneWeb. The bigger bet is in outer space though. His rocket company Blue Origin has already done commercial payloads on New Shepard, the reusable rocket that competes with SpaceXs Falcon 9. The capsule atop the New Shepard can carry six passengers, which Bezos looks to capitalise on for space tourism, a commercial opportunity most private space agencies are looking at. It is also building a reusable rocket - Glenn, named after John Glenn, the first American to orbit the earth which can carry payloads of as much as 45 tonnes in low earth orbit. Bezos aim, however, is to land on the Moon. His Blue Moon lander can deliver large infrastructure payloads with high accuracy to pre-position systems for future missions. The larger variant of Blue Moon has been designed to land a vehicle that will allow the United States to return to the Moon by 2024.
Robotics Amazons take on robotics is grounds-up. The company has been part of an opensource network that is developing ROS 2 or Robot Operating System 2, which will be commercial-grade, secure, hardened and peer reviewed in order to make it easier for developers to build robots. There is an incredible amount of promise and potential in robotics, but if you look at what a robot developer has to do to get things up and running, its an incredible amount of work, said Roger Barga, general manager, AWS Robotics and Autonomous Services, at Amazon Web Services. Apart from building the software that robots will run on, AWS is also making tools that will help developers simulate robots virtually before deploying them on the ground, gather data to run analytics on the cloud and even manage a fleet of robots. While AWS will largely build tools for developers, as capabilities such as autonomous navigation become commonplace, the company could look to build them in-house and offer them as a service to robot developers, Barga said. With the advent of 5G technology, more of the processing capabilities of robots will be offloaded to the cloud, making them smarter and giving them real-time analytics capabilities to do a better job. For India, robot builders will be able to get into the business far more easily, having all the tools on access, overcoming the barrier of a lack of fundamental research in robotics.
Enterprise Technology AWS might be a behemoth in the cloud computing space, but cloud still makes up just 3% of all IT in the world. The rest remains on-premise. While a lot will migrate to the cloud, some will not. In order to get into the action in the on-premise market, Amazon has innovated on services that run on a customers data centre, offering capabilities as if the data is stored on the cloud.
With Outposts, which was announced last month, AWS infrastructure, AWS services, APIs, and tools will be able to run on a customers data centre. Essentially, this will allow enterprises to run services on data housed within their own data centres, just like how they would if it had been stored on AWS. The other big problem that AWS is looking to solve is not having its own data centres close enough to customers who require extremely low-latency computing. For this, the company has introduced a new service called Local Zones, where it deploys own hardware closer to a large population, industry, and IT centre where no AWS Region exists today. Both these new services from AWS could be valuable in India given the lower reach of cloud computing among enterprises as well as stricter data localisation requirements.
Read this article:
From retail to robotics, Jeff Bezos is betting big on technology - ETBrandEquity.com
From Boombox To Wireless Speakers: How Tech Evolved Over The Last Decade – Tech Revolution In 2010s – Economic Times
Updated: 31 Dec 2019, 11:27 AM IST
1/8
To show you just how major and significant these changes have been, we have compiled a list of gadgets which shows how tech looked back in 2010.
iStock
2/8
Fast forward to 2019, and the iPhones have changed for good. They no longer come with a home button, a small screen size, latches or chins. iPhone 11 Pro comes with a 12 MP triple rear camera and a 12 MP front camera. Add to it the powerful iOS 13, a 4 GB RAM along with 64 GB storage and Apple A13 Bionic Chip, and there you have it, a phone that is miles ahead of the old iPhone 4.
iStock
3/8
The once elaborate setup of dangling wires and a pocket device has now been replaced by the simplistic and minimalist design of the Airpods with rich, high-quality voice and quick access to Siri, the personal mobile assistant.
Isnt it interesting how far technology has come in the last 10 years?
iStock
4/8
In 2008, India entered the world of 3G which provided superfast speed of 3.1 mbps. In 2012, Airtel launched 4G services and dongles with a maximum speed of 21 mbps.
Finally, with the advent of 5G, an application, should be able to transmit data at the rate of 10 Gigabytes per second as per ITU. While the deployment of 5G has begun in some countries as of 2019, some people have also raised concerns about the adverse effects 5G can have on health.
iStock
5/8
Smartwatches not only allow us to track the amount of calories burned during workout but also help us stay connected while we perform various activities. From receiving calls and messages instantaneously, to getting social media notifications, smartwatches perform all the tasks that an odd smartphone does.
Whats more? Several tech giants have ventured into the world of smartwatches and have whipped up amazing gadgets.
iStock
6/8
iStock
7/8
The last decade, however, saw a lot of scandals and private information being leaked from the cloud and serious concerns were raised about security. Nevertheless, with security enforcements in place, cloud storage has become one of the easiest and most-used ways of storing information.
iStock
8/8
However, as the world moved on from huge boomboxes to lightweight speakers, their love for music stayed the same. As they say, the more things change, the more they stay the same.
iStock
Year in review: Best Android apps of 2019 – The South African
There are thousands of apps on the Google Play Store, all trying to grab your attention. And truth be told, 2019 was a slow year for apps, despite some of the big names still dominating the space.
In this article, well take a look at some of the most popular Android apps of 2019/
Google Drive is Androids cloud storage solution and it has never let me down (to date). Android users get 15GB storage space for free when signing up, with premium packages available to those who need more.
The Google Drive suite of apps also includes Google Docs, Google Sheets, Google Photos, Gmail, Google Calendar and Google Keep. The latter is so amazing, it gets its own section.
Google Maps can be used for so much more than just navigating from point A to point B. It is the hub for all my travel plans; its where I get traffic data and find places of interest such as coffee shops, accommodation, etc.
You can create lists I have lists for places and landmarks I want to visit. I also created a checklist for interesting accommodation spot. Google Maps now also comes with AR walking directions.
I used Podcast Republic for years (before switching over to Spotify Premium, that is) but Ill still highly recommend it for non-Spotify people.
Podcast Republicallows you to manage yourPodcasts, Radios, Audio books, YouTube channels, SoundCloud channels and RSS news/blog feeds all within a single app.
LastPass is a password manager which keeps all your login details secure in one place. The cross-platform app can be accessed from your PC, mobile device, and tablet.
You can also grab LastPass Authenticator to go along with it for added security.
One of Googles most under-used apps, without a doubt. This little note-taking apps sure packs a punch. Sure, there are other more complex note-taking apps, but Keeps simplicity is the thing that truly hooked me.
Its effortless to use; you can search for notes easily this helps when it gets to the point where you have more than 200 notes. Search by theme, search by assigned colour, search by keywords and hashtags, search any way you like.
In addition, it works well with other apps too: Set reminders, use the Google Assistant to save notes, use Google Docs to sync Keep notes, or use Googles web apps with Keep.
Also read Year in review: Best Apple apps of 2019
Go here to see the original:
Year in review: Best Android apps of 2019 - The South African
Amazon’s second act, Microsoft’s revival and red-hot IPOs highlighted the decade of the cloud – CNBC
Microsoft CEO Satya Nadella and Salesforce CEO Marc Benioff in 2014.
Source: Microsoft
There were a lot of tech trends in the 2010s, from mobile computing and web-delivered content to the technology-powered gig economy. But no story was more powerful and pervasive than the emergence of the cloud.
Dropbox and Slack became household names during the past decade, Salesforce gained enterprise ubiquity, and Microsoft and Adobe revitalized their businesses by shifting from packaged software to cloud-based subscriptions, lifting their stock prices to record highs.
Formerly a side project, Amazon Web Services now generates $35 billion in annual revenue by allowing clients to offload their storage and computing needs to a third party, while ServiceNow, whose technology helps IT managers improve productivity, joined the S&P 500 last month after its market cap topped $50 billion.
In the past, companies, schools and government agencies operated their own data centers and bought expensive licenses to use software on their equipment, adding in hefty maintenance and update fees. The cloud changed all that, switching the applications that employees use every day as well as all the underlying databases, servers and communications equipment into services that can be delivered remotely to a host of devices over powerful networks. Customer service was upgraded, with a focus on user feedback, to keep clients from quitting their subscriptions and moving to rivals.
For investors, the paradigm shift presented an opportunity to put money into older companies positioned to make the transition, as well as a whole new crop of start-ups poised to take market share from the legacy providers. Slack, Twilio, Zoom and Okta were all founded in 2008 or later and are each now valued at over $10 billion on the public market. A bunch more are in the $5 billion range, and more still are filling up the IPO pipeline for 2020 and beyond.
Brad Gerstner, founder of Altimeter Capital, counts Salesforce as one of his top holdings and was a venture investor in Okta and Twilio. In a TV interview this month alongside Okta co-founder Frederic Kerrest, Gerstner told CNBC that the big bet has paid off.
"It really comes down to something that we talked about nearly a decade ago," said Gerstner, whose firm oversees more than $5 billion in assets, referring to his initial conversations with Okta. "We have a once in probably our lifetime rearchitecture of the entire enterprise stack into the cloud."
According to Synergy Research, 2019 revenue from enterprise software-as-a-service (SaaS) will exceed $100 billion, up from less than $4 billion in 2009. Adding up all layers of the stack, from the underlying infrastructure to the applications, IT service firm Gartner says cloud revenue will end the year at $214.3 billion, jumping to $331.2 billion by 2022.
In a $3.7 trillion global IT market with low single-digit expansion, annual cloud growth of greater than 15% is leading investors to bid up the cloud standouts in both the public and private markets.
If you're looking for the poster child of the cloud evolution, you may find it on the outskirts of Seattle.
In 2014, facing sluggish growth and disappointing investor returns, Microsoft turned to Satya Nadella to succeed Steve Ballmer as CEO, the first change at the top in 14 years. Nadella, who had previously run Microsoft's cloud and enterprise group, told employees on day one of his tenure, "Our job is to ensure that Microsoft thrives in a mobile and cloud-first world."
Weeks later Nadella announced that Office apps were coming to Apple iPads, giving customers more flexibility and showing that it was a new day at Microsoft. Office 365, the cloud version of Microsoft's flagship product, was launched in 2011, but the Apple integration was critical in bringing Word, Excel, PowerPoint and SharePoint to people who were choosing a competitor's hardware.
By 2017, commercial revenue for Office 365 had exceeded Office license revenue.
"I think the biggest event of the decade was Microsoft launching Office 365," said Todd McKinnon, CEO and co-founder of Okta who previously spent five years at Salesforce. "It was very clear that the largest software company in the world is saying, 'Cloud is good, Cloud will work, Cloud is sanctioned.' It changed the mindset of the IT industry."
Okta Conference hosts Facebook VP of Platform Partnerships Sean Ryan, Slack VP of Product April Underwood, Okta CEO Todd McKinnon, Box CEO Aaron Levie Zoom, and CEO Eric Yuan and moderator Brad Stone
Source: Harriet Taylor
McKinnon saw the movement firsthand. His company provides identity management software so businesses can securely control all of the cloud applications that employees are using.
"Companies of every size and every industry that we'd been having conversations with for years came back to us and said, 'This is real, this is happening, we need a real identity story,'" McKinnon said.
Meanwhile, Microsoft was also building Azure, its cloud infrastructure service that would eventually become the clear No. 2 to AWS, attracting as customers large retailers, health-care providers, banks and the U.S. Department of Defense along the way. Microsoft doesn't disclose Azure revenue, but it does report growth, which reached 59% in the third quarter.
Since the end of 2009, Microsoft's stock has jumped 417%, beating the S&P 500's 189% gain. This year it became the third company to reach a $1 trillion market capitalization.
Amazon isn't far behind at $889 billion, as of Monday's close. Much of Amazon's 1,233% stock surge over the last decade can be attributed to AWS, which in the latest quarter accounted for 71% of its parent company's operating income and 13% of revenue. Analysts at Jefferies said in a November report that AWS could be worth about 40% of the company's market cap, and the unit has gotten so big that it's now reportedly attracting antitrust scrutiny.
Salesforce, the company most synonymous with SaaS, has also taken advantage of investments made by the infrastructure players. In 2016, Salesforce said it would use AWS to expand its Sales Cloud and Service Cloud internationally and has since announced plans to use some services from Google and Microsoft's cloud.
While Salesforce is the biggest company that was born in the cloud, Adobe is the largest software maker to transition the majority of its business to the new model. Investors have rewarded the company, pushing the stock up ninefold since the beginning of the decade.
In 2009, subscriptions represented 3% of revenue. Two years later, Adobe introduced Creative Cloud, ushering in monthly and annual plans for access to apps like PhotoShop, along with cloud storage. Now, subscriptions account for about 90% of sales, and the company is growing at rates not seen since 1991.
"What we were able to do in terms of moving to this new way of delivering software was unshackle our product teams from the burdens of delivering products every 12 or 18 months and they could deliver at the pace at which they could innovate," CEO Shantanu Narayen said at Adobe's financial analyst meeting in November. "We were able to attract new customers to the platform, we were able to price these products globally differently."
Autodesk was founded in 1982, just like Adobe. It's undertaken a similar endeavor, moving its popular design and architecture software to the cloud. Carl Bass, Autodesk's CEO from 2006 to 2017, said in 2013 that the company "can get pretty close to subscriptions being the vast majority of our business."
He was proven right. In the most recent quarter, subscriptions accounted for 85% of sales, pushing total revenue up 28% from a year earlier. The stock has gained 620% since the end of 2009.
As Microsoft, Adobe and Autodesk were revamping their businesses, new venture-backed SaaS vendors were popping up by the month, unbundling the old software suites with targeted applications and solutions. The attrition rate has been high, but there are notable successes.
Videoconferencing company Zoom, which went public this year, reported revenue growth of 85% in the most recent quarter to $166.6 million. Twilio, a provider of communications infrastructure that went public in 2016, generated growth of 75% to $295.1 million in the third quarter. Newly public companies Elastic, Smartsheet and Coupa each reported growth in excess of 50%.
They're among the top performers in the BVP Nasdaq Emerging Cloud Index, a group of public companies that get most of their revenue from cloud products and services. Venture capital firm Bessemer Venture Partners launched the index in 2013 to bring more attention to cloud companies and provide metrics so private cloud companies could better understand public markets.
The index has risen 458% since it was formed, topping the Nasdaq's 146% jump over that stretch. In September, asset manager WisdomTree launched the WisdomTree Cloud Computing Fund, making it possible for people to bet on the group.
"We'd get tweets every week of, 'How can I trade this? How can I trade this?'" said Byron Deeter, who invests in cloud at Bessemer and sits on Twilio's board.
Rob Bernshteyn, CEO of Coupa, has been tracking cloud software since its infancy. While working at Siebel Systems in the early 2000s, he met Salesforce co-founder Marc Benioff and was skeptical of whether the company could provide cloud-based technology for many different purposes without extensive customization, even though Salesforce was already winning deals against Siebel.
"It wasn't really definitively clear to me that it could really work," Bernshteyn said in an interview at Coupa's Silicon Valley headquarters, where the server closets are filled with beanbags that employees use as chairs.
Over time, Bernshteyn said Salesforce fixed its technical issues. He considered joining the company but went to a younger cloud software provider called SuccessFactors, which was later acquired by SAP.
Bernshteyn left in 2009, in the middle of the financial crisis, and joined a small start-up that was helping companies track their spending to make sure they weren't being fleeced by vendors. That company, Coupa, is now worth over $9 billion and generating revenue of over $100 million a quarter.
But not all cloud stocks have delivered for investors.
Dropbox is 15% below its IPO price from 2018. Growth at the one-time venture darling has slowed amid competition from Google and Microsoft in the cloud storage and collaboration market.
Business intelligence software company Domo is up just 10% from its IPO in mid-2018 and way below where it was valued in the private markets before the offering. Yext, whose service helps businesses keep information like their addresses and hours up to date on Google and Amazon Alexa, is up 32% since its debut in 2017, underperforming the major indexes.
At 22% and 30% sales growth, respectively, Domo and Yext are expanding at a slower pace than many of their cloud counterparts, while still racking up big losses. It's a tough recipe for investors.
Yext CEO Howard Lerman is bullish on the broader sector. "Obviously at some point over the next decade, spending on cloud software will surpass licensed software," he said.
For venture investors, there's also plenty of money still to be made, assuming the public markets are on board. Deeter of Bessmer Ventures said there are 66 private cloud companies worth more than $1 billion.
"That's your future IPO pipeline," he said. "You're going to see this cloud index explode."
WATCH: Coupa CEO says there is a $50 billion addressable market in cloud expense management
See original here:
Amazon's second act, Microsoft's revival and red-hot IPOs highlighted the decade of the cloud - CNBC
The 3 biggest storage advances of the 2010s – ZDNet
I've been looking at storage technology for over 40 years, beginning with choosing mass storage for the original Apple ][ I bought in 1978. $800 for a 140KB floppy, or $50 for a Panasonic cassette deck? Yep, I bought the cassette deck, which taught me the meaning of random access storage.
Every decade since, the pace of change in storage has accelerated, and the 'teens were no exception. Drive that change are 3 key technologies.
The first time I wrote about flash memory for ZDNet, one reader complained that he was expecting to hear about Adobe Flash, the now obsolete graphics standard. No one makes that mistake today!
Flash was invented in the 1980s by Toshiba, whose storage division was spun off last year as Kioxia. Flash was slow to take off because, as a semiconductor, it took decades to build the volumes that allowed costs to drop. In 1991, I paid $400 for a 10MB Compact Flash card - $40/MB - for my favorite notebook of all time, the HP Omnibook 300.
Flash enabled the iPhone, and replaced 8mm magnetic video tapes - I still have a drawer full - and pretty much killed 35mm film cameras. As the industry invested billions in new fabs, the price has continued to decline, making flash the dominant solid state storage in the world today.
It wasn't until circa 2005 that flash became as cheap as DRAM, and that's when designers woke up to its potential in the data center, where FusionIO's fast but costly PCIe SSDs were popular. But the disastrous floods in Thailand in 2011, which destroyed almost half of the world's hard drive production capacity, forced a spike in HDD prices that suddenly made flash SSDs look relatively affordable.
While Apple led the charge to SSDs, offering them in the first MacBook Air back in 2008, it was the spike in HDD prices that made many people realize that despite their high price per GB, SSDs really improved their system performance. With their widespread adoption in even entry level machines, the market for client side HDDs collapsed, with volumes dropping for the last five years.
But fear not, HDDs, like tape, will continue to have a home in data centers for decades to come. They are still significantly cheaper than flash SSDs per GB, which is what will keep the cloud vendors buying them.
The biggest storage story of the teens was the advent of cloud storage, along with cloud infrastructure. Cloud has dramatically reduced the business and technology friction of acquiring and managing IT infrastructure.
Cloud has had knock-on effects in several areas:- Cloud has took the wind out of the once-robust storage array market. EMC, once the behemoth of data storage, sold itself to Dell. The remaining independents, especially startups, tend to focus on either direct sales to the cloud giants, or selling cloud-like infrastructure to enterprises. - Cloud has made storage advances largely proprietary and hidden. When you are operating at 10000x the scale of major enterprises, employ armies of PhDs, and control your entire stack, there is no reason to share tech discoveries.- Likewise, suppliers make investments, such as in 100GB Blu-ray discs, that seem unjustified by consumer demand. I call this the shadow IT market.
Cloud has made data infrastructure a utility. Just as we don't know where the power in our homes comes from, neither do we know where much of our computing takes place, especially for mobile devices. Cloud won't replace on premise systems totally, but it is highly competitive.
As vast as these changes have been, the 20's will dwarf them. More on that later. Happy New Year!
Comments welcome!
Read more here:
The 3 biggest storage advances of the 2010s - ZDNet
Big Data Professionals Give 11 Predictions for Cloud’s Evolution in 2020 – Database Trends and Applications
The cloud was on everyones mind this past year; with so many questions rising surrounding how to secure cloud environments to what type of cloud is best for the organization.
Cloud computing has revealed countless new dimensions to IT. There are public clouds, private clouds, distributed clouds, and hybrid, multi-cloud architectures.
An actual hybrid cloud will allow for large and small and critical and casual workloads to be seamlessly transitioned between on-premise private cloud infrastructure and any public cloud employed by any organization based on whatever criteria a customer architects. The current output of new technologies has this space exploding with possibilities.
Here, executives of leading companies offer 11 predictions for what's ahead in 2020 for cloud.
The Cloud Disillusionment blossoms because the meter is always running: Companies that rushed to the cloud finish their first phase of projects and realize that they have the same applications they had running before that do not take advantage of new data sources to make them supercharged with AI. In fact, their operating expenses actually have increased because the savings in human operators were completely overwhelmed by the cost of the cloud compute resources for applications that are always on. Ouch. These resources were capitalized before on-premise but now hit the P&L. - Monte Zweben, CEO, Splice Machine
Multi-cloud strategies increase the demand for application management tool adoption: Multi-cloud strategies are here to stay. Companies are increasingly adopting more than one platformeither for financial leverage or to create a time-to-market or feature race between the platforms. To remain competitive, public cloud providers must offer unique features or capabilities differentiating them from competitors. This has created an upsurge in new and more complex technologies, increasing the need for application performance management tool adoption. 2020 will bring an ever-increasing demand for APM tools and services.- David Wagner, senior manager, product marketing application management, SolarWinds
The Rise of the Hybrid Cloud Infrastructure -- Putting the Right Data in the Right Place: Today when people refer to the cloud, they usually mean the public cloud. In 2020, the term cloud might become more nuanced as private clouds rise in popularity and organizations increasingly pursue a hybrid cloud storage strategy. Organizations with large-scale storage needssuch as those in healthcare, scientific research, and media and entertainmentface unique challenges in managing capacity-intensive workloads that can reach tens of petabytes. Private clouds address these challenges by providing the scale and flexibility benefits of public clouds along with the performance, access, security and control advantages of on-premises storage. In 2020, well see more organizations taking advantage of private clouds in a hybrid cloud infrastructure storing frequently used data on-prem while continuing to utilize the public cloud for disaster recovery.- Jon Toor, CMO, Cloudian
Best-of-Breed cloud is coming under the name of Hybrid: Public cloud vendors have extortionately high prices. The public cloud makes sense for small-and-medium sized businesses. Those businesses dont have the scope to amortize their engineering spend. Public clouds dont make sense for technology companies. Companies like Bank of America have gone on record as saving 2 billion dollars per year by not using the public cloud. A best-of-breed architecture envisions building blocks within the technical stack, then selects not from a single cloud vendor, but from the variety of service providers. Assumptions that a given cloud provider has the lowest or best prices, or that the cost of networking between clouds is prohibitive, becomes less and less true. - Brian Bulkowski, CTO at Yellowbrick Data
Organizations will grapple with scaling multi-cloud, hybrid, edge/fog and more: In 2020, in-memory computing will disrupt both NoSQL and traditional database technologies, and streaming analytics will emerge as the preferred approach for data integration. Low-latency in-memory platforms for streaming will define a new paradigm for performance in this space, further disrupting traditional approaches. Multi-cloud will also emerge as the preferred strategy to build and integrate applications. In response, enterprises will increasingly need to support and scale multi-cloud, hybrid cloud and edge/fog, and turn to new approaches to achieve real-time machine learning at enterprise scale. - John DesJardins, VP of solution architecture & CTO, Hazelcast
More enterprises will have production cloud data lakes. With the maturation of the technology stack overall and more ML frameworks becoming mainstream, the cloud data lake trend, which began a few years ago, will continue to accelerate. Well see more enterprises with production data lakes in the cloud running meaningful workloads for the business. This trend will pose more pressure on the data privacy and governance teams to make sure data is being used the right way. - Okera CTO and co-founder, Amandeep Khurana
The biggest advantage presented by modern cloud technology is the ability for small to mid-size companies to level the playing field: Thanks to the cloud, organizations no longer require the assets previously required to implement enterprise solutions and technology large budgets, massive server farms, and a workforce dedicated to maintenance. Typically, when organizations want to implement new tech, they analyze the infrastructure cost associated to determine what is fiscally possible. Instead, organizations that want to harness the benefits provided by the cloud should start by defining strategic objectives and recognize that the cloud is going to provide access to solutions and new technology at a fraction of the on-premises cost. Dont let infrastructure costs be the impeding factor to implementing new tech. What the cloud now does is disintermediate the bar of access to, and drive adoption of, new technology. This is why the cloud growth line has been exponential, not linear. So, in 2020 and beyond we can expect cloud to be a huge asset that will allow small to mid-size businesses to get access to the same solutions, information, and data that was only before available to large enterprises. - Himanshu Palsule, chief product & technology officer, Epicor
Cloud data warehouses turn out to be a Big Data detour: Given the tremendous cost and complexity associated with traditional on-premise data warehouses, it wasnt surprising that a new generation of cloud-native enterprise data warehouse emerged. But savvy enterprises have figured out that cloud data warehouses are just a better implementation of a legacy architecture, and so theyre avoiding the detour and moving directly to a next-generation architecture built around cloud data lakes. In this new architecture data doesnt get moved or copied, there is no data warehouse, and no associated ETL, cubes, or other workarounds. We predict 75% of the global 2000 will be in production or in pilot with a cloud data lake in 2020, using multiple best-of breed engines for different use cases across data science, data pipelines, BI, and interactive/ad-hoc analysis. - Dremio's CEO Tomer Shiran
IT will begin to take a more methodical approach to achieving cloud native status: Running cloud native applications is an end goal for many organizations, but the process of getting there can be overwhelming especially because many companies believe they have to refactor everything at once. More IT departments will realize they dont need to take an all or nothing approach, and a process founded on baby steps is the best way to achieve cloud native goals. In other words, well start to see more IT teams forklift applications into the cloud and then implement a steady, methodical approach to refactoring them. - Chris Patterson, senior director of product management, Navisite
Major Cloud Providers Will Find a Bullseye on Their Backs: As more and more organizations move their critical systems and data to the cloud for efficiency, scalability, and cost reduction, cloud provider infrastructure will increasingly become a high payoff target. A target, that if compromised, could have devastating effects on the economy and national security. In 2020, we believe state adversaries will redouble their efforts to attack cloud systems. Whether the defenses in place will withstand the attacks remains to be seen. - Greg Conti, senior security strategist, IronNet Cybersecurity
A Meteoric Rise: Cloud Security Adoption to Accelerate in 2020: The coming year will usher in an even greater adoption of cloud security, with a material change in attitude and organizations fully embracing the cloud. As organizations increasingly access enterprise applications like Box, Salesforce, etc., its no longer practical for them to VPN back to the stack to remain secure while accessing these services in the cloud. With this move to the cloud comes countless security risks. Not only will we see more companies jump on the bandwagon and shift their applications and operations to the cloud, but we will also see the security stack move to the cloud and more resources dedicated to securing the cloud, such as cloud councils. -Kowsik Guruswamy, CTO,Menlo Security
Google Drive vs OneDrive: Which is better? – ValueWalk
There is no shortage of cloud storage services. You can choose from Google Drive, OneDrive, Apple iCloud, Amazon Drive, Dropbox, and many others. But the two most popular services are Google Drive and Microsofts OneDrive. Both Google and Microsoft have deeply integrated their cloud offerings with other services to give you a better user experience. If you cant decide which one to opt for, this Google Drive vs OneDrive comparison should help you decide.
Google Drive and OneDrive have become platform-agnostic. You can use them on Android, iOS, Windows, Mac, and other platforms without any issues. They let you access your files across devices. Both services also have a bunch of collaboration tools to let you share files and collaborate with others.
Both Google and Microsoft have a free plan with a limited amount of cloud storage. The free plans are good enough for most users who use cloud only to store or back up photos and documents. Google Drive offers 15GB of free storage, which is significantly higher than OneDrives 5GB. But the Google Drive storage is used for all of Googles services including Gmail. It means you could run out of storage faster than you expect.
If you need more storage, Google charges $2 per month or $20 per year for 100GB of cloud storage. If you want 200GB, its going to cost $3 per month or $30 per year. Googles 2TB plan costs $10 per month or $100 per year. The 10TB plan costs $100 per month and 20TB plan is going to set you back by $200 per month. Its worth pointing out you can extend Googles storage to other people in your Google Family.
OneDrive is relatively more expensive, mainly because Microsoft uses a different pricing strategy. The Redmond-based software giant has bundled OneDrive into the Microsoft Office subscription. The Office 365 Home costs $100 per year. It gives you access to Word, PowerPoint, Outlook, Excel, Access, and Publisher for PC along with 1TB of OneDrive storage.
You can also share the Office 365 Home plan with five of your family members, each of whom will get their own 1TB of cloud storage. For those who want a personal plan, Microsoft has Office 365 Personal for $70 per year. It gives you 1TB of cloud storage along with the Office tools.
If you only want the OneDrive storage without any Office tools, you can get 100GB of cloud storage for $2 per month. 1TB of OneDrive storage costs $7 per month or $70 per year, and 6TB cloud storage will set you back by $10 per month or $100 per year.
Both cloud services allow you to access their file management features via a web browser. The user interface is intuitive and easy to navigate. The file management system is similar to that of the desktop file managers. They both have a variety of viewing options such as thumbnails and list, and give you quick access to your recent files.
The search function is much better in Google Drive. It shows the search results live as you type each letter. It also has an advanced search option that you can toggle on. The advanced search lets you filter search results by date, keyword, file type, etc. The search function in OneDrive is still in its infancy. You wont be able to view the search results until you hit theEnter button.
The file sharing system is similar on the two services. You can share a direct link or a persons email address to give them access to a file. Both services allow you to set permissions for anyone accessing the files you shared. In Google Drive, you can let others View, Comment, or Edit for free users. The advanced permission settings are available only to paid users.
Microsofts OneDrive comes with block-level copying technology, which breaks files into smaller packages for uploading and saving to the cloud. If you make a change in your file, only the packages that have been modified are re-uploaded to the cloud. It speeds up the uploading process.
Both services send your files to the cloud via HTTPS encryption. Microsoft and Google encrypt files using their own keys. It makes it incredibly hard for hackers to decrypt files you have stored with Google or Microsoft even if they break into the servers. But it also means that if someone gains access to your email and password, they can access all your files.
Since Microsoft and Google have encryption keys to your files, they can decrypt them or give law enforcement agencies access to your files. Of course, you can use the E2EE option or other services to encrypt all your Google Drive or OneDrive files yourself.
Google scans the files you upload to Google Drive to mine data that it could use for targeted advertising. It doesnt use that data for malicious purposes. Microsoft doesnt do that. Recently, the Redmond-based software giant introduced a feature called Personal Vault that adds an extra layer of security to your sensitive files.
Free OneDrive users can add only three files to their Personal Vault. The Office 365 subscribers can add as many files as they want, up to the storage limit of their plan. The Personal Vault is locked automatically upon 20 minutes of inactivity. You can access your Personal Vault within OneDrive using a PIN, fingerprint scan, facial scan, an authenticator app, or by entering the authentication code you get via SMS or email.
If you want maximum cloud storage for the price, Google Drive is the way to go. Its plans are relatively cheaper, and Google Drive integrates well with other Google services. Microsofts OneDrive is for people who use its Office tools such as Word, Excel, PowerPoint, and Access. The Personal Vault also gives OneDrive an edge over Google Drive in terms of security. You should try out the free plans of both to decide which one is better for your needs.
Follow this link:
Google Drive vs OneDrive: Which is better? - ValueWalk
VC Investments In Enterprise Tech And AI – Forbes
According to Toptal, the venture capital sector has grown by 12.1% annually since the financial crisis. The same source tells us that the amount of capital raised per year has grown by 100% over the decade.
Hundreds of venture capitalists are backing up startups and entrepreneurs with billions of dollars each year. Many businesses rely on these VC investments and entire economies depend on it.
When choosing to back up projects, most investors look for innovation, expertise, and profitable opportunities. Im going to take a look at some of the top tier venture capitalists and their investments in the field of enterprise tech and AI.
Companies with a focus on AI have collected over 9.3 billion dollars in the US during 2018. The number of venture capital investments keeps growing on a global scale, opening up new opportunities for startups and entrepreneurs who are looking for their golden ticket to the enterprise tech and AI space.
As stated on Kurtosys, venture capital deals ranged between $10 million and $25 million in the US ten years ago. Today, there is a trend of $50 million plus deals getting a greater share of total investment.
Top tier macro venture capitalists in the startup ecosystem include Benchmark, Index Ventures, Felicis Ventures, and Union Square Ventures.
Even micro and local venture capitalists such as Northstar Ventures and Base Ventures are hitting these large numbers. On the local micro VC side, Aybuben Ventures, the first Pan-Armenian venture capitalist fund focused on Armenian tech entrepreneurs.
With a fund of over $50 million, Aybuben Ventures is not limited to people in Armenia only. On the contrary, the fund is open to Armenians all over the world who are engaged in enterprise tech business and development. Armenians live all over the world and they are proud of their culture and dont want to lose their identity. Potentially this creates a huge global pool of entrepreneurs, professionals, capital, companies and knowledge which can be leveraged and scaled in any of the world's economies. That said, we welcome interest in our foundation, from any organization and without regard to nationality, said Alexander Smbatyan, one of the founding partners of Aybuben Ventures.
Overall, the venture capital space keeps growing, providing technology startups with sufficient funding for growth and expansion. There is an innate disposition to develop companies that make extensive use of technologies such as artificial intelligence, machine learning, biotechnology and more, Smbatyan added as one of the reasons why it is worth to invest in the space of enterprise tech and AI.
Big Data Predictions: What 2020 Will Bring – Datanami
(ju_see/Shutterstock)
With just over a week left on the 2019 calendar, its now time for predictions. Well run several stories featuring the 2020 predictions of industry experts and observers in the field. It all starts today with what is arguably the most critical aspect of the big data question: The data itself.
Theres no denying that Hadoop had a rough year in 2019. But is it completely dead? Haoyuan HY Li, the founder and CTO of Alluxio, says that Hadoop storage, in the form of the Hadoop Distributed File System (HDFS) is dead, but Hadoop compute, in the form of Apache Spark, lives strong.
There is a lot of talk about Hadoop being dead, Li says. But the Hadoop ecosystem has rising stars. Compute frameworks like Spark and Presto extract more value from data and have been adopted into the broader compute ecosystem. Hadoop storage (HDFS) is dead because of its complexity and cost and because compute fundamentally cannot scale elastically if it stays tied to HDFS. For real-time insights, users need immediate and elastic compute capacity thats available in the cloud. Data in HDFS will move to the most optimal and cost-efficient system, be it cloud storage or on-prem object storage. HDFS will die but Hadoop compute will live on and live strong.
As HDFS data lake deployments slow, Cloudian is ready to swoop in and capture the data into its object store, says Jon Toor, CMO of Cloudian.
In 2020, we will see a growing number of organizations capitalizing on object storage to create structured/tagged data from unstructured data, allowing metadata to be used to make sense of the tsunami of data generated by AI and ML workloads, Toor writes.
The end of one thing, like Hadoop, will give rise the beginning of another, according to ThoughtSpot CEO Sudheesh Nair.
(Swill Klitch/Shutterstock)
Over the last 10 years or so, weve seen the rise, plateau, and the beginning of the end for Hadoop, Nair says. This isnt because Big Data is dead. Its exactly the opposite. Every organization in the world is becoming a Big Data company. Its a requirement to operate in todays business landscape. Data has become so voluminous, and the need for agility with this data so great, however, that organizations are either building their own data lakes or warehouses, or going directly to the cloud. As that trend accelerates in 2020, well see Hadoop continue to decline.
When data gets big enough, it exerts a gravitational-like force, which makes it difficult to move, while also serving to attract even more data. Understanding data gravity will help organizations overcome barriers to digital transformation, says Chris Sharp, CTO of Digital Realty.
Data is being generated at a rate that many enterprises cant keep up with, Sharp says. Adding to this complexity, enterprises are dealing with data both useful and not useful from multiple locations that is hard to move and utilize effectively. This presents enterprises with a data gravity problem that will prevent digital transformation initiatives from moving forward. In 2020, well see enterprises tackle data gravity by bringing their applications closer to data sources rather than transporting resources to a central location. By localizing data traffic, analytics and management, enterprises will more effectively control their data and scale digital business.
All things being equal, its better to have more data than less of it. But companies can move the needle just by using available technology to make better use of the data they already have, argues Beaumont Vance, the director of AI, data science, and emerging technology at TD Ameritrade.
As companies are creating new data pools and are discovering better techniques to understand findings, we will see the true value of AI delivered like never before, Vance says. At this point, companies are using less than 20% of all internal data, but through new AI capabilities, the remaining 80% of untapped data will be usable and easier to understand. Previous questions which were unanswerable will have obvious findings to help drive massive change across industries and societies.
Big data is tough to manage. What if you could do AI with small data? You can, according to Arka Dhar, the CEO of Zinier.
Going forward, well no longer require massive big data sets to train AI algorithms, Dhar says. In the past, data scientists have always needed large amounts of data to perform accurate inferences with AI models. Advances in AI are allowing us to achieve similar results with far less data.
(Drendan/Shutterstock)
How you store your data dictates what you can do with it. You can do more with data stored in memory than on disk, and in 2020, well see organizations storing more data on memory-based systems, says Abe Kleinfled, the CEO of GridGain.
In 2020, the adoption of in-memory technologies will continue to soar as digital transformation drives companies toward real-time data analysis and decision-making at massive scale, Kleinfled says. Lets say youre collecting real-time data from sensors on a fleet of airplanes to monitor performance and you want to develop a predictive maintenance capability for individual engines. Now you must compare anomalous readings in the real-time data stream with the historical data for a particular engine stored in the data lake. Currently, the only cost-effective way to do this is with an in-memory data integration hub, based on an in-memory computing platform like Apache Ignite that integrates Apache Spark, Apache Kafka, and data lake stores like Hadoop.2020 promises to be a pivotal year in the adoption of in-memory computing as data integration hubs continue to expand in enterprises.
Big data can make your wildest business dreams come true. Or it can turn into a total nightmare. The choice is yours, say Eric Raab and Kabir Choudry, vice presidents at Information Builders.
Those that have invested in the solutions to manage, analyze, and properly action their data will have a clearer view of their business and the path to success than has ever been available to them, Raab and Choudry write. Those that have not will be left with a mountain of information that they cannot truly understand or responsibly act upon, leaving them to make ill-informed decisions or deal with data paralysis.
Lets face it: Managing big data is hard. That doesnt change in 2020, which will bring a renewed focus on data orchestration, data discovery, data preparation, and model management, says Todd Wright, head of data management and data privacy solutions at SAS.
(a-image/Shutterstock)
According to the World Economic Forum, it is predicted by 2020 that the amount of data we produce will reach a staggering 44 zettabytes, Wright says. The promise of big data never came from simply having more data and from more sources but by being able to develop analytical models to gain better insights on this data. With all the work being done to advance the work of analytics, AI and ML, it is all for not if organizations do not have a data management program in place that can access, integrate, cleanse and govern all this data.
Organizations are filling up NVMe drives as fast as they can to help accelerate the storage and analysis of data, particularly involving IoT. But doing this alone is not enough to ensure success, says Nader Salessi, the CEO and founder of NGD Systems.
NVMe has provided a measure of relief and proven to remove existing storage protocol bottlenecks for platforms churning out terabytes and petabytes of data on a regular basis, Salessi writes. Even though NVMe is substantially faster, it is not fast enough by itself when petabytes of data are required to be analyzed and processed in real time. This is where computational storage comes in and solves the problem of data management and movement.
Data integration has never been easy. With the ongoing data explosion and expansion of AI and ML use cases, it gets even harder. One architectural concept showing promise is the data fabric, according to the folks at Denodo.
Through real-time access to fresh data from structured, semi-structured and unstructured data sets, data fabric will enable organization to focus more on ML and AI in the coming year, the Denodo company says. With the advancement in smart technologies and IoT devices, a dynamic data fabric provides quick, secure and reliable access to vast data through logical data warehouse architecture. Thus, facilitating AI-driven technologies and revolutionizing businesses.
Seeing how disparate data sets are connected using semantic AI and enterprise knowledge graphs (EKG) provide other approaches for tackling the data silo problem, says Saurav Chakravorty, the principal data scientist at Brillio.
An organizations valuable information and knowledge is often spread across multiple documents and data silos, creating big headaches for a business, Chakravorty says. EKG will allow organizations to do away with semantic incoherency in fragmented knowledge landscape. Semantic AI with EKG complement each other and can bring great value overall to enterprise investments in data lake and big data.
2020 holds the potential to be a breakout year for storage-class memory, argues Charles Fan, the CEO and co-founder of MemVerge.
With an increasing demand from data center applications, paired with the increased speed of processing, there will be a huge push towards a memory-centric data center, Fan says. Computing innovations are happening at a rapid pace, with more and more computation techfrom x86 to GPUs to ARM. This will continue to open up new topology between CPU and memory units. While architecture currently tends to be more disaggregated between the computing layer and the storage layer, I believe we are headed towards a memory-centric data center very soon.
We are rapidly moving toward a converged storage and processing architecture for edge deployments, says Bob Moul, CEO of machine data intelligence platform Circonus.
Gartner predicts there will be approximately 20 billion IoT-connected devices by 2020, Moul says. As IoT networks swell and become more advanced, the resources and tools that managed them must do the same. Companies will need to adopt scalable storage solutions to accommodate the explosion of data that promises to outpace current technologys ability to contain, process and provide valuable insights.
Dark data will finally see the light of day in 2020, according to Rob Perry, the vice president of product marketing at ASG Technologies.
(PictureDragon/Shutterstock)
Every organization has islands of data, collected but no longer (or perhaps never) used for business purposes, Perry says. While the cost of storing data has decreased dramatically, the risk premium of storing it has increased dramatically. This dark data could contain personal information that must be disclosed and protected. It could include information subject to Data Subject Access Requests and possible required deletion, but if you dont know its there, you cant meet the requirements of the law. Though, this data could also hold the insight that opens up new opportunities that drive business growth. Keeping it in the dark increases risk and possibly masks opportunity. Organizations will put a new focus on shining the light on their dark data.
Open source databases will have a good year in 2020, predicts Karthik Ranganathan, founder and CTO at Yugabyte.
Open source databases that claimed zero percent of the market ten years ago, now make up more than 7%, Ranganathan says. Its clear that the market is shifting and in 2020, there will be an increase in commitment to true open source. This goes against the recent trend of database and data infrastructure companies abandoning open source licenses for some or all of their core projects. However, as technology rapidly advances it will be in the best interest of database providers to switch to a 100% open source model, since freemium models take a significantly longer period of time for the software to mature to the same level as a true open source offering.
However, 2019 saw a pull back away from pure open source business models from companies like Confluent, Redis, and MongoDB. Instead of open source software, the market will be responsive to open services, says Dhruba Borthakur, the co-founder and CTO of Rockset.
Since the public cloud has completely changed the way software is delivered and monetized, I predict that the time for open sourcing new, disruptive data technologies will be over as of 2020, Borthakur says. Existing open-source software will continue to run its course, but there is no incentive for builders or users to choose open source over open services for new data offerings..Ironically, it was ease of adoption that drove the open-source wave, and it is ease of adoption of open services that will precipitate the demise of open source particularly in areas like data management. Just as the last decade was the era of open-source infrastructure, the next decade belongs to open services in the cloud.
Related Items:
2019: A Big Data Year in Review Part One
2019: A Big Data Year in Review Part Two
Excerpt from:
Big Data Predictions: What 2020 Will Bring - Datanami