Category Archives: Cloud Servers
Cloud storage keeps data out of reach of criminals – The Globe and Mail
As chief information security officer for Amazon Web Services, Stephen Schmidt is surprised by how many businesses still fail to see the dangers of storing information on computers and servers in their offices rather than in the cloud.
Not only is that data vulnerable to physical calamities such as fire or hardware failure, the onus is also on the businesses themselves to protect their hardware and networks against hackers and other online attacks.
Maintaining so-called on-premises storage is particularly risky for small and medium-sized companies, the bread and butter of Canadas economy, since they typically dont have large IT staffs or the resources to spend on countering these growing threats.
Outsourcing data security to a cloud service such as the Amazon.com Inc. subsidiary AWS, on the other hand, is safer and cheaper because it allows businesses to take advantage of significantly greater resources, Mr. Schmidt says.
If you have a customer who thinks theyre safer on-premises than they are on the cloud because theyre behind a firewall that somebody installed, they should seriously re-evaluate their risk.
Its an expected position from Amazon, which competes against a number of big technology companies including Google and Microsoft in selling cloud services to other businesses, but recent converts tend to agree.
Jour de la Terre, for example, began moving its data online last year. The 15-person non-profit, which promotes Earth Day activities in Quebec and France, had previously stored its websites, e-mail, documents and streaming videos on separate servers in its Montreal offices.
Director Pierre Lussier says he was initially nervous about putting all of his figurative eggs into one cloud basket, but that was before he realized the precariousness of his existing situation.
Much of his organizations information was managed by a single person who ended up leaving the organization. Without his collected knowledge of where all the data was and how it could be accessed, there was disarray.
We found out how vulnerable we were, Mr. Lussier says. It was a total mess.
Jour de la Terre is now on track to finish migrating all of its information by June. Staff have to learn how to interact with the new system, but its proving to be more convenient and secure for everyone involved.
You have one gate and the knowledge that [employees] go through that gate, he adds. Ive gained so much.
Axia NetMedia, a fibre-optic Internet service provider based in Calgary, began its conversion to the cloud three years ago out of necessity when it acquired a new corporate customer.
The client required more data services than Axia could itself quickly deliver, so the ISP signed on to AWS to scale up. The company, which employs 150 people and counts the Alberta government, Sunterra Farms and the Post Hotel in Lake Louise, Alta., as customers, has been moving more and more of its business to the cloud since then.
Tie Hoekstra, manager of corporate IT controls and security, believes Axias services are more secure now because he no longer has to worry about protecting customers data himself. That responsibility has been shifted to Amazon and its deep pockets.
You cant duplicate the tools that theyre able to give you to maintain your levels of security without spending an enormous amount of money, he says.
Google echoes that sentiment. Like Amazon, Microsoft and other cloud service rivals, the search company runs most of its operations on its own custom-designed hardware, from the servers that data is stored on to the networks that connect them.
To that end, Google in March unveiled Titan, a specially designed microchip that adds cryptographic capabilities to servers.
Each of the cloud companies effectively resell the same security they rely on for their own services to other businesses.
Our scale allows us to build in security from the ground up, says Niels Provos, a distinguished engineer at Google. Many of the worries you have about securing your on-premises machines do not exist in the cloud.
Thats not to say the cloud is a magic font of security for all businesses. While it does solve a number of problems, it also potentially introduces new ones.
Data sovereignty, where information is stored locally in a specific country rather than on the U.S.-based servers of the big tech companies, has been a growing issue since the U.S. National Security Agency spying revelations a few years ago.
Government clients, especially, are requiring cloud providers to base their data centres within their own borders, to keep their information from crossing over into other jurisdictions.
Those requirements were the main drivers for both Amazon and Google opening Canadian data centres in the Montreal region over the past few months.
Location, however, isnt the only determining factor behind data sovereignty local laws can also come into play and sometimes conflict.
Microsoft, for example, narrowly won a legal victory last January against the U.S. Department of Justice, which was trying to force access to customer data stored in Ireland.
Localized storage therefore isnt a silver bullet against unwanted search and seizure, which means that businesses with sensitive information will still need to seek out legal advice.
Just because a company says you have data sovereignty, it doesnt necessarily mean that all the nations involved will agree, says Christopher Parsons, a research associate at the University of Torontos Citizen Lab.
Mr. Parsons adds that storing data with Amazon, Google or Microsoft could also be awkward for any business that is looking to compete against those companies, a growing possibility given their respective sizes and scopes.
The companies will insist that their customers data is sacred and that they will never access it, but the incentive will always be there.
Still, as long as businesses are aware of some of these new risks, the cloud can indeed provide capabilities they cant otherwise access.
Googles [and other cloud providers] security team will almost inevitably be better than what you can provide, Mr. Parsons says.
Follow Peter Nowak on Twitter: @peternowak
See the article here:
Cloud storage keeps data out of reach of criminals - The Globe and Mail
8 Steps to Evaluation Cloud Service Security the Right Way – CPAPracticeAdvisor.com
With the current break-neck pace of software and technology we can often overlook the fact that "the cloud" is really just outsourcing. The term "cloud" is simply a catch-all term for subscription-based services running on someone else's network. Evaluating the security of such services requires digging in and asking the provider some possibly uncomfortable questions. If you aren't currently doing this for each cloud opportunity, and thinking through how its failure will impact your firm and your clients, you are simply putting the firm at risk.
As an example, I recently had a Partner forward me some information about a potential cloud service that we could use to help our staff by easing their manual data entry tasks. The idea behind the service was straightforward. Their cloud service would aggregate a client's transactions and allow the transactions to be bulk downloaded into our chosen software. To accomplish this, we would need to have each client enter their financial institution credentials into this cloud provider's system.
Our use of a cloud application like this would necessarily mean asking the client to participate. And, even if not actually stated, the fact that we would use it and ask the client to use it, conveys to the client that we "endorse" this software in some way. That means I had to ask the right questions before committing. If we ask our clients to participate in a cloud application, and then down the road that application is breached or found to be low quality, the client will be askingusthe hard questions.
These are the questions I always ask any potential cloud vendor:
If you can't get satisfactory answers to these questions, deciding to do business with such a provider boils down to a decision about how much risk your firm is willing to take on to gain the potential benefits the service will provide. And, if this is an app for doing client work, you will also be passing on that risk on to your clients. That has to be fully understood at the Partner level.
So, what do I consider "satisfactory" answers to the questions above?
Not answering one of the above questions doesn't necessarily shut the door on using the service. As long as the refusal to answer makes sense. For instance, a provider might tell you they definitely hash passwords stored in their database, but for security reasons they don't want to divulge which hashing algorithm they use. I'd be ok with that, as long as the rest of their answers seem competent and pass the "smell test".
Unfortunately, you will run into many startups that refuse to give straightforward answers to these questions. It's not enough that an app works well or solves a problem. If the people running the service don't have enough experience running and protecting such a service reliably at large scale, it's up to us to identify that ahead of time before we commit the data of our firm or our clients into their hands.
-------
Dave Jones is the IT Manager for Pearce, Bevill, Leesburg, Moore, P.C in Birmingham, AL. He has been a network and system administrator in the Birmingham, AL area for 20 years. He has been in the CPA technology field for 18 years. Email: dave@pearcebevill.com; LinkedIn: https://www.linkedin.com/in/daveajones.
Read more:
8 Steps to Evaluation Cloud Service Security the Right Way - CPAPracticeAdvisor.com
European tech unicorn OVH opens APAC HQ in Melbourne – The Australian Financial Review
OVH vice-chairman Laurent Allard says the business is growing rapidly in the Asia-Pacific region.
In a boost to the Victorian tech sector, French cloud infrastructure unicorn OVH is setting up an Asia-Pacific region headquarters in Melbourne to take on the likes of Amazon Web Services and Microsoft Azure, and it intends to employ up to 80 locals within three years.
The infrastructure-as-a-service company, which has more than 20 data centres in Europe, Canada and the US, has also built centres in Sydney and Singapore, as part of the company's expansion to the region.
OVH vice-chairman Laurent Allard told The Australian Financial Review while in Australia for the launch of the Asia-Pacific hub that the company's expansion into the region and the US was part of a bigger vision to become the global leader in cloud infrastructure-as-a-service.
"I'm not from this type of company background where the focus is on small and medium-sized businesses. I'm used to dealing with large enterprises and big transformation projects," he said.
"It was clear to me that the OVH model worked well but it was important to become the global leader, whereas at the time [in 2015] it was the European leader. We started out with no account manager for large accounts and had a very tech-driven company with gaps in terms of how to drive the company's strategy, but I had that expertise to bring and I'm very pleased with our complementary skill sets."
Until February 2015, Mr Allard had been the group chief technology officer of global IT and business process services provider CGI and until 2008 he was chief information officer of AXA Tech.
OVH, which already has 5000 customers in the Asia-Pacific region that are predominantly Australian businesses, was founded in 1999 by Octave Klaba with only $4000 and has remained a family-owned business in Roubaix, France.
Mr Klaba and his family still own 80 per cent of the business, which was valued at more than 1 billion in 2016.
The milestone valuation came after a 250 million capital raising led by New York-based private equity firms KKR & Co and TowerBrook Capital Partners.
OVH provides businesses with either public or hosted private cloud infrastructure, as well as bare metal cloud servers, But it considers its point of difference to be its ability to provide businesses with a hosted private cloud that works like a public one, and can be scaled up within 10 minutes to provide additional capacity during expected or unexpected busy periods.
The company said it selected Melbourne to be its Asia-Pacific region hub because of its talent pool and liveability, and OVH APAC expansion adviser Emmanuel Goutallier said the state government had actively engaged with the company ahead of the move.
"The state government has been extremely supportive. We started to engage with them mid last year and they have helped us understand the benefits of being in Melbourne and they've exposed us to something important the education community here in Melbourne, since we want to hire locally," he said.
"We have a strong in-house education program, but it needs to be complemented by external programs."
As well as establishing a presence in Australia, OVH will bring with it its Digital Launch Pad program. This is expected to be up and running by the end of June 2017 and will provide a range of free resources to start-ups on application, including up to $100,000 of cloud infrastructure support.
It is also in discussions with LaunchVic on how to support the local start-up ecosystem.
OVH joins a growing list of international tech companies which have set up offices in Melbourne, including Hired, Square, Slack and Zendesk.
Part of its decision to set up shop in Australia was to provide a local service to its customers here, fitting with the company's decision to create a wholly owned subsidiary for its US operations earlier this year, allowing it to operate in a way best suited to the US market.
"Our business is global, but we have to have a global-local approach. Technology can be a great generic asset, but the business solutions require proximity," Mr Allard said.
"We believe cloud is the way to build business solutions, but at the end of the day you need people on the ground and you need that proximity."
OVH is also in discussions with major telecommunications providers and cloud application businesses about establishing local partnerships in the Asia-Pacific region.
In the 2016 fiscal year OVH reported revenue of 320 million and by 2020 the company wants to hit 1 billion in revenue. By the end of 2017, the company also expects to have grown to 27 data centres, which will likely include more centres in the Asia-Pacific region.
The company has also pledged to invest 1.5 billion in its services over the next five years.
See the article here:
European tech unicorn OVH opens APAC HQ in Melbourne - The Australian Financial Review
Dell Refreshes PowerEdge Line for First Time in 3 Years – Virtualization Review
News
It's part of an effort to make servers more cloud ready.
If Dell wants to keep being a leading hardware vendor in the cloud age, it needs servers that can keep up with the needs of more demanding infrastructure.
Cloud computing -- whether public, private or hybrid -- puts more strain on the underlying systems than the familiar, traditional datacenter model where everything stayed on-premises. Dell, recognizing that need, has updated its lineup to handle the changing model of computing that can have files, storage, networking and compute anywhere.
At the core of the company's new lineup of datacenter offerings, outlined this week at Dell EMC World in Las Vegas, is an upgraded version of the flagship Dell EMC PowerEdge servers, the first developed by the newly-merged company.
The company kicked off the datacenter portion of the conference with the launch of its PowerEdge 14 G servers (due out this summer) which are tied to the release of Intel's next-generation Xeon processors, code-named "Skylake Purley." It's the first refresh of the PowerEdge server line in three years and, in keeping with any refresh, the new systems offer the typical boosts in feeds and speeds. And while PowerEdge refresh will appeal to anyone looking for the latest servers, the release is also the key component to the entire Dell EMC converged and hyper-converged systems portfolio as well as new purpose-built appliances and engineered systems.
In addition to a new line of tower and rack-based servers, the PowerEdge 14 G will be the core compute platform for the forthcoming Azure Stack system and a new portfolio of datacenter tools, including a new release of its Networker data protection offering and upgrades to the VXRail 4.5, VX Rack and XC Series engineered systems (Windows Server, Linux and VMware, among others). "This is our 14th generation of servers, which is actually the bedrock of the modern datacenter," said David Goulden, president of Dell EMC, during the opening keynote session.
The new PowerEdge 14 G servers will be available for traditional datacenter applications as well as Web-scale, cloud-native workloads. Among the key upgrades that Dell EMC will deliver in the new PowerEdge server line are increased app performance and response times. The company claims the servers will offer a 19x boost in Non-Volatile Memory Express (NVMe) low latency flash storage single-click BIOS tuning that will allow for simplified and faster deployment of CPU-intensive workloads and the ability to choose from a variety of software-defined-storage (SDS) options.
"We knew we had to accelerate the workloads. We had to reduce the latency to make sure we have handled the performance to transform peoples' businesses," said Ashley Gorakhpurwalla, president of the Server Solutions division at Dell EMC. The server's new automatic multi-vectoring cooling allows a greater number of GPU accelerators, which the company claims can increase the number of VDI users by 50 percent.
In addition to the performance boost, company officials are touting a more simplified management environment. The servers will support the new OpenManage Enterprise console and an expanded set of APIs, which Dell EMC said will deliver intelligent automation. The company described the new OpenManage Enterprise as a virtualized enterprise system management console with a simple user interface that supports application plugins and customizable reporting. A new Quick Sync feature offers server configuration and monitoring on mobile devices. It boasts a 4x improvement in systems management performance over the prior version and can offer faster remediation with its ProSupport Plus and Support Assist, which the company claims will reduce the time to resolve failures by up to 90 percent.
Dell EMC has also added some noteworthy new security capabilities embedded in the hardware that offers new defenses. They include SecureBoot, BIOS Recovery, signed firmware and iDRAC RESTful API that conforms to Redfish standards. It also has better protection from unauthorized access control changes, with a new System Lockdown feature and a new System Erase function that ensures all data is wiped from a machine when taken out of commission.
The new PowerEdge servers were part of a number of other key datacenter offerings announced by the company this week. "Our new 14 G servers will be built into our full Dell EMC product portfolio, bringing out of our seventh generation of storage and data protection as well," Goulden said.
The servers will be offered with a variety of the company's new software-defined enterprise storage systems, including a new version of the Dell EMC ScaleIO software-defined storage (SDS) and upgrades to the company's Elastic Cloud Storage (ECS) platform. They include the ECS Dedicated Cloud Service for hybrid deployments of ECS, and ECS.Next, which will offer upgraded data protection and analytics; and its new Project Nautilus SDS offering, for storing and streaming IoT data. The servers will also power Dell EMC's new Ready Node portfolio, designed to transition traditional datacenters into cloud-scale infrastructure.
In addition to storage, Dell EMC said the PowerEdge 14 G will power the company's new Open Networking switches, including what the company claims is a top of rack that can offer more than a 2x in-rack throughput speed of traditional 10GbE switches and a unified platform for network switching, as well as a new line for small and midsize organizations.
About the Author
Jeffrey Schwartz is editor of Redmond magazine and also covers cloud computing for Virtualization Review's Cloud Report. In addition, he writes the Channeling the Cloud column for Redmond Channel Partner. Follow him on Twitter @JeffreySchwartz.
See the rest here:
Dell Refreshes PowerEdge Line for First Time in 3 Years - Virtualization Review
Demands of IoT, Quantum and Cognitive Workloads Drive IBM’s Cloud Datacenter Expansion – Redmondmag.com
Datacenter Trends
The company is not only expanding its datacenters to meet growing individual demand, but to meet the computing burden of new and emerging technologies.
IBM just announced the opening of four new cloud datacenters in the U.S. -- two in Dallas, Texas, and two in Washington, D.C. The new facilities were designed to handle demanding cognitive workloads running on IBM's Bluemix cloud platform. Each facility has the capacity for thousands of physical servers and offers a range of cloud infrastructure services, including bare metal servers, virtual servers, storage, security services, and networking, the company said.
IBM's growing cloud datacenter network now extends across 19 countries and comprises 55 facilities.
Big Blue launched a strategic initiative to deploy its cloud datacenters in key local markets around the world about two years ago, and the new U.S. facilities are part of that strategy. Late last year, the company opened a cloud datacenter in Norway, which was the first in the Nordic region. It also opened cloud datacenters in Seoul, South Korea, and Chennai, India. The company reportedly plans to open four addition datacenters before the end of second quarter, including two in London, one in Australia, and one in San Jose, Calif.
But a significant expansion effort planned for 2017 emphasizes enhancing the capabilities of existing facilities to accommodate growing demand for support of blockchain technology, quantum and cognitive computing and the Internet of Things (IoT), said John Considine, GM of IBM's Cloud Infrastructure group, in a statement
"This expansion is not about increasing the number of countries we operate in, but the capacities of the markets we're already in," explained Francisco Romero, VP of IBM's Cloud Infrastructure Operations group. "We're essentially growing with the demand for IBM's analytic and cognitive capabilities. As the number of clients leveraging those capabilities and the data sets grow, the demand for our infrastructure grows."
IBM is betting on growing demand for these technologies and the resulting demand for datacenters that can handle the higher-end workloads, which mitigate the costs of added hardware.
"We're doing a lot of work within the datacenter to support technologies that are very AI-friendly at scale," Romero said. "We already have CPUs available in those datacenters that cognitive workloads take advantage of. And we continue to work with our hardware vendors to incorporate more and more cognitive-specific capabilities into the datacenter, but doing it at scale, so the total-cost-of-ownership equation works as well as possible for the overall business."
This is a bet IBM appears to be winning. In April, the company reported a 33% increase in revenue from its cloud services during the last quarter, and total cloud revenues of $14.6 billion over the past 12 months.
About the Author
John has been covering the high-tech beat from Silicon Valley and the San Francisco Bay Area for nearly two decades. He serves as Editor-at-Large for Application Development Trends (www.ADTMag.com) and contributes regularly to Redmond Magazine, The Technology Horizons in Education Journal, and Campus Technology. He is the author of more than a dozen books, including The Everything Guide to Social Media; The Everything Computer Book; Blobitecture: Waveform Architecture and Digital Design; John Chambers and the Cisco Way; and Diablo: The Official Strategy Guide.
See the original post:
Demands of IoT, Quantum and Cognitive Workloads Drive IBM's Cloud Datacenter Expansion - Redmondmag.com
Cyberduck FTP 6.0 offers Cryptomator cloud encryption – The Stack
Cyberduck, the file transfer protocol client, now offers transparent, client-side encryption for all data stored on servers or in the cloud. The new encryption tool allows users to create secure vaults of data on any server or cloud storage program compatible withCyberduck.
The new encryption tool, available at no charge to Cyberduck users, was created using Cryptomator open-source AES encryption protocol.
Users can now create a secure data vault on a physical server or in their cloud storage with a simple menu selection, which is then password protected. Files transferred to the secure vault are automatically encrypted when uploaded and decrypted when downloaded.
All file contents are encrypted using AES, as are directory names, and directory structures are automatically obfuscated. Each file is encrypted individually. The only part of the file that is not encrypted, in fact, is the date the file was created or modified; other than that, all aspects of uploaded files are encrypted and password protected.
The vault passwords themselves are protected against brute force attacks using scrypt.
Uploads and downloads are encrypted transparently, and decrypted automatically when required. Transparent encryption means that users can work with their own encrypted files normally, without additional steps. While the vault containing encrypted data resides in the server or cloud, Cryptomator allows users to access files through a virtual hard drive that works like a USB flash drive.
Because no subscription is required, users need not worry about losing access to their encrypted data. Additionally, since the Cryptomator service is open source, clients can conduct their own independent audits to ensure the security of stored data.
Cyberduck supports all server hardware and most major cloud services, including Amazon S3, Microsoft Azure, or any OpenStack Swift, WebDAV or FTP interoperable hosting solution. The vaults created by users are also compatible with Cyberduck and Cryptomator mobile apps, representing an advancement over previous Cryptomator vaults which could only be used with a local hard disk. Because Cryptomator and Cyberduck are now sharing services, users are no longer required to store local copies of files uploaded to the cloud, as Cyberduck can retrieve stored files on demand.
Read the rest here:
Cyberduck FTP 6.0 offers Cryptomator cloud encryption - The Stack
Microsoft launches new database tools – The Seattle Times
Microsoft Build 2017: The new Microsoft tools are designed to make corporate database software, used to power everything from internal systems and records to public-facing websites, more smoothly integrate into the companys Azure platform.
Seattle Times technology reporter
Microsoft is rolling out new tools to entice businesses to store their data in the cloud, a bid to keep its units that sell on-demand processing power and data storage growing.
At its Build developer conference in Seattle on Wednesday, Microsoft launched a range of tools designed to make corporate database software, used to power everything from internal systems and records to public-facing websites, more smoothly integrate into the companys Azure platform.
Scott Guthrie, executive vice president of Microsofts Cloud and Enterprise Group and a longtime developer, introduced a database migration service, designed to smoothly convert data stored on companies own servers to Azure.
He also unveiled Azure Cosmos DB, a cloud-based database tool that Microsoft says can sort and organize business data around the world quickly.
The announcements are part of the latest Microsoft salvo aimed at Oracle, one of Microsofts rivals in selling business software.
A year ago, Microsoft said it would bring SQL Server, the companys popular database software, to the Linux operating system, a bid to go after a corner of the market dominated by Oracle and IBM. Previously, SQL Server had been available only on Windows.
Oracle is the largest seller of such database software. The company was slower than Microsoft to embrace cloud-computing, analysts say, but has recently been on a public relations and engineering push to build tools that work well in the cloud.
Cosmos DB, Microsoft says, comes with a pledge for consistency and reliability, a guarantee that a database supported by Microsoft servers in the U.S. will run the same, and in sync, when powered by one of the companys data centers in Japan, for example.
Jet.com the e-commerce company and Amazon.com rival owned by Wal-Mart uses the database to power internet transactions, Guthrie said. The company, which runs its entire web infrastructure on Azure, used the database tool to make 100 trillion queries during the Black Friday shopping rush, he said.
Cloud-computing providers like Microsoft and Amazon often use examples like Black Friday to pitch their services. Sudden changes in the use of company websites can overwhelm the corporate-owned servers that power them, a problem Microsoft says is eased by the companys superior capacity provided by its global network of data centers.
Cosmos DB is available starting Wednesday. The database migration services are available in preview versions.
See the rest here:
Microsoft launches new database tools - The Seattle Times
Nvidia surges as AI drives deeper into the cloud – Morningstar.com
TheStreet.com | Nvidia surges as AI drives deeper into the cloud Morningstar.com However, the company's data-center business enjoyed a massive bump as cloud-service providers and other companies looked to increase their computing power with chipsets for servers that have advanced deep-learning technology. Nvidia reported more ... Nvidia's Gaming and Server Businesses Are Still Defying Predictions of a Big Slowdown NVIDIA's Tesla GPUs Power Major Cloud Companies' AI Efforts Inspur Unveil AGX-2 Ultra-High Density AI Computing Server |
Go here to see the original:
Nvidia surges as AI drives deeper into the cloud - Morningstar.com
Electro Industries Releases Enhanced Security and Data Push for Shark Meters to Cloud Servers – PR Newswire (press release)
WESTBURY, N.Y., May 10, 2017 /PRNewswire/ --Electro Industries/GaugeTech (EIG) announces important new features for the INP100S Ethernet card and INP300S IEC 61850 Protocol Server card used by its Shark 200/270 meters. Both cards now give enhanced security through creation of a whitelisted Exclusive Client. When the Exclusive Client is communicating to the Network card, all other communication to the card is suspended. This protects the meter from unauthorized access or tampering during meter programming.
In addition, the INP100S Ethernet card now supports data push of meter readings to cloud servers. The meter can push up to 15 readings on a programmed interval to cloud servers using the popular JSON structure. Cloud server support, as well as the enhanced security, is a critical capability for customers of the Shark 200 and Shark 270 advanced revenue meters, giving them the ability to integrate energy management into their cloud-based building management systems, such as Lucid's BuildingOS.
Excerpt from:
Electro Industries Releases Enhanced Security and Data Push for Shark Meters to Cloud Servers - PR Newswire (press release)
Better Buy: Twilio Inc vs. Nutanix Inc. – Madison.com
Cloud computing stocks can be great growth plays, but they can also quickly collapse on concerns about slowing sales growth, widening losses, and lofty valuations. That's exactly what happened to two recent cloud IPOs -- Twilio (NYSE: TWLO) and Nutanix (NASDAQ: NTNX).
Twilio went public at $15 per share last June, soared to nearly $70 three months later, then fell back to the mid-$20s. Nutanix went public at $16 per share last September, peaked in the mid-$40s in early October, then stumbled back to the mid-teens.
Image source: Getty Images.
Let's discuss what happened to these two recent IPOs, and whether or not investors should consider them potential turnaround plays at current prices.
Twilio's cloud service delivers voice calls, SMS messages, videos, and other content for mobile apps. If developers want users to call or text each other from within their apps, they subscribe to Twilio's service and integrate its API into their apps. This is generally cheaper and more scalable than creating comparable features from scratch.
Facebook (NASDAQ: FB), for example, uses Twilio's API to enable WhatsApp and Messenger users to add other users via phone numbers. Twilio has also been gradually adding additional video, security, and enterprise administration features to this platform to boost its revenues per user.
Nutanix is the market leader in hyper-converged infrastructure (HCI) appliances and software-defined storage solutions. These products bundle traditional silos of server, storage, networking, virtualization, and data center management into a single turnkey solution.
Nutanix claims that collapsing all those product categories into a single "converged" enterprise cloud platform will gradually make data center infrastructure "invisible." This can be a cost-effective solution for younger companies which haven't installed on-site infrastructure yet, and it can help older companies pivot away from on-site private cloud models toward more flexible "hybrid" cloud models that straddle both the private and public clouds.
Twilio's sales surged 66% to$277.3 million in fiscal 2016. Its "base" revenue -- which excludes revenue from "Variable Customer Accounts" (large customers which haven't signed 12-month minimum revenue commitment contracts) jumped 79% to $245.5 million.
However, Twilio expects just 28%-31% sales growth this year, partly due to waning business from itstop customer, Uber. During its first quarter earnings report, Twilio disclosed that Uber -- which contributed 12% of Twilio's sales during the period -- plans to use other internal or third-party platforms for its calls and texts in the future.
That bombshell caused Twilio shares toplummet 26% on May 3. That development was also troubling because another Twilio customer, Lyft, recently announced that it would start testing Vonage's (NYSE: VG) Nexmo platform as an alternative to Twilio's service.
Nutanix's sales soared 85% to $444.9 million in fiscal 2016 on growing demand for HCI solutions. Wall Street expects salesto rise another 66% this year. Those numbers look solid, but a slowdown in sequential growth last quarter indicates that sales could peak this year. Moreover, Nutanix faces tough questions regarding Hewlett-Packard Enterprise's (NYSE: HPE) recent acquisition of itsrival SimpliVity.
That $650 million buyout is troubling because it makes HPE -- which already has a massive presence in enterprise hardware, software, and services -- the second largest player in the HCI market after Nutanix. HPE will inevitably bundle SimpliVity's services with its other enterprise products -- which could render Nutanix obsolete.
The short of it is that both businesses could lose customers in the near future.
Twilio and Nutanix are both unprofitable, and analysts don't see them achieving either non-GAAP or GAAP profitability anytime soon. That's because the cost of running cloud servers, securing new customers, and offering competitive prices doesn't leave much room for profits -- unless the companies scale up dramatically.
Twilio posted a non-GAAP net loss of $0.16 per share in 2016. Due to the gradual loss of Uber, the company expects that loss to widen to $0.27-$0.30 per share this year. That bottom line decline won't be good for its cash flow -- the company's cash and equivalents already dropped 61% sequentially to just $118.4 million last quarter. This raises the troubling possibility of another secondary offering in the near future.
Nutanix's net loss in fiscal 2016 is unknown. But the company's quarterly net loss nearly tripled last quarter, and it's expected to post a non-GAAP net loss of $1.49 per share this year. On the bright side, Nutanix's cash cushion remains strong, with its cash and equivalents staying nearly flat sequentially at $226 million last quarter.
Twilio trades at 4.8 times sales, which is slightly lower than its industry average of 5.6. Nutanix's P/S ratio of 3.6 is also lower than the industry average of 5.5. Nonetheless, investors probably shouldn't consider either stock "cheap" due to their top line challenges and lack of profitability.
Both stocks remain highly speculative plays, but I'd still pick Twilio over Nutanix because it dominates a valuable niche service and faces less direct competition. Nutanix is well-poised to profit from growing demand for HCI solutions, but I have doubts that it can survive competition from HPEand other tech giants which are all aggressively expanding into the same market.
10 stocks we like better than Twilio
When investing geniuses David and Tom Gardner have a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has tripled the market.*
David and Tom just revealed what they believe are the 10 best stocks for investors to buy right now... and Twilio wasn't one of them! That's right -- they think these 10 stocks are even better buys.
*Stock Advisor returns as of May 1, 2017
Leo Sun owns shares of Twilio. The Motley Fool owns shares of and recommends Facebook. The Motley Fool recommends Twilio. The Motley Fool has a disclosure policy.
Follow this link:
Better Buy: Twilio Inc vs. Nutanix Inc. - Madison.com