Category Archives: Cloud Servers
Druva Announces Record Growth in Cloud Server Data Protection – Marketwired (press release)
Druva Affirms Leadership in Cloud Server Data Protection Market with Phoenix Revenue Growth of over 300%
SUNNYVALE, CA--(Marketwired - May 9, 2017) - Druva, the leader in cloud data protection and information management, today announced record growth across its server data protection business. With over 300 percent year-over-year revenue growth across this line of business, the company announced it has added more than 150 marquee brands to the Druva Phoenix customer list.With cloud data protection and management becoming an IT imperative, total Druva Cloud deployments now expand to over 4000 enterprise customers, including 10 percent of the world's Fortune 500 companies such as Continental, Emerson, Flex, Fujitsu and Lockheed Martin.
"Enterprises are realizing the power, simplicity and evident value of the cloud. The rapid expansion of Druva's customer base and continued revenue growth is reflective of this shift taking place across the industry," said Jaspreet Singh, CEO of Druva. "Replacing legacy point backup solutions with modern data protection platforms is a strategic requirement for organizations to keep pace with accelerating recovery and protection needs. Druva is leading the way in this shift by delivering a cloud-first approach that simplifies business data protection."
Accelerating Demand for Modern Data Protection
Data growth is surging, and modernizing the protection capabilities of sensitive business information has become a strategic IT initiative. Enterprises in today's increasingly complex hybrid environment are looking to modern, cloud-based solutions to reduce the risk, cost and effort of managing data. According to Allied Market Research, data protection is expected to grow to $28B in 2022 for both cloud-based and on-premises servers.1
According to leading industry analyst firm Gartner, storage managers find backup delivered as a service is becoming attractive for a portion of their organization's data. Organizations are actively considering SaaS and cloud-delivered solutions to protect data in remote office/branch office (ROBO) environments, as well as test-and-development data and endpoints, including desktop, laptop and tablets. Some companies are evaluating these delivery models for regional areas and at least a subset of their primary data centers.2
Businesses are under growing external threats to data such as advanced ransomware as well as changing data regulations. At the same time, data is increasingly being stored and managed outside of the core data center bypassing legacy controls. By 2020, independent analyst firms have projected that over 50 percent of all corporate data will reside outside of the corporate data center.
"After exploring the market options, we selected Druva for our cloud server protection needs due to the unique capabilities of delivering a true SaaS solution for server data protection," said JP Saini, Chief Information Officer at TRC Companies, Inc. "Druva provides seamless access to protected information and delivers against our complex and distributed protection requirements."
Druva Cloud Meets Today's Data Protection Challenges
Druva delivers a SaaS solution that brings together backup, archival and disaster recovery to harness the ease of use and scalability of the public cloud. Druva Phoenix delivers modern data protection and management by centralizing and streamlining the processes of backing up, and making immediately available, data from physical and virtual servers across an organization's distributed infrastructure.
"Carahsoft is helping federal organizations modernize their datacenters with a comprehensive hybrid cloud architecture, and Druva is a critical solution partner in this stack," said Alex Whitworth, Director of Druva Solutions at Carahsoft. "With Druva Phoenix, we can empower our reseller partners and public sector customers to deliver strategic value while reducing risk, complexity and cost."
With Druva Phoenix, data can be cached locally for fast backup and recovery when required, helping companies meet their recovery time and recovery point objectives. Additionally, data can automatically be moved to cold storage after a specified period of time for increased cost savings.
To learn more about how Druva Phoenix can improve your business, visit https://www.druva.com/products/phoenix/.
1 Allied Market Research, "Data Protection as a Service (DPaaS) Market," January 2017 2 Gartner, "Modify Your Backup/Recovery Plan to Improve Data Management and Reduce Costs," Dave Russell, February 14 2017)
About Druva
Druva is the leader in cloud data protection and information management, leveraging the public cloud to offer a single pane of glass to protect, preserve and discover information -- dramatically increasing the availability and visibility of business critical information, while reducing the risk, cost and complexity of managing and protecting it.
Druva's award-winning solutions intelligently collect data, and unify backup, disaster recovery, archival and governance capabilities onto a single, optimized data set. As the industry's fastest growing data protection provider, Druva is trusted by over 4,000 global organizations and protects over 25 PB of data. Learn more at http://www.druva.com and join the conversation at http://www.twitter.com/druvainc.
Originally posted here:
Druva Announces Record Growth in Cloud Server Data Protection - Marketwired (press release)
Better Buy: Twilio Inc vs. Nutanix Inc. – Motley Fool
Cloud computing stocks can be great growth plays, but they can also quickly collapse on concerns about slowing sales growth, widening losses, and lofty valuations. That's exactly what happened to two recent cloud IPOs -- Twilio (NYSE:TWLO) and Nutanix (NASDAQ:NTNX).
Twilio went public at $15 per share last June, soared to nearly $70 three months later, then fell back to the mid-$20s. Nutanix went public at $16 per share last September, peaked in the mid-$40s in early October, then stumbled back to the mid-teens.
Image source: Getty Images.
Let's discuss what happened to these two recent IPOs, and whether or not investors should consider them potential turnaround plays at current prices.
Twilio's cloud service delivers voice calls, SMS messages, videos, and other content for mobile apps. If developers want users to call or text each other from within their apps, they subscribe to Twilio's service and integrate its API into their apps. This is generally cheaper and more scalable than creating comparable features from scratch.
Facebook (NASDAQ:FB), for example, uses Twilio's API to enable WhatsApp and Messenger users to add other users via phone numbers. Twilio has also been gradually adding additional video, security, and enterprise administration features to this platform to boost its revenues per user.
Nutanix is the market leader in hyper-converged infrastructure (HCI) appliances and software-defined storage solutions. These products bundle traditional silos of server, storage, networking, virtualization, and data center management into a single turnkey solution.
Nutanix claims that collapsing all those product categories into a single "converged" enterprise cloud platform will gradually make data center infrastructure "invisible." This can be a cost-effective solution for younger companies which haven't installed on-site infrastructure yet, and it can help older companies pivot away from on-site private cloud models toward more flexible "hybrid" cloud models that straddle both the private and public clouds.
Twilio's sales surged 66% to$277.3 million in fiscal 2016. Its "base" revenue -- which excludes revenue from "Variable Customer Accounts" (large customers which haven't signed 12-month minimum revenue commitment contracts) jumped 79% to $245.5 million.
Image source: Uber.
However, Twilio expects just 28%-31% sales growth this year, partly due to waning business from itstop customer, Uber. During its first quarter earnings report, Twilio disclosed that Uber -- which contributed 12% of Twilio's sales during the period -- plans to use other internal or third-party platforms for its calls and texts in the future.
That bombshell caused Twilio shares toplummet 26% on May 3. That development was also troubling because another Twilio customer, Lyft, recently announced that it would start testing Vonage's (NYSE:VG) Nexmo platform as an alternative to Twilio's service.
Nutanix's sales soared 85% to $444.9 million in fiscal 2016 on growing demand for HCI solutions. Wall Street expects salesto rise another 66% this year. Those numbers look solid, but a slowdown in sequential growth last quarter indicates that sales could peak this year. Moreover, Nutanix faces tough questions regarding Hewlett-Packard Enterprise's (NYSE:HPE) recent acquisition of itsrival SimpliVity.
That $650 million buyout is troubling because it makes HPE -- which already has a massive presence in enterprise hardware, software, and services -- the second largest player in the HCI market after Nutanix. HPE will inevitably bundle SimpliVity's services with its other enterprise products -- which could render Nutanix obsolete.
The short of it is that both businesses could lose customers in the near future.
Twilio and Nutanix are both unprofitable, and analysts don't see them achieving either non-GAAP or GAAP profitability anytime soon. That's because the cost of running cloud servers, securing new customers, and offering competitive prices doesn't leave much room for profits -- unless the companies scale up dramatically.
Twilio posted a non-GAAP net loss of $0.16 per share in 2016. Due to the gradual loss of Uber, the company expects that loss to widen to $0.27-$0.30 per share this year. That bottom line decline won't be good for its cash flow -- the company's cash and equivalents already dropped 61% sequentially to just $118.4 million last quarter. This raises the troubling possibility of another secondary offering in the near future.
Nutanix's net loss in fiscal 2016 is unknown. But the company's quarterly net loss nearly tripled last quarter, and it's expected to post a non-GAAP net loss of $1.49 per share this year. On the bright side, Nutanix's cash cushion remains strong, with its cash and equivalents staying nearly flat sequentially at $226 million last quarter.
Twilio trades at 4.8 times sales, which is slightly lower than its industry average of 5.6. Nutanix's P/S ratio of 3.6 is also lower than the industry average of 5.5. Nonetheless, investors probably shouldn't consider either stock "cheap" due to their top line challenges and lack of profitability.
Both stocks remain highly speculative plays, but I'd still pick Twilio over Nutanix because it dominates a valuable niche service and faces less direct competition. Nutanix is well-poised to profit from growing demand for HCI solutions, but I have doubts that it can survive competition from HPEand other tech giants which are all aggressively expanding into the same market.
Leo Sun owns shares of Twilio. The Motley Fool owns shares of and recommends Facebook. The Motley Fool recommends Twilio. The Motley Fool has a disclosure policy.
Read the original:
Better Buy: Twilio Inc vs. Nutanix Inc. - Motley Fool
You may now kiss the server-side: Dell EMC marries storage software to PowerEdge 14Gs – The Register
Dell EMC World Dell EMC is updating three storage software products, previewing a fourth, and refreshing its Ready Nodes storage software and server bundles.
And we're told the Dell PowerEdge servers to run this software are getting updated over the next few months to the 14G generation, with NVMe flash and NVDIMM options, and can be expected to provide more flash-enhanced performance than the current 13G models.
ScaleIO.Next gets inline compression, enhanced snapshot capabilities, granular thin provisioning and seamless volume migration. It also gets to support VMware virtual volumes.
Elastic Cloud Storage ECS.Next cloud-scale software gets enterprise-class data protection and management capabilities, and advanced analytics support.
An ECS Dedicated Cloud Service is hosted in Virtustream data centers. It's a dedicated, single tenant offering with, Dell EMC says, "private cloud control with the hands-off operations and agility of the public cloud." It's said to enable customers to use ECS through a hybrid cloud model.
Project Nautilus is software for storing and analyzing high volumes of streaming IoT data. This data is stored in Isilon systems or on ECS ones, and the software is intended to bring real-time data processing capabilities (analytics) to Dell EMC's unstructured storage platforms.
IsilonSD Edge, Isilon software for remote offices, gets deployable on PowerEdge 14G servers. This software also gets support for vSphere v6.5, and deployment using virtual storage platforms like ScaleIO and VMware vSAN.
There three new Ready Nodes:
ECS.Next and ScaleIO.Next have planned global availability in the second half of 2017. ECS Dedicated Cloud Service and IsilonSD Edge have planned global availabilities in the second quarter of 2017.
ScaleIO Ready Nodes and VMware vSAN Ready Nodes are available globally today and have planned availability on new PowerEdge 14G servers in mid-2017.
Microsoft Storage Spaces Direct Ready Nodes have planned global availability in June 2017 and planned global availability on new PowerEdge 14G servers in mid-2017.
Read the original post:
You may now kiss the server-side: Dell EMC marries storage software to PowerEdge 14Gs - The Register
Panda Free Antivirus – TechRadar
Free security software Panda Free Antivirus uses cloud processing to protect your PC, eliminating the need for hefty software updates and demanding scans. This collective intelligence should lead to more harmful programs being detected while using a fraction of the system resources.
According to the antivirus testers at AV-Test, Panda Free Antivirus right on the industry average in terms of virus detection, sitting at around the 98% mark for zero-day attacks and a hair under 100% for established threats.
Those are good numbers, and Panda is good software providing you remember to deny it permission to take over your browsers home page and search facility upon installing. Its process monitor is very useful, it scans quite quickly, and its simple enough in its presentation for even the most technophobic user to find their way around.
You might not realise at first, but Panda Free Antivirus begins scanning as soon as it's installed to ensure your PC is protected immediately. You can also perform three types of manual scan: full, critical areas, and custom.
The suite's other tools include a process monitor, which lets you see active processes and whether they are using a secure connection. Processes that don't are blocked, but you can undo this manually if you're confident it's safe.
There's also a handy vaccination tool that checks attached USB drives for malicious software. We recommend changing the settings so it runs automatically to pick up any threats as soon as possible.
If one of your PCs has been locked by a malicious program, Panda Free Antivirus can create an emergency rescue USB drive that you can use to scan the affected machine. This feature uses Panda Cloud Cleaner a specialist scanner that detects viruses and malware other scanners might miss.
All of of these options are presented in a clear dashboard, with moveable tiles that you can customize to suit your preferences or remove if you never use a particular feature.
Unfortunately, in our tests, Panda Free Antivirus still had a significant impact on system performance, despite its cloud-based approach. Its emergency cleaner and USB vaccination tools are well worth investigating, but if you're looking for a free antivirus tool that will have a minimum impact on other system processes, you might want to look elsewhere.
See more here:
Panda Free Antivirus - TechRadar
NVIDIA’s Tesla GPUs Power Major Cloud Companies’ AI Efforts – Market Realist
What Makes NVIDIA the Semiconductor Company of the Future? PART 8 OF 16
NVIDIA (NVDA) has continued to witness strong growth in the gaming sector as eSports, virtual reality, and its Pascal-based GPUs (graphics processing unit) have driven demand.
However, the key highlight of its fiscal 2017 was its Data Center segment, which grew more than threefold from just $97 million in fiscal 4Q16 to $296 million in fiscal 4Q17, making it the second-largest business segment after Gaming.
The growth was driven by the increasing adoption of NVIDIAs Tesla P100 GPUs and the DGX-1 supercomputer by CSPs (cloud service provider) and other high-performance computing providers. NVIDIAs Tesla GPUs enable CSPs to boost processing power by fivefold and reduce costs 60%.
AI (artificial intelligence) was first adopted by hyperscale data centers such as Microsoft (MSFT), Facebook (FB), and Google (GOOG), which used AI for image recognition and voice processing. AI is now moving to the enterprise space as companies in healthcare, retail, and finance begin to use deep learning to solve problems and automate processes.
The increasing adoption of AI has encouraged several CSPs to offer AI-as-a-service.
Google is offering deep-learning capabilities on the Google Cloud Platform using NVIDIAs Tesla K80 GPUs. Users can use up to eight GPUs for their deep-learning operations for an hourly charge of $0.70 for each GPU in the United States and $0.77 for each GPU in Asia and Europe.
Even Amazon (AMZN) Web Services offers deep-learning capabilities, allowing users to use up to 16 of NVIDIAs Tesla K80 GPUs. Microsofts Azure offers similar support for up to four of NVIDIAs slightly older GPUs.
Chinas web search engine Baidu (BIDU) is also offering deep-learning capabilities on the Baidu Cloud using NVIDIAs Tesla P40 GPUs and deep learning software.
Users use this service for both training and inference acceleration for open-source deep learning frameworks such as TensorFlow and PaddlePaddle.
Chinas Tencent Cloud will soon offer deep learning capabilities on its public cloud platform using NVIDIAs Tesla P100, P40, and M40 GPU accelerators and deep learning software.The chip supplier stated that the cloud servers would integrate up to eight GPU accelerators in 1H17.
NVIDIA and Microsoft have developed thehyperscale GPU accelerator framework HGX-1, which will feature eight Tesla GPUs and will be able to connect to the CPU (central processing unit) depending on the workload. The companies plan to make the open-source, scalable HGX-1 design the standard architecture for AI cloud computing.
IBM (IBM) will soon offer GPU support on its Bluemix cloud, allowing its users to add two NVIDIA Tesla P100 GPUs. This will provide up to 4.7 teraflops of double-precision performance and 16 gigabytes of memory. IBM is also working with NVIDIA in the supercomputer space. IBMs Power8 CPUs and NVIDIAs Tesla P100 GPUs will be used in two new supercomputers for the U.S. Department of Energy.
Next, well see how NVIDIA is supporting the adoption of AI by enterprises and supercomputers.
See the rest here:
NVIDIA's Tesla GPUs Power Major Cloud Companies' AI Efforts - Market Realist
IBM to Provide Enterprises with Precise Placement Control of their Cloud Workloads – PR Newswire (press release)
ARMONK, N.Y., May 9, 2017 /PRNewswire/ -- IBM (NYSE: IBM) today announced plans to release a new virtual server offering, called Dedicated Hosts, that provides enterprises with more precise placement control over their cloud workloads. A Dedicated Host is a physical server with workload capacity entirely dedicated to a single client's use. The introduction of this offering will enhance IBM's existing suite of virtual server offerings and is ideal for enterprises transitioning to a public cloud environment who require exact control and visibility into where their data and workloads reside to meet compliance or security needs.
Enterprises largely want to be able to access the cost savings and speed benefits of the cloud, but 53 percent of organizations surveyed by IDG Enterprise cite concerns over where data is stored as a key challenge to implementing a cloud computing strategy[1]. Single tenant cloud offerings such as Dedicated Hosts are designed to enable enterprises in highly regulated industries to capture the benefits of the public cloud while helping to meet complex challenges.
The introduction of Dedicated Hosts will take IBM's current single-tenant virtual server offerings a step further by providing clients with complete control over how each of their virtual servers are provisioned on a Dedicated Host. Additionally, clients will be able to select where their Dedicated Host is located within an IBM Cloud data center. This level of control can help clients in heavily regulated industries overcome barriers to cloud adoption. For example, clients in the health care industry may require precise information on where data is stored so that they can meet their compliance standards. With Dedicated Hosts, clients will be able to provide auditors with more precise information on how and where their workloads are running.
With Dedicated Hosts, clients will benefit from three distinct features:
"Enterprises turn to IBM because they want cloud infrastructure that is flexible and easy to scale, while providing higher levels of control and visibility to help as they meet their regulatory and compliance requirements," said John Considine, general manager, cloud infrastructure services, IBM. "Dedicated Hosts are another example of IBM's commitment to providing enterprise-strength capabilities on our cloud platform so that clients can do more with their data."
IBM plans to make Dedicated Hosts available in June 2017. Dedicated Hosts will be available to order monthly or hourly in IBM Bluemix. The initial offering will be available in the following configuration:
Dedicated Host Configuration
Core (vCPU)
RAM (GB)
Local Storage (TB SSD)
56 x 242 x 1.4
56
242
1.4
About IBM Cloud:For more information, visit: https://www.ibm.com/cloud-computing/.
[1] IDG Enterprise: 2016 IDG Enterprise Cloud Computing Survey, November 2016
Media Contact:Sarah Murphy IBM Media Relations srmurphy@us.ibm.com 336-337-7584
To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/ibm-to-provide-enterprises-with-precise-placement-control-of-their-cloud-workloads-300454089.html
SOURCE IBM
Read the original:
IBM to Provide Enterprises with Precise Placement Control of their Cloud Workloads - PR Newswire (press release)
BOXX Expands Product Line and Services by Acquiring Cirrascale – GlobeNewswire (press release)
May 08, 2017 08:00 ET | Source: BOXX Technologies
Austin, TX, May 08, 2017 (GLOBE NEWSWIRE) -- AUSTIN, TEXAS, May 8, 2017 BOXX Technologies, the leading innovator of high-performance computer workstations, rendering systems, and servers, today announced the acquisition of Cirrascale Corporation, a premier developer of multi-GPU servers and cloud solutions designed for deep learning infrastructure. The acquisition enables BOXX to add Cirrascales deep learning hardware to its line of multi-GPU solutions and solidifies BOXX as the leader in multi-GPU computer technology. Cirrascale Cloud Services, a BOXX subsidiary, will continue to expand its growing business and provide GPU-as-a-Service along with other professional services.
Cirrascale is instantly recognizable as a leader in deep learning infrastructure, cloud services, and like us, a strategic partner of NVIDIA, so naturally, were excited to welcome them to the BOXX family, said Rick Krause, BOXX CEO. We now have a complete solution of world-class deep learning servers, development workstations, and cloud services for data scientists, researchers, and other professionals.
Cirrascale Cloud Services offers a dedicated, bare-metal cloud service with the ability for customers to load their own instances of popular deep learning frameworks such as TensorFlow, Caffe, MXNet, and Theano. This provides user access to the raw horsepower of a modern multi-GPU system and is highly attractive to customers with various deep learning and HPC applications. BOXX will manufacture the high performance rackmount systems featuring up to eight NVIDIA Quadro or Tesla graphics cards.
With expertise in the development and manufacturing of high-performance systems, BOXX will now deliver deep learning solutions to customers worldwide while providing services and support to meet their needs, said PJ Go, CEO, Cirrascale Cloud Services. This enables our team to continue to expand our cloud services which have grown exponentially over the past year. Together, our companies will further accelerate the ever growing momentum of machine learning and artificial intelligence.
BOXX began primarily in media & entertainment, producing the fastest and highest quality hardware solutions for VFX, animation, and motion media applications. However, over the past decade, the hardware manufacturer has expanded to also develop products specific to architecture, engineering, and other markets which rely on professional 3D design applications. Multi-GPU workstations have always represented a significant portion of BOXX business, but as the company continues to add enterprise customers like broadcast networks and organizations focused on deep learning, the acquisition of Cirrascale is a natural fit. The acquisition comes as BOXX, owned by Dallas-based private equity firm Craftsman Capital, is experiencing record growth.
BOXX is an incredible brand built on delivering the highest possible performance to the customer, said Barrett Dean, Partner at Craftsman Capital. With expanded resources and knowledge base, we have complete confidence that consumers will view BOXX as a one stop shop for deep learning infrastructure and cloud services. The highly experienced BOXX management team has done an outstanding job of building BOXX as a premium solution provider, so the addition of Cirrascale further expands the BOXX brand into new markets.
For further information and pricing on multi-GPU servers, contact BOXX at 1-877-877-2699. Learn more about multi-GPU servers, APEXX workstations, BOXX rendering solutions, BOXX finance options, and how to contact worldwide resellers, by visiting http://www.boxx.com. For further information on Cirrascale Cloud Services or to rent the latest x86 and POWER configurations, call (888) 942-3800 or visit http://www.cirrascale.cloud.
About BOXX Technologies
BOXX is the leading innovator of high-performance workstations and rendering systems for visual effects, animation, product design, engineering, architectural visualization, and more. Combining record-setting performance, speed, and reliability with unparalleled industry knowledge, BOXX is the trusted choice for creative professionals worldwide. For more information, visit http://www.boxx.com.
About Cirrascale Cloud Services Cirrascale Cloud Services is a premier provider of dedicated public and private, multi-GPU cloud solutions enabling deep learning. The company offers the latest cloud-based solutions for large-scale deep learning operators, service providers, as well as HPC users. To learn more about Cirrascale Cloud Services and its unique dedicated, multi-GPU cloud solutions, please visit http://www.cirrascale.cloud or call (888) 942-3800.
Attachments:
A photo accompanying this announcement is available at http://www.globenewswire.com/NewsRoom/AttachmentNg/2d54d8d7-1b2b-46ae-91ac-e24fe9930074
Related Articles
View post:
BOXX Expands Product Line and Services by Acquiring Cirrascale - GlobeNewswire (press release)
Servers as pets or cattle was 2012. Now it’s McMansions or Hotels … – The Register
Remember pets and cattle? CERN's 2012 metaphor to describe on-premises servers you name and care for lavishly versus virtualized cloud servers you never name, run in a herd and snuff out without a second's thought?
Well the metaphor's evolved: VMware and Pivotal are now talking about McMansions and hotels to explain how they will bring virtual networks into the world of DevOps.
The companies feel that most DevOps work is taking place either with cloud-native applications or on the margins of a big organisation's software flet. Core applications remain largely untouched in the push to continuous deployment, largely because even small changes to older code in the heart of a business require detailed security and compliance oversight. That slows things down because large organisations have silos to take care of those things. Between politics and governance, that makes it hard to get close to continuous delivery.
That state of affairs got the two companies thinking about the surprise breakout use case for NSX. VMware first imagined NSX as a control plane for networks comprised of different vendors' hardware. It's turned out to be more immediately useful for microsegmentation the practice of creating virtual networks to link a small set of resources, often tied to a specific workload. Because these virtual networks are only required to do certain things, they are defined by policies that don't let them do anything else. If behaviour not defined in policy is detected, microsegmented networks either isolate themselves or make red lights and klaxons go off down in Ops.
At Dell EMC World the companies will explain how NSX will be integrated with Pivotal Cloud Foundry so that when developers work on stuff that touches compliance-and-security-sensitive applications, they'll do so inside virtual networks that reflect all the worries security and audit teams want taken into account. Instead of creating compliance-friendly new development environments a McMansion with a room for everyone and every household activity - they'll check into a hotel with just the services needed for a short stay, but a policy-enforced empty minibar.
These development environments will have their own IP and MAC address spaces and, while they may run on shared hardware, will be logically discrete from production environments and from other testbeds.
The two companies think this approach will be especially attractive to developers building containerised systems on top of core applications, because spawning containers, chaining them and they destroying both the containers and the connections between them sets off alarms among compliance pros. Those folks are accustomed to being able to trace transactions with great granularity, a task that possible with containers but made harder by the fact containers are treated as even more disposable than cattle. Showing governance types that all of this whacky work happens within virtual environments that adhere to policy makes for greater comfort.
We're not sure at this stage exactly what Pivotal and VMware will announced, but The Register understands this is a day two announcement. Michael Dell, David Goulden and Intel's Diane Bryant are the day one speakers at the show.
Read the original here:
Servers as pets or cattle was 2012. Now it's McMansions or Hotels ... - The Register
Transforming the datacenter – ZDNet
The biggest fear early cloud adopters overcame was the fear of disruption. How would they move everything from their existing data center to a cloud service without disrupting business operations? The answer was, they didn't.
Smart companies realized that the safest way to transition from on-premises systems to the cloud would be to first assess all their applications, workloads, and other data assets to determine which would be most amenable to the cloud transition. They began with low-criticality workloads and slowly worked their way up the list. Some are still working through that process.
For those organizations, a new concern emerged. How would they manage an environment that basically consisted of two network cores; one still on-premises and the other now in a cloud facility? Would they need to maintain two separate environments? Two management consoles? Two sets of protocols? What would happen when they needed to combine resources from both environments to accomplish something?
The alignment of Active Directory for Windows Server with Active Directory for Azure meant that data center managers could create a unified, seamless system that combined both on-premises and cloud-resident systems under one AD forest. This enables IT to use one set of protocols, one set of user credentials, one set of security standards, and one unified data environment.
It also facilitated the smooth movement of more applications, workloads, and other data assets from the on-prem network to the cloud. Once the two are aligned, it becomes almost effortless to provision new virtual machines and new resource stacks, and move them across the network.
Gartner recently predicted that, by 2020, the "cloud shift" will affect more than $1 trillion in IT spending. That means that more organizations will be migrating more of their IT operations to the cloud.Some will still need to port, modify, or rewrite some applications before they can be migrated. The recent embrace of Open Systems by Microsoft clearly signals Redmond's commitment to assist in this challenge. One way in which the company is facilitating cloud migration is by developing more and more enhancements to Azure services that will accommodate a broader variety of platforms and environments.
Many resources combine to provide a complete Azure service for your applications. These include virtual machines (VM), your storage account, web apps, databases, and your virtual network (VLAN). Originally, when you managed an Azure cloud environment, you created, deployed, and managed each of these separately. When adjustments needed to be made, you went to the required consoles manually. When you wanted to replicate your solution in another Azure environment, you had to recreate everything.
The introduction of Azure Resource Management (ARM) in 2014 changed all that, mainly by providing resource groups, which are defined by Microsoft as "a container that holds related resources for an Azure solution. The resource group can include all the resources for the solution, or only those resources that you want to manage as a group. You decide how you want to allocate resources to resource groups based on what makes the most sense for your organization." Now all the required resources can be managed as a group, which saves tremendous time and effort.
As the cloud revolution marched on, it became clear that cloud customers enjoy better IT services at a lower cost. It made little sense to own and operate your own servers when a cloud provider could deliver a professionally operated data service, while you pay only for what you use.
Now, all of the tools required to ease the path from on-premises to cloud facilities are emerging very quickly. Companies can move from on-premises to robust co-existence and then finally complete the journey to the cloud on their own schedule with virtually no disruption whatsoever.
Read more:
Transforming the datacenter - ZDNet
The Storm Platform | Liquid Web
Javascript not enabled - Portions of this site require javascript to work.
Storm Servers are the next generation of server hosting, providing the power of dedicated server hosting with the flexibility of cloud hosting. Our Storm Dedicated Servers have all the functionality of cloud servers, but are run on hardware specifically dedicated to your infrastructure. In order to make Storm Servers powerful features available to all levels of application, we also offer Storm VPS Cloud Servers, which are customizable servers of several different sizes and configurations in a shared cloud environment.
Available in Linux or Windows and featuring a full selection of Operating Systems, our Storm Platform is engineered to meet any hosting need you could imagine. Explore our additional performance options and tools ranging from solid state drives, to private cloud functionality, to both block and object storage, then let us know how we can customize the perfect hosting environment for you.
Our Storm Servers Dashboard puts you in complete control of your server and provides you immediate access to many powerful features. From here you can monitor your cloud infrastructure, reboot a server, clone a copy of your server or even add new cloud products to your account.
The mobile version of our management site gives you the peace of mind that you can have full visibility into your cloud environment no matter where you're at.
Our Storm Dedicated Servers are cloud servers on your own dedicated hardware. With the intent of making Storm Servers' powerful features available to all levels of application, we also offer Storm Cloud Servers, which are servers in a shared cloud environment. In addition to these two levels, you can add solid state storage in the form of Storm SSD -- in a number of packages -- that will help you achieve speed and performance levels that represent the future of hosting, now!
Read more here:
The Storm Platform | Liquid Web