Category Archives: Cloud Servers
IBM Launches New Cloud Facilities to Satisfy Enterprise Demands – MarTech Advisor
IBM has launched four new IBM cloud data centers in the US, an attempt to satiate the rising demand for cloud-based infrastructures. This initiative accounts for a significant share of the companys investment towards cloud infrastructure expansion in 2017.
The global network of IBM Cloud comprises of several cloud data processing centers positioned at strategic local markets across six continents and 19 countries. These data centers enable clients to run their applications and data functions locally, thereby helping them with their monitoring and performance needs. The newly established cloud data centers will deliver clients the much-needed framework to successfully manage and derive actionable insights from their own accumulated data. In addition, the centers will also let them leverage the cognitive services of IBM along with IBM Watson.
The requirement for a hybrid and public cloud infrastructure is at an all-time high, as more and more enterprises start depending on AI to derive value from data. In such circumstances, the new IBM cloud facilities can help companies to digitize their operations and promote perceptive innovations.
"IBM is making major investments to expand our global cloud data centers in 2017 and provide the infrastructure necessary for enterprises to run their cognitive, big data, blockchain, and IoT workloads," said John Considine, general manager for cloud infrastructure at IBM, in a release. Emphasizing the value of these strategic cloud infrastructure expansions, he further highlighted that the companys increasing global footprint in the sector will give enterprises the required flexibility and capability to execute their most intricate workloads, wherever and whenever required.
All the new cloud data facilities in the US have the capability to support thousands of physical servers and offer an assortment of cloud framework services including storage, virtual severs, bare metal servers, networking, and security services. These IBM administered centers deploy services on the basis of requirements, and offer the liberty of remote accessing. Thus, leveraging these infrastructures, enterprises can develop highly scalable hybrid, private or public cloud environments.
Read more:
IBM Launches New Cloud Facilities to Satisfy Enterprise Demands - MarTech Advisor
Why Might Enterprise IT Spending Be Lower Than Expected? The Cloud – BizTech Magazine
IT spending worldwide is going to be lower than expected this year, according to Gartner, thanks in part to the continued shift away from legacy infrastructure to the cloud.
Earlier this month, Gartner slashed its forecast for global IT spending roughly in half. The research firm now projects IT spending will be $3.5 trillion in 2017, which would represent a 1.4 percent increase from 2016, but is down 1.3 percent from the firms fourth-quarter forecast of 2.7 percent.
The strong U.S. dollar has cut $67 billion out of our 2017 IT spending forecast, John-David Lovelock, research vice president at Gartner, says in a statement. Gartner expects these currency headwinds to be a drag on earnings of U.S.-based multinational IT vendors through 2017.
However, currency fluctuations are not the only thing driving down Gartners IT spending forecast.
The data center system segment of the IT market is expected to grow at a rate of 0.3 percent in 2017, which would be up from a shrinking market in 2016. However, Gartner says the segment is experiencing a slowdown in the server market.
We are seeing a shift in who is buying servers and who they are buying them from, Lovelock says. Enterprises are moving away from buying servers from the traditional vendors and instead renting server power in the cloud, from companies like Microsoft and Google, he notes.
That shift has led to lower spending on servers, which is impacting the larger data center market, he says.
Gartner expects the external controller-based storage segment to decline, but at a slower rate than before, according to the forecast. Additionally, the ECB segment, although still suffering from long-term structural challenges from cloud and alternative storage architectures, is benefiting from strong demand and component constraints in the solid-state array segment in particular, the forecast says.
Globally, Gartner expects the enterprise software market will grow by 7.3 percent in 2017, reaching $394.8 billion in constant dollars, a slight increase over the previous forecast. However, in U.S. dollars, enterprise software is expected to grow 5.5 percent ($351 billion) in 2017, slower than the firms forecast from the fourth quarter of 2016.
This is predominantly due to a stronger U.S. dollar against the euro, though the impact of exchange rate shifts is effecting multiple regions, Gartner says. Through 2021, we expect the market to grow at a 7.2% [compound annual growth rate] in constant currency. This breaks down as an 8.8% CAGR for enterprise application software, and a slower 5.8% CAGR for infrastructure software, in constant currency.
Gartner says a change in its expectations for the IT operations management market, particularly in the automation tools and IT service support management tools subsegments, resulted in a new five-year CAGR of 7.8 percent.
This change is focused entirely in North America where, driven by the availability of lower-cost cloud offerings, the number of organizations expected to move to the cloud within the forecast period has increased, Gartner says.
Read the original here:
Why Might Enterprise IT Spending Be Lower Than Expected? The Cloud - BizTech Magazine
Taskworld Unveils Exclusive Enterprise Plan with Private Cloud … – IT Business Net
April 26, 2017 --
New York (PRWEB) April 26, 2017
Leading project management app, Taskworld, today announces the launch of new Enterprise plan with unlimited usage, private cloud servers and discounts for educational institutions and non-profits.The app consolidates all files, tasks, messages and analytics related to work in one secure place and now, businesses can access huge benefits from the new options.
1) NEW VIRTUAL PRIVATE CLOUD
In addition to savings and added value, the new Enterprise plan with unlimited usage now offers access to private cloud. With this plan, businesses may choose from 16 geographic regions worldwide including Los Angeles, New York (USA), Frankfurt (Germany), London (UK), Singapore, Seoul (Korea), Tokyo (Japan), Beijing (China) and So Paulo (Brazil). With these dedicated data centers, enterprises experience even faster speeds, and data is encrypted with the utmost security, backed up fully and continuously every hour. The Enterprise plan also includes personalized live demos and onboarding, with priority access to customer support.
"Agility and velocity are essential for businesses looking to scale their success. It's in that spirit that Taskworld offers its Enterprise plan," said Fred Mouawad, CEO of Taskworld. "It is the ultimate solution for enterprises, offering not only the tools for innovation, but added value and room to grow."
Taskworld's Enterprise Plan gives you the opportunity to use Taskworld like never before.
2) DISCOUNTS FOR EDUCATIONAL INSTITUTIONS AND NON PROFITS
Taskworld's new pricing plans include generous discounts for educational institutions and non-profits.
Educational institutions can now subscribe to an unlimited package that allows schools and universities to add as many students, teachers and admin staff to a workspace for $399/month.
?I have tried many other project management systems, but I settled on Taskworld because of the user friendly interface, their features and their willingness to listen and improve." - Isa Dunaway, Web Administrator - Trinity Episcopal School
Non-profits can now receive a flat 30 percent off on all annual plans.
?Taskworld makes work seamless by allowing everyone to see what needs to be done, how, by whom and by when. I would definitely recommend Taskworld to other managers and non-profit organizations. It made our work life much more easier. We have gained efficiency by saving hours of work every day." - Sean Callaghan, Operations Director - 28TooMany.org
Taskworld's pricing starts at $8 per month for a single person after a 15-day free trial. Downloading the mobile app is free. The platform is available in English, German, French, Italian, Japanese, Korean, Mandarin, Portuguese, Spanish, Standard Chinese, Bahasa and Thai. For more information about Taskworld, visit http://www.taskworld.com or follow the company on LinkedIn, Facebook and Twitter.
ABOUT TASKWORLD
Taskworld Inc., founded in New York in 2012 by serial entrepreneur Fred Mouawad, is a cloud-based task/project management software that is used by thousands of teams and solo users in more than 80 countries.
Read the full story at http://www.prweb.com/releases/2017/04/prweb14274698.htm.
Source:PRWEB.COM Newswire. All Rights Reserved
See original here:
Taskworld Unveils Exclusive Enterprise Plan with Private Cloud ... - IT Business Net
UTC aims to move IT services and storage to cloud – Education Dive
Dive Brief:
Some educational institutions face issues when trying to upend their entire system, overhauling the storage and background systems and the interfaces students and staff use, which Davidson College CIO Raechelle Clemmons says can be a heavy financial load. In terms of expense, it may be an easier sell to universities to begin with cloud storage. Universities could also see cost savings keeping support and repair staff and systems off-site; for example, as the transition from campus cards to mobile apps takes shape, it becomes clear that contracting off-campus can sometimes reduce costs.
Still, there are legitimate concerns over how secure cloud storage can be; the potential pitfalls of a hacked server can be enormous, from malware, to theft of research or harassment of students on their private school accounts. Susan Grajek, the vice president of data, research and analytics at EDUCAUSE, noted last fall that there have been escalations in security breaches on cloud servers used by colleges, but that is likely because the number of colleges employing cloud storage is growing exponentially. She said that with the increase in focus on the cloud, there is also an increase in attention paid to securing that data.
Read the original here:
UTC aims to move IT services and storage to cloud - Education Dive
Galaxy Control Systems embracing the cloud – Security Systems News (blog)
At ISC West 2017 earlier this month, Galaxy Control Systems solidified its commitment to cloud-based solutions with the announcement of its new Cloud Concierge, a cloud-based access control and monitoring solution. Galaxy also just released a new whitepaper this week entitled, Understanding Cloud Services for Access Control, further exemplifying the companys commitment to its customers by providing education and awareness.
Galaxy Control Systems recognized early on that cloud services had the potential to deliver new flexibility and benefits for the security market, Lukas Le, director of Cloud Services, Galaxy Control Systems, told Security Systems News. Le said that Galaxy has been working with cloud and hosted solutions for years to develop a high level of experience and expertise so the company could provide leadership and strong support for its customers.
We have also been watching the market to gauge the increasing level of interest, knowledge and acceptance of this new approach, he explained. Today, we see that the current mature state of the necessary technologies will let us leverage the recent release of our mobile applications while delivering on a business model that is mutually beneficial for the integrator, end user and Galaxy alike.
Security professionalsboth channel and end usersare recognizing the operational and cost benefits of a hosted collaborative access control solution, Rick Caruthers, executive vice president, Galaxy Control Systems, said in the announcement. With Cloud Concierge, weve reduced the complexity to allow resellers to comfortably sell a cost-effective solution to their base without having to assume responsibilities associated with system monitoring and management.
He continued, By joining the Cloud Concierge program, integrators who specialize and excel in their specific areas can focus on what they do best rather than trying to provide an end-to-end solution themselves. Best of all, by working with trusted partners to deploy, administer and maintain systems, end users can devote their time and energy to their core business and operations.
Unlike a traditional access control system, where door locks and controllers are connected to on-premise access control servers that store the system settings, enrollees and rules, and run the access control software application, with the new Galaxy Control Systems cloud-based access control solution, the access control servers are located remotely.
Door locks and controllers communicate with the remote cloud servers through an encrypted Internet connection, said Le. This arrangement offers security and maintenance advantages, and lets system administrators and service staff access the system from anywhere they can reach the Internet using a standard browser, with no special software required on their devices.
Le said that there is an up-front cost savings for customers who are installing new systems, and for customers with existing systems the choice of timing might be dependent on the age of the on-premise hardware, availability of service and maintenance staff, and similar factors, he noted. Initially, we will focus our efforts on our current dealers, and we expect that continued education using webinars, workshops and similar hands-on methods will continue to increase adoption.
Although there are many customers who understand the cloud, cloud service adoption is primarily about educating our customers about the benefits. Le said. For those already familiar with the benefits of cloud services, its an easy discussionthese customers cant wait to move their access control into the cloud. For those that are less familiar or less comfortable with cloud services, they will need a bit more education to prepare them to make this move.
A good example of Galaxys ongoing commitment to educate and inform is the publishing if its new whitepaper, which provides insights into the growing realm of cloud-based services and how they are changing the way access control systems can be deployed and used. In the paper, readers will find an overview of cloud technology and learn how and where access control cloud services can be applied for maximum economic and security benefits to the organization. The whitepaper also touches on decision factors to assess when considering a move to cloud services, each of which helps inform the decision-making process.
We recognize that not everyone in the security industry has the time or resources to stay current with new technologies, Caruthers said in the whitepapers release. We developed our new whitepaper to provide information for our community, to help accelerate their understanding of the benefits of access control cloud services.
To access the full whitepaper, click here.
Read the original post:
Galaxy Control Systems embracing the cloud - Security Systems News (blog)
IBM Opens Four New Cloud Data Centers in the United States to Support Growing Demand for Cognitive Capabilities – PR Newswire (press release)
ARMONK, N.Y., April 26, 2017 /PRNewswire/ --IBM (NYSE: IBM) today announced the opening of four new IBM Cloud data centers in the United States to support growing enterprise demand for cloud infrastructure that can provide access to services like IoT, blockchain, quantum computing and cognitive.
IBM Cloud's global network includes cloud data centers in key local markets around the world so clients can run their data and applications locally to meet performance and regulatory needs. With data centers across 19 countries and six continents, enterprises can provision cloud infrastructure when and where they need. The new cloud data centers in the U.S. can provide clients with infrastructure to manage and gain insight from their data while also taking advantage of IBM's advanced cognitive services with Watson on the IBM Cloud.
The opening of two new facilities in Dallas, Texas, and two new facilities in Washington, D.C., is a key part of IBM's investment to expand its global cloud footprint in 2017.
As enterprises increasingly turn to AI to generate value from their data, demand for public and hybrid cloud infrastructure will continue to grow. According to IDC, worldwide spending on public cloud services and infrastructure will reach $203.4 billion by 2020, a 21.5% compound annual growth rate nearly seven times the rate of overall IT spending growth. [1]
"IBM is making major investments to expand our global cloud data centers in 2017 and provide the infrastructure necessary for enterprises to run their cognitive, big data, blockchain and IoT workloads," said John Considine, general manager for cloud infrastructure, IBM. "IBM's growing global cloud footprint gives enterprises the flexibility and scale to run their most complex workloads when and where they need."
The new U.S. facilities can help enable companies to digitize business and operations and drive cognitive innovation through the IBM Cloud. Clients of all sizes are already taking advantage of the benefits of the IBM Cloud including Bitlyand Halliburton.
IBM's Expanding Global Cloud FootprintIBM now has more than 55 global cloud data centers in 19 countries spanning six continents to help enterprises manage and gain insight into their data no matter where it resides.The opening of additional facilities in Dallas, Texas, and Washington, D.C., marks 22 IBM data centers across the U.S.
The news reinforces IBM's commitment to expand its cloud presence around the world in 2017 and builds on strong global momentum from 2016. In 2016, IBM opened the industry's first cloud data center in the Nordics as well as a new cloud data center located outside of Seoul in South Korea. Additionally, IBM announced in November that it is tripling its cloud data center capacity in the U.K. with four new facilities.
Each of the four new facilities now open in the U.S. has the capacity for thousands of physical servers and offers a full range of cloud infrastructure services, including bare metal servers, virtual servers, storage, security services and networking. With services deployed on demand and full remote access and control, customers can create their ideal public, private or hybrid cloud environments.
IBM's cloud infrastructure is cognitive at the core and geared for big data workloads. IBM operates a large fleet of bare metal servers, which are ideal for high performance cloud applications. IBM also offers the latest NVIDIA GPU accelerators including the Tesla P100, Tesla K80 and the Tesla M60 to help enable enterprises to quickly and efficiently run compute-heavy workloads, such as AI, deep learning and high performance data analytics.
About IBM Cloud:
For more information, visit: http://www.ibm.com/cloud-computing.
[1] IDC: Worldwide Semiannual Public Cloud Services Spending Guide, February 20, 2017
Media Contact:Sarah Murphy IBM Media Relations 336-337-7584 srmurphy@us.ibm.com
To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/ibm-opens-four-new-cloud-data-centers-in-the-united-states-to-support-growing-demand-for-cognitive-capabilities-300446207.html
SOURCE IBM
Read the original post:
IBM Opens Four New Cloud Data Centers in the United States to Support Growing Demand for Cognitive Capabilities - PR Newswire (press release)
How to Help Your Customers Combat Public Cloud Bill Shock – Talkin’ Cloud
Consumption-based pricing and the ability to spin cloud servers up and down as needed have opened organizations up to a whole new way of doing business one that requires a lot less upfront capital. But something that still eludes many organizations is how to monitor the costs of their cloud usage before it gets out of hand.
Lynn LeBlanc, CEO and founder of hybrid IT service provider Hotlink, says that public cloud is typically more of a so-called black hole than on-premise infrastructure. One advantage of on-premise infrastructure is you buy it, and its yours, and it doesnt cost you anymore to use it up, she says.
If youre on a pay-as-you-go plan of public cloud, while you get that upfront benefit of not having to extend all that capitalreally managing that consumption, its a new problem that people arent used to having to solve, she tells Talkin Cloud in an interview.
To address this issue with its own corporate IT customers, HotLink launched managed services last year, in conjunctionwith its hybridHotLinkCloud-Attach Platform,that provide AWS cost optimization and load optimization. These services were born out of the companys own experience dealing with what LeBlanc calls bill shock at the end of each month.
I would get these [AWS] bills at the end of the month and honestly I never knew how much they were going to be, and they werent really organized in a way that we could deconstruct them, she says.
According to LeBlanc, while an Amazon bill shows her how much data was transferred to and from the cloud, how much storage she consume and other metrics, it will not tell me who was doing what, what were they doing with itso when it came to trying to figure out, wow, suddenly we got this monster bill that, by the way, was way more than we budgeted for, why? What were we doing?
I know that our engineering VP always dreaded when the bill came in because he knew there was going to be this fire drill to try and figure out what it was, she says.
For engineering it was particularly problematic because it was impossible to figure out how many cloud resources individual developers were using. So the company developed its own tools to monitor cloud usage, some of which use the Amazon API.
What we found was the reporting we created for ourselves could be really useful for our customers, LeBlanc says. We gave them some of those tools but then we found there wasnt that much discipline; that you really have to have if you care about managing cost in the cloud.
There are a lot of tools that do all kinds of things related to cost but it still depends on somebody really actively managing it. We found that when it came to corporate IT, andparticularly the upper mid-market, I just dont think there was the discipline around 'youve got to watch this stuff every day' if you really want at the end of the month to have the economic benefit that you envisioned when you started to use this resource.
HotLink started offering managed services around its various products last year, including disaster recovery as a service and general cloud management, but it wasnt until August 2016 when the company productized its cloud cost optimization and load optimization and made it part of its managed services offerings.
HotLink monitors AWS cloud costs on behalf of its customers with its cloud cost optimization module. LeBlanc explains: We find out from them their budget for public cloud usage for each month, we monitor it daily to see whats powering on, whats not, perhaps an extra-large instance is powered on but theres almost no CPU usage. This means somebody should have just powered it off.
For companies that are just getting started in monitoring cloud costs, LeBlanc says it can be hardto figure out how to budget. Similar to a household budget, it is best to have some number of months of usage before you can create a realistic budget. It also requires a bit of a behavioral change, she says.
Theres a bit of corporate IT thats sort of opaque. They dont necessarily know what all these applications are doing so lets say for example theyre using the public cloud for disaster recovery. They dont necessarily know how often a given application goes through a major update cycle for their database. They just never paid attention to that because they dont manage things at the application level for the most part; theyre looking at it more holistically.
Not being application aware from an operational point of view is a change in behavior that is necessary when you have a new unit of financial measure in the public cloud, LeBlanc says. When you dont know the application, there is a tendency to overprovision, she adds. HotLink makes recommendations to customers about right-sizing the pre-defined instances based on the performance and utilization of the resources, she says.
With its load optimization module, HotLink provides load balancing recommendations for on-premise data transfer; intelligent scheduling for bandwidth management; and AWS account configuration for fewer bottlenecks and faster throughput, the company says.
[W]hat youre trying to do is keep from saturating the network that is being used for a lot of purposeswe havent had any customers where they just didnt just have enough bandwidth but if its not being accurately managed they will have bottlenecks, she says.
See the article here:
How to Help Your Customers Combat Public Cloud Bill Shock - Talkin' Cloud
Amihan Global Strategies Launches AMIHAN CLOUD BLOCKS, a Cloud-Native Infrastructure to Accelerate Enterprise … – PR Newswire (press release)
LOS ANGELES, April 25, 2017 /PRNewswire/ -- Amihan Global Strategies announces the immediate availability of AMIHAN CLOUD BLOCKS, a cloud-native infrastructure solution that accelerates digital transformation by augmenting the existing IT systems of large enterprises with the agility of remote servers in the cloud.
AMIHAN CLOUD BLOCKS merges the best of two worlds: world-class technology, based on Google's pioneering Kubernetes Platform, and regional expertise catering to the specific needs of the largest enterprises in the ASEAN region. Designed as a fully managed service solution and a 100% OPEX subscription model, AMIHAN CLOUD BLOCKS is the only cloud solution that is optimized for Southeast Asia.
INFRASTRUCTURE FOR INNOVATION
"The cloud is step one of digital transformation," said Winston Damarillo, Executive Chairman of Amihan. "It's the infrastructure for innovation that allows you to experiment, grow, and adapt in ways that are necessary to keep up with the pace of customers' digital appetites."
AMIHAN CLOUD BLOCKS enables companies to deploy new services faster, harmonize legacy and next-generation apps in a unified ecosystem, and perform complex data analytics all in a single platform. It aims to transform the IT infrastructure of legacy companies, starting with their data center: the nucleus that powers business applications and houses business intelligence.
AMIHAN CLOUD BLOCKS is based on Kubernetes, a container management tool that was initially developed by Google and is now managed by the Cloud Native Computing Foundation (CNCF), of which Amihan is a member. Furthermore, AMIHAN CLOUD BLOCKS is built on the NEC DX 2000 hardware platform, which delivers world-leading density of compute, memory and storage.
"NEC is excited to collaborate with Amihan to deliver a state-of-the-art Cloud Native Platform in Southeast Asia," says Tatsunori Shibata (Head of Go-to-market, Bigdata & Cloud platform business, NEC Corporation). "The NEC DX 2000 is a perfect fit for the workloads of the fastest growing digital companies in the region."
A SIMPLER DIGITAL TRANSFORMATION, ON DEMAND
Digital transformation is a complex process that involves drastic changes in culture, business practices and IT Systems. AMIHAN CLOUD BLOCKS simplifies this by minimizing time spent on compatibility issues, unifying data sources, investing in new systems, and ensuring latest cybersecurity and data privacy standards.
AMIHAN CLOUD BLOCKS enables a deliberate transformation of a company's existing IT systems to the cloud through its virtualization platform based on OpenStack, which will house legacy Linux and windows servers; Acaleph Storage, a data-secure enterprise wide storage platform; and Kubernetes-managed Docker to accelerate application development. In addition, AMIHAN CLOUD BLOCKS can seamlessly federate with Google Cloud Platform to enable a scalable hybrid-cloud across all its services.
"Enterprises are using open source technologies like Kubernetes to deploy cloud native architecture models that support fast, agile application development," said Dan Kohn, Executive Director of The Cloud Native Computing Foundation. "We are pleased to support our members in their efforts to deliver engineered solutions for the modern enterprise."
Media Contact: Rexy Josh Dorado Phone: 216.526.7842 Email: rdorado@agsx.net
About Amihan:
Amihan Global Strategies is a digital transformation accelerator that advises and partners with some of the largest institutions in the ASEAN region. Amihan helps companies build a digital roadmap, access ideal technology, and act on their vision to become future-ready organizations. Amihan has offices in Manila, Cebu, Singapore and Los Angeles.
Related Files
CLOUDBLOCKS Data Sheet.pdf
Related Images
image1.jpg
image2.png
image3.png
Related Links
AMIHAN CLOUD BLOCKS Product Page
Amihan Homepage
This content was issued through the press release distribution service at Newswire.com. For more info visit: http://www.newswire.com.
To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/amihan-global-strategies-launches-amihan-cloud-blocks-a-cloud-native-infrastructure-to-accelerate-enterprise-digital-transformation-300445731.html
SOURCE Amihan
IBM-Nvidia Servers Achieve High-Performance Computing Milestone In Oil Industry – Forbes
Forbes | IBM-Nvidia Servers Achieve High-Performance Computing Milestone In Oil Industry Forbes Each server, dubbed Minsky, is equipped with two Power8 central processing units (or CPUs) and four Nvidia Tesla P100 graphics processing units (or GPUs). With the servers running on IBM's cloud, the Stone Ridge's simulation took less than two hours to ... |
See the article here:
IBM-Nvidia Servers Achieve High-Performance Computing Milestone In Oil Industry - Forbes
How just 30 machines beat a warehouse-sized supercomputer to set a new world record – ZDNet
An IBM/Nvidia cluster completed a one-billion cell Echelon petroleum reservoir simulation (shown above) in 92 minutes.
A high-performance computing record set by a cluster of more than 22,000 compute nodes has been shattered by just 30 machines.
The massive reduction in computing infrastructure needed to set a new record for simulating oil, water, and gas flow was made possible by tapping into the huge parallel processing ability of graphics-processing units (GPUs).
While the original record, set by ExxonMobil just a few months ago, used a cluster of more than 716,000 CPU cores to run the simulation, this new approach by IBM and Stone Ridge Technology relied on 30 IBM OpenPower servers equipped with 120 Nvidia Tesla P100 GPU accelerators.
The IBM/Nvidia cluster completed a one-billion cell Echelon petroleum reservoir simulation in 92 minutes, faster than ExxonMobil's approach and using what IBM says is one 10th of the power and 1/100th of the space.
The IBM Power System S822LC machines used in this latest test pair two IBM POWER8 CPUs with four Tesla P100 GPUs, using the high-speed, bidirectional 40GBps NVLink.
Sumit Gupta, IBM VP of high-performance computing and analytics at IBM, said the result demonstrates the strengths of this tightly coupled GPU/CPU architecture.
"The bottom line is that by running Echelon on IBM Power Systems, users can achieve faster runtimes using a fraction of the hardware," he said.
"One recent effort used more than 700,000 processors in a server installation that occupies nearly half a football field. Stone Ridge did this calculation on two racks of IBM machines that could fit in the space of half a ping-pong table."
GPUs excel at tasks that can be broken into subtasks, which can then be handled in parallel by the thousands of power-efficient cores within each GPU. Each of the P100 GPUs has 3,584 cores that can be used for parallel compute, with more than 460,000 cores across the cluster used to run the simulation.
While the types of tasks suited to the massively parallel processing offered by GPUs are still limited, Nvidia says the number continues to grow, and today includes computational fluid dynamics, structural mechanics, climate modeling, and other tasks related to manufacturing and scientific discovery.
Nvidia recently began pushing its GPUs as a technology suited to training machine-learning models, particularly the Tesla P100 GPU and its own DGX-1 server.
This month, IBM also added the Nvidia Tesla P100 GPU to the IBM Cloud, providing the option to equip individual IBM Bluemix bare-metal cloud servers with two P100 accelerator cards.
Here is the original post:
How just 30 machines beat a warehouse-sized supercomputer to set a new world record - ZDNet