Page 4,208«..1020..4,2074,2084,2094,210..4,2204,230..»

Amazon Unveils New Conference Call Service To Take on Skype – Top Tech News

A new video and phone conference service from Amazon aims to compete with the likes of Microsoft's Skype for Business. Amazon Web Services unveiled its new Amazon Chime service today to provide unified communications tools to enable companies to host or join meetings as well as chat online, while sharing content and screens across their devices.

"In a world where meeting attendees are often not in the same city, much less the same office building, unified communications has become increasingly more important," the company said in a statement. "Amazon Chime takes frustration out of meetings, delivering very high quality video, voice, chat, and screen sharing."

Push-Button Conferencing

Amazon is pitching the new service as a way to bring high-end quality to an aspect of enterprise technology often seen as a source of frustration for companies due to clunky or hard-to-use interfaces, bad sound and audio quality and complicated login procedures. Chime aims to deliver high-end video and audio quality while making the process of hosting or joining a meeting as simple as the push of a button.

Chime calls all participants listed for a meeting when it starts so joining is as easy as clicking a button in the app, according to the company. It also provides a visual roster of all attendees, taking the mystery out of knowing who is on the call.

The new service also comes with its own mobile and desktop apps that can be synchronized across multiple Android, iOS, Mac, and Windows devices. Amazon said Chime can also be integrated into existing corporate directories, and also allow IT administrators to manage identities and control access across their organizations.

Easy Deployment

Perhaps most attractive for IT departments, Chime doesn't require deployment or upfront investments, since enterprises can just download the app. Additionally, Amazon said it costs about a third as much as competing solutions.

"It's pretty hard to find people who actually like the technology they use for meetings today. Most meeting applications or services are hard to use, deliver bad audio and video, require constant switching between multiple tools to do everything they want, and are way too expensive," said Gene Farrell, vice president, enterprise applications, AWS, in the statement.

The service is now available in three versions. Amazon Chime Basic is free and lets a user attend meetings, call another person using voice or video, and use its messaging and chat capabilities. Amazon Chime Plus adds user management, such as the ability to manage an entire e-mail domain, disable accounts, or configure Active Directory, as well as 1 GB per user of message retention, for $2.50 per user, per month.

And Amazon Chime Pro adds the ability to host meetings with screen sharing and video for up to 100 users and also includes support for mobile, laptop, and in-room video along with unlimited VoIP support for $15 per user, per month.

Image credit: Amazon.

See the original post:
Amazon Unveils New Conference Call Service To Take on Skype - Top Tech News

Read More..

New Charity Focussed Cryptocurrency Launches – AllCoinsNews.com (blog)

Centurion, a new cryptocurrency, launched last week with a focus on ease-of-use and scalability in addition to promoting childrens charities. With a block size of 2 MB, Centurion can process and confirm transactions in under 6 minutes.

Ready-made merchant payment API libraries can be integrated into websites to enable the cryptocurrency to be used to buy products and services. The first adopter of Centurion is an online store which sells more than 100 e-books and 50 videos on marketing, cryptocurrencies, internet tips, tricks, businesses, etc. The partnerships will be revealed soon, with more stores to follow soon after. Centurion is already available for traders on the cryptocurrency exchange Excambiorex.

Mining pools for Centurion users do not require miners to sign up and manually withdraw their accumulated share of cryptocurrency. Instead, they will be receiving funds directly into their wallets. According to the company, this has been done to improve ease of use, but also reduce the risk of attacks on the mining pools wallets. The auto pay-outs are set to execute every few minutes. In order to cater to users who are not technology experts, simple, pre-configured files are provided that can be downloaded to start CPU and GPU mining.

Centurion4Children is donating 5 million Centurion coins to well-established charity organizations. It is also raising funds within the community and through the website. The foundation is already represented in India, as well as Africa and Europe with official charity partnerships being revealed in March, 2017. Centurion4Children is currently raising funds for: Support a Child and its Entire Family, Sponsor a Boy, Safe Water for Children in Development Countries, and Sustainable Schools. To cover promotion costs and to kick-start the donations distributed by Centurion4Children, the coin has reserved 50 Million of its tokens.

The cryptocurrency platform will soon embark on a marketing campaign in association with Cryptonetwork ltd, a Dubai-based entity which has a network of people spread across India, Germany, Italy, Spain and several other countries. They will be involved in various promotional activities, including the sale of products and services, for which they will receive rewards in centurion and bitcoin. An estimated 20 million Centurion tokens over a period of 5 years has been earmarked for these promotional purposes.

Centurion will donate 5 million of its reserved coins to charity and the remaining 50% will be used to reward early adopters, investors, related projects and talented individuals within the community who work to improve the Centurion cryptocurrency.

Specifications X11 Proof of Work (PoW) 3% Proof of Stake (PoS) RPC port: 5555 / P2P port: 5556 1 Minute Blocks Block Size 2Mb Reward Schedule: Blocks until 100 0 CNT (for fair difficulty balancing) Blocks 101 250,100 100 CNT Blocks 250,101 500,100 75 CNT Blocks 500,101 1,000,100 60 CNT Blocks 1,000,101 2,000,100 50 CNT Blocks 2,000,101 2,500,100 25 CNT Blocks 2,500,101 3,500,100 10 CNT Blocks 3,500,101 4,000,100 5 CNT Blocks 4,000,101 5,000,100 2.5 CNT Blocks 5,000,101 19,000,000 1 CNT Total Coin production 250 Million Reserve: 50 Million. SIDEBAR

View original post here:
New Charity Focussed Cryptocurrency Launches - AllCoinsNews.com (blog)

Read More..

Towards equal access to digital coins – Science Daily

Scientists at the Interdisciplinary Centre for Security, Reliability and Trust (SnT) of the University of Luxembourg have developed an important mathematical algorithm called "Equihash." Equihash is a core component for the new cryptocurrency Zcash, which offers more privacy and equality than the famous Bitcoin. Zcash came into operation as an experimental technology for a community-driven digital currency in late 2016.

Bitcoin is by far the most recognized and widely used digital currency. It was introduced in January 2009 and has garnered much attention since then. But it is not the only one of its kind. Wikipedia lists nearly one hundred cryptocurrencies boasting more than 1 million US dollar market capitalisation.

One of the newest cryptocurrencies is "Zcash," which can be seen as an update to the Bitcoin protocols. In Bitcoin, the transfer of coins is recorded in a global ledger, the so-called blockchain. The validity of the latest transfers in the blockchain is verified about every ten minutes. Verifying the transfers and creating new blocks for the blockchain (the so-called mining) requires a lot of computing power, which is provided by distributed computers worldwide. The "miners" who allocate the processing power are rewarded with new coins.

Zcash is trying to resolve two main shortcomings of Bitcoin: its lack of privacy for transactions and the centralization of transaction verification into the hands of a mere dozen miners who have invested in large amounts of specialized mining hardware: Bitcoin is prone to such centralization because the computational load of the bitcoin mining algorithm can be split into many different small tasks, which can be conducted in parallel. The algorithm is easy to implement in dedicated, energy-efficient and cheap microchips, but not suited to standard hardware. Bitcoin mining today is therefore done on special-purpose supercomputers which are located in places with cheap electricity and/or cheap cooling. Such supercomputers are expensive, costing millions of euros, but provide much more mining power than if one were to use standard PC hardware of the same price.

Prof. Alex Biryukov, head of the research group "Cryptolux" and Dr. Dmitry Khovratovich at SnT have developed the algorithm "Equihash" which can resolve this problem. Equihash is a so called memory-hard problem, which can not be split up into smaller working packages. It can be more efficiently calculated on desktop-class computers with their multiple processing cores and gigabytes of memory than on special hardware chips. "If 10,000 miners with a single PC were active, in Zcash the investment to compete with them would be 10,000 times the price of a PC, while with bitcoin, the investment would be significantly smaller," says Khovratovich. This creates a more democratic digital currency by allowing more users to contribute to the mining process. Khovratovich adds: "The strength of a cryptocurrency comes from the fact that the ledger is globally distributed. Our Equihash algorithm reverses the situation back to this more ideal world."

Equihash was first presented at the Network and Distributed System Security Symposium last year -- one of the top-5 IT security events. Prof. Biryukov comments: "Since Equihash is based on a fundamental computer science problem, advances in Equihash mining algorithms will benefit computer science in general. Equihash is so far unique among all the mining algorithms: it is memory-hard on the one hand and very easy to verify on the other." In other words, while mining new coins with Zcash/Equihash is comparatively expensive, hence posing a smaller risk of monopolization because it requires large amounts of computer memory and hard computational work, checking that the new coins are genuine is memoryless, fast and cheap.

Understanding these advantages, the creators of Zcash chose Equihash as the algorithm for mining coins and verifying transfers. Equihash itself is not limited to use in Zcash and can be used in any cryptocurrency, including Bitcoin.

"With our contribution to Zcash, the Cryptography and Security lab (CryptoLux) has shown its strength in innovative research that has immediate applications in the financial technology industry," says SnTs director, Prof. Bjrn Ottersten. "We invite students to follow us in this promising field," adds Professor Biryukov: "There are still lots of challenging research problems to solve."

Story Source:

Materials provided by University of Luxembourg. Note: Content may be edited for style and length.

See the rest here:
Towards equal access to digital coins - Science Daily

Read More..

Thinking Outside The Box: Why Cloud Storage Services Are A Bad Fit For Managing Brand Assets – Business Solutions Magazine

By Leslie Weller, Director of Marketing, Canto

The popularity of cloud-based storage and sharing tools has skyrocketed over the past five years. Businesses and consumers use Google Drive, Box, and Dropbox for sharing files, collaborating on documents, and coordinating projects in real time.

These services, however, are not one-size-fits-all. They might not be suitable for businesses in industries that adhere to more restrictive standards like finance and healthcare. Marketers and agencies could find them to be poor fits as well, especially when managing a growing number of brand assets (such as photos, videos, and graphics).

When it comes to organizing images, videos, and other media files, marketers eventually realize sometimes after trial and error a service like Box cant meet their needs. Marketers need a digital asset management (DAM) system designed for managing visual content, not a simple tool that simply sends files to and from the cloud. Following are five reasons marketers should avoid Box, Dropbox, and other cloud storage tools for managing brand assets.

Bottom Line: DAM Supports Better Productivity For Managing Brand Assets

When all is said and done, your digital assets are valuable to your company for one main reason: theyre created to help the company meet business goals by generating revenue through marketing, advertising campaigns, and more. The easier it is to manage and organize your creative assets, the easier it is to stay productive. While Box, Dropbox, and other cloud storage services are commonplace, they arent designed for brand asset management (and may actually inhibit productivity). By relying on SaaS-driven DAM technology to manage, store, and use creative assets, marketing teams will benefit from the increased productivity and streamlined workflows for asset management.

Leslie Weller, director of marketing for Canto, has a deep respect for the way technology connects people with the things they value most. She joined Canto to help marketers, brand managers, product managers, and content managers understand there is a better way to make use of their organizations massive amounts of digital content. Weller earned a masters of business administration degree from California State University, San Marcos and a bachelors of science degree in sociology from Brigham Young University.

Continue reading here:
Thinking Outside The Box: Why Cloud Storage Services Are A Bad Fit For Managing Brand Assets - Business Solutions Magazine

Read More..

Gormley’s Take: Cloud Computing With a Human Face – Wall Street Journal (subscription)

Gormley's Take: Cloud Computing With a Human Face
Wall Street Journal (subscription)
Health-care companies' shift to cloud computing is usually thought of as a technology story. There is also a human element. Cloud computing enables companies to quickly scale their computing power up and down. In health care alone, the cloud-computing ...

See more here:
Gormley's Take: Cloud Computing With a Human Face - Wall Street Journal (subscription)

Read More..

Optimizing data center placement, network design to strengthen cloud computing – Science Daily

Telecommunication experts estimate the amount of data stored "in the cloud" or in remote data centers around the world, will quintuple in the next five years. Whether it's streaming video or business' database content drawn from distant servers, all of this data is -- and will continue in the foreseeable future to be -- accessed and transmitted by lasers sending pulses of light along long bundles of flexible optical fibers.

Traditionally, the rate information is transmitted does not consider the distance that data must travel, despite the fact that shorter distances can support higher rates. Yet as the traffic grows in volume and uses increasingly more of the available bandwidth, or capacity to transfer bits of data, researchers have become increasingly aware of some of the limitations of this mode of transmission.

New research from Nokia Bell Labs in Murray Hill, New Jersey may offer a way to capitalize on this notion and offer improved data transfer rates for cloud computing based traffic. The results of this work will be presented at the Optical Fiber Communications Conference and Exhibition (OFC), held 19-23 March in Los Angeles, California, USA.

"The challenge for legacy systems that rely on fixed-rate transmission is that they lack flexibility," said Dr. Kyle Guan, a research scientist at Nokia Bell Labs. "At shorter distances, it is possible to transmit data at much higher rates, but fixed-rate systems lack the capability to take advantage of that opportunity."

Guan worked with a newly emerged transmission technology called "distance-adaptive transmission," where the equipment that receives and transmits these light signals can change the rate of transmission depending on how far the data must travel. With this, he set about building a mathematical model to determine the optimal lay-out of network infrastructure for data transfer.

"The question that I wanted to answer was how to design a network that would allow for the most efficient flow of data traffic," said Guan. "Specifically, in a continent-wide system, what would be the most effective [set of] locations for data centers and how should bandwidth be apportioned? It quickly became apparent that my model would have to reflect not just the flow of traffic between data centers and end users, but also the flow of traffic between data centers."

External industry research suggests that this second type of traffic, between the data centers, represents about one-third of total cloud traffic. It includes activities such as data backup and load balancing, whereby tasks are completed by multiple servers to maximize application performance.

After accounting for these factors, Guan ran simulations with his model of how data traffic would flow most effectively in a network.

"My preliminary results showed that in a continental-scale network with optimized data center placement and bandwidth allocation, distance-adaptive transmission can use 50 percent less wavelength resources or light transmission, and reception equipment, compared to fixed-rate rate transmission," said Guan. "On a functional level, this could allow cloud service providers to significantly increase the volume of traffic supported on the existing fiber-optic network with the same wavelength resources."

Guan recognizes other important issues related to data center placement. "Other important factors that have to be considered include the proximity of data centers to renewable sources of energy that can power them, and latency -- the interval of time that passes from when an end user or data center initiates an action and when they receive a response," he said.

Guan's future research will involve integrating these types of factors into his model so that he can run simulations that even more closely mirror the complexity of real-world conditions.

Story Source:

Materials provided by The Optical Society. Note: Content may be edited for style and length.

Go here to read the rest:
Optimizing data center placement, network design to strengthen cloud computing - Science Daily

Read More..

How to add cloud-based bookmarks to your Nextcloud server – TechRepublic

Image: Jack Wallen

Nextcloud is one of the most flexible and easy to use cloud servers. Not only is Nextcloud an incredibly simple cloud server to get up and running, but you can expand its capabilities through a user-friendly app store.

An especially useful feature that can be included in the Nextcloud server is the cloud-based Bookmarks app. With this feature, users can add and tag bookmarks to be saved in their personal Nextcloud account. This makes it easy for users to reach their work-based bookmarks from any browser that can access the Nextcloud server.

I'll walk you through the process of adding the Bookmarks app and then illustrate how easy it is to work with. I assume your Nextcloud server is set up and running, and that you have an administrative account on the server.

SEE: Video: The 5 trends that form the future of cloud computing (TechRepublic)

Nexcloud will download and install the app. Once it's installed, the app is ready to gothere are no settings for the app.

Click the Apps drop-down and select Bookmarks from the list. In the new window, you will see a warning that you have no bookmarks. Nextcloud makes it easy to add bookmarksit includes a bookmarklet that you can drag to your browser's toolbar (Figure A).

Figure A

The Nextcloud bookmarklet is ready to be added to your browser.

Once you drag the bookmarklet to your browser, you can add bookmarks to Nextcloud by pointing your browser to the site you want to bookmark and then clicking the bookmarklet. In the window that appears, you can edit the bookmark's name, tag, and description (Figure B).

Figure B

Adding a TechRepublic Cloud bookmark to Nextcloud.

After the bookmark is added, you can return to your Nextcloud server, click the Apps drop-down, click Bookmarks, and view/edit your current listing of bookmarks. If there's a URL you want to visit, click the name (Figure C) and a new tab will open to that link.

Figure C

Bookmarks are easily accessed, edited, and tagged.

You can also add new bookmarks from within the Nextcloud Bookmark app by typing in the URL in the Address textarea and clicking the + button. Nextcloud will call out to the URL to get the name of the site and automatically add it to the bookmark.

The Bookmarks app can import and export bookmarks, which means you can export a browser's bookmarks and then import them into Nexcloud or vice versa. To get to this feature, click the Bookmarks app from the Apps drop-down, click the small gear icon in the bottom right of the window, and click either Export or Import (Figure D). The only downfall of the Import feature is that your bookmarks won't be tagged, so you'll have to do that manually.

Figure D

Importing and exporting bookmarks.

The Bookmarks app helps make your life a tiny bit easier. This feature may not change the way you work, improve your ROI, or secure your network, but it can extend your Nextcloud server offerings and help users get the most out of your company's hosted cloud. And with the help of the Nextcloud bookmarklet, adding bookmarks to a Nextcloud account is incredibly simple.

Read the rest here:
How to add cloud-based bookmarks to your Nextcloud server - TechRepublic

Read More..

Microsoft Azure Cloud Gaining Traction in Business – Fortune

Use of Microsoft Azure and Google Cloud Platform is on the rise, according to The 2017 State of the Cloud Report compiled by cloud management company RightScale . That's good news for what are generally seen as the number two and three players in public cloud, respectively, following market leader Amazon Web Services.

A survey of just over 1,000 IT professionals on their usage of public cloud showed Microsoft Azure adoption up to 34% from 20% last year. Likewise, 15% of respondents used Google ( goog ) Cloud this year, up from 10% a year ago. Twenty percent of the survey respondents are RightScale customers and 61% hail from the U.S.

Meanwhile, AWS remains the market leader, with usage holding steady at 57% year-over-year. Penetration of IBM ( ibm ) rose slightly from 7% to 8% while Oracle ( orcl ) Cloud adoption fell slightly from 4% to 3% over the same period.

Get Data Sheet , Fortunes daily tech newsletter.

Public cloud refers to a massive array of servers, storage, and networking aggregated and managed by one company, which then rents out those resources to many customers. This model is gaining traction as companies seek to augment or even replace their own data centers, which is why many legacy tech companies have jumped into a market AWS kicked off in 2006.

As Fortune reported earlier Wednesday, many big software companieslike Salesforce ( crm ) , Workday ( wday ) , Box ( box ) , SAP ( sap ) , and Inforthat typically stream their software from their own data centers are also using AWS, Microsoft, or other public cloud facilities to run some of their new products.

Among a subset of bigger "enterprise customers" RightScale surveyed, 59% said they use Amazon ( amzn ) up from 56% last year. Azure, meanwhile, saw a bigger jump with 43% of respondents on Azure now, compared to 26% last year. Microsoft ( msft ) , by virtue of its Windows-and-Office franchises, is already established in many of these accounts and is pushing them to embrace Azure as well. Companies in this enterprise category have 1,000 or more employees.

However, there's a not-so-silver lining to all this adoption. While public cloud is sold as a relatively inexpensive alternative to buying and running your own servers typically at partial capacity much of the time, the model leads to waste.

If a developer powers up a bunch of cloud servers to run a job and forgets to shut them down when it's done, the company pays for unused computing cycles. And customers are realizing this. More than half (53%) of the respondents said cost optimization is a top priority, and that percentage is higher (64%) for experienced cloud users who have been at it for a while.

For more on Microsoft Azure, watch:

For one thing, a lot of early cloud usage was initiated by a single team of developers trying to get a specific job done, often in a way that was not sanctioned by (or even known to) corporate IT. But all those one-off jobs add up, and if IT has no window to what's going on, costs can get funky, fast.

While respondents think they waste 30% of their cloud spending, RightScale says the real number is likely between 30% and 45%. RightScale sells services that help customers manage workloads running in different clouds and to help minimize waste. Experienced IT pros know the downside of bad cloud practices now and are striving to nip them in the bud.

Continued here:
Microsoft Azure Cloud Gaining Traction in Business - Fortune

Read More..

Discover the pros, cons of bare-metal cloud services – TechTarget

In some cases, public cloud services don't offer admins full visibility and control, particularly around variable...

Enjoy this article as well as all of our content, including E-Guides, news, tips and more.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

workload performance and security. Some providers address this challenge by offering bare-metal cloud.

What are bare-metal cloud services?

A bare-metal cloud service is a variation on infrastructure as a service (Iaas) that allows users to rent and configure single-tenant servers, typically without a virtualization layer. Bare-metal cloud promises the flexibility and scalability of public cloud, with the predictability, granularity and security of local servers.

Not all workloads run well on the virtualized cloud instances that make up many IaaS offerings. For example, legacy applications that demand access to physical hardware, or workloads that are extremely stable and require no scalability, might be better fits for bare-metal cloud.

Bare-metal cloud services are similar to other cloud services -- they're accessible like Amazon Web Services (AWS) Elastic Compute Cloud (EC2) instance or an Azure D-series VM. The primary difference with bare-metal is that the service maps to a physical server rather than a VM. Since the server is rarely virtualized, users get complete access to the entire server directly and all of its compute, storage and network resources. A bare-metal cloud instance is almost indistinguishable from a more traditional server, but typically uses the same on-demand rental model that public cloud providers use.

There are numerous providers in the bare-metal cloud market, including Oracle, IBM, baremetalcloud, Rackspace and Internap.

Major public cloud providers, including Azure and Google, do not have strong bare-metal cloud offerings. AWS comes close with its Dedicated EC2 Hosts service, which commits an entire physical server to the user.

What are the pros and cons of bare-metal cloud services?

The primary difference with bare-metal is that the service maps to a physical server rather than a virtual machine.

The most notable benefit of bare-metal cloud services is direct control of the server and its resources. This is a far cry from typical virtualized cloud instances, which intentionally obscure underlying hardware operations from the users. Additionally, without the virtualization layer, bare-metal cloud reduces the overhead of a hypervisor, which can increase performance.

Because most public IaaS environments are multi-tenant, organizations are concerned with security and compliance. Bare-metal cloud instances address both of these concerns because they provide a single-tenant hardware platform committed exclusively to a single user. However, bare-metal clouds do not guarantee security or compliance; organizations need to understand the legal obligations and industry best practices for proper security and regulatory posture.

Bare-metal cloud instances can also be more cost-effective than public cloud instances because users generally pay only for the underlying hardware rather than usage. However, organizations must consider the costs and compare to in-house hardware acquisition, deployment and operational expenses.

However, compared to virtualized instances, bare-metal cloud services can be limited. For example, when a physical server is virtualized, admins can provision a wide range of standardized VM types from the underlying hardware. But a bare-metal cloud instance is a complete server, so there is a limited number of instance sizes and types available.

What makes bare-metal cloud management unique?

There are few fundamental differences in managing bare-metal cloud instances versus typical virtual machine public cloud instances.

For example, bare-metal cloud providers like Rackspace and Oracle provide management interfaces, including a console and command-line interface. The Rackspace control panel offers a web-based interface to start the cloud server, access and view KPIs, schedule tasks like snapshots and access support functions. Admins can use control panels to perform tasks such as server resets and power cycling -- tasks unthinkable with common VM instances.

However, bare-metal servers generally require a more granular level of management and control than common cloud instances. For example, Oracle organizes bare-metal cloud services into entities called compartments, which provide isolated cloud instances to other business units or projects. This means it is possible to monitor, manage and track billing for resources by activity or group, and assign granular security and data protection characteristics to each individual user. Admins familiar with managing VM-based public cloud instances may see an additional learning to set up and manage a bare-metal cloud environment.

Is bare-metal cloud a better option for higher-level services?

Bare-metal cloud services can be a good option when a workload's computing demands are relatively constant due to lack of scalability. Workloads that fit this model include those involved with big data analytics, backup and recovery cycles, media encoding tasks, machine learning, visual rendering projects and other I/O-intensive applications.

For example, a big data workload needs to ingest a large amount of data, transform and process that data and then pass results back to storage. Once completed, the server may remain unused for weeks or even months. This makes bare-metal cloud a viable option, rather than buying and owning servers permanently.

Find out if bare-metal cloud services are right for you

Evaluate the architectural, cost benefits of bare-metal

Do containers run better on bare-metal servers?

Link:
Discover the pros, cons of bare-metal cloud services - TechTarget

Read More..

Welcome to the Era of Great Data Center Consolidation – Fortune

"Friends don't let friends build data centers," said Charles Phillips, chief executive officer of Infor, a business software maker, about two years ago.

His remarkwhich launched a flood of T-shirts came at an Amazon Web Services conference in April 2014, when he announced that Infor would move all of its IT operations into AWS data centersand out of its own.

That was a bold statement at the time, and it presaged a change in how big software companies are building and distributing their products.

Since then, a slew of these software companiesincluding Box ( box ) , Salesforce , NetSuite , Tableau ( data ) , Workday, and SAP ( sap ) have all announced plans to use a public cloud (like AWS) as a deployment option for their own software.

A public cloudreferred to within the industry as Infrastructure-as-a-Service (IaaS)consists of massive quantities of servers, storage, and networking owned-and-operated by one company, and then rented out to others.

Box and Workday , for example, are turning to AWS and IBM ( ibm ) SoftLayer to run specific types of software. Salesforce has signed on to AWS for its Internet of Things Cloud Suite and other new products. NetSuite named Microsoft Azure as its " preferred cloud partner ." (Although now that NetSuite has been bought by Oracle ( orcl ) , which has its own public cloud ambitions, that could change.) JDA Software, which offers supply chain management software, is using Google Cloud Platform.

None of these companies are going as far as Inforat least not yet. But unless these announcements are 100% marketing hype, they indicate that much more software power will be concentrated in fewer, albeit bigger, data centers owned and operated by just a handful of cloud computing giants.

The arguments seem sound: Let the big cloud providers pull together hundreds of thousands of servers, petabyte upon petabyte of data storage, and networking at their data centers around the worldand software companies can just piggyback on that. That allows the software maker to concentrate on features and functions for customers, not racks of servers and power supplies in the data center running either in a dedicated corporate data center or in facilities owned by third-parties like Equinix ( eqix ) , Digital Realty Trust, or Dupont Fabros. But build or no build, that's a major expense.

And, if a U.S. software company throws in with a cloud provider with data centers around the world, it won't have to eat the expense of operating a data center in a country with strict data sovereignty laws requiring that local data stay in country. That is no small benefit.

Get Data Sheet , Fortunes daily tech newsletter.

But isn't there concern about relying so heavily on a small handful of huge playersnamely Microsoft, Amazon, and Google? Some software executives admit there is. Privately, they worry that their cloud provider might end up being as competitor as well.

Microsoft ( msft ) , for example, offers its own cloud-based applications, which compete with offerings from many other companies. Amazon ( amzn ) keeps adding higher level software to its infrastructure as well. Who knows where that will end?

Marc Benioff, chief executive of Salesforce( crm ) , downplays the concern, noting that there is no shortage of competition in cloud. Aside from Amazon, Google , IBM, and Microsoft, Benioff cites the various telcoslike Deutsche Telekom, BT , and NTTthat also offer cloud services.

And, Salesforce is not leaving its own data centers behind. "Our infrastructure remains our primary focus as public cloud is still too expensive for many of the primary markets we are in," Benioff tells Fortune via email.

So while Salesforce is working with AWS in Canada and Australia, it will keep using its own data centers as well.

"This is not a one-size fits all solution for infrastructure," says Benioff.

David Clarke, senior vice president of technology development at Workday ( wday ) , which specializes in financial and human resources software for big companies, agreed that expanding geographic reach is a big reason to consider public cloud. Amazon, Microsoft Azure, and IBM SoftLayer maintain data centers all over the world.

"Our first production workload will go on AWS in Canada this year," Clarke tells Fortune . "Whether or not you like running data centers, it's expensive and being able to offer geographic diversity without having to build data centers everywhere is a benefit." Workday itself has three primary data centers in Ashburn, Va.; Portland, Ore.; and Dublin, Ireland.

For more on Dropbox's infrastructure move, watch:

Another factor is that big businesses have grown more comfortable with the use of massive public clouds, and may push other software suppliers to put services in the customer's cloud of choice.

SAP offers versions of many of its productslike SuccessFactors human resources and Concur travel and expense accounting softwarefrom various cloud partners. Right now, however, SAP's new public cloud-based SAP S/4Hana manufacturing and financial management software runs only on SAP's own public cloud infrastructure. However, the company will consider "the usual suspects" among the giant public cloud playersif demand warrants it, Darren Roos, president of the company's S/4Hana business unit tells Fortune .

As many SAP customers already run some software in AWS or Azure, they may want to "co-locate" the new SAP software there as well, Roos suggests.

But there are some in the tech industry who say that there are many companies of sufficient size with specialized needs that would be better off running their own data centers, but in a cloud-like manner. In this "private cloud" model, all of the computing gear is devoted to one company, but still offers internal users flexibility in adding or deleting resources as needed. And each department or user can be charged for what capabilities they use.

"There are plenty of companies big enough to operate their own data centers," Bryan Cantrill, chief technology officer of Joyent said recently. And that is likely one huge reason Samsung Electronics bought Joyent for its cloud expertise last year.

And then there is Dropbox, the popular file storage and file sharing company that ran almost entirely on AWS until it quietly moved 90% of that workload into its own data centers over the course of a few years. Companies like Dropbox say they know their own needs better than a public cloud provider, and that they're big enough to make the economics work. Their argument is that once a company's workload gets big enough and stable enough not to need all that public cloud flexibility, it is cheaper to run your own data center gear.

Still, it seems that if the public cloud is a viable option, more corporate workloads will flow in that direction. Oracle co-chief executive Mark Hurd has said he expects 80% of corporate data centers to disappear by 2025. Even if he's a little bit off, that's a considerable estimate.

This shift could be generational. A company like Boxfounded in 2005, a year before AWS launched its first cloud servicedidn't have much of a choice but to provision its own data centers. If the content management company were starting out now, Box chief executive Aaron Levie says that would likely not be the case.

When asked if it was hard to rely on a huge cloud provider that might end up competing with Box, Levie replies, "You know what's harder? Building data centers!"

Originally posted here:
Welcome to the Era of Great Data Center Consolidation - Fortune

Read More..