Page 4,207«..1020..4,2064,2074,2084,209..4,2204,230..»

People Love Talking About Bitcoin More Than Using It – Wall Street Journal (subscription)


Wall Street Journal (subscription)
People Love Talking About Bitcoin More Than Using It
Wall Street Journal (subscription)
A few years ago, a Spice Girl and an NFL star were accepting bitcoin to sell their wares. Comedian Drew Carey tweeted about trying to buy breakfast with it, and Federal Reserve Chairwoman Janet Yellen testified about bitcoin in front of Congress. Now ...

Read the original post:
People Love Talking About Bitcoin More Than Using It - Wall Street Journal (subscription)

Read More..

Bitcoin breakthrough? Russia moots cryptocurrency green light – RT

Published time: 11 Apr, 2017 11:56Edited time: 11 Apr, 2017 15:48

Cryptocurrencies may be recognized in Russia by 2018, according to Deputy Finance Minister Aleksey Moiseev.

Moiseev says monitoring cryptocurrencies could be an instrumental tool against money laundering, and Bitcoin and other digital currencies could be recognized by next year as the central bank works with the government to develop rules against illegal transfers.

The state needs to know who at every moment of time stands on both sides of the financial chain, Moiseev said in an interview, as cited Bloomberg. If theres a transaction, the people who facilitate it should understand from whom they bought and to whom they were selling, just like with bank operations.

Last year the idea of a national cryptocurrency had been considered by the Ministry of Finance and the Central Bank, which would see the banning of all other virtual currencies in Russia. The idea had not been discussed by the Kremlin, however, according to Presidential Press Secretary Dmitry Peskov at the time.

Russian officials had been opposing all virtual currencies, arguing their cross-border nature, transaction anonymity and lack of a supervisory body makes them the perfect vehicle for illegal transactions.

In 2014, Moiseev suggested a ban on cryptocurrencies could be introduced because of their use to fund illegal activities such as money laundering, the buying of illicit goods, rendering illegal services or funding terrorism.

Read the original here:
Bitcoin breakthrough? Russia moots cryptocurrency green light - RT

Read More..

Investing in Blockchain Technology Without Exposure to Cryptocurrency Volatility – Live Bitcoin News

A lot of people are looking to invest in blockchain technology. With so many startups around the world, the process becomes a lot easier. However, most blockchain-based projects only allow investments through bitcoin or other cryptocurrencies. This raises the question whether or not there is an alternative solution available to those looking to become a financial contributor to the future of blockchain development.

As it turns out, there are quite a few options to invest in blockchain-based projects without buying bitcoin, Ethereum, or any other currency. Although some of these investment opportunities require a slightly different strategy, there are plenty of ways one can contribute financially. While ICOs are still a valuable solution, they often require an amount of cryptocurrency first and foremost.

That being said, investors do not necessarily expose themselves to volatility when buying a cryptocurrency and immediately spending it on a blockchain-based ICO. This also allows contributors to immediately diversify their cryptocurrency portfolio, as they will receive native project tokens in return. Every one of these tokens can eventually be exchanged back to cryptocurrency, although it is doubtful they will ever receive a fiat currency trading pair.

Two other popular options are available to investors all over the world as well. Buying stocks in a blockchain startup is an option many investors often explore. There are quite a few established names in the blockchain world already, and most companies are always looking for additional investment. Some companies are even publicly traded, which makes it even easier for investors to get involved in the world of blockchain technology. It is important to note some of the publicly traded companies can be found outside the United States as well, as some stocks are publicly traded in the UK, Australia, and Canada.

Last but not least, there is another way for blockchain startups to raise money. Rather than going through an ICO, some companies will use more traditional crowdfunding solutions. Early-stage funding rounds are well worth keeping an eye on, although not every blockchain project will be successful. All things considered, there are multiple ways for investors to invest in blockchain, either with or without going through the process of buying cryptocurrency first.

Header image courtesy of Shutterstock

About JP Buntinx

JP is a freelance copywriter and SEO writer who is passionate about various topics. The majority of his work focuses on Bitcoin, blockchain, and financial technology. He is contributing to major news sites all over the world, including NewsBTC, The Merkle, Samsung Insights, and TransferGo.

View all posts by JP Buntinx

Link:
Investing in Blockchain Technology Without Exposure to Cryptocurrency Volatility - Live Bitcoin News

Read More..

Dash Cryptocurrency Signs on with Kraken – Blockchain News

Dash, a potential competitor tobitcoin, has just announced they have signed on to the KrakenDigital Asset Exchange. The worlds fifth most valuable cryptocurrency is now open for trading on the platform with buy and sell pairings including DASH / EUR, DASH / USD, and DASH / BTC. The partnership between Dash and Kraken comes in the wake of a record surge for the cryptocurrency, which experienced a 6x increase in price per ($11 to $72 USD) and a 10x increase in trading volume ($3 million to $30 million USD) across Q1.

Dash VP of Business Development, Daniel Diaz, said:

Kraken is excited to offer Dash on their trading platform and our teams are working closely to ensure clients can begin trading the currency immediately. Kraken is an incredibly well established and well structured organization, and amongst the best in the exchange business. In terms of reputation, they represent the highest standard for client satisfaction. Dash is a project that has implemented very original ideas that resonate well with the market, and as a top tier exchange, Krakens mission is to provide clients with access to digital currencies that are in demand and provide value.

Following several business partnerships around the world, the implementation of the Sentinel software upgrade and the announcement of revolutionary decentralized payments system called Evolution, Dash has been on record breaking trajectory. Its total market cap skyrocketed from $78 million USD (January 1st) to an all time high of $835 million USD (March 18th), with new international markets unlocked alongside user demand.

Daniel Diaz continued:

As the leading exchange in the Euro market, Krakens global reach helps Dash successfully meet the needs of our users and investors. The entire integration experience was very positive and we have high expectations for the partnership going forward. This is a significant achievement for Dash because our ecosystem needs high quality and trustworthy exchanges like Kraken to thrive, and we know they will play an important role as a fiat gateway.

Founded in 2011, Kraken Digital Asset Exchange is one of the worlds largest and oldest bitcoin exchanges with the widest selection of digital assets and national currencies. Based in San Francisco with offices around the world, Krakens trading platform is consistently rated the best and most secure digital asset exchange by independent news media. Trusted by hundreds of thousands of traders, institutions, and authorities, including Germanys BaFin regulated Fidor Bank, Kraken is the first exchange to display its market data on the Bloomberg Terminal, pass a cryptographically verifiable proof-of-reserves audit, and one of the first to offer leveraged margin trading. Kraken investors include Blockchain Capital, Digital Currency Group, Hummingbird Ventures, Money Partners Group, and SBI Investment

Kraken is expected to offer Dash margin trading in the near future.

Read the rest here:
Dash Cryptocurrency Signs on with Kraken - Blockchain News

Read More..

PointStar Expands its Cloud Computing Investment in Indonesia … – Yahoo Finance

JAKARTA, April 12, 2017 /PRNewswire/ -- PointStar, a cloud technology consulting firm that empowers businesses across Asia Pacific to innovate and move beyond productivityhas moved to a new office to accommodate its rapid business growth in Indonesia.

PointStar's Jakarta team in their new cloud-managed Indonesia office

The new office is designed to promote employees' productivity in a collaborative environment. It comes with fully-equipped conference rooms, a silent room, and a 'Phone Booth' area for employees who wish for a quiet working environment. With these facilities, the team can focus on enhancing their core competencies and assists companies to choose, adopt, implement, customise and manage their cloud solutions.

As one of the leading Google Partners in the APAC region, the conference room highlights the use of Google Hangouts in connecting PointStar's team of more than 60 employees working at different geographical locations.

The new office allows the team to collaborate and achieve better and faster results for growing client's list of all sizes, which include Gojek, Zalora, Matahari Mall, Indo Tambangraya Megah, and Parkland.

On May 7th 2015, PointStar announced the opening of a Cloud Center of Excellence in Indonesia. In less than 18 months, PointStar has outgrown its premises and experienced a multifold expansion in its employees as well as customers' portfolio.

As one of the pioneers in cloud solutions, PointStar adopted Google's advanced online enterprise-ready tool,G Suite,in achieving optimal communication. PointStar also utilises Netsuite's cloud accounting and CRM system, Cisco Meraki's cloud controlled WiFi, routing, and security system in ensuring a secure scalable enterprise network, as well as the Grandstream IP phone for a much stable telecommunication network.

"Our Jakarta new office will reinforce the strength and focus to bring our clients world-class services. The expansion reflects our commitment and the continued success on our clients' businesses. We will continue to provide quality, innovative and affordable cloud solutions for companies to embrace digital transformation, and redefine the smarter way of working in the new digital era," said Justin Lee, CEO of PointStar.

To celebrate the opening of the new Jakarta office, PointStar has launched a 1-FOR-1 training package called TransformMyBiz. Its objective is to help the small and medium businesses to train their employees on getting the most out of G Suite (formerly Google Apps) at work. The training package, worth $900 per session will be applicable to companies in Indonesia, Singapore and Malaysia, valid for the first 100 companies that sign up and on a first-come-first-serve basis.

"We live ina world of change, driven by technology and it is changing the way how companies and people used to work. As companies take on digital transformation, it simplifies the way they work and soon see themselves operating at maximum velocity," added Justin Lee.

About PointStar

PointStar was established in 2008 and has transformed hundreds of organisations and institutions, providing reliable and comprehensive cloud services that empower businesses to innovate and stay ahead of the competition.

As a Cloud Technology consulting firm, PointStar has a treasure chest of world class, expertise in consulting for businesses in various industries. Pointstar is one of Asia's leading Google Cloud Partners, the preferred NetSuite Solution Provider, and an authorized partner for Cisco Meraki. PointStar is headquartered in Singapore with a presence in Jakarta and Malaysia. For further information, please visit http://www.point-star.com

Read More

Here is the original post:
PointStar Expands its Cloud Computing Investment in Indonesia ... - Yahoo Finance

Read More..

Don’t let your data get out of control: Six steps to cloud governance success – Cloud Tech

(c)iStock.com/erhui1979

Cloud usage is accelerating at a rapid rate, and its showing no signs of slowing down. Right now, 90 percent of all companies are using the cloud in some way, and, in the next 10 years, spending on public cloud alone is slated to multiply by a factor of five to $500 billion, according to recent studies by North Bridge and Wikibon.

Enterprises are launching new infrastructure at a furious pace, migrating workloads of all kinds and big chunks of data to public and private cloud environments. Many of these enterprises are generating tangible results cutting out costs, increasing agility and getting products to market faster.

But cloud initiatives dont come without growing pains. If they arent managed closely, cloud projects can spiral out of control quickly, leaving a trail of data breaches, regulatory fines and cost overruns. What organizations need is a strong governance program to ensure that their move to the cloud starts smoothly and continues to perform at peak levels over time.

It should come as no surprise that some companies reap greater benefits from the cloud than others. In a recent survey of nearly 400 senior managers, CloudHealth identified a class of cloud leaders that are outperforming their peers based on a series of best practices theyre deploying in the cloud. These companies grow revenue 2.3 times faster than their peers. They also move faster, display more agility and position themselves as more likely to see the cloud drive competitive advantage.

Governance is a key differentiator for companies that leverage their clouds efficiently. As companies grow their cloud environments they have to conduct a balancing act between agility and control. They want to be flexible, get to market quickly and innovate, but they need to institute processes to ensure secure, effective and efficient use of IT resources.

The best of the best accomplish this using centralized governance. According to the survey, cloud leaders are twice as likely to have instituted centralized governance plans for migrating workloads to the cloud. And once the workloads are up and running, the survey found, cloud leaders display a series of best practices that effectively govern their cloud operations and protect their organizations from risk.

What are the most important facets of a cloud governance program? Based on the cloud leaders survey and anecdotal evidence weve observed from working with some of the most active cloud users in the world, here are six best practices for governance success:

When your organization has hundreds, if not thousands, of cloud instances spread across departments and locales, its not hard to lose track of how cloud services are being used. There needs to be accountability and the easiest way to ensure this is to put one person in charge.

An organization can assign any number of names to this role Cloud Optimization Manager, Cloud Governance Manager or, one weve heard repeatedly, Cloud Steward. The important thing is to be clear on this persons duties. The individual must have a business sense for the corporation and an ability to cross and influence departments and LOBs. The governance leader needs to have a strong technology sense for the corporation with access to devops, monitoring, API and security level expertise. The individual also will need to lead a cross functional team that defines best practices for building functional business groups across cloud resources and assets.

This role is emerging. According to the survey, cloud leaders are 2.5 times as likely to have a role dedicated to cloud optimization and governance. Having this Cloud Steward designated and empowered gives an organization a head start toward getting a cloud program under control.

A strong governance strategy can help an organization keep costs under control and manage the financial risks of a cloud environment. Controlling costs is a complicated discipline, with many factors to consider. But one way to get started is to simply set up a system to stay current with costs structures in the market.

Across the board, cloud leaders showed a better understanding of pricing structures and could control cost (even when dealing with a variety of vendors, pricing plans and billing structures). The survey concluded that cloud leaders are 1.5 times as likely to leverage pricing strategies from cloud providers based on forecasted use by workload. Theyre also 2.5 times as likely to have a comprehensive, clearly articulated strategy for managing the cost of cloud deployments.

Proactive monitoring and alerting mechanisms will detect abnormal behavior. This can explain why cloud leaders are nearly three times as likely to identify and mitigate cloud computing risks somewhat or extremely fast.

Just as monitoring and alerting can help organizations manage security threats, they can help keep costs under control by flagging improper use of services or unexpected outages. Many unauthorized uses stem from the spread of shadow IT. According to a recent survey by North Bridge and market watcher Wikibon, nearly two thirds of large enterprises have a cloud governance policy in place to help control shadow IT. The same survey revealed that a third of cloud services are not allowed by IT, meaning they present a risk to the organization.

Clearly, the increased availability of cloud applications puts a strain on IT departments. Having clear governance practices that define who can use what resources and when can eliminate a lot of confusion and risk in the organization.

Knowing what is and what is not permitted in terms of usage prevents users from relying on inefficient methods that drive costs higher. For example, there can be a policy limiting the amount of time a development program can be run in the cloud, so that the development team doesnt run something over an entire weekend, wasting money. Guidelines can also govern factors such as maximum thresholds for memory and minimum thresholds for performance, with the goal of optimizing efficiency in the cloud and keeping costs in control.

Perhaps the most overlooked aspect of governance is the fact that its not a set it and forget it practice. Organizations need to regularly update policies to reflect changing services, cost structures and usage patterns for continuous optimization. Tellingly, cloud leaders are 4.6 times as likely to see cloud deployment optimization as a continual process, rather than a one-time exercise. They get it.

Cloud computing is the new normal. Adoption is growing rapidly, and the enterprises that are using cloud wisely are able to conduct business in dramatically different ways. To get the most out of the cloud, while keeping a lid on cloud complexity, organizations need to establish clear, concise governance practices. Doing a good job on cloud governance will set organizations on the right path to long-term success.

Go here to read the rest:
Don't let your data get out of control: Six steps to cloud governance success - Cloud Tech

Read More..

Watch Out Amazon and Alibaba! Huawei’s Coming to the Public Cloud – Barron’s (blog)

By Isabella Zhong

E-commerce giants Amazon (AMZN) and Alibaba (BABA) have been making an aggressive push into public cloud computing services but smartphone maker Huawei now wants to give the pair a run for their money.

Huawei has announced plans for a new business that will focus on turning the Shenzhen-based company into a global provider of cloud computing services that will rival Amazon and Alibaba. The company has previously offered private cloud services. Sijia Jiang of Reuters has more on Huaweis cloud strategy.

The Shenzhen-based firm, which last month reported its slowest profit growth in five years, said it will expand in cloud computing with a dedicated division that will recruit 2,000 more people this year.

By expanding in cloud computing, hardware-focused Huawei hopes to continue developing software-based revenue at a time of slowing growth in smartphone sales and reduced spending on telecommunication infrastructure.

Public cloud computing is big business. Morgan Stanley analyst Grace Chen sees a $300 billion market opportunity today that could grow to around $340 billion by 2020. But the potential could be even greater if the public cloud becomes a transformational technology like electricity.

If public cloud is a transformational technology like electricity, usage broadens and workload growth accelerates. Two signposts support confidence in this bull case: (1) In markets with the highest public cloud penetration to-date like salesforce automation, weve seen close to 100% inflation of the forecasted market opportunities. (2) With the emergence of machine learning techniques, public cloud environments now facilitate the automation of entirely new areas of work, like image recognition, natural language processing, and decision making. As a public utility, electricity represents a $1.6 trillion market globally, multiples of our $300-plus billion base case cloud market forecast.

Cloud computing has already become a hotspot for some of Chinas largest companies. Aliyun, Alibabas cloud computing unit, is the Internet giants fastest growing business unit and has been posting triple-digit revenue growth. Social media and gaming behemoth Tencent (700.HK) also has a presence in cloud computing and last month stirred controversy by winning a cloud computing contract in Xiamen for just CNY0.01, while commercial real estate giant Dalian Wanda Group has recently joined forces with IBM (IBM) to roll out its own public cloud offering in China.

Here is the original post:
Watch Out Amazon and Alibaba! Huawei's Coming to the Public Cloud - Barron's (blog)

Read More..

AI Boom Boosts GPU Adoption, High-Density Cooling – Data Center Frontier (blog)

A row of eight NVIDIA graphics processing units (GPUs) packed into a Big Sur machine learning server at Facebook's data center in Prineville, Oregon. (Photo: Rich Miller)

The data center team at eBay is plenty familiar with high density data centers. The e-commerce giant has been running racks with more than 30 kilowatts (kW) of power density at the SUPERNAP in Las Vegas, seeking to fill every available slot in racks whenever possible.

But as eBay has begun applying artificial intelligence (AI) to its IT operations, the company has deployed more servers using graphics processing units (GPUs) instead of traditional CPUs.

From a data center power and cooling perspective, theyre a real challenge, said Serena DeVito, an Advanced Data Center Engineer at eBay. Most data centers are not ready for them. These are really power hungry little boxes.

The rise of artificial intelligence, and the GPU computing hardware that often supports it, is reshaping the data center industrys relationship with power density. New hardware for AI workloads is packing more computing power into each piece of equipment, boosting the power density the amount of electricity used by servers and storage in a rack or cabinet and the accompanying heat. The trend is challenging traditional practices in data center cooling, and prompting data center operators to adapt new strategies and designs.

All signs suggest that we are in the early phase of the adoption of AI hardware by data center users. For the moment, the trend is focused on hyperscale players, who are pursuing AI and machine learning at Internet scale. But soon there will be a larger group of companies and industries hoping to integrate AI into their products, and in many cases, their data centers.

Amazon Web Services, Microsoft Azure, Google Cloud Platform and IBM all offer GPU cloud servers. Facebook and Microsoft have each developed GPU-accelerated servers for their in-house machine learning operations, while Google went a step further, designing and building its own custom silicon for AI.

AI is the fastest-growing segment of the data center, but it is still nascent, said Diane Bryant, the Executive VP and General Manager of Intels Data Center Group. Bryant says that 7 percent of servers sold in 2016 were dedicated for AI workloads. While that is still a small percentage of its business, Intel has invested more than $32 billion in acquisitions of Altera, Nervana and MobilEye to prepare for a world in which specialized computing for AI workloads will become more important.

The appetite for accelerated computing shows up most clearly at NVIDIA, the market leader in GPU computing, which has seen its revenue from data center customers leap 205 percent over the past year. NVIDIAs prowess in parallel processing was seen first in supercomputing and high-performance computing (HPC), and supported by facilities with specialized cooling using water or refrigerants. The arrival of HPC-style density in data centers is driven by the broad application of machine learning technologies.

Deep learning on Nvidia GPUs, a breakthrough approach to AI, is helping to tackle challenges such as self-driving cars, early cancer detection and weather prediction, said Nvidia cofounder and CEO Jen-Hsun Huang. We can now see that GPU-based deep learning will revolutionize major industries, from consumer internet and transportation to health care and manufacturing. The era of AI is upon us.

And with the dawn of the AI era comes a rise in rack density, first at the hyperscale players and soon at multi-tenant colocation centers.

How much density are we talking about? A kilowatt per rack unit is common with these GPUs, said Peter Harrison, the co-founder and Chief Technical Officer at Colovore, a Silicon Valley colocation business that specializes in high-density hosting. These are real deployments. These customers are pushing to the point where 30kW or 40kW loads (per cabinet) are easily possible today.

A good example is CirraScale, a service provider that specializes in GPU-powered cloud services for AI and machine learning. CirraScale hosts some of its infrastructure in custom high-density cabinets at the ScaleMatrix data center in San Diego.

These technologies are pushing the envelope, said Chris Orlando, the Chief Sales and Marketing Officer and a co-founder of ScaleMatrix. We have people from around the country seeking us out because they have dense platforms that are pushing the limits of what their data centers can handle. With densities and workloads changing rapidly, its hard to see the future.

Cirrascale, the successor to the Verari HPC business, operates several rows of cabinets at ScaleMatrix, which house between 11 and 14 GPU servers per cabinet, including some connecting eight NVIDIA GPUs using PCIe a configuration also seen in Facebooks Big Sur AI appliance andthe NVIDIA DGX-1supercomputer in a box.

Over the past decade, there have been numerous predictions of the imminent arrival of higher rack power densities. Yet extreme densities remain limited, primarily seen in HPC. The consensus view is that most data centers average 3kW to 6kW a rack, with hyperscale facilities running at about 10kW per rack.

Yet the interest in AI extends beyond the HPC environments at universities and research labs, bringing these workloads into cloud data centers. Service providers specializing in high-density computing have also seen growing business from machine learning and AI workloads. These companies use different strategies and designs to cool high-density cabinets.

A TSCIF aisle containment system inside the SUPERNAP campus in Las Vegas. (Photo: Switch)

The primary strategy is containment, which creates a physical separation between cold air and hot air in the data hall. One of the pioneers in containment has been Switch, whose SUPERNAP data centers use a hot-aisle containment system to handle workloads of 30kW a rack and beyond. This capability has won the business of many large customers, allowing them to pack more computing power into a smaller footprint. Prominent customers include eBay, with its historic focus on density, which hosts its GPU-powered AI hardware at the SUPERNAPs in Las Vegas.

For hyperscale operators, data center economics dictates a middle path on the density spectrum. Facebook, Google and Microsoft operate their data centers at higher temperatures, often above 80 degrees in the cold aisle. This saves money on power and cooling, but those higher temperatures make it difficult to manage HPC-style density. Facebook, for example, seeks to keep racks around 10 kW, so it runs just four of its Big Sur and Big Basin AI servers in each rack. The units are each 3U in depth.

Facebooks machine learning servers feature eight NVIDIA GPUs, which the custom chassis design places directly in front of the cool air being drawn into the system, removing preheat from other components and improving the overall thermal efficiency. Microsofts HGX-1 machine learning server, developed with NVIDIA and Ingrasys/Foxconn, also features eight GPUs.

A custom rack in a Google data center packed with Tensor Processing Unit hardware for machine learning. (Photo: Google)

While much of the AI density discussion has focused on NVIDIA gear, GPUs arent the only hardware being adopted for artificial intelligence computing, and just about all of these chips result in higher power densities.

Google decided to design and build its own AI hardware centered on the Tensor Processing Unit (TPU), a custom ASIC tailored for Googles TensorFlow open source software library for machine learning. An ASIC (Application Specific Integrated Circuits) is a chip that can be customized to perform a specific task, squeezing more operations per second into the silicon. A board with a TPU fits into a hard disk drive slot in a data center rack.

Those TPUs are more energy dense than a traditional x86 server, said Joe Kava, the Vice President of Data Centers at Google. If you have a full rack of TPUs, it will draw more power than a traditional rack. It hasnt really changed anything for us. We have the ability in our data center design to adapt for higher density. As a percentage of the total fleet, its not a majority of our (hardware).

Tomorrow: We look at data center service providers focused on GPU hosting, and how they are designing for extreme density.

See the rest here:
AI Boom Boosts GPU Adoption, High-Density Cooling - Data Center Frontier (blog)

Read More..

Microsoft’s new software tool helps enterprises evaluate cloud move – PCWorld

Thank you

Your message has been sent.

There was an error emailing this page.

IT professionals who want help getting a handle on a potential cloud migration have a new tool from Microsoft. The company is offering a Cloud Migration Assessment service that walks customers through an evaluation of the resources they currently use, in order to determine what a move to the cloud would cost.

Microsofts cost calculation is driven in part by the Azure Hybrid Use Benefit , which lets customers apply their existing Windows Server licenses with Software Assurance to virtual machines running in Microsofts cloud. That means customers only have to pay the base price for the compute resources they use.

Also starting Wednesday, all customers can invoke the discount from the Azure Management Portal. In the past, this type of deployment of discounted virtual machine images was limited to companies who have enterprise agreements with Microsoft. Others had to use Azure PowerShell to configure the discounts.

The moves are part of Microsofts overall push to get its enterprise customers to move more of their workloads from on-premises servers to the Azure public cloud. The tech titan has been emphasizing tools for running hybrid cloud configurations for quite some time.

In the past year, weve seen lots of other vendors also starting to talk about hybrid and realizing that its central to the vast majority of organizations IT strategies, Julia White, Microsofts corporate vice president for Azure marketing, said. And this push here, whether it be the migration tools or in general, better amplifying and clarifying our hybrid capabilities, is all in the essence of recognizing that [hybrid] is the approach for most customers, and it needs to be done in a way that can be durable.

The Cloud Migration Assessment tool lets users manually enter the compute, networking and storage resources that theyre already using, or import the same information from an Excel file thats either user-composed or generated by the Microsoft Assessment and Planning Toolkit.

Microsofts tool takes that information and provides users with a graph that shows them a model for the costs of continuing to run a data center, along with how much theyll pay for running the same workloads in Azure. The tool offers a set of default assumptions about how much an on-premises deployment costs, but customers who have information about the costs associated with their environment can input those, instead.

In order to get access to the tool, users have to hand over their name, contact information, and the name of their company. Microsoft will use that to follow up with users about their experience, and will also work to connect those companies with partner businesses that can help with migration if that makes sense.

Much like Microsoft in general, we remain very partner-led, White said. And so, when we can match a great partner with a customer that needs them, thats what we aim to do.

On top of all this, Microsoft also announced that its Azure Site Recovery migration tool will be updated in the coming weeks so that users can more easily use AHUB discounts when migrating from other environments. When that update goes through, users will be able to tag Windows Server VMs that theyre migrating for hybrid use discounts. That may entice people to move their Windows Server virtual machines from AWS and on-premises hardware into Azure by making it easier to do so.

Blair Hanley Frank is primarily focused on the public cloud, productivity and operating systems businesses for the IDG News Service.

Continued here:
Microsoft's new software tool helps enterprises evaluate cloud move - PCWorld

Read More..

Oracle CEO: We Can Beat Amazon and Microsoft Without as Many Data Centers – Fortune

Conventional wisdom in the public cloud market is that there are three leaders: Amazon Web Services followed by Microsoft Azure, and Google Cloud Platform.

Those companies assemble and sell massive arrays of servers, storage, and networking to businessesmost of which don't want to build more of their own data centers. Towards that end, those three cloud superpowers alone spent roughly $31 billion last year to extend their data center capacity around the world, according to the Wall Street Journal , which tabulated that total from corporate filings.

By comparison, Oracle ( orcl ) , which is making its own public cloud push, spent about $1.7 billion. To most observers, that looks like a stunning mismatch.

But Mark Hurd, Oracle's co-chief executive, would beg to differ. In his view, there are data centers and then there are data centers. And Oracle's data centers, he said, can be more efficient because they run Oracle hardware and supercharged databases.

"We try not to get into this capital expenditure discussion. It's an interesting thesis that whoever has the most capex wins," Hurd said in response to a question from Fortune at a Boston event on Tuesday. "If I have two-times faster computers, I don't need as many data centers. If I can speed up the database, maybe I need one fourth as may data centers. I can go on and on about how tech drives this."

Get Data Sheet , Fortunes technology newsletter.

"Our core advantage is what we've said all along, which is that it's about the intellectual property and the software, not about who's got the most real estate," Hurd added. "We have spent billions over the past year, but in isolation, that's a discrete argument that I find interesting, but not fascinating."

Following up via email, Hurd said: This isnt a battle of capex. This is about R&D, about technology, software, innovation and IP; and then the capex to make it work."

Oracle has said it runs its data centers on Oracle Exadata servers , which are turbocharged machines that differ fundamentally from the bare-bones servers that other public cloud providers deploy by the hundreds of thousands in what is called a scale-out model. The idea is that when a server or two among the thousands failas they willthe jobs get routed to still-working machines. It's about designing applications that are easily redeployed.

Oracle is banking more on what techies call a "scale-up" model in which fewer, but very powerful computersin Exadata's case each with its own integrated networking and storagetake on big workloads.

Oracle execs, including executive chairman Larry Ellison, have argued that Oracle's big machines can actually work cheaper and more efficiently than the other public cloud configurations. Many industry analysts have their doubts on that, maintaining Oracle must spend much more to catchup with Amazon. Toward that end, in January, Oracle announced plans to add three new data center farms within six months and more to come.

There are those who think that Fortune 500 companies relying on Oracle databases and financial applications give Oracle an advantage because they are loathe to move those workloads to another cloud providerdespite AWS wooing them with promises of easy migrations other perks.

In late March, AWS chief executive Andy Jassy claimed the company had converted 22,000 databases from other vendors to its own database services. AWS ( amzn ) does not break out which databases those customers had been using.

Hurd took up that point as well: "How much database market will Oracle lose to [Amazon] Aurora? My guess is close to zero." (Aurora is one of several database options that AWS offers.)

"The third largest database in the world is IBM DB2, and it's been going out of business for 20 years," Hurd said in a characterization that IBM ( ibm ) would dispute. "If it was so easy to replace databases, DB2 market share would be zero."

That is because most databaseswhich companies rely on as the basis for core accounting and financial operationsrun custom programming, which is hard to move.

Still, in Amazon, Oracleas well as IBM, Microsoft, and virtually every legacy information technology providerfaces a huge challenge. AWS is on track to log more than $14 billion in revenue this year.

Note: (April 12, 2017 12:55 p.m.) This story was updated to add an additional quote from Oracle's Mark Hurd.

Go here to see the original:
Oracle CEO: We Can Beat Amazon and Microsoft Without as Many Data Centers - Fortune

Read More..