Category Archives: Cloud Computing

Global Cloud Computing Market Report (2017-2023) – Almost All Sectors are Shifting Databases to Cloud – Research … – Business Wire (press release)

DUBLIN--(BUSINESS WIRE)--Research and Markets has announced the addition of the "Global Cloud Computing Market Insights, Opportunity Analysis, Market Shares and Forecast 2017 - 2023" report to their offering.

Cloud Computing is an internet based technology that serves a centralized data source to remote devices connected via internet or intranet. It is a network where a program or application runs on a server and is shared across multiple devices such as Personal Computers (PCs), Laptops, and Mobile Devices. The sole requirement is connecting the server through a communication network such as Internet or Intranet or Local Area Network (LAN) or Wide Area Network (WAN). The earlier levels of virtualization were featured by integrated storage, CapEx and OpEx savings, and data security and so on.

Cloud computing enables utilization of data to support business decisions while keeping IT cost down. It also enables organizations to leverage PaaS for faster application deployment. Cloud Computing is being largely used by major companies for their widely used applications such as Dropbox Inc. for their Dropbox Storage, Google Inc. for Gmail, Google Auto Back Up, Facebook, Evernote, Skype and lot more.

Cloud Computing services have a transformational effect on all sectors across the globe. Almost all the sectors are shifting or at least thinking to shift their database into the cloud. Today several IT companies have moved to cloud computing. Other sectors such as healthcare and social media are most affected by cloud computing. There are several reasons why a company or an industry migrates to cloud. However, the major reason is cost reduction. Cloud services are allowing the businesses to reduce their capital expenditure by eliminating the cost of IT Infrastructure.

Key Topics Covered:

1. Overview Of Cloud Computing

2. Key Components Of Cloud Computing

3. Market Determinants

4. Sector Analysis

5. Competitive Landscape

6. Geographical Analysis

7. Company Profiling

Companies Mentioned:

For more information about this report visit http://www.researchandmarkets.com/research/hvqx62/global_cloud

More:
Global Cloud Computing Market Report (2017-2023) - Almost All Sectors are Shifting Databases to Cloud - Research ... - Business Wire (press release)

The Doyle Report: The New Rules for Computing – MSPmentor

Depending on how you count, we are either in the third wave of technology transformation, the fourth or even the fifth.

The designations dont matter as much as the impact. Ross Brown, senior vice president of worldwide partners and alliances at VMware, simply calls the current era the new wave of computing. It is having profound impacts on the channel and beyond. Take software development.

If something is ready available in the cloud, why not leverage it? Many technology buyers dont need or want ownership of basic capabilities. They simply want to move swiftly ahead with their digital objectives.

This shift in philosophy is foundational, says Brown. This is because the thinking now disconnects physical infrastructure from applications delivery, which, though admittedly wonky, is a big deal to CIOs and the partners that support them. For most of their careers, these IT professionals have prioritized things such as infrastructure, security, redundancy, etc. Now? Competitive pressures have them thinking more about functionality, ease-of-use and time-to-market.

Ross Brown, VMware SVP & Channel Chief

For partners who used to think in terms of resource optimization, five-nines reliability or bullet-proof security, this change in priorities is a radical shift. Speaking recently in San Jose at a tech event for channel leaders, Brown identified what he believes are the new rules of computing for 2017. They include the following.

These new realities have left many solution providers in a quandary, especially those that sell networking and storage solutions but do not touch applications. How can they stay relevant when the bulk of spending is going to line-of-business managers who are interested in business outcomes and not systems integration? Its a pressing question for thousands of channel companies.

So what does it all mean? Several things. For one, your mess for less outsourcing will suffer. IP assets sold as virtual services will grow increasingly more attractive than IT value delivered by physical labor.

In addition, Brown predicts, single-layer solution partners and VARs, no matter how large and capable, will face headwinds. The CIOs they sell to are growing weary of serving as project management offices (PMOs) that take orders from line-of-business executives and then break down these requests into discrete tasks doled out to third party contractors and in-house staffers. The normal model of enlisting a VAR or MSP to do a discrete task around a layer of IT is under direct attack, Brown says.

In the meantime, ISV and applications-specific IP will become a sustainable differentiator for many partners who change their business models.

Finally, appliances will go virtual and be designed by IP owners, not integrated by CIOs using the PMO approach.

When will all this happen? Brown says the channel can expect big changes from vendors in the next 24-36 months, which will lead to widespread customer shifts over the next five years and more.

Im teased internally that Im always looking ahead three years out in a company that looks at things 90 days out, jokes Brown. But thats the role of a good channel chief.

View post:
The Doyle Report: The New Rules for Computing - MSPmentor

Microsoft will give Azure customers access to 10000 patents to fight lawsuits – GeekWire

Microsoft will help its cloud computing customersfight lawsuits from patent trolls.

As an added benefit to those paying for Microsoft Azure, the tech giant announced that it willnow offer Microsoft Azure IP Advantage, a new program that helps customers hitwithintellectual property lawsuits.

Microsoft President Brad Smith called it the industrys most comprehensive protection against intellectual property (IP) risks.

Our goal is to help foster a community that values and protects innovation and investments in the cloud, Smith wrote in a blog post. We want software developers to be able to focus on coding, and businesses and enterprises to be able to respond to the changing needs of their customers with agility without worrying about lawsuits.

Smith noted that as more and more companies utilize cloud-based software services, there has been a rise in cloud-rated IP lawsuits from patent trolls, which do not sell actual products but profit off patent litigation.

Microsofts new program will allow customers to utilize 10,000 patents of the companys patents, free of charge, for lawsuitsagainst their services that run on top of Azure. It will also expand an existing service that provides resources to those fighting patent lawsuits to include any open source technology that powers Microsoft Azure services, such as Hadoop used for Azure HD Insight.

Finally, the program assures Azure customers that if Microsoft transfers patents in the future to non-practicing entities, they can never be asserted against them.

The program is an added benefit for customers of Azure, which competes with Amazon Web Services, Google Cloud, and others. Reuters noted that its designed to help an automaker, for example, that has auto-related patents but nonefor its mobile apps or connected car technologies that could put them at risk for litigation.

In its earnings report for the December quarter, Microsoft reached an annualized commercial cloud run rate of more than $14 billion, driven in part by its Azure cloud computing division.

AWS is still the market leader by a wide margin,although direct comparisons to Azureare difficult because Microsoft includes cloud applications such as Office 365 in itsfinancial calculation.

In the December quarter, MicrosoftsIntelligent Cloud segment postedrevenue of $6.86 billion, up 8 percent,and operating income of $2.39 billion.

In a call with analysts after the earnings release, CEO Satya Nadella repeated his earlier vowthat the company will reach$20 billion in annualized cloud revenue by the companys fiscal 2018, which ends June 30, 2018.

Read more here:
Microsoft will give Azure customers access to 10000 patents to fight lawsuits - GeekWire

The future isn’t cloud. It’s multi-cloud – Network World

Network World | Feb 6, 2017 7:58 AM PT

Your message has been sent.

There was an error emailing this page.

Cloud computing was supposed to simplify IT environments. Now, according to a recent study by Microsoft and 451 Research, nearly a third of organizations work with four or more cloud vendors. It would seem multi-cloud is the future of cloud computing. But what is driving this trend?

Some organizations simply want to have more optionsusing multiple cloud providers to support different applications and workloads mean they can use the solution best suited to their needs. For example, an organizations core applications may need resilient applications that can run even if local power is lost or that can expand or contract their capacity depending on workload.

Other departments in the same organization may need customer management and data analytic and modelling tools available anywhere in the world to enhance productivity.

Settling on a single cloud model would create compromises for such an organization that would ultimately dilute its benefits and the business use case. Its inevitable big companies with many divisions and their own agendas and vendor alliances will end up with multiple clouds.

According to a report by Ovum, a quarter of European firms are unhappy with their cloud service provider largely due to poor service performance, weak service-level guarantees and a lack of personalized support.

Organizations tend to prefer a multi-cloud strategy to get out of the keeping all your eggs in one basket problem that can leave them vulnerable to a variety of issues, such as cloud data center outages, bandwidth problems and vendor lock-in. A cloud application that consistently goes offline doesnt reflect well on a business and can ultimately lose it customers. If critical data and applications depend on a single cloud provider the ability to negotiate through business disagreements and arbitrage compute and data storage pricing is also constrained.

Data sovereignty and compliance issues are also leading to a surge in multi-cloud as organizations, particularly in Europe, worry about how to comply with current rules and their exposure if they operate in areas where no rules governing cloud services yet exist. Storing data locally minimizes issues over data sovereignty whilst directing traffic to data centers closest to users based on their location is vital for latency-sensitive applications.

While organizations may want to deploy a multi-cloud strategy, the reality is that moving between clouds can be challenging. Unfortunately, no two IT environments are ever alike, and the cloud is no exception. While cloud providers do all they can to make it simple for their clients to move applications to their platforms, they dont want to make it easy to leaveafter all absolute portability would reduce their business to a price-sensitive commodity.

Many organizations are rightly concerned about the downtime involved in moving petabytes of data between cloud providers. Fortunately, the same patented Active Data Replication technology that all the major cloud vendors offer to make it simple for customers to move to the cloud can also be used to migrate data between the clouds.

The recent acquisition by Google of Orbitera, a platform that supports multi-cloud commerce, show that Google recognizes that multi-cloud environments are the future. The ramifications of this are huge. While Amazon Web Services (AWS) remains the dominant player in the space, businesses wanting the freedom to juggle multiple cloud services and avoid vendor lock-in may well help the other players to catch up.

In a market estimated by Gartner to be worth $240 billion next year, multi-cloud creates a new front in the so-called cloud computing wars. This can only be good news for those businesses looking for flexibility, cost savings and ultimately better solutions.

This article is published as part of the IDG Contributor Network. Want to Join?

David Richards is co-founder, president and CEO of WANdisco. He has more than 15 years of experience as an executive in the software industry and sits on boards and advisory boards of Silicon Valley startups.

Sponsored Links

Visit link:
The future isn't cloud. It's multi-cloud - Network World

AWS CEO: Luck gave Amazon’s cloud-computing unit a boost – The Seattle Times

Speaking at the University of Washington, Andy Jassy, head of Amazon Web Services, said that luck played a part in AWS huge growth but early decisions paved the way for the companys cloud-computing dominance.

The head of Amazon Web Services (AWS) said its dominance in cloud computing stems, in part, from luck but also from early decisions that proved critical to establishing a revolutionary business.

Theres always a fair amount that is luck, AWS CEO Andy Jassy said Tuesday in a speech at the University of Washingtons department of computer science and engineering. You have to have the right timing and some things have to break your way.

Jassys comments come as AWS Amazons cloud-computing unit, which rents out computing power and storage to enterprises, governments and entrepreneurs has become a $14 billion-a-year business, driven in recent years by mass migration of data from companies private data centers to shared ones. Its a decade-old business that Amazon pioneered and a hugely profitable one.

In 2016 it brought home $3.1 billion in operating income 32 percent more than Amazons North America retail unit, the companys largest business by revenue. Its also, by far, the largest cloud provider.

AWS sales growth rate, however, has slowed to 47percent in the fourth quarter of 2016 from 69percent in the year-earlier period, amid stiffening competition from Microsoft, Google and other large tech firms.

Among the early decisions at the foundation of AWS fortunes was the creation of primitive building blocks, basic functions that customers could use and combine according to their needs, Jassy said.

Also key: AWS sold its services a la carte and based charges on usage, much like a public utility. That was a big departure from the expensive, multiyear contracts that technology providers typically charged. People gave us a lot of credit early on for the pricing model, Jassy said.

The next critical decision was the market AWS first went after.

AWS leaders very consciously targeted software developers and startups early on, even though they knew that enterprises and governments would eventually be the largest clients, according to Jassy.

That turned out to be an extremely underserved segment, Jassy said. A lot of those developers were spending only a few bucks on AWS services, but we didnt mind that, the executive said. Some of those are going to be the next big enterprise in the next five to 10 years.

It was also necessary to keep innovating, quickly, to adapt to growing needs. Being quick and moving fast and being feature-poor to start with only works if you can deliver and iterate quickly, Jassy said.

Now the breadth and sophistication of AWS offerings include artificial intelligence, voice computing, databases and machine learning tools, some of which draw on the innovations Amazon has deployed on its Alexa digital assistant and in its fulfillment centers.

As for the growing competition, Jassy said that the market for computing-related services is so big that there is room for a number of successful cloud providers. But I dont think theres going to be 30, because scale really matters.

See the rest here:
AWS CEO: Luck gave Amazon's cloud-computing unit a boost - The Seattle Times

Strategic Focus Report – Cloud Computing – PR Newswire (press release)

LONDON, Feb. 8, 2017 /PRNewswire/ -- Summary This strategic focus report analyses the current trends, drivers, and inhibitors impacting the cloud computing market. The report outlines the evolution of cloud computing technologies, and identifies and assesses the best performing vendors in the market. This report also presents view of the revenue opportunities in the cloud computing market through to 2020, highlighting the market size and growth by technology, geography, sectors, and size band. Moreover, following in-depth ICT decision maker surveys, the report outlines enterprises' investment priorities in the cloud computing segment.

Key Findings - The emergence of big data platforms such as Hadoop and NoSQL, as well as the need to perform high processing operations involving machine learning and predictive analytics, hybrid cloud appears to be the best option for enterprises. - Containers as a service (CaaS) is emerging as the enhanced version of the platform as a service (PaaS) offerings, as PaaS services are becoming increasingly more commoditized by the day, making it difficult for vendors to create any differentiation. - The growing demand for disaster recovery services in the cloud environment has persuaded IT vendors to develop innovative cloud solutions while keeping disaster recovery and business continuity as the central theme, which are now being termed as disaster recovery as a service (DRaaS).

Synopsis Strategic Focus Report - Cloud computing analyses the current trends, drivers, and inhibitors impacting the cloud computing market. The report outlines the evolution of cloud computing, and identifies and assesses the best performing vendors in the market. This report also presents view of the revenue opportunities in the cloud computing market through to 2020, highlighting the market size and growth by technology, geography, sector, and size band. Moreover, following in-depth ICT decision maker surveys, the report outlines enterprises' investment priorities in cloud computing. This product covers the latest trends in the cloud computing market, coupled with insight into the vendor landscape and market size in the cloud computing domain.

In particular, it provides an in-depth analysis of the following: - The latest trends impacting the cloud computing market. - The market drivers (both supply-side and demand-side) that will facilitate the growth of the cloud computing market. - The market inhibitors that may hinder the pervasive adoption of cloud computing. - Identification of the top ICT vendors in the cloud computing market, coupled with an overview of the top 5 vendors. - The primary findings from view of revenue opportunities in the cloud computing market through to 2020, highlighting the market size and growth by technology, geography, sectors and size band. - An identification of enterprises' investment priorities based on their budget allocations relating to cloud computing.

ReasonsToBuy - Understand the cloud computing landscape, the recent trends, drivers, and inhibitors shaping the cloud computing segment. - Comprehend the cloud computing vendor landscape and track their relative performance in the cloud computing market to gain a competitive advantage. - Enhance your market segmentation by analyzing the revenue opportunity forecasts figure in the cloud computing market from 2015 to 2020, spanning six regions, 14 verticals, and two size bands. - Understand how organization's cloud computing requirements are set to change in the next two years in order to prioritize your target market.

Download the full report: https://www.reportbuyer.com/product/4595288/

About Reportbuyer Reportbuyer is a leading industry intelligence solution that provides all market research reports from top publishers http://www.reportbuyer.com

For more information: Sarah Smith Research Advisor at Reportbuyer.com Email: query@reportbuyer.com Tel: +44 208 816 85 48 Website: http://www.reportbuyer.com

To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/strategic-focus-report---cloud-computing-300404494.html

Here is the original post:
Strategic Focus Report - Cloud Computing - PR Newswire (press release)

What MSPs Must Have for Customer Cloud Security – Talkin’ Cloud

In outsourced cloud computing services, public cloud Platform as a Service (PaaS) providers only ensure security on the outside of the cloudnot inside. For security inside the cloud, PaaS users have to take matters into their own hands. While that should concern all public cloud denizens, for managed service providers (MSPs) the issue gets magnified by the number of customers on their Software as a Service (SaaS) solutions.

Its the most important practice for security practitioners to do everything to minimize risk of SaaS infrastructure security gaps within their own organizations first, says Chris Carter, CEO, Approyo, a global SAP solution provider. Some of the most important steps can help security managers tighten cloud security and keep their organizations safe by leveraging Cloud Access Security BrokersCASB. These tools help executives find unauthorized applications and manage risk across all their clouds.

And with many MSPs responsible for a great and growing number of customer public cloud instances, it has become impossible to manually maintain security on them all. However, continually monitoring security and configuration vulnerabilities exists as a mission-critical item to cross off the MSP checklist. How to accomplish it has yet to receive an answer. Talkin Cloud reached out to industry thought leaders to ask what they think. What follows remains anecdotal and does not purport to cover all aspects of the subject. If something significant got left out, leave a comment. Lets discuss it.

The In Crowd Source

With large public cloud PaaS providers like Amazon Web Services (AWS) and Microsoft Azure busy battling for control of the internet business of governments and Fortune 500 companies, they may have overlooked prospects of the MSP marketand its ecosystem of startups. While inside-out-only security seems just fine for those large customerswho have their own legacy business IT departments to worry about internal securityit does not come close for MSPs and the Internet of Things (IoT) and billions of device events headed their way.

Even the most secure cloud providers only offer security of the cloud, says Matthew Fuller, co-founder, CloudSploit, provider of automated AWS security and configuration monitoring. The user is responsible for security in the cloud. As groups, roles and devices change, oversights and misconfigurations open vulnerabilities that can lead to outright hacks or financial DDoS.

To help solve this issue, continual monitoring of AWS instances can prove effective. For example, CloudSploit customers can run tests that they choose or want to create and as frequently as desired, according to Fuller. And if they find issues, CloudSploit alerts designees, keeping records of findings, detailed issue descriptions and likely resolutions, according to Fuller.

Security experts from around the world contribute to CloudSploit, Fuller says. It is an open source project with goal of increasing compliance with best practices to protect MSP infrastructure and customer information.

Benchmarks, Shared Responsibility and Control Planes

With MSPs overwhelmed by information technology (IT) applications duties and customers at best novices about cloud security in many cases, exactly how it must get done remains dubious. Agreement as to the course of action and who has onus for completion must take priority before establishing SaaS defenseand before a cloud exploit comes into existence.

The cloud provider shared responsibility model places a security burden on enterprises consuming services, says Dave Ginsburg, vice president, marketing, Cavirin, provider of security and compliance across physical, public and hybrid clouds. But, in some cases, IT will not have the processes or expertise to properly mitigate risk. The result may be a breach that could have been prevented or reluctance to move critical applications to the cloud, creating competitive disadvantage.

What MSPs and cloud customers need remain consensus benchmarks to properly share responsibility for their respective pieces of the security pie. The fact that cloud workloads now exist in constant fluxexacerbated by virtualization and containersmakes it even more critical, according to Ginsburg. But how do you gauge performance of how the parties divide the responsibility?

Therefore, enterprises require continuous visibility into their security postures and one set of tools designed to test against benchmarks that include NIST, CIS, PCI, HIPAA and FISMA, Ginsburg says. Tools deployed by the enterprise should support these and have full visibility into different AWS services via APIs. The same applies for Microsoft Azure, Google Cloud Platform and others.

Fortunately, many public PaaS providers including AWS have issued security best practices for hardening cloud instances that align with CIS. And MSPs can employ third party compliance solutions to help implement them. In addition, both AWS and Azure provide sophisticated tools to secure access to their control planes.

A control plane compromise is generally worse than a server compromise, as control planes provide access to servers as well as direct access to the account, says Jarret Raim, head of strategy and operations, Rackspace Managed Security. Rackspace tools like CloudTrail from AWS will surface the changes made to the control planes and should be monitored for abuse.

Points of Demarcation

Hand-in-hand with the cooperation that must exist between MSPs and customers when it comes to security, a boundary must delineate where each has total responsibility in cloud defense. And it cannot come as an afterthought. Security among MSPs and customers needs careful planning built in the beginning, with proper resource liaising a must, according to cyber infrastructure experts.

Security management is a key prerequisite for driving a cloud strategy, says Steve Hanney, chief cloud officer, Presidio, an IT solutions provider focused on digital infrastructure, cloud and security solutions. And with vendor management from the outset paramount, there must be a demarcation of responsibility defined between MSP and customer.

After creating customer security rules of engagement, MSPs can craft secure and compliant environments that protect services and data seamlessly, end-to-end throughout the relationship lifecycle, according to Hanney. This secure network access controlsecure infrastructureestablishes appropriate predefined traffic rules using firewalls, network policy engines and secure tunnels between customer on-premises data center environments and off-premises MSPs, paraphrasing Hanney, with reference link provided by Cloudscene.

Blacklisting vs. Whitelisting

As the off-premises cloud solutions concept takes hold, the trust model of internet connection must change, in the view of some IT experts. Whereas a presumed-innocent-until-proven-guilty mindset that attempted to fingerprint black hats upfront has prevailed among many security experts until now, the explosion of links in cloud computing has made more cautious practitioners dissent and opt for a trust-but-verify stance to identify white hats in advance.

The cloud is becoming a set of computing utilities and will be as essential as the electricity grid, says Amir Sharif, co-founder, Aporeto, provider of comprehensive cloud-native security for deploying and operating cloud-native applications. Like any critical service, security needs to be part of cloud infrastructure and automatic. Protecting individual data assets in the cloud requires a whitelist security model, only allowing intended connections, instead of the existing blacklist model, where all links are implicitly allowed unless explicitly prohibited.

Implementation of this security model requires a robust policy regime where application and personal intention get captured, if possible, and described easily, according to Sharif.

Password Reuse and Brute Force Attacks

As many know, the internet has become increasingly hostile, with cyber criminals targeting poorly secured hosted services. For example, at MSPs and other hosted services, applications can come under attack from old-fashioned hacking attempts like password reuse attack and bruteforce attack. This poses particular problems for MSPs that use remote monitoring and management (RMM) solutions to administer customer accounts, according to MSP security experts.

The most likely RMM compromise is a password reuse attack, says Ian Trump, global security lead, SolarWinds MSP. This scenario led to account compromise in hosted services like GitHub and others. Also, this attack is the easiest to mitigate. By simply enabling Two Factor Authentication (2FA) protection of your RMM dashboard, it easily prevents account compromise in event your password falls into hands of the bad guys from a previous data breach.

Brute forced weak passwords guessing attacks of RMM accounts remain the next most likely MSP exploit, according to Trump. But SolarWinds always offers 2FA options to customers to mitigate successful guesses by hackers, according to Trump.

And MSPs can prevent further hacking by banning IP addresses of where brute force attacks emanate, according to Trump.

Follow this link:
What MSPs Must Have for Customer Cloud Security - Talkin' Cloud

WTF is cloud computing? – TechCrunch


ZDNet
WTF is cloud computing?
TechCrunch
After more than a decade of being in the popular tech lexicon, people kind of get the idea of the cloud, but most probably only understand a bit of it. That's because the cloud isn't a single concrete thing so much as a concept that encompasses many ...
AWS dominates cloud computing infrastructure market, bigger than IBM/Google/Microsoft combinedZDNet
The cloud continues to riseand fastNetwork World
Dropbox Rolls Out Google Docs CompetitorCIO Today
InfoWorld -University Herald -1redDrop (blog) -Canalys
all 77 news articles »

Read the original here:
WTF is cloud computing? - TechCrunch

Cloud computing key to precision medicine but security concerns … – Healthcare IT News

Precision medicine promises to change the healthcare paradigm and create a powerful new model of care designed specifically for each individual, offering a much greater likelihood of effectiveness. But to realize much of this vision requires eliminating data silos and aggregating information from all sources Internet of Things, patient surveys, genomic data, EHRs and more into a central repository that gives clinicians worldwide access to this data.

Many believe the cloud will become the primary platform for data aggregation and harmonization. And thats the direction Nephi Walton, MD, informaticist and clinical geneticist at Washington University School of Medicine, is heading.

People are misled a bit by the benefits of cloud computing in this domain, Walton said. A lot of people are touting the advantage of data anywhere, which indeed is an advantage of cloud computing. But that is not the major role that the cloud will play in precision medicine. Its more related to the size of the data and the ability to analyze and access data quickly. And the ability to plug into cloud services of different types. There are certain things that lend themselves to cloud computing more than others, and in precision medicine it is more the large data sets involved.

A caregiver may have huge data sets, entire genomes, gigabytes on an individual. Historically, healthcare has placed all this data on huge servers; but when one has this large a data set on each patient, one really needs to use distributed computing, Walton explained.

Learn more at theCloud Computing ForumHIMSS17.Register here. University of Mississippi Medical Center finds big analytics gains in the cloud Intermountain exec: Cloud changes breadth and depth of innovation

When you think about who is doing it and allowing huge sets of analysis, you think of someone like Google; it is using database cloud computing technology, he explained. The advantage is setting up all of these large data sets in something similar to Google Big Table a database structure different from standard relational databases as it allows one to use a distributed model that enables rapid access to large amounts of data. Here you can quickly access lots of information and process it quickly.

That is where the power will be in terms of cloud computing and precision medicine, he added.

You will have someones genomic data in an accessible database where you can access all the people with certain conditions to do real-time analysis and apply new knowledge to large data sets quickly, he said.

Walton will be speaking on the benefits and challenges of using cloud computing with precision medicine at the HIMSS and Healthcare IT News Cloud Computing Forum in Orlando, Florida, on February 19, during the 2017 HIMSS Conference & Exhibition. Waltons session is entitled Precision Medicine and the Cloud.

One of the big challenges with cloud computing and precision medicine is peoples fear of data security, Walton said.

The thing people do not realize is that the cloud is probably in some ways more secure than what a lot of people are doing now, he said. I know of some organizations that are fearful of the security of putting the data out there in the cloud but that actually have serious gaping security holes that expose them to far more risk than would happen with cloud computing. Anytime you allow remote access to data your weakest link in security is your employees passwords. If you have any reasonable security, the weakest spot will be at the employee level.

Walton said most companies that provide cloud computing services have excellent reliability and security and can provide these things on a scale that would be difficult for smaller organizations and even challenging for larger organizations.

The issue is it is not cheap, he added. But when you look at all the people you employ for security and backup and maintenance and so forth, for smaller organizations it makes sense to turn to the cloud; for larger organizations, it depends.

In the end, healthcare organizations must understand why they wish to get into cloud computing before they actually do so, Walton advised.

Understand if you are doing it for the right reasons, that you have done a good analysis of not just the real obvious things and are not just jumping into the ring without fully understanding why, he said. If you do it from the perspective of you do not want to be the person who manages servers and worry about backups and data security, essentially what you are doing is putting off a lot of your IT expenses to someone else. You can build a cloud in-house. The question is can you do it more efficiently than someone who's job and mission it is to do that. You have to make sure you know why you are doing it and the benefits you will get from it.

HIMSS17runs from Feb. 19-23, 2017 at the Orange County Convention Center.

This article is part of our ongoing coverage of HIMSS17. VisitDestination HIMSS17for previews, reporting live from the show floor and after the conference.

Like Healthcare IT News onFacebookandLinkedIn

See original here:
Cloud computing key to precision medicine but security concerns ... - Healthcare IT News

The Best Cloud Computing Companies And CEOs To Work For In 2017 Based On Glassdoor – Forbes


Forbes
The Best Cloud Computing Companies And CEOs To Work For In 2017 Based On Glassdoor
Forbes
Employees would most recommend Apprenda, DocuSign, Google, HyTrust, Mendix, M-Files, Mimecast, OutSystems, ProsperWorks and Zerto to their friends looking for a cloud computing company to work for in 2017. These and other insights are from an ...

Follow this link:
The Best Cloud Computing Companies And CEOs To Work For In 2017 Based On Glassdoor - Forbes