Category Archives: Cloud Hosting

Cloud Hosting Solutions – UK Cloud Hosting Provider Netcetera

The term 'cloud' is used to describe a number of very different products, but in our case, it refers to on-demand, scalable, virtualized servers accessible over the internet.

Cloud hosting is a service which allows websites to be hosted on virtual servers. Physical web servers are used to pull together to provide a service for those who need it. It is alternative to single server hosting and many more companies are now using it. It has many advantages over this though. It can be far more reliable as instead on relying on just a single server to host information this is spread over a large network and this means that if one server goes offline, it will have very little effect on the resource.

Cloud hosting also provides better security as each secure machine is on a different site, leaving it less vulnerable to attack. If extra resources are required, then these can be drawn from the range of servers available, whereas relying on one server will not allow this to happen. It is also possible to scale the amount of resources allocated depending on the demand of the particular customer.

You will get a lot more flexible and produce more innovative ways of working without so many concerns about how you will be able to find the resources to be able to carry it out. You will be able to do things faster as well. So there is huge advantage in using a cloud hosting service rather than a single source one. It is well worth looking into it as a solution for your company to see whether you can not only save money, but also find a better way of working as well.

See the original post:
Cloud Hosting Solutions - UK Cloud Hosting Provider Netcetera

Stay out of the hot seat with turnkey private cloud – CIO

Bringing together thought leaders to address the importance of transformation in business today.

sponsored

Stay out of the hot seat with turnkey private cloud

In the pursuit of digital transformation, most organizations will use a mix of Infrastructure as a Service (IaaS) and cloud-native platforms (PaaS) deployed across both public and private cloud called multi-cloud strategy. (See the CIO.com blog: Overcoming the Digital Dilemma for more details.) A cloud operating model is supposed to reduce friction, i.e., make it easy to deliver the digital systems on which most organizations rely. The problem is that implementing and maintaining a private cloud can be challengingthis introduces friction in the form of delays, costs and risk. As we know from physics, friction results in heat, and in this scenario, IT pros whose projects incur undesirable delays, cost and risk find themselves in an uncomfortable placethe hot seat! Choosing a turnkey private cloud instead of pursuing a do-it-yourself (DIY) cloud implementation reduces the friction in your multi-cloud strategy keeping you out of the hot seat.

Most organizations will use both private and public cloud computing for appropriate workloads. Steady-state workloads can be hosted on your private cloud at a lower cost than public cloud hosting while the cost of public cloud can be lower for shorter-lived and spike workloads (see the white paper The Cost of Using the Public Cloud.) A private cloud allows strict control over data placement within the requirements of an organization and therefore public cloud is more likely to be considered for less sensitive workloads, i.e., those with low concerns regarding data sovereignty or the risks of multi-tenancy. The use of the proprietary development and management tools of public cloud platforms creates barriers to exit which can be a crucial concern for systems with strategic value. Choosing the right toolset with which to build your strategic applications provides flexibility to develop and deploy in your choice of the most common private and public clouds.

There are two types of frictionstatic friction and dynamic friction. Static friction exists when an object is at rest and acts to resist the initiation of motion. Dynamic friction comes into play once an object is in motion. These two types of friction can be used in our discussion of the challenges faced with a DIY private cloud. Static friction is created by the activities that must take place between the time when a decision is made to implement a private cloud and the time when its actually up and running in production. Activities involved in standing up your DIY private cloud, such as designing, purchasing, installing, integrating, testing, etc., all serve to resist your move to implement a private cloudthey are the static friction that is preventing you from getting to production! Dynamic friction is experienced once your cloud is up and running. Maintaining, patching, upgrading and scaling your DIY private cloud can consume significant resources and create undesirable risk as they slow down your velocity and generate heatmaking things uncomfortable for all involved.

Adopting a turnkey private cloud reduces friction and accelerate results. (For a discussion of turnkey private cloud see the IDC report: The Power of Hybrid Cloud.) A turnkey private cloud is one that is delivered to your organization ready-to-run. All the components are racked, stacked and cabled. All the software is installed, integrated and configured. In other words, when compared with a DIY approach pursued by many organizations to deploy a private cloud, a turnkey private cloud eliminates the static friction, accelerating results so that your private cloud will be up and running in days instead of months. (See the Principled Technologies report; IT service transformation with hybrid cloud: buy or build?.)

But what about dynamic friction? Once your private cloud is up and running, a different set of challenges must be faced. These are sometimes called day 2 challenges. A turnkey private cloud dramatically reduces the challenges associated with the ongoing operation of your private cloud. It is built in a well-known configuration whereas a DIY private cloud is almost always a one-off, bespoke configuration. Like snowflakes, no two DIY private clouds are the same. As a result, a turnkey private cloud can be patched and upgraded with known validated packages resulting in no drama operations. A turnkey private cloud is supported as one so that there is no multi-vendor finger pointing. With a turnkey private cloud, risk and costs are minimized in a way that dramatically reduces the dynamic friction caused by DIY private cloud operations.

When it comes to implementing a multi-cloud strategy, your approach to private cloud can have a dramatic effect on your organizations results. When compared with a DIY private cloud approach, implementing a turnkey private cloud will dramatically reduce the friction you experience. Less static friction with a turnkey private cloud means that your strategy will be implemented faster, accelerating time-to-value. Less dynamic friction with a turnkey private cloud means that ongoing cost and risk will be reduced, resulting in improved service levels and a better bottom line. Less friction means less heat. Keep yourself out of the hot seat and adopt a turnkey private cloud.

Dell EMC offers the following turnkey private clouds for both IaaS and cloud-native platforms: Dell EC Enterprise Hybrid Cloud, Dell EMC Native Hybrid Cloud and Dell EMC Cloud for Microsoft Azure Stack. For more information on the full suite of turnkey private cloud options available from Dell EMC, visit dellemc.com/cloud.

As a Senior Consultant for Cloud Solutions Marketing at Dell EMC, Bob Ganley is working to speed the time-to-value for customers through the creation of outcome-focused solution content. In prior roles he has experience as a software engineer, product manager and sales professional spanning several generations of enterprise architectures. He is now leveraging that experience to bridge the gap that often exists between traditional product offerings and the real-world results that customers must achieve with information technology. You can follow him on twitter at @ganleybob

Sponsored Links

See the rest here:
Stay out of the hot seat with turnkey private cloud - CIO

Wal-Mart Orders Tech Partners to Get Off Amazon Cloud (MSFT, WMT) – Investopedia

Microsoft (MSFT) has a new ally in its efforts to displace Amazon (AMZN) as the leading cloud computing company: retailer Wal-Mart (WMT)

With competition fierce between Wal-Mart and Amazon, the former has started informing some technology partners that none of the applications they run for the retailer can be housed on Amazon Web Services. Tech executives told the Wall Street Journal that if they want to continue to keep Wal-Mart's business they have to look toward alternatives such as Microsofts Azure cloud service.

While Wal-Mart maintains most of its data internally, a spokesman told the Wall Street Journal there are some instances where some of the cloud apps it uses run on AWS. The spokesman declined to name which apps, but did acknowledge it has urged vendors to use other cloud providers in those cases.

It shouldnt be a big surprise that there are cases in which wed prefer our most sensitive data isnt sitting on a competitors platform, the spokesman told the Journal, noting its a small number of instances. The paper highlighted one example in which data warehousing service Snowflake Computing was asked by a Wal-Mart client to handle its business on the condition that it would run the services on Microsofts cloud. Snowflake obliged and is developing a product for Azure. (See more: Analyst Downgrades Amazon Citing Need for Greater Operating Leverage.)

Wal-Marts push to keep its data off of AWS may not do much to hurt Amazons cloud hosting business, but it could help rivals who are closing the gap on its leadership position in that market. Late last week Pacific Crest Securities said Azure could have more revenue that its main rival for the first time in 2017. In a research note covered by The Street, analyst Brent Bracelin said Microsoft becoming the biggest cloud provider for the first time in 10 years would transition the company from a cloud laggard to a cloud leader. Bracelin said he came to this conclusion after conducting an analysis of the 60 biggest cloud computing companies. (See more: Microsoft Could Surpass Amazon in Cloud Computing This Year)

People jump through hoops to do business with Wal-Mart all the time, said Robert Hetu, an analyst with the research firm Gartner to the Wall Street Journal. That should absolutely accelerate the competition from Azure. Microsoft is already the main cloud infrastructure provider for Jet.com, the ecommerce company Wal-Mart paid $3.3 billion for last fall and currently Wal-Mart is one of Azures biggest customers. The Journal reports that other big players who compete against Amazon are following suit and requesting service providers to stop using AWS, which could prove to be a big boon for the Redmond, Washington-based software company.

The rest is here:
Wal-Mart Orders Tech Partners to Get Off Amazon Cloud (MSFT, WMT) - Investopedia

3 Reasons Why the Private Cloud is Here to Stay – MSPmentor

MSPs know that customers expect both scale and economics when it comes to the cloud.

For most, this means public cloud options like AWS, Google and Azure.

The subtitle for RightScales 2017 State of the Cloud Report says it all: Public cloud adoption grows as private cloud wanes.

Public cloud services dominate news cycles for enterprise IT, and on the surface, the numbers seem to align with this narrative: organizations are increasingly leveraging public and hybrid cloud, while private cloud use feels like part of a forgotten era.

However, in talking to CIOs and IT professionals, I tend to hear a different side of this story.

Private clouds specifically, object storage clouds are alive, well, and becoming more popular.

Between expenses associated with data access, security and privacy concerns, and features added to cloud services, its clear why organizations reap success from private cloud models.

Below are three reasons why MSPs shouldnt expect the private cloud model to go anywhere, anytime soon.

1. Public clouds are expensive though at first glance, the opposite seems true.

Public cloud services main selling point is their low associated costs, which particularly come from organizations not needing to maintain IT infrastructure.

However, low baseline fees dont take into account data migration expenses, egress fees and a general discontent that comes with not being able to predict IT costs beyond your organizations control.

Although object storage clouds require infrastructure maintenance, they also cost less than a cent per gigabyte per month for data storage, compared to the public clouds two to three cents per gigabyte.

In addition, while private cloud services offer opportunities for cost negotiation with MSPs and service providers, public cloud models are usually uncompromising.

After calculating public cloud costs for your particular system up front, you may decide its not worth the price tag.

2. Object storage clouds bring innovation to the private cloud space.

Historically, the private cloud represented technologies such as OpenStack and CloudStack complex, open source architectures favored by organizations without public cloud services on their roadmaps.

For MSP customers, object storage clouds make private cloud a strong alternative to the public cloud; theyre scalable, inexpensive and powered by commodity hardware.

The technology is more than a decade old, but issues with the growing public cloud model has increased focus on object storage.

An organization can use object storage as a private cloud building block that simplifies data storage while driving down pricing, maintaining cost predictability and guaranteeing a secure environment.

3. Private clouds create new opportunities for MSPs.

Roadblocks for public cloud services can become milestones for MSPs.

For organizations looking for the public clouds flexibility, economics and scale with the security and ease of use traditionally found on-premises, MSPs can use private clouds to deliver an ideal compromise.

By combining block storage and object storage in one environment without leveraging the public cloud, MSPs can offer a private cloud environment, expanded and enriched by personalized services, that delivers a managed hosting experience with better capabilities and lower costs than are often realized with public clouds.

Such services can also move organizations down the path toward their IT goals.

For example, private object storage is generally compatible with Amazon S3, leaving the door open for an extended hybrid cloud future.

As an MSP, its always interesting to see how certain trends and technologies come and go in waves.

Although object storage isnt breaking news, if you look closely at the way organizations are using public and private cloud services, its clear that private object storage clouds are bringing innovation to an area often assumed to be going away.

As private cloud and public cloud infrastructures continue to evolve, itll be interesting to see how they both power the fully hybrid, multicloud IT future.

Ellen Rubin is CEO and co-founder ofClearSky Data, whose global storage network simplifies the entire data lifecycle and delivers enterprise storage as a fully managed service. Most recently, Rubin was co-founder of CloudSwitch, a cloud-enablement software company that was acquired by Verizon in 2011.

Go here to read the rest:
3 Reasons Why the Private Cloud is Here to Stay - MSPmentor

Cloud Hosting Leader Infinitely Virtual Now MSP for Duo Security – PRUnderground (press release)

In a bid to quickly and easily secure its customers via trusted multifactor authentication, Infrastructure-as-a-Service (IaaS) leader Infinitely Virtual today announced that the company has become a managed service provider (MSP) for Duo Security (www.duosecurity.com).

Duo Security verifies user identity with an easy-to-use two-factor authentication solution, which will enable Infinitely Virtual to now enforce stronger user access policies. Two-factor strengthens access security by requiring two methods to verify identity: something the user knows, plus something the user has. Duo then checks user devices for out-of-date software and missing security controls. Duo Securitys device access policies will permit Infinitely Virtual to block any risky devices from accessing data and apps at login, protecting client organizations against software vulnerabilities.

Duo Security protects every application organizations deploy, whether on-premises or cloud-based, allowing clients to limit access to their applications based on type of user and device. Through Infinitely Virtual, Duo Security offers single sign-on (SSO), enabling users to log in only once to securely access all of their enterprise cloud applications.

Duo Securitys technology protects against data breaches by ensuring only legitimate users and appropriate devices have access to sensitive data and applications anytime, anywhere, said Adam Stern, founder and CEO, Infinitely Virtual. Because theres no central authority to vouch for whether people are who they say they are, traditional security products have made it difficult and costly for providers like us to set policies across all of our customers endpoints.

By contrast, Duo has made it fast and easy for us to protect our customers and their end-users, all in one place, Stern said. And because Duo Security is now a managed service through IV, our customers dont need to manage multifactor authentication themselves.

In April, Duo Security launchedits Duo Managed Service Provider (MSP) program, giving MSPs the tools to better protect their customers from data breaches in a rapidly changing world of cloud applications and mobile devices. Pricing for Duo Securitys Trusted Access product suite through Infinitely Virtual, including Duos patented Push two-factor authentication technology, starts at $3 per user per month for Duo MFA, moving to $6 per user/month for the popular Duo Access and $9 per user/month for Duo Beyond:

For additional information, visit http://www.infinitelyvirtual.com.

About Duo Security

Duo Security is a cloud-based Trusted Access provider protecting thousands of the worlds largest and fastest-growing organizations, including Dresser-Rand Group, Etsy, Facebook, K-Swiss, Paramount Pictures, Random House, SuddenLink, Toyota, Twitter, Yelp, Zillow and more. Duo Securitys innovative and easy-to-use technology can be quickly deployed to protect users, data and applications from breaches, credential theft and account takeover. The Ann Arbor, Michigan-based company also has offices in San Mateo, California; Austin, Texas and London. Duo Security is backed by Benchmark, Google Ventures, Radar Partners, Redpoint Ventures and True Ventures. Try it for free at http://www.duo.com.

About Infinitely Virtual

The Worlds Most Advanced Hosting Environment. Infinitely Virtual is a leading provider of high quality and affordable Cloud Server technology, capable of delivering services to any type of business, via terminal servers, SharePoint servers and SQL servers all based on Cloud Servers. Named to the Talkin Cloud 100 as one of the industrys premier hosting providers, Infinitely Virtual has earned the highest rating of Enterprise-Ready in Skyhigh Networks CloudTrust Program for four of its offerings Cloud Server Hosting, InfiniteVault, InfiniteProtect and Virtual Terminal Server. The company recently took the #1 spot in HostReviews Ranking of VPS hosting providers. Infinitely Virtual was established as a subsidiary of Altay Corporation, and through this partnership, the company provides customers with expert 247 technical support. More information about Infinitely Virtual can be found at: http://www.infinitelyvirtual.com, @iv_cloudhosting, or call 866-257-8455.

Here is the original post:
Cloud Hosting Leader Infinitely Virtual Now MSP for Duo Security - PRUnderground (press release)

DoD reexamining cloud policies to remove bottleneck for sensitive data – FederalNewsRadio.com

For more than two years, the Defense Department has had procedures in place that, at least on paper, allow its sensitive data be housed in commercial cloud computing facilities. But migrations to the cloud have been relatively few and far between for anything besides public, unclassified data.

Thats partially because for impact levels 4 and above, not only do providers have to earn authorizations that go above-and-beyond the governmentwide FedRAMP process, any data they process also has to make its way through a DoD-provided Cloud Access Point (CAP).

The department is taking a fresh look at that latter point, saying its current CAP policies may be creating an unnecessary roadblock to DoDs cloud ambitions. As of now, there are only two access points in existence one run by the Defense Information Systems Agency and one by the Navy.

Dr. John Zangardi, the departments acting chief information officer said hes asked his office to revisit the policy with an eye toward letting commercial cloud vendors provide a CAP-like capability on their own.

Sponsored Content - Download our Executive Briefing to learn how agency and industry experts are hoping to reduce insider threats.

Its my job to ensure the most effective IT support to the warfighter and to make best use of resources, so the question to my staff is, How can we do CAP better? he said last week at the Defense Cyber Operations Summit in Baltimore, Md. Specifically, can it be provided as a service? Its a significant question, but if it is resolved, it should open opportunities for services and components to move more quickly to commercial cloud providers.

DoDs current policy on access points is laid out in the security requirements guide (SRG) it published in April 2015 and last updated in March of this year. It requires all network traffic thats making its way between DoD systems and a commercial cloud provider to pass through government-operated monitoring systems firewalls and other intrusion prevention systems even when the cloud providers system is operating entirely within a DoD facility.

The overall objectivewill remain the same: giving some reasonable level of assurance that Defense networks cant be penetrated viatheir connections to cloud providers, since most commercial cloud facilities are connected to the public Internet in some fashion, Zangardi said. He said the latest SRG will be updated to reflect any changes in DoDs thinking when we get that far.

Cloud access points are among the issues likely to be raised later this week when DoD hosts an industry day to hash out the issues surrounding a final cybersecurity-focused contracting rule the department issued last October after nearly a year of public comment and revisions.

The final, updated version failed to address industrys concerns, and their representatives have been asking for a face-to-face meeting ever since.

The final version of the update to the Defense Federal Acquisition Regulation Supplement sweeps in what had been two separate interim rules. One portion requires contractors to report any data breaches involving Defense information within 72 hours and implement the National Institute of Standards and Technologys new guidelines for protecting controlled unclassified information by the end of 2017.

A second makes plain that vendors must comply with the controls in DoDs cloud SRG as a condition of their contracts, but goes a few steps further, including demanding that government personnel be allowed to physically enter cloud hosting facilities to conduct audits or inspections.

Thats because according to a 27-page FAQ the department issued earlier this year its interpretation of the Federal Information Security Management Act dictates that it treat any ITsystem thats operated on DoDs behalf as though it were a government operation.

Both before and after the issuance of the final rule, industry officials have expressed confusion over how the new rule fits in with a host of other provisions the government added to the Federal Acqusition Regulation at about the same time including one by the National Archives and Records Administration that set governmentwide definitions for what constitutes controlled unclassified information, and another new FAR provision that requires all federal contractors to come into compliance with at least some of NISTs guidelines for protecting CUI.

Our objective at this meeting is to clarify some foundational questions, Zangardi said. What are the clauses? What is Covered Defense Information? How is it identified and marked? How does the rule work in the cloud computing environment? It should be a substantive, productive discussion.

See original here:
DoD reexamining cloud policies to remove bottleneck for sensitive data - FederalNewsRadio.com

Cloud Hosting Firm Contegix Acquires MSP BlackMesh – ChannelE2E

by Ashley Smith Jun 14, 2017

Cloud hosting provider Contegix has acquired BlackMesh Inc., an Ashburn, Virginia-based managed service provider (MSP) that specializes in web hosting, cloud services and secure compliance. Financialterms were not disclosed.

This is the third acquisition for St. Louis-based Contegix since November 2016, when the bulk of the company was sold to Strattham Capital, a private equity firm that invests in high-potential business IT companies.

The addition of BlackMesh is a significant leap forward for Contegix and our customers, Contegix CEO David Turner said in a statement. BlackMeshs advanced web content management capabilities, coupled with their federal security and compliance certifications, align very well with our existing portfolio of application management services designed for the DevOps community.

The combined company now serves more than 1,200 customers, both in the public and private sectors.

Founded in 2003, BlackMesh describes itself as an MSP that delivers secure, compliant and scalable application hosting platforms and cloud-based solutions to business of all sizes, government agencies and nonprofits.

The company manages and supports open-source web content management platforms such as Drupal, Magento and WordPress, and it offers end-to-end managed hosting, managing the application down to the infrastructure layer. It has three data center locations.

Jason Ford, CTO and co-founder of BlackMesh, will assume the role of CTO and CISO of Contegix and will help lead the integration.

Together, Contegix and BlackMesh will enhance both companies ability to serve clients in a secure, efficient, and scalable fashion, Ford said in a statement. We are very excited to join the Contegix team and to continue to provide high-touch service and support to our customers while expanding the services we offer.

Less than a month after its investment from Strattham Capital, Contegix announced on November 29 that it had acquired Reading, Pennsylvania-based MSP Distributed Systems Services, Inc.

The move boosted Contegix to a $50 million-plus company with more than 700 clients and 190 employees.

Just over two weeks later, on December 13, Contegix acquired Admo.net Web Services, LLC, a cloud and managed services provider based in Kansas City, Missouri, adding another 60 clients to its roster.

Contegix provides application management to the DevOps community, as well as a suite of cloud and managed services to enterprise customers. The company calls itself a global leader in Atlassian hosting.

The firm has data centers in St. Louis and Kansas City, Missouri; Reading and Bethlehem, Pennsylvania; Dallas, Texas; and Amsterdam.

Read more:
Cloud Hosting Firm Contegix Acquires MSP BlackMesh - ChannelE2E

7 Tips for Securely Moving Data to the Cloud – Government Technology (blog)

A few years back, an unmistakable trend emerged that cloud computing was growing in both percentage of organizations adopting cloud solutions as well as the amount and type of data being placed in the cloud.

Earlier this year, I highlighted research that made it clear that trust and risks are both growing in government clouds. Since that time, many readers have asked for more specific guidance about moving more data to the cloud in the public and private sectors. I was asked: What are the right cloud questions?

Questions like: Where are we heading with our sensitive data? Will cloud computing continue to dominate the global landscape? These are key questions that surface on a regular basis.

The forecast for the computer industry is mostly cloudy. Here are some of the recent numbers:

Back at the end of last year, The Motley Fool reported 10 Cloud Computing States That Will Blow You Away, and the last three listed are especially intriguing to me. Here they are:

IoT, Other Trends and the Cloud

And while it is true that the Internet of Things (IoT) has taken over the mantle as the hottest trend in technology, the reality is that The Internet of Things and digital transformation have driven the adoption of cloud computing technology in business organizations, according to a U.S.-based cloud infrastructure firm Nutanix.

This article from CxO Today lays out the case that the cloud remains the most disruptive force in the tech world today. Why?

While premise-based IT software and tools have their own advantages, the global trend is for cloud based applications since they offer more connectivity and functionalities than legacy systems. Moreover, enterprises are naturally gravitating towards it as the technology is reasonably reliable, affordable, and provides them access to other new and emergent technologies as well as high end skills. The cloud boom is also propelled by the fact that enterprises are trying to improve performance and productivity over the long term. Looking at the tremendous response for cloud services, several IT companies are designing applications meant solely for pure cloud play.

Other experts say that several overlapping trends are colliding as The edge is eating the cloud. These trends include:

Overcoming Fears in the Cloud

And yet, there are plenty of enterprises that continue to have significant concerns regarding cloud computing contracts. Kleiner Perkins Mary Meeker highlighted the fact that cloud buyers are kicking the tires of multiple vendors while becoming more concerned about vendor lock-in.

Also, technology leaders often move to the cloud to save money, but CFOs are now telling IT shops to cut costs in the cloud fearing that resources are being wasted. For example:

Also, while overall trust in cloud infrastructure is higher, new concerns are rising about application security delivered through the cloud.

My 7 Tips for Moving Data into the Cloud

So what can technology and security leaders do to protect their data that is moving to the cloud?

Here are seven recommendations that can help you through the journey. Note that the first four items are largely best practices about your current data situation and options before your data moves.

1) Know your data. I mean, really know what is happening now before you move the data. Think about the analogy of a doing a house cleaning and organizing what you own before putting things in storage to sell your house.

If you dont want to catalog everything (which is a mistake), at least know where the most important data is. Who is doing what regarding the cloud already? What data is sensitive? This is your as is data inventory situation with known protections of current data. And dont forget shadow IT. There are plenty of vendor organizations that can help you through this process.

2) Have a defined and enforced data life cycle policy. You need to know what data is being collected by your business processes, where does it go, who is accountable (now) and what policies are in force.

Ask: Is there appropriate training happening now? Is it working? What policies are in place to govern the movement of your data? For example, my good friend and Delaware CSO Elayne Starkey does a great job in this area of policies. You can visit this Web portal for examples: https://dti.delaware.gov/information/standards-policies.shtml

3) Know your cloud options: Private, public, hybrid or community cloud? This simple step often gets confusing, in my experience, because some staff mix these terms up with the public sector and private sector definitions wrongly thinking that a private cloud means private-sector-owned cloud.

Here are some basic cloud definitions to ponder with your architecture team:

Private Cloud: The organization chooses to have its own cloud where the resource pooling is done by the organization itself (Single Organization cloud). May be or may not be on premises (in your own data centers.)

Public Cloud: Different tenants are doing the resource pooling among the same infrastructure.

Pros: It can be easily consumable, and the consumer can provision the resource.

Cons: Consumer will not get the same level of isolation as a Private cloud.

Community Cloud: Sharing the cloud with different organizations usually unified by the same community sharing underlined infrastructure (halfway between private and public) small organizations pooling resources among others. For example, some state and local government organizations share email hosting with other state and local governments in the U.S. only.

Hybrid: Mixture of both private and public i.e., some organization might say we would like elasticity and cost effectiveness of public cloud and we want to put certain applications in private cloud.

4) Understand and clearly articulate your Identity and Access Management (IAM) roles responsibilities and demarcation points for your data. Who owns the data? Who are the custodians? Who has access? Who can add, delete or modify the data? Really (not just on paper)? How will this change with your cloud provider?

Build a system administration list. Insist on rigorous compliance certifications Incorporate appropriate IAM:Incorporate appropriate IAM from the outset, ideally based on roles, especially for administration duties. When you move to the cloud, the customers, not the provider, are responsible for defining who can do what within their cloud environments. Your compliance requirements will likely dictate what your future architecture in the cloud will look like. Note that these staff may need background checks, a process to update lists (for new employees and staff that leave) and segregation of duties as defined by your auditors.

5) Apply encryption thinking end to end data at rest and data in transit. We could do an entirely separate blog on this encryption topic, since a recent (and scary) report says there is no encryption on 82 percent of public cloud databases. Here are a few points to consider. Who controls and has access to the encryption keys? What data is truly being encrypted and when? Only sensitive data? All data?

6) Test your controls. Once you move the data, your cloud solution vulnerability testing should be rigorous and ongoing and include penetration testing. Ask: How do you truly know your data is safe? What tools do you have to see your data in the cloud environment? How transparent is this ongoing process?

The cloud service provider should employ industry-leading vulnerability and incident response tools. For example, solutions from these incidence response tools enable fully automated security assessments that can test for system weaknesses and dramatically shorten the time between critical security audits from yearly or quarterly, to monthly, weekly, or even daily.

You can decide how often a vulnerability assessment is required, varying from device to device and from network to network. Scans can be scheduled or performed on demand.

7) Back up all data in a distinct fault domain.

Gartner recommends: To spread risk most effectively, back up all data in a fault domain distinct from where it resides in production. Some cloud providers offer backup capabilities as an extra cost option, but it isnt a substitute for proper backups. Customers, not cloud providers, are responsible for determining appropriate replication strategies, as well as maintaining backups.

Final Thoughts

No doubt, managing your data in the cloud is a complex and ongoing challenge that includes many other pieces beyond these seven items. From contract provisions to measuring costs incurred for the services to overall administration functions, the essential data duties listed are generally not for technology professionals or contracts pros lacking real experience.

Nevertheless, all organizations that move data into and out of cloud providers data centers are constantly going through this data analysis process. Just because you moved sensitive data in the cloud five years ago for one business area does not mean that new business areas can skip these steps.

If you are in a large enterprise, you may want to consider adding a cloud computing project management office (PMO) to manage vendor engagement and ensure the implementation of best practices across all business areas.

And dont just fall for the typical line: I know xyz company (Amazon or Microsoft or Google or fill-in-the-blank) is better at overall security than we are so just stop asking questions. Yes these companies are good at what they do, but there are always trade-offs.

You must trust but verify your cloud service because you own the data. Remember, you can outsource the function, but not the responsibility.

Excerpt from:
7 Tips for Securely Moving Data to the Cloud - Government Technology (blog)

Tata Communications expands IZO Private Cloud footprint with focus on data sovereignty – ETCIO.com

London: Tata Communication today announced the launch of three new nodes for its IZO Private Cloud service to support enterprises hybrid cloud adoption while ensuring regulatory compliance. The new private cloud nodes in Germany, United Arab Emirates (UAE) and Malaysia enable CIOs to gain unprecedented control over all their applications by creating a truly hybrid, high-performance IT infrastructure where different cloud, co-location and managed hosting environments work together as one.

Today, different clouds often operate in silos, resulting in a complex environment which can hold back enterprises digital transformation. The fully-managed IZO Private Cloud service addresses this complexity by enabling CIOs to create a hybrid IT environment that combines the flexibility of public cloud with enterprise-grade security. It also gives CIOs complete control of the residency of their data, while keeping up with employees demands for mobile, collaborative and social ways of working.

IZO Private Cloud now spans across 13 locations. In addition to Germany, UAE and Malaysia, Tata Communications has private cloud nodes in India, Singapore, Hong Kong and the UK. The new private cloud nodes address the needs of enterprises in industries with stringent regulatory requirements, including aviation, healthcare, manufacturing, media, banking, IT, financial services and insurance, retail and ecommerce.

Tata Communications IZO Private Cloud customers in Europe, Middle East and Asia Pacific include Constantin Medien, a Germany-based international media company specialising in sports, entertainment and event marketing, and its media production subsidiary PLAZAMEDIA; Khimji Ramdas, an Oman-based conglomerate with operations across consumer products, infrastructure, lifestyle and logistics; and SkyLab, a Singapore-based IoT technology solutions provider.

Fred Kogel, CEO of Constantin Medien AG, said: There are major opportunities for us to transform how we operate and expand PLAZAMEDIAs digital portfolio of video contribution and distribution services through global connectivity and the cloud. Tata Communications new private cloud in Germany will ensure the security and sovereignty of our and our customers data, which is a key consideration for businesses in this region.

In todays digital economy, enterprises growth is fuelled by cloud-based applications and data, said Srinivasan CR, Senior Vice President, Global Product Management & Data Centre Services - Tata Communications. Yet, the sovereignty and security of these critical assets is a major concern for CIOs. As a global cloud provider with a local presence, we address these concerns by giving CIOs complete visibility and control over their entire IT estate, across all networks and devices, and empower them to drive organisation-wide digital transformation with maximum agility.

The expansion of IZO Private Cloud in Europe, Middle East and Asia Pacific strengthens Tata Communications ability to capitalise on the growth of the private cloud services market in these regions, worth more than USD 77.7 billion in total.

The value of cloud computing to drive business transformation is indisputable. However, the security aspect of cloud deployments in the context of data privacy, compliance and cyber security top the list of enterprise concerns, said Agatha Poon, Research Director, Asia-Pacific Services - 451 Research. For cloud providers to become a trusted partner, they must demonstrate a right balance between technical strength and operational excellence while mitigating any business risks existed in today enterprises IT infrastructures.

The IZO Private Cloud service is underpinned by Tata Communications global network, IZO ecosystem, and partnerships with worlds biggest clouds - Microsoft Azure, Amazon Web Services, Google Cloud Platform, Office 365 and Salesforce. Today, over 25% of the worlds Internet routes travel over Tata Communications network and the company is the only Tier-1 provider that is in the top five by routes in five continents.

Continue reading here:
Tata Communications expands IZO Private Cloud footprint with focus on data sovereignty - ETCIO.com

XSS Just One Part of Broad Application Threat Landscape: Report – Web Host Industry Review

Only one out of 1,000cross-site scripting attacks (.001 percent) progress and require a security response, according to research released Tuesday by application security company tCell.

The State of In-Production Application Security report, drawn from analysis of more than 30 major enterprise applications in production, shows that over 40 percent of organizations experience account takeover attacks unrelated to software flaws over just a 30-day period. These attacks typically leverage large credential breaches, and 85 percent of them successfully compromise a user.

More than 90 percent of organizations have orphan application routes, or API endpoints which have been forgotten and left open. More than a quarter of companies have over 100 such vulnerabilities, which represent an attack surface with no business benefit, according to tCell.

Hosting Firm Restores Servers After Ex-Employee Deletes Everything

Many enterprise organizations start out thinking they have to replicate the traditional data center security stack for cloud environments, Michael Feiertag, tCell CEO said in a statement. The reality is that its a different, far more dynamic world, with a lot of effort from the cloud provider on securing that infrastructure. Organizations need to focus on protecting whats theirs, the application, which enables all of the goodness that is cloud without weighing it down.

The report findings and insights about securing production applications gathered by tCell since it began broad customer deployments last year underscore the variety of application attack vectors and types, which the company says go beyond the OWASP Top 10.

Along with the report, tCell announced expanded product functionality and platform support. The company now supports enterprise .NET applications, and its latest release adds point-of-attack instrumentation to determine if command injection attempts have breached the app, and field-level encryption for increased data security in regulated industries like healthcare and financial services.

See more here:
XSS Just One Part of Broad Application Threat Landscape: Report - Web Host Industry Review