Category Archives: Cloud Servers
Packet Takes Its ‘Go Anywhere’ Bare Metal Cloud to the Edge – Data Center Frontier
Packet can dep[oy network-dense racks like this one. But it sees future growth in streamlined infrastructure for edge computing. (Photo: Packet)
Bare metal cloud server specialist Packet has emerged as an early adopter of edge computing capacity. The developer-centric company is building a go anywhere cloud model, and expects to deploy its gear in hundreds or even thousands of locations.
We believe that edge computing will be radically different than the public cloud and its not because of robot surgery and flying taxis just yet, Packet says. Instead, the edge is being driven by the opportunity to engage in a deep manner with billions of people and trillions of devices in a cost-effective way.
The latest deployment of the Packet platform is at the EdgeConneX data center in Detroit, and will support the Sprint Curiosity IoT platform, which runs on Packets bare metal edge cloud infrastructure.
The goal of edge computing is to process data and services as close to the end user as possible. The trend is driven by the increased use of consumer mobile devices, especially consumption of video and virtual reality content and the growth of sensors as part of the Internet of Things (IoT).
The Sprint Curiosity IoT offering features a dedicated, virtualized network designed to be managed by software. Its based on Sprints Curiosity operating system, which can manage a range of devices across a distributed network.
What makes this deployment special is that were able to activate three key elements at once: bare metal cloud, rich local connectivity, and end-to-end wireless IoT services, said Zachary Smith, the CEO of Packet. Its like an edge computing triple play.
Packet was founded in 2014 to deliver automated infrastructure for developers. Its canvas is bare metal cloud, a term for dedicated servers that can be provisioned with cloud-like ease and speed. Packet says it can deploy bare metal hardware in 8 minutes, providing customers with dedicated boxes on a rapid-deploy model, similar to spinning up virtual server instances on major cloud platforms as opposed to the hours or days to deploy a typical dedicated server.
The company operates 20 data centers around the world, serving 18,000 customers. Smith said Packet deploys tens of thousands of servers per year.
Fast, automated delivery of compute capacity is the future, and Packet is optimized for younger developers.
Zachary Smith, the CEO of Packet.
We think the world is dominated by software, and were focused on the millennial or Gen Z buyer, because theyre definitely going to be the boss in the next 10 years. This generation just doesnt (care about the infrastructure). Our biggest challenge is building an experience that is amazing and that people love.
The (current) delivery model is awful, Smith added. You have to dramatically increase your expectations. These millennials didnt grow up when the Internet didnt work. Its just magic that happens on the other side of the tube. I think thats going to be a challenge for data center operators.
Smith was previously on the leadership team at hosting company Voxel, and founded Packet with his twin brother Jacob (the Chief Marketing Officer for Packet) and Aaron Welch. Its leadership includes veterans of SoftLayer, Puppet and Internap.
Packet has raised $36 million in three funding rounds, with investors including Softbank, Dell Technologies, Samsung, Battery Ventures and Third Point Ventures. Although it has a large customer base, the majority of its revenue comes from its top 100 customers, which are primarily enterprises. This week the company added Gary Green, who has enterprise software experience at VMware and Synopsys, as Chief Revenue Officer.
Packets largest customer is Sprint, and the new deployment with EdgeConneX is designed to benefit Detroits growing autonomous vehicle and manufacturing sectors.
EdgeConneX deployed one of the earliest edge networks. Since 2013 the company has built 40 data centers, extending large-scale content distribution to markets across the U.S., Europe and South America.The companys data centers are designed as lights out unmanned facilities, using sophisticated monitoring and remote hands for maintenance. EdgeConneX also operates a network of more than 3,000 wireless stations, which provides the company with deep insights on data traffic and where it needs to go.
Given the rapid surge in the volume, velocity and variety of data, we will witness rapid growth of infrastructure being built at the Edge, said Phillip Marangella, Chief Marketing Officer at EdgeConneX. Packets deployment in Detroit is emblematic of the future demands at the Edge where emerging applications and workloads like IoT or Autonomous Vehicles are highly latency sensitive and require very proximate compute resources.
Packet works with a number of data center operators, and is an early user of micro-modular data centers near wireless tower infrastructure. Packet has deployed gear in the first two Vapor IO sites at cell tower locations in Chicago, rolling out a 5G-as-a-Service (5GaaS) offering as part of Vapor IOs Kinetic Edgeplatform. Packet also installed its equipment at an SBA Communications data center at a tower site near Gillette Stadium in Foxborough, Massachusetts.
Weve looked at partners who have vast real estate, said Smith. Today the (data center) supply chain is built almost exclusively for hyperscalers and putting 100,000 things in one place, and not 100 things in 1,000 places.
My goal isnt to be in Boston, Smith added. My goal is to be able to go wherever that customer wants me to go. I think this is the new central office model.
And as with the old telephone network, the customer doesnt much care what the central office looks like. They just want a dial tone. Sorting out the business models is more complicated.
Its still a work in progress, said Smith. But the enterprise customers know this is complicated and they want to find a platform and partners that can invest in. I think its moving incredibly quickly. The desire is there now. We see a lot of demand.
As the business models for edge computing come into focus, Smith says, that demand will translate into rapid growth.
Nothing makes people get serious like dollars, said Smith. Youll see us deploy more data centers in the next 12 months than in the last five years.
To me, this looks like 2006 in the cloud, he said. Infrastructure is exciting again. Im excited every day about what were building.
Continued here:
Packet Takes Its 'Go Anywhere' Bare Metal Cloud to the Edge - Data Center Frontier
Global Cloud Computing in Education Sector Industry Forecast to 2025 with Top Key Manufacturers – Eastlake Times
TheGlobal Cloud Computing in Education Sector Marketstatus, future forecast, growth opportunity, key market and key players. The study objectives are to present the Cloud Computing in Education Sector development in United States, Europe and China.
Cloud computing is the on demand availability of computer system resources, especially data storage and computing power, without direct active management by the user. The term is generally used to describe data centers available to many users over the Internet. Large clouds, predominant today, often have functions distributed over multiple locations from central servers. If the connection to the user is relatively close, it may be designated an edge server.
In 2018, the global Cloud Computing in Education Sector market size was xx million US$ and it is expected to reach xx million US$ by the end of 2025, with a CAGR of xx% during 2019-2025.
Request a sample of this report @http://orbisresearch.com/contacts/request-sample/2986623.
The key players covered in this study
Amazon Web Services
Microsoft Azure
IBM
Aliyun
Google Cloud Platform
Salesforce
Rackspace
SAP
Oracle
Dell EMC
Adobe Systems
Verizon Cloud
NetApp
Baidu Yun
Tencent Cloud
Blackboard
Market segment by Type, the product can be split into
Infrastructure as a Service (IaaS)
Platform as a Service (PaaS)
Software as a Service (SaaS)
Market segment by Application, split into
K-12 Schools
Higher Education
Market segment by Regions/Countries, this report covers
United States
Europe
China
Japan
Southeast Asia
India
Central & South America
If enquiry before buying this report @http://orbisresearch.com/contacts/enquiry-before-buying/2986623.
The study objectives of this report are:To analyze global Cloud Computing in Education Sector status, future forecast, growth opportunity, key market and key players.To present the Cloud Computing in Education Sector development in United States, Europe and China.To strategically profile the key players and comprehensively analyze their development plan and strategies.To define, describe and forecast the market by product type, market and key regions.
About Us:Orbis Research (orbisresearch.com) is a single point aid for all your market research requirements. We have vast database of reports from the leading publishers and authors across the globe. We specialize in delivering customized reports as per the requirements of our clients. We have complete information about our publishers and hence are sure about the accuracy of the industries and verticals of their specialization. This helps our clients to map their needs and we produce the perfect required market research study for our clients.
Contact Us:Hector CostelloSenior Manager Client Engagements4144N Central Expressway,Suite 600, Dallas,Texas 75204, U.S.A.Phone No.: +1 (214) 884-6817; +912064101019Email ID:sales@orbisresearch.com
See more here:
Global Cloud Computing in Education Sector Industry Forecast to 2025 with Top Key Manufacturers - Eastlake Times
Microsofts Biggest Business Could Be the Cloud by 2023, Analyst Says – Barron’s
Text size
Microsoft stock has had a remarkable renaissance with CEO Satya Nadella at the helm, driven in particular his full embrace of cloud computing, with the companys Azure public cloud business at the heart. And as one analyst notes this morning, Azure could become a bigger operation than you likely imagine.
In a research note Monday, Stifel Nicolaus analyst Brad Reback notes that Azure is already at a $17 billion annualized revenue run ratebut that businesses that are potential customers are still in the early stages of shifting computing to the could, with less than 10% penetration so far. He notes that there is potential for more of the kind of gigantic contracts like the recent Defense Department $10 billion JEDI contract, in which Microsoft beat out a competing bid by Amazon.coms (AMZN) Amazon Web Services.
Reback rates Microsoft stock (ticker: MSFT) at Buy with a $160 price target.
Over the last several quarters, it has become increasingly clear, based on large cloud deal activity and strong hybrid cloud growth, that as enterprises of all sizes begin their respective digital transformation journeys, Microsoft is effectively tapping into sizable Tier-1 enterprise workloads that it was not directly exposed to in the client/server world, Reback writes. The emergence of the hyperscale cloud has fundamentally changed the landscape, allowing Microsoft to now provide many of the services to customers that it used to leave for partners.
In his note, Reback rattles off a string of recent large public cloud contracts. Snap (SNAP) has a $2 billion, five-year deal with Alphabets (GOOGL) Google Cloud, and a $1.1 billion commitment over five years to AWS. Pinterest (PINS) has a $750 million six-year deal with AWS. Apple (AAPL) spent $350 million with AWS in 2018. And he points out that Allianz, AT&T, BMW, Campbell, Columbia Sportswear, GAP, Kroger, Nuance, Salesforce.com, Sony, Walgreens Boots Alliance and Walt Disney Studios, among others have all announced large Azure deals with Microsoft in recent quarters.
His point is that there remains a huge opportunity for the public cloud players, including AWS, Google Cloud, and Azure.
Reback models the Azure business, which was $12.4 billion in the June 2019 fiscal year, at $26.7 billion in fiscal 2021, $35.2 billion in fiscal 2022, and more than $90 billion by fiscal 2030. By FY 2023, he says, it will be larger than Microsoft Office, Windows, and the Server and Tools businesses.
Microsoft stock is down 0.3%, to $149.59.
Write to Eric J. Savitz at eric.savitz@barrons.com
Excerpt from:
Microsofts Biggest Business Could Be the Cloud by 2023, Analyst Says - Barron's
The Top Cloud Computing Books You Need to Read in 2020 – Solutions Review
Are you looking for the best books on cloud computing to read in 2020? The cloud is one of the quickest-growing technologies of recent years; more and more businesses are looking for cloud knowledge and expertise in their current and prospective employees. Books, whether hardcover or digital, are an excellent source for people looking to learn about a specific field of technology, and the cloud is no exception.
Weve listed the top 12 cloud computing books that you should add to your reading list below! These books are intended for beginners and experts alike and are written by authors with proficiency and/or recognition in the field of cloud computing.
If youre looking for a managed service provider to help you manage your cloud deployments, you should check out our free MSP Buyers Guide! The guide contains profiles on the top cloud managed service providers for AWS, Azure, and Google Cloud, as well as questions you should ask vendors and yourself before buying. We also offer an MSP Vendor Map that outlines those vendors in a Venn diagram to make it easy for you to select potential providers.
by Stephen Orban
Cloud computing is the most significant technology development of our lifetimes. It has made countless new businesses possible and presents a massive opportunity for large enterprises to innovate like startups and retire decades of technical debt. But making the most of the cloud requires much more from enterprises than just a technology change. Stephen Orban led Dow Joness journey toward digital agility as their CIO and now leads AWSs Enterprise Strategy function, where he helps leaders from the largest companies in the world transform their businesses.
by Michael J. Kavis
Cloud computing is all the rage, allowing for the delivery of computing and storage capacity to a diverse community of end-recipients. However, before you can decide on a cloud model, you need to determine what the ideal cloud service model is for your business. Architecting the Cloud is vendor neutral and guides you in making one of the most critical technology decisions that you will face: selecting the right cloud service model(s) based on a combination of both business and technology requirements.
by Moe Abdula, Ingo Averdunk, Roland Barcia, Kyle Brown, and Ndu Emuchay
As cloud technologies continue to challenge the fundamental understanding of how businesses work, smart companies are moving quickly to adapt to a changing set of rules. Adopting the cloud requires a clear roadmap backed by use cases, grounded in practical real-world experience, to show the routes to successful adoption. The Cloud Adoption Playbook helps business and technology leaders in enterprise organizations sort through the options and make the best choices for accelerating cloud adoption and digital transformation.
by Thomas Erl
Thomas Erl, one of the worlds top-selling IT authors, teams up with cloud computing experts and researchers to break down proven and mature cloud computing technologies and practices into a series of well-defined concepts, models, technology mechanisms, and technology architectures, all from an industry-centric and vendor-neutral point of view. In doing so, the book establishes concrete, academic coverage with a focus on structure, clarity, and well-defined building blocks for mainstream cloud computing platforms and solutions.
by Thomas Erl
Best-selling service technology author Thomas Erl has brought together the de facto catalog of design patterns for modern cloud-based architecture and solution design. More than two years in development, this books 100+ patterns illustrate proven solutions to common cloud challenges and requirements. Its patterns are supported by rich, visual documentation, including 300+ diagrams.
by Ray Rafaels
Your Complete Guide to Cloud Computing and Migrating to the Cloud. This book covers not only the technical details of how public and private cloud technology works but also the strategy, technical design, and in-depth implementation details required to migrate existing applications to the cloud. After reading this book, you will have a much better understanding of cloud technology and the steps required to quickly reap its benefits while at the same time lowering your IT implementation risk.
by Joe Weinman
Cloudonomics radically upends the conventional wisdom, clearly explains the underlying principles and illustrates through understandable examples how Cloud computing can create compelling valuewhether you are a customer, a provider, a strategist, or an investor. Cloudonomics covers everything you need to consider for the delivery of business solutions, opportunities, and customer satisfaction through the Cloud, so you can understand it. Cloudonomics also delivers insight into when to avoid the cloud, and why.
by Raj Samani, Brian Honan, and Jim Reavis
CSA Guide to Cloud Computing brings you the most current and comprehensive understanding of cloud security issues and deployment techniques from industry thought leaders at the Cloud Security Alliance. For years the CSA has been at the forefront of research and analysis into the most pressing security and privacy related issues associated with cloud computing. CSA Guide to Cloud Computing provides you with a one-stop source for industry-leading content, as well as a roadmap into the future considerations that the cloud presents.
by James Bond
If youre planning your long-term cloud strategy, this practical book provides insider knowledge and actionable real-world lessons regarding planning, design, operations, security, and application transformation. This book teaches business and technology managers how to transition their organizations traditional IT to cloud computing. Rather than yet another book trying to sell or convince readers on the benefits of clouds, this book provides guidance, lessons learned, and best practices on how to design, deploy, operate, and secure an enterprise cloud based on real-world experience.
by Zeal Vora
Automating security tasks, such as Server Hardening with Ansible, and other automation services, such as Monit, will monitor other security daemons and take the necessary action in case these security daemons are stopped maliciously. In short, this book has everything you need to secure your Cloud environment with. It is your ticket to obtain industry-adopted best practices for developing a secure, highly available, and fault-tolerant architecture for organizations.
by Kief Morris
Ideal for system administrators, infrastructure engineers, team leads, and architects, this book demonstrates various tools, techniques, and patterns you can use to implement infrastructure as code. In three parts, youll learn about the platforms and tooling involved in creating and configuring infrastructure elements, patterns for using these tools, and practices for making infrastructure as code work in your environment.
by Edward Mahon
An inconsistency exists between actual cloud adoption rates and the viewpoints and direct actions of those responsible for corporate information technology operations. On the one hand, information technology (IT) leaders generally believe the cloud more easily enables the implementation and management of technology services, from web and mobile application development to on-demand computing and storage. These leaders also appreciate the clouds consumption-based pricing model over their current capital-intensive cost structures.
Looking for more info on managed service providers for your cloud solutions? OurMSP Buyers Guidecontains profiles on the top cloud managed service providers for AWS, Azure, and Google Cloud, as well as questions you should ask vendors and yourself before buying. We also offer anMSP Vendor Mapthat outlines those vendors in a Venn diagram to make it easy for you to select potential providers.
Check us out onTwitterfor the latest in Enterprise Cloud news and developments!
Dan is a tech writer who writes about Enterprise Cloud Strategy and Network Monitoring for Solutions Review. He graduated from Fitchburg State University with a Bachelor's in Professional Writing. You can reach him at dhein@solutionsreview.com
Related
See the original post here:
The Top Cloud Computing Books You Need to Read in 2020 - Solutions Review
Cisco Restructuring To ‘Strongly Position The Company Against Our Competitors’ – CRN: The Biggest Tech News For Partners And The IT Channel
Cisco Systems is shaking up several of its business units, along with its leadership, to better address how customers are buying its networking and cloud products, according to an internal email obtained by CRN after a published report on Wednesday.
Cisco's enterprise networking and data center networking units will be combined, according to the email sent by David Goeckeler, Cisco's executive vice president and general manager of Networking and Security, which was first viewed and reported by The Information.
Cisco is also renaming its existing cloud computing business to Cloud Strategy and Compute and expanding the segment to include server products, the email explained.
While I know these updates may seem like a significant change, its important to understand, this alignment will strongly position the company against our competitors, according to Goeckeler in his email.
[Related: Former Cisco Channel Exec Nirav Sheth Jumps To Google Cloud]
"We are focused on the tremendous opportunities in front of us across cloud, automation, 5G, security and collaboration, said a Cisco spokesperson. To continue the great progress already made, we are making some organizational updates to our networking and security business. This changes will better align our development process with our customer's needs as they transition to a multi-domain approach."
In the midst of the shakeup, several Cisco executives are being reassigned, according to the email. Perhaps the biggest change-up is Cisco's Dave Ward, chief technology officer of engineering and chief architect, who will be stepping down from his post to take a new role inside the company. Roland Acra, senior vice president and general manager of Ciscos Data Center business unit, will be his replacement, according to the email.
The soon-to-be combined enterprise and data center networking unit will be led by Cisco's Senior Vice President and General Manager of enterprise networking, Scott Harrell, who has spent the last two years leading the enterprise networking business unit and has an 18-year tenure with Cisco.
Liz Centoni, a 19-year Cisco veteran and current senior vice president and general manager of IoT for the tech giant, will be heading up the new Cloud Strategy and Compute business unit. Cloud computing's former leader, Kip Compton, will move to Ciscos Networking and Security Business group, Goeckeler said in his email.
Cisco's network orchestration products, which are currently part of the Cloud Platforms and Solutions group, will become part of the Service Provider Business. Leading the group will still be Jonathan Davidson, Cisco's senior vice president and general manager of Service Provider Networking.
Cisco's restructure comes at a time in which the San Jose, Calif.-based networking titan is seeing some impact from a "challenging" macro-economic climate, according to Cisco CEO Chuck Robbins. Cisco's stock declined by 5 percent in after-hours trading following its Q2 2020 earnings call last Wednesday.
Cisco in recent years has felt the pinch in its data center business as more service providers virtualize their networks and are buying cheaper, white box gear. Many business customers are also moving to public cloud environments and out of the their private data centers.
A reorganization isn't a surprise, especially as more customers move to the cloud and evolve their IT infrastructures, according to one executive for a Cisco partner that did not want to be identified.
"We're not leading with Cisco today but we are using them as needed. We are also displacing Cisco in some cases," the executive said. "But no matter how many times they restructure, they are still the behemoth."
Cisco said it will not be changing any of its products in the wake of the reshuffle.
See original here:
Cisco Restructuring To 'Strongly Position The Company Against Our Competitors' - CRN: The Biggest Tech News For Partners And The IT Channel
Kubernetes and the misconception of multi-cloud portability – Diginomica
Container news is flowing hot and heavy this week with the Linux Foundation KubeCon event, now 12,000 strong, serving as the backdrop for no less than 70 vendor and foundation announcements by my count of the pre-event press packet.
Most of these are feature updates and enhancements to a products container support, i.e. routine vendor news piggybacking off a major conference to magnify their reach.
However, the overriding theme of the event is the expanding penetration of containers in general and the Kubernetes management software in specific as an application platform.
Indeed, as Martin Banks recent column on VMworld Europe illustrated, much of the Kubernetes enthusiasm comes from enterprises seeing it as an alternative to full VMs, particularly now that VMware has given its imprimatur by tightly integrating containers into its ubiquitous infrastructure management software.
There are several incentives for the transition from VMs to containers, including more efficient resource usage, the availability of sophisticated workload management software like Kubernetes, a robust and growing software ecosystem (as evidenced at KubeCon) and a more rapidly scaled platform. However, one of the oft-cited reasons for container adoption, easy workload portability between cloud platforms, seems based more in theory than in practice.
Banks states the commonly-held container portability case this way:
But in practice, the Kubernetes/container movement has already created an environment where it is possible to package up an application and its associated data and move it to a more suitable platform. In future years that is likely to become the common approach, a move made without even thinking or, perhaps, not knowing it has happened.
I contend that such transparent, incognizant workload movement is only possible on vanilla container platforms for the simplest of applications and that in actuality, the dream of automated, multi-cloud application migration depends on moving the platform lock-in risk up a level of abstraction, from infrastructure environments to managed container services and their accompanying workload management systems.
That is, Kubernetes, even with its associated ecosystem of cloud-agnostic add-ons, wont be enough to provide transparent multi-cloud portability, particularly given the seduction of using managed container platforms and cloud-specific platform and application services, along with the friction of multi-cloud data movement and security policy enforcement.
With the understanding that most analogies are imperfect, here goes since it illustrates an underlying concept of implementation-specific complexity: Containerized workloads on Kubernetes are portable across clouds the same way Unix source code is portable between systems. As anyone who has ported application code between Unix platforms in the era before the ascendency of Linux on x86 can attest, there are plenty of devilish details to iron out before make installactually works.
When it comes to container/Kubernetes usage for real-life applications, I see the following issues all thwarting the goal of transparent platform portability:
Banks acknowledges one of these issues, namely how data gravity promotes platform lock-in via following the path of least resistance when he writes:
The issues of extracting data and applications at the end of a [cloud service] contract and the possibilities that a move to another supplier will involve some degree of re-engineering or at least re-optimising to suit the new environment all threaten the possibility of an additional cost burden in making such a move, adding to the possibility of remaining locked in being seen as the safest option.
However, data friction is one of the easier problems to solve, and not what I see as the chief source of lock-in.
Data movement and replication can be costly, but there are known solutions and thus, isnt the most forbidding problem, particularly for the new generation of cloud-native applications. Indeed, the term cloud-native captures a larger lock-in threat, once we clear up some confusion. Many people conflate the term cloud-native with containerized applications, particularly those using a microservice, i.e. disaggregated design. Its a constrained definition that I dispute since the primary advantage of cloud services is the opportunity to offload the implementation details of commodifiable functions to a service provider.
Such services naturally started at the lowest logical layers with infrastructure services like compute instances, object storage containers and network file shares, but have continually moved to higher levels of service abstraction; first to infrastructure applications like load balancers and nameservers, but later to application components like databases, message queues, notification systems, event-driven functions (serverless) and AI-based components.
Each of these cloud-specific features is the Lilliputian ropes of lock-in tying Gulliver, the enterprise developer, to a particular service provider and implementation by using proprietary APIs and other non-portable cloud features. The point isnt that the same functionality couldnt be implemented on another cloud, it could, since any clever feature introduced by one is quickly mimicked by the others. Its that the implementations are different and thus require significant effort by both developers and cloud operations teams to change.
The notion of seamless Kubernetes container-based application portability requires:
Alternatively, it requires shifting the platform and vendor lock-in to another layer by adopting a multi-cloud PaaS or meta-container implementation that abstracts the management control plane from the infrastructure implementation.
Think my scenario is a stretch and that the Kubernetes cognoscenti have more discipline than that? Think again. Datadog used Kubecon as the backdrop for an update to its container orchestration and Docker research reports and while brief, it has some relevant insights. Notably, Datadog found that of the 45 percent of organizations running Kubernetes, those doing so on cloud platforms (likely, most), are gravitating to managed Kubernetes services.
On Google Cloud, more than 90 percent run GKE, while on AWS, about a third use EKS. Whats the problem here, its standard Kubernetes you say? Consider this from the Pulumi blog , developers of a multi-cloud development platform, which summarizes the portability problem of CaaS products (emphasis added):
Kubernetes clusters from the managed offerings of AWS EKS, Azure AKS, and GCP GKE all vary in configuration, management, and resource properties. This variance creates unnecessary complexity in cluster provisioning and app deployments, as well as for CI/CD and testing. Additionally, if you wanted to deploy the same app across multiple clusters for specific use cases or test scenarios across providers, subtleties such as LoadBalancer outputs and cluster connection settings can be a nuisance to manage.
Irrespective of whether developers invoke cloud-native services from an application container, each managed container environment has different settings, cloud network interfaces and management interfaces. Sure, once you get them all set up it might be possible to move workloads between them, but what happens when you need to create a new cluster in a new region? Manual work recreating the configuration; that is unless youve taken the initiative to develop some automation scripts on each cloud to do most of the drudgery.
Source: Datadog survey; 8 facts about the changing container landscape
Datadog also found that almost a fifth of its AWS users run containers on Fargate, its managed instance service that eliminates the need to provision EC2 instances as cluster nodes. Indeed, Fargate usage has almost quadrupled in the past year and for good reason. Services like Fargate are incredibly convenient, but what happens when you want to shift workloads to a new cluster on Azure using Azure Container Instances (ACI)? How transparent is that going to be?
Source: Datadog survey; 8 facts about the changing container landscape
Finally, Datadogs survey found that 70 percent of Kubernetes users turn to NGINX for cluster traffic routing, but again, how will that change when DevOps teams get comfortable with service meshes and start using cloud services like AWS App Mesh, Azure Service Fabric Mesh and GCP Istio? How easily will routing policies and configurations port between implementations, since each is based on a different software platform and has different features?
Many of the products announced at KubeCon address the portability issues identified above. For example, Datadog announced multi-cloud performance monitoring for Kubernetes clusters, Yugabyte released a distributed database that works across multi-cloud clusters and several companies updated multi-cloud configuration management and automation tools to support Kubernetes. Indeed, theres a swarm of companies racing to solve the problem of multi-cloud infrastructure and application management, typically by introducing another level of software abstraction and dependence to handle the meta-level configurations.
See the article here:
Kubernetes and the misconception of multi-cloud portability - Diginomica
Fashion retailer AllSaints on using Google Cloud to handle online shopping traffic peaks – ComputerWeekly.com
Adoption of the Google Cloud Platform in the retail space is continuing apace, with the fashion retailer AllSaints outlining how its use of the technology is assisting the brand with delivering a more robust and performant online shopping experience to its customers.
The firms website regularly receives two million visits a month, and the revenue generated by online sales of its garments and accessories is becoming increasingly important to the overall success of the company.
However, like many retailers, the firm has previously struggled with ensuring its website is equipped to cope with surprise and prolonged periods of peak traffic, and had taken to over-provisioning on-premise servers to protect against downtime or service disruptions.
Responsive websites and fast page-load speeds are critical for the mobile, connected customer, said John Bovill, executive consultant of digital and technology at AllSaints.
As such, the company operated a 60-unit strong server farm that had more than enough capacity to cope during peak periods, but when traffic levels returned to normal almost half of these would be left idle, which the company considered to be a waste of resources.
We make projections for rates of business in normal periods, but its very hard to predict how sales will rise during peak demand, especially online. Our need for infrastructure often doubles at those times, but we only actually need those servers for a very short period, he added.
At the same time, provisioning additional capacity was often a slow process, which sometimes caused delays in the companys plans to expand into new geographical territories or deploy new website features, for example.
To address these myriad issues, the firm decided to overhaul its infrastructure setup by moving to the Google Cloud Platform, which it claims has provided the firm with the ability to provision compute capacity instantaneously, across multiple locations, and importantly power it down when it is no longer required.
This was not, however, the firms first foray into the Google Cloud, having made the decision to make moving its applications and workloads to the cloud a strategic goal in 2014.
It started out by adopting Googles G-Suite portfolio of cloud-based productivity tools to bolster internal collaboration and communication between the firms employees as part of a introductory process to get its office and store-based staff used to using cloud-based tools.
This work also coincided with a switch in its software development strategy which has since seen it favour the use of microservices, and in 2016 the adoption of Kubernetes for container orchestration purposes.
The move to microservices was part of a push by the company to overhaul its continuous integration and continuous delivery pipeline, which is now based on Jenkins on Google Cloud and Terraform, so that its in-house developers can speed up the time it takes to release new features and code changes.
As a result, the company claims it has now cut the time it takes developers to deploy new code from 20 minutes to less than five, while enabling them to test code on the same infrastructure that it will run on in production.
Before, we couldnt confidently say a bug was fixed until we actually tested it in production. Now we can deploy code in test environments that exactly mimic production, said Andy Dean, technical operations manager at AllSaints.
The improved CI/CD pipeline means we can update our services every day, with a shorter lifespan on bugs, and minimal disruption. That makes us more responsive to customer needs, more proactive. And thats exactly what were trying to achieve.
The company had another cloud provider in its supplier mix at this time before deciding to oust them in favour of deepening its cloud ties with Google, on cost grounds, and embarking on a strategy that would see all new apps and services built and hosted solely in its cloud infrastructure.
At the same time, it also began laying the groundwork to begin migrating its legacy applications to the Google Cloud too.
We were moving 60 individual services, not just one application, said Dean. The interdependencies between them meant that it made more sense to move them all at once, and that took a lot of planning.
All in all, the migration was done and dusted within a week to avoid any network latency issues arising from running applications and workloads in two difference places at once for too long.
It was the biggest infrastructure change wed made in the history of the company, so one of our goals was that nobody notice the change, said Dean.
The companys technology department worked closely with the Google Cloud team during the migration to ensure the migration progressed as smoothly and efficiently as possible.
We got things stable within the first week, which was crucial for us. In that week, we moved 60 individual services, including our enterprise resource planning [ERP] tills, to a microservices cloud environment.
On the back of all this work, the retailer has been able to cut the number of servers it operates from 60 to 30, cutting its infrastructure costs by 50% in the process, by handing off a portion of its infrastructure requirements to the Google Computer Engine, with autoscaling support provided by the Google Kubernetes Engine.
We monitor the architecture using Stackdriver, but Google Kubernetes Engine really looks after itself, said Dean.
The self-healing aspect of Google Kubernetes Engine means we no longer have to make time to restart some of the [virtual machines]. Scaling is now automatic, and so are key maintenance tasks.
The company still has some cloud migration work to do, a process it predicts will stretch into 2021, as it prepares to containerise all of its back end and internal systems including its ERP and point-of-sale systems and move them to the cloud.
Strategically we are looking to maximise our usage of Google Cloud, driving this and associated technologies to provide the best possible AllSaints experience for our customers, said Bovill.
See original here:
Fashion retailer AllSaints on using Google Cloud to handle online shopping traffic peaks - ComputerWeekly.com
Exposed database left terabyte of travelers’ data open to the public – CNET
One of the largest travel booking companies in Europe left its data exposed on an unprotected server, researchers say.
When it comes to travel, most people are concerned with planning their trip, getting the best price and making sure they've packed everything. Now they also need to worry about whether their reservation companies have properly secured their data: Security researchers found that one of Europe's largest hotel booking companies left more than a terabyte of sensitive data exposed on a public server.
The exposed database contained travelers' information like names, home addresses, lodging, children's personal information, credit card numbers and thousands of passwords stored in plaintext, the security researchers said Wednesday. The database stores information on 140,000 clients, each of which could be an individual, a group of travelers or an organization.
The database belongs to Gekko Group, a subsidiary of France-based AccorHotels, Europe's largest hospitality company. Gekko Group handles business travel and luxury travel with more than 600,000 hotels across the world, according to its website. AccorHotels referred to Gekko Group for comment.
Fabrice Perdoncini, Gekko Group's CEO, said that the company has secured the database and is launching an internal investigation on its IT systems.
"Ensuring the adequate protection of our clients' data is of utmost importance to Gekko Group, a B2B company," Perdoncini said in a statement. "We acknowledge the seriousness of this matter and confirm that no malicious use or misuse of data has been reported so far."
The company said that it was informing its affected clients and that less than 1,000 unencrypted credit card numbers were stored on the database. But more credit card numbers could have been seen in document scans stored on the server.
Now playing: Watch this: What to do if your personal information is part of a...
2:42
The pile of leaked passwords contained the credentials for the World Health Organization, and a potential hacker could have used those credentials to book travel using the group's budget, the security researchers said. The WHO didn't respond to a request for comment.
The discovery came via independent security researchers Noam Rotem and Ran Locar, who worked with Israeli security company VPNMentor to find the exposed database. "It's unfortunately not the first time we see a data breach of this scale with that type of sensitive information. It's sadly a much more common issue than one would think," Rotem said in a statement.
The researchers found the database, which is hosted onElasticsearch, through an online scan, while looking for servers that lacked proper protections.
"This breach represents a serious lapse in data security by Gekko Group and its subsidiaries, compromising the privacy of their customers, clients, AccorHotels, and the businesses themselves," VPNMentor said in a blog postWednesday.
As more companies move to store their data on cloud servers, they're driving cybersecurity concerns about properly protecting sensitive data. Security researchers have found volumes of sensitive data exposed online in unsecured databases as they look to warn companies to protect that data before a malicious hacker finds it.
In the past year, researchers found exposed databases showing debt from millions of people, along with open servers hosting millions of Facebook records. While security researchers found those first, hackers have also taken advantage of open servers. In July, a hacker allegedly stole the credit card applications of more than 100 million US citizens from Capital One's Amazon Web Services cloud server.
Rotem and Locar said they reported the exposed database to Gekko Group and AccorHotels on Nov. 7 and got a response on Nov. 13. The company told the researchers that it's since secured the server, according to Rotem and Locar.
Even if you've never interacted with those two companies, data from their partners was also exposed, the researchers said. The database had a significant amount of data from websites like Booking.com and Hotelbeds.com open to the public, including personal information and credit card numbers, researchers said.
Booking.com and Hotelbeds.com didn't respond to a request for comment.
VPNMentor's researchers also saw travel itineraries left on the open server, like tickets to Euro Disney and travel plans between hotels and airports with personal information.
The server was hosted in France, but the affected travelers came from several countries including Spain, the United Kingdom, the Netherlands, Portugal, France, Belgium, Italy and Israel, researchers said.
"For two companies of their respective sizes and market shares, Gekko Group and AccorHotels would be expected to have more robust data security," VPNMentor said. "By exposing such a huge amount of sensitive data, they will likely face questions over how this happened, and their wider data security policies for all brands they own."
Read the rest here:
Exposed database left terabyte of travelers' data open to the public - CNET
Vodafone picks Google Cloud to develop and host its global data platform – FierceTelecom
In its quest to be a digital operator, Vodafone Group has partnered up with Google Cloud to host its cloud platform for data analytics, business intelligence, and machine learning.
Vodafones Neuron big-data analytics platform, which "act as a brain and driver for AI and business intelligence" for Vodafones global business, will be hosted on Google Cloud.
Neuron serves as a single "data ocean" of analytic insights to support services and applications such as 5G optimization and smart retail. Neuron is currently being used across 11 countries where it combines data from more than 600 servers.
Like this story? Subscribe to FierceTelecom!
The Telecom industry is an ever-changing world where big ideas come along daily. Our subscribers rely on FierceTelecom as their must-read source for the latest news, analysis and data on the intersection of telecom and media. Sign up today to get telecom news and updates delivered to your inbox and read on the go.
Vodafone will also rely on Google Cloud Platform (GCP) for hybrid infrastructure and containerization and to develop its next-generation business intelligence platform. The container approach will deliver faster insights in a more standardized way, making it easier to compare performance across departments and local markets.
RELATED: Google Cloud pumps profits into parent company Alphabet's bottom line
Neuron will leverage Google Cloud to improve its operations by making Vodafones existing software cloud-compatible, which allows local markets to tap into new platform capabilities without disrupting existing campaigns.
"The project is complex and multi-faceted," Google Cloud CEO Thomas Kurian said in a blog post. "Vodafones existing on-premises group data platform is a shared service consisting of eight clusters with more than 600 servers and is used in 11 countries. The platform relies on legacy Hadoop architecture that lacks the agility or scalability to support demands for analytics and an increasing list of innovation projects.
"To begin, Vodafone will perform a large-scale migration of its global data into our highly secure public cloud. It will also create a custom platform for data performance that lets disparate data from across the organization be aggregated into one data oceanrather than multiple data lakeswithin which analytics and business intelligence can take place."
To simplify the integration, the Neuron platform will also use other Google Cloud services including Dataflow, Dataproc, Cloud Composer, Data Fusion, BigQuery, and Google Kubernetes Engine.
We want to lead the industry in capturing the benefits of digital," said Vodafone Group CTO Johan Wibergh, in a statement. "The capabilities that Google Cloud gives us will help accelerate our digital transformation.
Follow this link:
Vodafone picks Google Cloud to develop and host its global data platform - FierceTelecom
Global Cloud Security Market Size is Expected to Reach 8.9 Billion US$ with a CAGR of 23.5% During the Forecast Period 2015-2020 – Valuates Reports -…
BANGALORE, India, Nov. 19, 2019 /PRNewswire/ -- Data protection requires a set of policies and controls that tackle the cloud's security aspects by protecting software, information, and infrastructure.
Due to its ability to configure the set of services as necessary, the cloud-based security services are expected to see increased market acceptance. The definition of managed security service protects against intruders & cyber attacks and includes firewall of the next generation, content filtering, managed two-factor authentication and even security consultancy. This is expected to offer industry players ample opportunities.
Inquire for Sample @ https://reports.valuates.com/request/sample/ALLI-Auto-1R41/Cloud_Security_Market
Growing adoption of cloud services by large and medium-sized companies and increased demand for managed security services generate ample opportunities for cloud security market players.
View Full Report @ https://reports.valuates.com/market-reports/ALLI-Auto-1R41/cloud-security-market
Trends Influencing The Cloud Security Market Share:
Region Wise Cloud Security Market Analysis:
Inquire for Regional/Country @ https://reports.valuates.com/request/regional/ALLI-Auto-1R41/Cloud_Security_Market
Growing Reliance On Cloud-based Services
Increasing adoption of cloud services across diverse verticals has resulted in increased dependence on cloud for storage and other applications. Increasing number of internet users and growing adoption of cloud services are the key impacting factors for adoption of cloud security solutions. Over the period, the growth of online business would highlight the significance of this factor.
Increasing Number Of Cyber-attacks
An increasing number of cyber-attacks due to an upsurge in digitalization is one of the driving factors of the cloud security market. Cyber-attacks have increased rapidly, thereby, resulting in a strong need for cloud security services.
The number of data theft cases have exponentially increased in the last five years, owing to increased generation of digital content and lack of security to protect financial and corporate data. BFSI, followed by IT & telecom and retail, are the most targeted industries.Therefore, burgeoning number of cyber-attacks and data breach cases would boost the growth of the market in the future.
Growing Market For Managed Security Services
The concept of managed security service offers protection against intruders & cyber-attacks and includes next-generation firewall, content filtering, managed two-factor authentication and even security consultancy. This is expected to provide ample opportunities for market players.
Cloud Security Key Segments
The market segmentation is illustrated below:
Cloud Security Market by Type
Cloud Security Market by End User
Cloud Security Market by Vertical
Cloud Security Market by Deployment
Cloud Security Market by Geography
Key Players
Key Benefits of Market Study:
Buy Report @ https://reports.valuates.com/api/directpaytoken?rcode=ALLI-Auto-1R41
Similar Reports :
Global Cloud Security Solutions Market : https://reports.valuates.com/market-reports/PROF-Auto-21U225/global-cloud-security-solutions-market
Global Cloud Security in Retail Market :
https://reports.valuates.com/market-reports/PROF-Auto-38A254/global-cloud-security-in-retail-market
Global Multi-Cloud Security Solutions Market :
Global Hybrid Cloud Security Solutions Market :
About Us:
Our aim is to collate unparalleled Market insights and notify our customers as and when it happens. Valuates is curating premium Market Research Reports from the leading publishers around the globe. We will help you map your information needs to our report repository of Market research reports and guide you through your purchasing decision. We are based out of Silicon Valley of India (Bengaluru) and provide 24/7 online and offline support to all our customers and just a phone call away.
Contact Us:
Valuates Reportssales@valuates.com For U.S. Toll Free Call +1-(315)-215-3225 For IST Call +91-8040957137 WhatsApp : +91-9945648335 Website:https://reports.valuates.com
SOURCE Valuates Reports
Here is the original post:
Global Cloud Security Market Size is Expected to Reach 8.9 Billion US$ with a CAGR of 23.5% During the Forecast Period 2015-2020 - Valuates Reports -...