Category Archives: Cloud Computing
Red Hat Joins Forces with U.S. Department of Energy Laboratories to Bridge the Gap Between High Performance Computing and Cloud Environments -…
RALEIGH, N.C.--(BUSINESS WIRE)--Red Hat, Inc., the world's leading provider of open source solutions, today announced it is collaborating with multiple U.S. Department of Energy (DOE) laboratories to bolster cloud-native standards and practices in high-performance computing (HPC), including Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories.
Adoption of HPC is expanding beyond traditional use cases. Advancements in artificial intelligence, machine learning and deep learning, as well as compute and data-driven analytics, is driving greater interest and need for organizations to be able to run scalable containerized workloads on traditional HPC infrastructure. According to industry analyst firm Hyperion Research, roughly one-third of all HPC system revenue will be dedicated to AI-centric systems by 2025, showing nearly 23% CAGR over the five year period1, driven by the influx of AI workloads. Additionally, nearly 20% of HPC users' HPC-enabled AI workloads are currently being run in the cloud.2
Red Hat is a leader in cloud-native innovation across hybrid and multicloud environments, while laboratories understand the needs and unique demands of massive-scale HPC deployments. By establishing a common foundation of technology best practices, this collaboration seeks to use standardized container platforms to link HPC and cloud computing footprints, helping to fill potential gaps in building cloud-friendly HPC applications while creating common usage patterns for industry, enterprise and HPC deployments.
Together with the laboratories, Red Hat will focus on advancing four specific areas that address current gaps and help lay the groundwork for exascale computing, including standardization, scale, cloud-native application development, and container storage. Examples of collaborative projects between Red Hat and DOE laboratories includes:
Bringing standard container technologies to HPC Red Hat and the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab recognize the importance of standard-based solutions in enabling computing innovation, especially when technologies must span from the edge to the cloud to HPC environments. From container security to scaling containerized workloads, common, accepted practices help HPC sites to get the most from container technologies. To better meet the unique requirements for large scale HPC systems and pave the way for organizations to be able to take advantage of containers in exascale computing, Red Hat and NERSC are collaborating on enhancements to Podman, a daemonless container engine for developing, managing and running container images on a Linux system, to enable it to replace NERSCs custom development runtime, Shifter.
Running Kubernetes at massive scale Red Hat has been collaborating with Sandia National Laboratories on the SuperContainers project for several years, working to make Linux containers and other building blocks of cloud-native computing more readily accessible to supercomputing operations. In this expanded collaboration, Red Hat and Sandia National Laboratories intend to explore the deployment scenarios of Kubernetes-based infrastructure at extreme scale, providing easier, well-defined mechanisms for delivering containerized workloads to users.
Bridging traditional HPC jobs with cloud-native workloads Red Hat and Lawrence Livermore National Laboratory are collaborating to bring HPC job schedulers, such as Flux, to Kubernetes through a standardized programmatic interface helping IT teams supporting supercomputing operations to better manage traditional parallel workflows alongside containerized jobs, including how this mix of technologies operates with low-level hardware devices, like accelerators or high-speed networks.
Reimagining storage for containers For containers to be used effectively across both HPC and commercial cloud resources, a set of standard interfaces is needed in order to manage various container image formats and for providing access to distributed file systems. Red Hat and the three DOE National Laboratories aim to define the mechanisms by which container images can be migrated from and deployed with other container engines, allowing users to freely move their applications across popular container runtime platforms, as well as create mechanisms that allow containers to use distributed file systems as persistent storage.
Through the collaboration and Red Hat's experience supporting some of the most powerful supercomputers in the world, HPC sites will be able to abstract the immense complexities their environments can present, benefiting the range of United States exascale machines being deployed by DOE.
Supporting QuotesChris Wright, senior vice president and chief technology officer, Red HatThe HPC community has served as the proving ground for compute-intensive applications, embracing containers early on to help deal with a new set of scientific challenges and problems. That led to the lack of standardization across various HPC sites creating barriers to building and deploying containerized applications that can effectively span large-scale HPC, commercial and cloud environments, while also taking advantage of emerging hardware accelerators. Through our collaboration with leading laboratories, we are working to remove these barriers, opening the door to liberating next-generation HPC workloads.
Earl Joseph, Ph.D., chief executive officer, Hyperion ResearchHigh performance computing infrastructure must adapt to the requirements of today's heterogeneous workloads, including workloads that use containers. Red Hats partnership with the DOE labs is designed to allow the new generation of HPC applications to run in containers at exascale while utilizing distributed file system storage, providing a strong example of collaboration between industry and research leaders."
Shane Canon, senior engineer, Lawrence Berkeley National LaboratoryThe collaboration with the Podman community and Red Hat engineers is helping us to explore and co-develop enhancements that will allow Podman to scale and perform for the largest HPC workloads. We have already demonstrated this across 512 GPU nodes on Perlmutter. NERSC sees a convergence of HPC and cloud-native workloads, and Podman can be an important tool in helping to bridge between these two worlds.
Bronis R. de Supinski, chief technology officer, Lawrence Livermore National LaboratoryHigh performance computing infrastructure is becoming more diverse and is increasingly being used to run non-traditional HPC workflows. We need to provide mechanisms for scheduling various types of workflows and expect container orchestration frameworks like Kubernetes and Red Hat OpenShift to be a significant part of the software ecosystem effectively contributing to the convergence of the HPC and cloud realms.
Andrew J. Younge, Ph.D., R&D manager and computer scientist, Sandia National LaboratoriesSandia and the DOE are seeing an increased need to support more diverse HPC workloads, beyond traditional batch-based modeling and simulation codes. This requires us to find new and innovative ways to enabling services, tasks, and data persistence models together within tight coordination with current simulation capabilities. Furthermore, workload portability remains an important consideration where containers are now a key component to our code deployment strategy. Sandias collaboration with Red Hat on Podman and Kubernetes-based OpenShift enables us to investigate approaches for delivering modeling and simulation capabilities as a service to Sandias designer and analyst communities.
Additional Resources
Connect with Red Hat
About Red Hat, Inc.Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.
Red Hats Forward-Looking StatementsExcept for the historical information and discussions contained herein, statements contained in this press release may constitute forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements are based on the companys current assumptions regarding future business and financial performance. These statements involve a number of risks, uncertainties and other factors that could cause actual results to differ materially. Any forward-looking statement in this press release speaks only as of the date on which it is made. Except as required by law, the company assumes no obligation to update or revise any forward-looking statements.
Red Hat, the Red Hat logo and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.
1 Source: Hyperion Research, Worldwide HPC-based Artificial Intelligence (AI) MarketForecast, 2020-2025
2 Source: Hyperion Research, HPC and Containers An Intriguing Combination
Originally posted here:
Red Hat Joins Forces with U.S. Department of Energy Laboratories to Bridge the Gap Between High Performance Computing and Cloud Environments -...
Healthcare Cloud Computing Market is Growing Rapidly with Recent Demand, Trends, Development, Revenue and Forecast to 2029 The Greater Binghamton…
The quality of Global Healthcare Cloud Computing Market market research report is at par which gains customer confidence and trust. This market report directs business in right direction by giving insights about products, market, customers, competitors and marketing strategy at right time. Hence it acts as a backbone to the business. This market report helps to develop a successful marketing strategy for the business. The business report is very helpful to all sizes of business which makes it simpler to take informed decisions in providing the different aspect of HEALTHCARE industry. The worldwide Global Healthcare Cloud Computing Market market report contains all the company profiles of the major players and brands.
Download Sample Report @ https://www.databridgemarketresearch.com/request-a-sample/?dbmr=global-healthcare-cloud-computing-market&kb
The Global Healthcare Cloud Computing Market market report also indicates a narrowed decisive summary of the market. Along with this, multiple factors which have affected the advancement and improvement in a positive as well as negative manner are also studied in the report. On the contrary, the various factors which will be acting as the opportunities for the development and growth of the Global Healthcare Cloud Computing Market market in the forecasted period are also mentioned.
A combination of best industry insight, practical solutions, talent solutions and latest technology has been used to structure an excellent Global Healthcare Cloud Computing Market market report. Thoroughly analysed market segmentation aspect provides a clear idea about the product consumption based on several factors ranging from type, application, deployment model, end user to geographical region. Moreover, drivers and restraints of the market assessed in this wide ranging report makes aware about how the product is getting utilized in the recent market environment and also provide estimations about the future usage. The proper utilization of established statistical tools and coherent models for analysis and forecasting of market data makes Global Healthcare Cloud Computing Market marketing report outperforming.
Top Keyplayers in Global Healthcare Cloud Computing Market Report:
CareCloud Corporation, Carestream Health, ClearDATA, Dell, NetDepot.com LLC, IBM Corporation, Iron Mountain Incorporated, IBM Watson Health, Cisco Systems Inc., HP Development Company, L.P., Microsoft, VMware, Inc., Salesforce.com, Inc., ClearDATA, SYNOPTEK, LLC., Cisco, Oracle, GE healthcare, CitiusTech Inc., Napier Healthcare Solutions Pte. Ltd., HealthAsyst, Optum, Inc., McKesson Corporation, 3M, Omnicell, Inc., Ciox Health, Wipro, Nuance Communications, Inc., Infosys Limited, and Cognizant, among others.
Feel Free To Ask Question Before Purchasing The Report @ https://www.databridgemarketresearch.com/inquire-before-buying/?dbmr=global-healthcare-cloud-computing-market&kb
Regional Analysis
U.S., Canada and Mexico in North America, Germany, France, U.K., Netherlands, Switzerland, Belgium, Russia, Italy, Spain, Turkey, Rest of Europe in Europe, China, Japan, India, South Korea, Singapore, Malaysia, Australia, Thailand, Indonesia, Philippines, Rest of Asia-Pacific (APAC) in the Asia-Pacific (APAC), Saudi Arabia, U.A.E, South Africa, Egypt, Israel, Rest of Middle East and Africa (MEA) as a part of Middle East and Africa (MEA), Brazil, Argentina and Rest of South America as part of South America.
Request for TOC @ https://www.databridgemarketresearch.com/toc/?dbmr=global-healthcare-cloud-computing-market&kb
An excellent Global Healthcare Cloud Computing Market market research report has several benefits which can be projected to wide-ranging aspects of HEALTHCARE industry. It helps to draw target audiences for the clients before launching any advertising campaign. With the market report, it becomes easy to collect industry information more quickly. Moreover, this industry analysis report also solves the purpose of validating the information that has been gathered through internal or primary research. With the finest Global Healthcare Cloud Computing Market market report, it becomes possible to achieve holistic view of the market effectively and then also benchmark all the companies in the HEALTHCARE industry.
Highlights of TOC:
Chapter 1: Market overview
Chapter 2: Global Induction Global Healthcare Cloud Computing Market market analysis
Chapter 3: Regional analysis of the Induction Global Healthcare Cloud Computing Market industry
Chapter 4: Market segmentation based on types and applications
Chapter 5: Revenue analysis based on types and applications
Chapter 6: Market share
Chapter 7: Competitive Landscape
Chapter 8: Drivers, Restraints, Challenges, and Opportunities
Chapter 9: Gross Margin and Price Analysis
Goals and objectives of the Global Healthcare Cloud Computing Market Market Study
Understanding the opportunities and progress of Global Healthcare Cloud Computing Market Global Healthcare Cloud Computing Market market highlights, as well as key regions and countries involved in market growth.
Study the different segments of the Global Healthcare Cloud Computing Market market and the dynamics of Global Healthcare Cloud Computing Market in the market.
Categorize Global Healthcare Cloud Computing Market segments with increasing growth potential and evaluate the futuristic segment market
To analyze the most important trends related to the different segments that help to decipher and convince the Global Healthcare Cloud Computing Market market.
To verify region-specific growth and development in the Global Healthcare Cloud Computing Market market.
Understand the key stakeholders in the Global Healthcare Cloud Computing Market market and the value of the competitive image of the Global Healthcare Cloud Computing Market market leaders.
To study key plans, initiatives and strategies for the development of the Global Healthcare Cloud Computing Market market.
Access Full Report @ https://www.databridgemarketresearch.com/reports/global-healthcare-cloud-computing-market?kb
How DBMR is different than other Market Research Providers:
The inception of DBMR has been backed by providing clients with a holistic view of market conditions and future possibilities/opportunities to reap maximum profits out of their businesses and assist in decision making. Our team of in-house analysts and consultants works tirelessly to understand your needs and suggest the best possible solutions to fulfill your research requirements.
Related Reports:
https://www.databridgemarketresearch.com/reports/global-nicotine-replacement-therapy-market
https://www.databridgemarketresearch.com/reports/global-chemiluminescence-analyzer-cla-market
https://www.databridgemarketresearch.com/reports/global-desmopressin-market
About Data Bridge Market Research:
Data Bridge Market Research set forth itself as an unconventional and neoteric Market research and consulting firm with unparalleled level of resilience and integrated approaches. We are determined to unearth the best market opportunities and foster efficient information for your business to thrive in the market. Data Bridge endeavors to provide appropriate solutions to the complex business challenges and initiates an effortless decision-making process.
Contact:
Data Bridge Market Research
US: +1 888 387 2818
UK: +44 208 089 1725
Hong Kong: +852 8192 7475
Corporatesales@databridgemarketresearch.com
Originally posted here:
Healthcare Cloud Computing Market is Growing Rapidly with Recent Demand, Trends, Development, Revenue and Forecast to 2029 The Greater Binghamton...
United States Insurance Distribution Technology Market Report 2022: Impact of COVID-19 and Forecast up to 2026 – ResearchAndMarkets.com – Business…
DUBLIN--(BUSINESS WIRE)--The "The US Insurance Distribution Technology Market: Analysis By Function, By Application, By Technology Type, By End-User Size & Trends with Impact of COVID-19 and Forecast up to 2026" report has been added to ResearchAndMarkets.com's offering.
The US insurance distribution technology market was valued at US$181.37 billion in 2021 and is expected to reach US$261.59 billion by 2026.
Insurance distribution technologies mainly focuses on improving the efficiency of the existing insurance industry model. These technologies are also focusing on improving communications with the clients and capabilities to implement automation processes.
The US region offers strong growth potential to the insurance distribution technology market. Insurance technology firms have been showing significant growth in the areas of auto, homeownership, and cyber insurance. Such strong growth would stimulate traditional insurers to either acquire technology capabilities or partner with InsurTech companies.
With an increasing demand for innovative products and services from millennials, such collaboration would become a critical imperative. Due to such collaborations, traditional insurers would benefit from faster results in establishing a tech culture. The US insurance distribution technology market is projected to rise at a CAGR of 7.60% during the forecast period of 2022-2026.
Market Segmentation Analysis:
In 2021, the commission segment lead the insurance distribution technology market, accounted for around 40% share of the market. The commission segment is expected to experience high growth. The rising penetration of digital and telesales models is expected to increase the share of the broker commission. It is also expected that the bulk of integrated distributors' growth would come from the auto and individual health and Medicare markets as these both have a growing digital aspect.
The US advertising insurance distribution technology market can be further divided into five applications: Auto, Health, Life, Home, and SMB Commercial P&C. The US home advertising market is expected to grow significantly over the forecast period, owing to the rising development of back-end call centers in the home insurance markets, the growing use of connected devices to streamline processes, etc.
Cloud computing held the major share in the market. The cloud computing market is anticipated to grow at a CAGR of 26.8% during the year 2022-2026. The growth is expected to increase as cloud computing is a valuable delivery model that insurers can use to facilitate or accelerate business transformation. Moreover, cloud computing has a huge impact on the insurance industry, with benefits for internal processes, new customer acquisition, and building policyholders' loyalty.
Manufacturing, Transportation, Government, Healthcare, Retail, and Others. The healthcare segment is anticipated to grow at the highest CAGR of 19.7% during the forecasted period. The growth can be attributed to the widespread use of the platform and peer-to-peer business models and lower insurance premium rates leading to a rise in the number of policymakers.
The US Insurance Distribution Technology Market Dynamics:
Growth Drivers: One of the most important factors impacting insurance distribution technology market dynamics is the increasing use of mobile technology and applications. By using mobile app technology, the insurance distributors can now easily reach out to their tech-savvy customers. These apps are extremely useful and viewed as invaluable assets when it comes to claims. They are capable of capturing and transferring data quickly, making the process hassle-free and expediting the complete insurance-related claims process. Furthermore, the market has been growing over the past few years, due to factors such as soaring urban population, rising millennial population, rise in the use of internet to buy insurance products, integration of artificial intelligence (AI) with insurance, etc.
Challenges: However, the market has been confronted with some challenges specifically, security concerns, infrastructure bottlenecks, etc.
Trends: The market is projected to grow at a fast pace during the forecast period, due to various latest trends such as increasing use of social media as a distribution channel, rising personalization trend, growing adoption of telematics, rising penetration of cloud-based insurance technology services, escalating demand for digital self-service, collaboration with bigtechs, surging popularity of virtual insurance advisor, emergence of subscription models, growing adoption of connected insurance, increasing use of blockchain in insurance, rising popularity of gamification in insurance industry, etc.
Telematics refers to devices that merge telecommunications and information technology. With the proliferation of smartphones in the US, telematics has become a convenient method of data collection, with the ability to analyze information and compare data of users, improving underwriting with greater accuracy and efficiency. Therefore, it is expected that the growing adoption of telematics would further provide significant opportunities to the IT sector by making insurance distribution more efficient, and better coordinated.
Impact Analysis of COVID-19 and Way Forward:
The US insurance distribution technology market growth was positively influenced by the COVID-19 pandemic. The COVID-19 pandemic is forcing almost all organizations to speed up their digital transformation priorities. In regards to the insurance distribution technology industry, the most significant change is the growing interest throughout the pandemic in digital products that are more dynamic. This change would lead to much greater levels of personalization and so change the customer experience and value proposition. In post-COVID era, insurers must explore technologies like virtual reality, augmented reality, and virtual-visit solutions for facilitating virtual sales and intermediary enablement.
Competitive Landscape:
The US insurance distribution technology market is fragmented in nature, owing to the presence of a large number of small businesses catering to the demands of life and non-life insurance sectors.
Some of the strategies among key players in the market for insurance distribution technology market are partnerships, mergers, acquisitions, and collaborations. For Instance, Lemonade entered into an agreement in 2021 to acquire Metromile (the data science company focused on auto insurance).
Whereas, GoHealth has invested in multiple sectors such as HRTech, Employer Insurance, Employee Health IT, and more. MediaAlpha is one of the largest digital aggregator companies in the market. Whereas, companies such as GoHealth, Goosehead, and Porch are integrated distributors.
Insurance distribution technology firm's ability to promote insurance industry innovation by generating new products would aid insurance companies in meeting the changing needs of their customers. As a result, various insurance distribution technology companies are gaining traction by providing a new and diverse set of services.
The key players in the US insurance distribution technology market are:
Key Topics Covered:
1. Executive Summary
2. Introduction
2.1 Insurance Distribution Technology: An Overview
2.1.1 Introduction to Insurance Distribution
2.1.2 Introduction to Insurance Distribution Technology
2.2 Insurance Distribution Segmentation: An Overview
2.2.1 Insurance Distribution Segmentation
2.3 Insurance Distribution Technology Segmentation: An Overview
2.3.1 Insurance Distribution Technology Segmentation
3. The US Market Analysis
3.1 The US Insurance Distribution Market: An Analysis
3.2 The US Insurance Distribution Market: Application Analysis
3.3 The US Insurance Distribution Market: Segment Analysis
3.4 The US Insurance Distribution Technology Market: An Analysis
3.5 The US Insurance Distribution Technology Market: Function Analysis
3.6 The US Insurance Distribution Technology Market: Technology Type Analysis
3.7 The US Insurance Distribution Technology Market: End-User Analysis
4. Impact of COVID-19
4.1 Impact of COVID-19 on Insurance Industry
4.1.1 Impact of COVID-19 on Insurance Industry
4.2 Impact of COVID-19 on Insurance Technology Industry
4.2.1 Impact of COVID-19 on Insurance Technology Industry
5. Market Dynamics
5.1 Growth Driver
5.1.1 Soaring Urban Population
5.1.2 Rising Millennial Population
5.1.3 Rise in the Use of Internet to Buy Insurance Products
5.1.4 Increasing Use of Mobile Technology and Applications (Apps)
5.1.5 Integration of Artificial Intelligence (AI) with Insurance
5.2 Challenges
5.2.1 Security Concerns
5.2.2 Infrastructure Bottlenecks
5.3 Market Trends
5.3.1 Increasing Use of Social Media as a Distribution Channel
5.3.2 Rising Personalization Trend
5.3.3 Growing Adoption of Telematics
5.3.4 Rising Penetration of Cloud-based Insurance Technology Services
5.3.5 Escalating Demand for Digital Self-Service
5.3.6 Collaboration with BigTechs
5.3.7 Surging Popularity of Virtual Insurance Advisor
5.3.8 Emergence of Subscription Models
5.3.9 Growing Adoption of Connected Insurance
5.3.10 Increasing Use of Blockchain in Insurance
5.3.11 Rising Popularity of Gamification in Insurance Industry
6. Competitive Landscape
6.1 The US Insurance Distribution Technology Market Players by Business Models
7. Company Profiles
7.1 Business Overview
7.2 Operating Segment
7.3 Business Strategy
For more information about this report visit https://www.researchandmarkets.com/r/6qhgzm
Cloud Computing in Government Market increasing demand with Industry Professionals: Adobe Systems, Blackboard The Greater Binghamton Business…
JCMRrecently broadcasted a new study in its database that highlights the in-depth market analysis with future prospects ofCloud Computing in Government market.The study covers significant data which makes the research document a handy resource for managers, industry executives and other key people get ready-to-access and self analyzed study along with graphs and tables to help understand market trends, drivers and market challenges. Some of the key players mentioned in this research areAdobe Systems, Blackboard, Cisco, Ellucian, Dell EMC, Instructure, Microsoft, NetApp, Oracle, Salesforce, SAP
Get Access to Free sample Report PDF @jcmarketresearch.com/report-details/1102465/sample
COVID-19 is an infectious disease caused by the most recently discovered novel corona virus. Largely unknown before the outbreak began in Wuhan (China) in December 2019, COVID-19 has moved from a regional crisis to a global pandemic in just a matter of a few weeks.
In addition, production and supply chain delays were also witnessed during the second quarter which poised a challenge to the Cloud Computing in Government market, since end-user industries were still not operating at their full capacity.
Enquire for customization in Report @jcmarketresearch.com/report-details/1102465/enquiry
What are the markets problems in Cloud Computing in Government?
Changing regulatory landscapes, operational barriers, and the emergence of alternative technologies are all impacting the Cloud Computing in Government industry.
What are the various types of segments covered in the Cloud Computing in Government Market?
[Segments]
Who are the top key players in the Cloud Computing in Government market?
Adobe Systems, Blackboard, Cisco, Ellucian, Dell EMC, Instructure, Microsoft, NetApp, Oracle, Salesforce, SAP
Which region is the most profitable for the Cloud Computing in Government market?
The emerging economies in the Asia Pacific region will be the lucrative markets for Cloud Computing in Government products. .
What is the current size of the Cloud Computing in Government market?
The current market size of global Cloud Computing in Government market is estimated to be USD XX in 2022.
Get Discount instant FLAT40% Discount on full Cloud Computing in Government Report @jcmarketresearch.com/report-details/1102465/discount
North America is the regions largest market forCloud Computing in Government.
North America includes countries such as the US, Canada, and Mexico. North America is the second-largest consumer and producer of electricity, after Asia Pacific. The US and Canada, which are among the largest consumers in this region as well as globally, constitute the largest share of theCloud Computing in Governmentmarket.
Secondary Research:
This Cloud Computing in Government research study made extensive use of secondary sources, directories, and databases such as Hoovers, Bloomberg BusinessWeek, Factiva, and OneSource to identify and collect information useful for a technical, market-oriented, and commercial study of the global portable generator market. Other secondary sources included company annual reports, press releases, and investor presentations, white papers, certified publications, articles by recognized authors, manufacturer associations, trade directories, and databases.
Cloud Computing in Government Primary Research:
Various sources from both the supply and demand sides were interviewed during the Cloud Computing in Government primary research process to obtain qualitative and quantitative information for this report. Primary sources included industry experts from the core and related industries, as well as preferred suppliers, manufacturers, distributors, technology developers, researchers, and organizations from all segments of the value chain of this industry. To obtain and verify critical qualitative and quantitative information, in-depth interviews were conducted with a variety of primary respondents, including key industry participants, subject-matter experts, C-level executives of key market players, and industry consultants.
Estimation of Cloud Computing in Government Market Size
The total size of the Cloud Computing in Government market was estimated and validated using both top-down and bottom-up approaches. These methods were also widely used to estimate the size of various market sub segments. The following research methodologies were used to estimate market size:
Extensive secondary research was used to identify the industrys key players.
The revenues generated by the markets leading players in molecular diagnostics have been determined through primary and secondary research.
All percentage shares, splits, and breakdowns were calculated using secondary sources and confirmed using primary sources.
TABLE OF CONTENTS OF Cloud Computing in Government Market Report
1 INTRODUCTION
1.1 Cloud Computing in Government study objectives1.2 Cloud Computing in Government definition1.3 Cloud Computing in Government inclusions & exclusions
1.4 Cloud Computing in Government market scope1.5 Cloud Computing in Government report years considered1.6 Cloud Computing in Government currency1.7 Cloud Computing in Government limitations1.8 Cloud Computing in Government industry stakeholders1.9 Cloud Computing in Government summary of changes
2RESEARCH METHODOLOGY2.1 Cloud Computing in Government research data
2.2 Cloud Computing in Government market breakdown and data triangulation2.3 Cloud Computing in Government scope2.4 impact of covid-19 on Cloud Computing in Government industry2.5 Cloud Computing in Government market size estimation3 Cloud Computing in Government EXECUTIVE SUMMARY
4 Cloud Computing in Government PREMIUM INSIGHTS
4.1 attractive opportunities in Cloud Computing in Government market4.2 Cloud Computing in Government market, by region4.3 Cloud Computing in Government market in North America, by end user & country4.4 Cloud Computing in Government market, by application4.5 Cloud Computing in Government market, by end user
5 Cloud Computing in Government MARKET OVERVIEW5.1 Cloud Computing in Government introduction5.2 covid-19 Cloud Computing in Government health assessment5.3 Cloud Computing in Government road to recovery
5.4 covid-19 Cloud Computing in Government economic assessment5.5 Cloud Computing in Government market dynamics
5.6 Cloud Computing in Government trends5.7 Cloud Computing in Government market map5.8 average pricing of Cloud Computing in Government5.9 Cloud Computing in Government trade statistics5.8 Cloud Computing in Government value chain analysis5.9 Cloud Computing in Government technology analysis5.10 Cloud Computing in Government tariff and regulatory landscape
5.11 Cloud Computing in Government: patent analysis5.14 Cloud Computing in Government porters five forces analysis
6 Cloud Computing in Government MARKET, BY APPLICATION
6.1 Cloud Computing in Government Introduction6.2 Cloud Computing in Government Emergency6.3 Cloud Computing in Government Prime/Continuous
7 Cloud Computing in Government MARKET, BY END USER7.1 Cloud Computing in Government Introduction7.2 Cloud Computing in Government Residential7.3 Cloud Computing in Government Commercial7.4 Cloud Computing in Government Industrial
8 GEOGRAPHIC ANALYSIS
8.1 Cloud Computing in Government Introduction8.2 Cloud Computing in Government industry by North America8.3 Cloud Computing in Government industry by Asia Pacific8.4 Cloud Computing in Government industry by Europe8.5 Cloud Computing in Government industry by Middle East & Africa8.6 Cloud Computing in Government industry by South America
9 Cloud Computing in Government COMPETITIVE LANDSCAPE9.1 Cloud Computing in Government Key Players Strategies9.2 Cloud Computing in Government Market Share Analysis Of Top Five Players9.3 Cloud Computing in Government Market Evaluation Framework9.4 Revenue Analysis Of Top Five Cloud Computing in Government Market Players9.5 Cloud Computing in Government Company Evaluation Quadrant9.6 Cloud Computing in Government Competitive Leadership Mapping Of Start-Ups9.7 Cloud Computing in Government Competitive Scenario
10 Cloud Computing in Government COMPANY PROFILES10.1 Cloud Computing in Government Major Players10.2 Cloud Computing in Government Startup/Sme Players
11 APPENDIX11.1 Insights Of Cloud Computing in Government Industry Experts11.2 Cloud Computing in Government Discussion Guide11.3 Cloud Computing in Government Knowledge Store11.4 Cloud Computing in Government Available Customizations11.5 Cloud Computing in Government Related Reports11.6 Cloud Computing in Government Author Details
Buy instant copy of Cloud Computing in Government research report @jcmarketresearch.com/checkout/1102465
Find more research reports onCloud Computing in Government Industry.By JC Market Research.
About Author:
JCMR global research and market intelligence consulting organization is uniquely positioned to not only identify growth opportunities but to also empower and inspire you to create visionary growth strategies for futures, enabled by our extraordinary depth and breadth of thought leadership, research, tools, events and experience that assist you for making goals into a reality. Our understanding of the interplay between industry convergence, Mega Trends, technologies and market trends provides our clients with new business models and expansion opportunities. We are focused on identifying the Accurate Forecast in every industry we cover so our clients can reap the benefits of being early market entrants and can accomplish their Goals & Objectives.
Contact Us:https://jcmarketresearch.com/contact-us
JCMARKETRESEARCH
Mark Baxter (Head of Business Development)
Phone:+1 (925) 478-7203
Email:sales@jcmarketresearch.com
Connect with us at LinkedIn
Go here to read the rest:
Cloud Computing in Government Market increasing demand with Industry Professionals: Adobe Systems, Blackboard The Greater Binghamton Business...
Cloud Computing Data Center IT Asset Disposition Market Size, Scope and Forecast | SAP Ariba, Sims Recycling, Apto Solutions, Iron Mountain,…
New Jersey, United States TheCloud Computing Data Center IT Asset Disposition Marketreport is the result of extensive and expert research into theCloud Computing Data Center IT Asset Dispositionindustry. TheCloud Computing Data Center IT Asset DispositionMarketreport explains what the market is all about, the market prognosis, several segmentations, and everything that falls under the markets umbrella. It also looks at major and secondary market drivers, market share, potential sales volume, regional analysis, and the markets key segments. The research also includes key variables that contribute to the markets growth as well as elements that might stifle the markets growth. The professionals at VM Reports applied precise research techniques and other analyses.
PESTLE analysis and SWOT analysis are two of the analyses utilized in theCloud Computing Data Center IT Asset DispositionMarketresearch, and they shed light on numerous internal and external variables that affect the market in every situation. It also includes a section that identifies strengths, weaknesses, opportunities, and threats, as well as Porters five forces model.
Get Full PDF Sample Copy of Report: (Including Full TOC, List of Tables & Figures, Chart) @https://www.verifiedmarketresearch.com/download-sample/?rid=28290
There is also a section dedicated to significant actors and their plans, such as acquisitions, collaborations, and so on. The material in the Cloud Computing Data Center IT Asset Disposition Market research was gathered mostly via expert opinions, interviews, and surveys. Verified Market Research experts have created a Cloud Computing Data Center IT Asset Disposition Market research that is full of clarity, accuracy, and useful information. The data in the report is quite accurate and reliable, with no duplicates or mistakes.
Key Players Mentioned in the Cloud Computing Data Center IT Asset Disposition Market Research Report:
SAP Ariba, Sims Recycling, Apto Solutions, Iron Mountain, CloudBlue.
Cloud Computing Data Center IT Asset DispositionMarket Segmentation:
Global Cloud Computing Data Center IT Asset Disposition (ITAD) Market, By Asset Type
IT Equipment Support Infrastructure
Global Cloud Computing Data Center IT Asset Disposition (ITAD) Market, By Solution
Data Sanitization Recovery (Reuse, Resale and Remarketing) Recycling
Inquire for a Discount on this Premium Report@ https://www.verifiedmarketresearch.com/ask-for-discount/?rid=28290
Cloud Computing Data Center IT Asset Disposition Market Report Scope
Key questions answered in the report:
1. Which are the five top players of the Cloud Computing Data Center IT Asset Disposition market?
2. How will the Cloud Computing Data Center IT Asset Disposition market change in the next five years?
3. Which product and application will take a lions share of the Cloud Computing Data Center IT Asset Disposition market?
4. What are the drivers and restraints of the Cloud Computing Data Center IT Asset Disposition market?
5. Which regional market will show the highest growth?
6. What will be the CAGR and size of the Cloud Computing Data Center IT Asset Disposition market throughout the forecast period?
For More Information or Query or Customization Before Buying, Visit @ https://www.verifiedmarketresearch.com/product/cloud-computing-data-center-it-asset-disposition-market/
Visualize Cloud Computing Data Center IT Asset Disposition Market using Verified Market Intelligence:-
Verified Market Intelligence is our BI-enabled platform for narrative storytelling of this market. VMI offers in-depth forecasted trends and accurate Insights on over 20,000+ emerging & niche markets, helping you make critical revenue-impacting decisions for a brilliant future.
VMI provides a holistic overview and global competitive landscape with respect to Region, Country, and Segment, and Key players of your market. Present your Market Report & findings with an inbuilt presentation feature saving over 70% of your time and resources for Investor, Sales & Marketing, R&D, and Product Development pitches. VMI enables data delivery In Excel and Interactive PDF formats with over 15+ Key Market Indicators for your market.
Visualize Cloud Computing Data Center IT Asset Disposition Market using VMI @ https://www.verifiedmarketresearch.com/vmintelligence/
About Us: Verified Market Research
Verified Market Research is a leading Global Research and Consulting firm that has been providing advanced analytical research solutions, custom consulting and in-depth data analysis for 10+ years to individuals and companies alike that are looking for accurate, reliable and up to date research data and technical consulting. We offer insights into strategic and growth analyses, Data necessary to achieve corporate goals and help make critical revenue decisions.
Our research studies help our clients make superior data-driven decisions, understand market forecast, capitalize on future opportunities and optimize efficiency by working as their partner to deliver accurate and valuable information. The industries we cover span over a large spectrum including Technology, Chemicals, Manufacturing, Energy, Food and Beverages, Automotive, Robotics, Packaging, Construction, Mining & Gas. Etc.
We, at Verified Market Research, assist in understanding holistic market indicating factors and most current and future market trends. Our analysts, with their high expertise in data gathering and governance, utilize industry techniques to collate and examine data at all stages. They are trained to combine modern data collection techniques, superior research methodology, subject expertise and years of collective experience to produce informative and accurate research.
Having serviced over 5000+ clients, we have provided reliable market research services to more than 100 Global Fortune 500 companies such as Amazon, Dell, IBM, Shell, Exxon Mobil, General Electric, Siemens, Microsoft, Sony and Hitachi. We have co-consulted with some of the worlds leading consulting firms like McKinsey & Company, Boston Consulting Group, Bain and Company for custom research and consulting projects for businesses worldwide.
Contact us:
Mr. Edwyne Fernandes
Verified Market Research
US: +1 (650)-781-4080UK: +44 (753)-715-0008APAC: +61 (488)-85-9400US Toll-Free: +1 (800)-782-1768
Email: sales@verifiedmarketresearch.com
Website:- https://www.verifiedmarketresearch.com/
Microsoft: Don’t Stress Over The Slashed Guidance (NASDAQ:MSFT) – Seeking Alpha
Jean-Luc Ichard/iStock Editorial via Getty Images
Microsoft (NASDAQ:MSFT) issued a guidance revision Thursday for both its top- and bottom-line, citing stronger-than-expected FX headwinds amidst a surging U.S. dollar. The revision comes on the heels of a similar sales outlook adjustment by software peer Salesforce (CRM) earlier this week, which cited a $600 million negative FX impact on its revenues for the current fiscal year. And more can be expected on the way, including technology giants like Amazon (AMZN), Apple (AAPL), Meta Platforms (FB), and Google (GOOG/GOOGL), which all generate a meaningful portion of sales overseas.
However, a resilient demand environment buoyed by robust secular tailwinds has largely overshadowed such international risks. Despite a looming economic recession, Microsoft has yet to see demand slow, citing that "IT spend would not be the first to see budget cuts" as digital transformation remains a strategic investment area across the corporate sector. Our view is that Microsoft remains a core long-term software investment, underpinned by its market leadership in a wide array of critical software categories and robust growth trajectory, balanced product portfolio and continued margin expansion.
The company decreased its fiscal fourth quarter revenue guidance from a range of $52.4 billion (+14%y/y; % +6 q/q) to $53.2 billion (+15% y/y; +8% q/q), to $51.94 billion (+13% y/y; +5% q/q) to $52.74 billion (+14% y/y; +7% q/q), with the difference fully attributable to additional FX headwinds anticipated through the rest of June. Margins have also been trimmed, with gross profit expected to come in at $35.45 billion to $36.05 billion (vs. previous guidance $35.80 billion to $36.40 billion), and operating income $20.6 billion to $21.3 billion (vs. previous guidance $20.9 billion to $21.6 billion). As a result, the fiscal fourth quarter earnings outlook has been revised from $2.28 to $2.35 per share, down to $2.24 to $2.32 per share, which falls slightly short from consensus estimates of $2.33 per share.
However, Microsoft's latest downward revision to its revenue and margins due to the FX overhang is not expected to result in a material change to its valuation outlook. Adjusting our previous financial forecast for Microsoft for the latest outlook revisions, as well as expectations for further FX headwinds through the calendar year (Microsoft has ~50% revenue exposure overseas), the company is estimated to maintain long-term revenue growth at a 10.0% compounded annual growth rate ("CAGR") still. Specifically, revenues are expected to total $198.7 billion (+18% y/y) by the end of the current fiscal year ending June 30th, with further expansion towards $357.4 billion by fiscal 2027. Higher-margin Intelligent Cloud sales will remain the core powerhouse for growth and continue to represent a higher mix of consolidated revenues over coming years with accelerated deployment of digital investments across the corporate sector.
Meanwhile, signs of softening PC market will likely contribute to a moderated pace of growth across More Personal Computing sales, which we anticipate to be in the low single-digits over the forecast period. Productivity and Business Processes revenue, which include Office and Dynamics sales, are estimated to remain a substantial contributor to consolidated growth at an approximately 9% CAGR through fiscal 2027.
Microsoft Revenue Forecast (Author)
Taking into consideration the near-term macro impacts to business, including FX headwinds, the company is expected to generate net income of $73.3 billion by the end of fiscal 2022 (+20% y/y). With additional consideration of the increasing shift to higher-margin Intelligent Cloud sales over the forecast period, as well as management's discipline in balancing revenue growth with sustained margin expansion, Microsoft's net income is expected to further advance towards $134.2 billion by fiscal 2027.
Microsoft Financial Forecast (Author)
On the valuations front, the latest FX headwinds are not expected to materially alter the Microsoft stock's outlook. Despite the currently volatile macroeconomic environment, the company's fundamental runway remains strong. Microsoft continues to show substantial resilience against the potential recession as corporate spending on core digital transformation categories like cloud migration and collaboration/productivity software adoption remain robust over coming years.
Microsoft Valuation Analysis (Author)
As such, we are maintaining our price target for the Microsoft stock in the mid-$300 level (~$340) despite the recent FX impacts, as well as the broader near-term macroeconomic overhang on growth tech stocks and equities in general. This would represent upside potential of close to 25% based on the stock's last traded share price of $274.58 on June 2nd.
Microsoft Valuation Analysis (Author)
Microsoft Sensitivity Analysis (Author)
Market leadership in key software segments. Microsoft benefits from large moat created by its market leadership across key software segments spanning cloud-computing and security services to collaboration and productivity software. This prowess bodes well with ongoing secular tailwinds buoyed by an accelerating multi-year investment cycle on digital transformation across the corporate sector.
With proven expertise across core software requirements needed to remain economically and operationally competitive in the "digital-first" world, Microsoft is uniquely positioned to capitalize on related growth opportunities given its market leadership and "strong value proposition through effective bundling" of services offered. Echoing Microsoft CEO Satya Nadella, tech spending's contribution to GDP growth is expected to double by 2030, as software becomes a key "deflationary force" going forward by driving "better economic outcomes". The trends, paired with Microsoft's market leadership across key software segments, underscore the favourable impact anticipated on Microsoft's longer-term fundamentals.
Robust fundamental performance. Microsoft has largely defied the law of large numbers, as it continues to maintain strong double-digit revenue growth in recent years. The company's consolidated revenues expanded at a 15% CAGR over the past three reporting fiscal years, primarily driven by organic growth across core businesses that count Office 365, Azure, and gaming. As discussed in earlier sections, our forecast estimates continued top-line growth at a 10% CAGR through fiscal 2027, buoyed by increased penetration into cloud opportunities.
The bullish commercial demand environment for cloud-computing solutions has continued to provide partial insulation for Microsoft from any "consumer slowdown", assuaging investors' month-long fears that the increasingly complex macroeconomic environment might result in a pullback in corporate IT budgets and backfire on the business' growth outlook. During the fiscal third quarter, Microsoft saw the "number of $100+ million Azure deals more than double year-over-year", which underscores "broad-based acceleration in [cloud] consumption". And momentum is expected to persist for the segment, benefiting from both new cloud customers looking into a transition away from legacy on-premise IT infrastructures, as well as matured cloud customers looking to bolster their IT capabilities through expanded adoption of Microsoft's Azure-based data offerings (e.g., Synapse, Purview, Azure SQL Server).
In addition to Azure, Microsoft also continues to benefit from robust commercial demand for its cloud-based productivity solutions like Office 365/Microsoft 365/Dynamics 365. Collaboration app Teams is also gaining uptake momentum, especially across emerging markets and small and medium enterprises ("SMEs"), in addition to large corporations, to support the post-pandemic era work-from-home norm. Commercial take rate of Office 365 has accumulated to almost 350 million seats as of mid-May, courtesy of additional penetration into peripheral opportunities across smaller-scale business operations.
The rise in demand for security in cloud-based collaborative and productivity tools from the commercial sector has also continued to underpin success for the company's latest Microsoft 365 E5 offering. With a focus on enhancing overall security, compliance, voice and analytic capabilities, the E5 suite is encouraging a flurry of upgrades from existing customers of the preceding E3 suite.
Multiple growth drivers. Microsoft benefits from a diverse growth profile composed of a wide array of core software offerings spanning cloud-computing, security, collaboration and productivity, hyper-automation solutions, and more. The company is uniquely positioned to benefit from growth opportunities arising from key software segments as the consolidation of enterprise IT spend continues to accelerate:
Sustained margin expansion. Balancing revenue growth with margin expansion through disciplined execution of day-to-day operational requirements and investment opportunities remains a priority for Microsoft. The company's gross margin has steadily paced towards 70% in recent quarters, while the operating margin has also gradually edged above 40%. As mentioned in earlier sections, continued capitalization on accelerating demand for higher-margin Intelligent Cloud offerings like those offered by Azure will continue to pave way for gradual margin expansion, albeit at a moderating pace over the longer term as the business matures. Meanwhile, Office margins are likely approaching a "steady-state" given its current size and maturity.
Overall, we remain confident that Microsoft's investment positives will continue to overshadow the near-term FX risks, which it is not alone in. Compared to SaaS peers with a similar international strategy, Microsoft is relatively better off, given its market leadership and the general stickiness of its cloud-based solution offerings in the commercial sector, especially productivity tools like Office 365. The company's diverse growth portfolio across key software categories, paired with a robust digital spending cycle across both the private and public sectors also serves as an effective insulation from the near-term macroeconomic uncertainties. The critical role of Microsoft's offerings in both the corporate and consumer digital fabric will continue to drive long-term growth, providing for robust valuation prospects ahead despite the near-term FX bump.
Read the original:
Microsoft: Don't Stress Over The Slashed Guidance (NASDAQ:MSFT) - Seeking Alpha
Going from COBOL to Cloud Native The New Stack – thenewstack.io
Virtually every technology publication these days is full of cloud stories, often about the success that has been achieved by webscale companies doing amazing things exclusively in the cloud. Unlike Netflix, Twitter and Facebook, however, most companies have a heritage that predates the availability of cloud computing.
Mark Hinkle
Mark has a long history in emerging technologies and open source. Before co-founding TriggerMesh, he was the executive director of the Node.js Foundation and an executive at Citrix, Cloud.com and Zenoss, where he led its open source efforts.
Unlike these relatively young companies that have the benefit of starting more recently and growing to maturity in the cloud native era, there are myriad companies that may feel that they are held hostage by legacy infrastructure that cant be migrated to the cloud for reasons of risk, compliance or compatibility.
Just because you have a legacy investment that would be disruptive to move doesnt mean you cant adopt cloud or cloud native systems that enable new digital initiatives and still capitalize on those legacy investments. However, it does mean that you need to find ways to integrate in a nondisruptive way.
There are a few practices you can put in place to get to a cloud native environment while still using your existing legacy investment. I advocate working on adopting cloud native practices and architecture patterns that can ease your implementation of cloud computing incrementally, which involves adopting cloud computing architecture patterns on-premises.
In the early days of the internet, the idea of stacks was prevalent. In regards to delivering web-based services, Microsoft had the WIMSA (Windows, IIS, SQL Server and ASP) and open source users had the LAMP (Linux, Apache, MySQL, PHP). The LAMP stack was the most democratic, allowing you to choose the vendors for your stack, and vendors provided a single throat to choke should something go awry. The choice to choose the layers of the stack was a benefit many users of legacy technology may not realize today.
When you look at todays applications, the gold standard for reliability is Java. Though you need to manage the JVMs, you need to tune the stack and use garbage collection to manage memory. You also need an app server to serve the instances. Taking a container-based approach to running individual services, you can leverage Kubernetes and Knative (both housed in the CNCF), which can simplify things by scaling containers automatically both up and down as needed.
Kubernetes and containers make application environments portable from on premises to the cloud and back again. An example of how you could get the best of both worlds is to consider Spring Boot, an open source framework for Java developers aimed at cloud native deployments that can be deployed in containers that can run on premises with Kubernetes or in the cloud.
Using composable infrastructure is the best practice, taking the best technologies and solutions to build systems that are decoupled but well integrated. Gartner describes the Composable Enterprise as a composable business made from interchangeable building blocks and follows four principles of composable business: modularity, autonomy, orchestration and discovery. The idea that any system or application can benefit from composability is often overlooked. Anything can be part of composable infrastructure, not just cloud services.
We experience batch processing every day. Our banks typically process our deposits overnight, and we dont see that in our banking app until after the batch processes. The same thing applies to our utilities that process the usage on a monthly basis, and we only see the consumption once a month.
Batch processing was used because the load placed on the data warehouse could potentially interrupt or slow down business operations. So the goal would be to move to an architecture that increases the speed of delivery of data without interrupting current business operations. Thats where extract, load, and transform (ELT) and event-driven architecture (EDA) can help.
Many times, we use the term replicating data and syncing data interchangeably. Technically, theres an important difference. Replication implies a copy of the data (or some subset thereof) is maintained to keep the data closer to the user, often for performance or latency reasons. Synchronization implies that two or more copies of data are being kept up to date but not necessarily that each copy contains all the data, though there is the idea that some consistency is kept between the data sources.
Using an event-streaming technology like Apache Kafka, you can replicate data from read-only data producers (databases, ERP systems, keeping your attack face smaller since you arent granting writes to the database). You can also choose to replicate only whats needed for other systems like mobile apps, web portals and other customer-facing systems without necessarily having them place the load on the canonical database.
Figure 1.1 Extract, transform, and load versus extract, load, and transform
When you look at any major cloud provider, the pattern of event-driven architecture is prevalent. In AWS, for example, services are decoupled and run in response to events. They are made up of three types of infrastructure: event producers, event consumers and an event router.
While AWS deals exclusively in services, your enterprise likely has things like message buses and server software that logs activity on the server. These systems can be event producers. They can be streamed via Kafka or consumed from your log server directly by an event router. In this usage, I suggest the project I work on, the open source TriggerMesh Cloud Native Integration platform to connect, split, enrich and transform these event sources.
For example, you can forward messages from your mainframe using the IBM MQ message bus to integrate your legacy and cloud services like Snowflake. Using the event payloads, you can create data replication without additional load on the producer. You can change that event to a format consumable by the event consumer by changing that event or enriching that event on the fly.
By decoupling the event consumer and producer, you can change the destinations in the event you change vendors (move from AWS to Google) or add additional sources where you may want to replicate data. You also get the benefit of creating synchronizations in real-time, which is in contrast to waiting on batched data to arrive.
EDA isnt a silver bullet. There are times when you may need to make synchronous API calls. Using APIs, you can make queries based on some set of conditions that cant be anticipated. In that case, I am a fan of using open source, cloud native technologies like Kongs API Gateway.
When you talk about code, you might have heard the term WET (Write Everything Twice) as opposed to DRY (Dont Repeat Yourself). In the world of development, WET refers to poor coding that needs to be rewritten and DRY is writing more efficient code that doesnt need to be rewritten. In integration, its not an exact correlation, but I believe synchronous API integration is often WET; you write to the API and then write the response that the API returns.
There are many good reasons to do this when you need to complete a complex integration that requires look-ups and a complex answer. However, it can be overkill.
Event-driven architecture (EDA) provides a way for DRY integration by providing an event stream that can be consumed passively. There are many advantages. If you are forwarding changes via the event streams, you can even do whats called change data capture (CDC).
Change data capture is a software process that identifies and tracks changes to data in a database. CDC provides real-time or near-real-time movement of data by moving and processing data continuously as new database events occur. Event-driven architectures can accomplish this by using events that are already being written but then can be streamed to multiple sources.
Many corporations face one of the most entrenched pieces of legacy technology in the cloud. Although, until I went digging, I didnt realize the full extent of this. Mainframes still run a large amount of COBOL. In fact, our whole financial system relies on technology that is unlikely to move to the cloud in the near future.
One of the most interesting and unforeseen integrations I have run into is the integration of mainframes with the cloud. While Amazon doesnt have an AWS Mainframe-as-a-Service, there is a benefit in integrating workflows between mainframes and the cloud. One global rental car company I work with has an extensive workflow that takes data stored in IBM mainframe copybooks and transforms it into events that are consumed to automate workflows in AWS SQS.
There are many reasons you might want to forward mainframe traffic and not just for workflows, but for data replication, real-time dashboards or to take advantage of cloud services that have no data center equivalent. Also, because you arent logging in to the event-producing system, there can be a security benefit of a smaller attack surface exposing only the event stream and not the host system.
I believe strongly that going forward there will be two main types of infrastructure: those served by cloud providers as services and open source software. Open source has eaten the world. Linux is the dominant operating system in the cloud and the data center. Kubernetes is becoming the open source cloud native fabric of the cloud. Then there is an abundance of free and open source data center software from multibillion-dollar corporations, consortia and innovative start-ups alike.
One incredibly interesting example of composable infrastructure is the ONUG Cloud Security Notification Framework. CSNF is an open source initiative led by FedEx, Raytheon and Cigna that tackles the difficulty of providing security assurance for multiple clouds at scale caused by the large volume of events and security state messaging. The problem is compounded when using multiple cloud service providers (CSPs) due to the lack of standardized events and alerts among CSPs.
Figure 1.2 Architecture diagram of composable infrastructure for ONUG Cloud Security Notification Framework
This gap translates into increased toil and decreased efficiency for the enterprise cloud consumer. Cloud Security Notification Framework (CSNF), developed by the ONUG Collaboratives Automated Cloud Governance (ACG) Working Group, is working to create a standardization process without sacrificing innovation.
The interesting thing about CSNF is that its a loosely coupled set of technologies that can incorporate both cloud services and on-premises technologies. While the initial goal is to normalize security events from cloud providers into a single format, it can also incorporate any number of other tools and data sources as appropriate.
While your existing infrastructure may not be completely modern, theres no reason you cant benefit from modern technologies and cloud services through integration. Firstly, integration is arguably the key to modernization without the dreaded lift and shift. If you look at your integration layer today, Id consider a number of tactics:
For IT operations to thrive, they need to adopt agile practices like DevOps and technologies that are open source, event-driven and cloud native. Though, even if you have an IT heritage to consider, it doesnt mean you are stuck in the past. In the modern world of open source cloud native technologies, you can still reap benefits without a wholesale move to the cloud.
Featured image via Pixabay
View post:
Going from COBOL to Cloud Native The New Stack - thenewstack.io
Edge coming to the rescue of cloud – ITWeb
Pramod Venkatesh, Group Chief Technology Officer, Inq.
Cloud computing is at risk of being throttled by its own success, overwhelmed by the rising tsunami of data. However, edge computing is riding to the rescue, particularly in regions like Africa, which are most vulnerable to clouds three major weaknesses: bandwidth limitations, excess latency and network congestion.
Thats the view of Pramod Venkatesh, Group CTO at Inq. While acknowledging that the concept of edge computing isnt new its roots go back to the earliest days of remote or distributed computing he maintains that edge computing is the next evolution of cloud computing.
Cloud computing itself is an evolution of traditional enterprise, client-server computing where data is moved from a users computer across a WAN or the internet to a centralised computer, where it is either stored or worked on, and the results sent back to the user.
With the rise of 5G networks, more companies than ever can harness comprehensive data analysis without the IT infrastructure needed in previous generations. Thats the power of the cloud, Venkatesh says.
But the quantity of data moving across the internet is enormous and getting more so by the minute. The World Economic Forum estimated that at the start of 2020, there was about 44 zettabytes of data in the world or 40 times more data in the digital realm than observable stars in the universe. By 2025, that unfathomable number would be added to by another 463 exabytes of newly generated data produced by an ever-increasing number of connected devices, every single day.
Gartner has predicted that by 2025, three-quarters of all enterprise-generated data will be created outside centralised data centres on or by these devices. The internet would buckle under the load.
Traditional cloud platforms, including those set up and operated by the worlds largest providers, which are struggling to cope now, could be overwhelmed. The impact on time- and disruption-sensitive data could be catastrophic. It doesnt take much imagination to appreciate the chaos that would result if data sent from a self-drive car for analysis at some distant data centre were delayed, disrupted or distorted by the time the confirmation came back that the car was approaching a hazard, lives could be lost.
The further the data centre where the analysis is to happen, from the end-point, where the analysis is needed the greater the risk of delay. And in Africa, those distances are not only large, theyre also with the exception of South Africa across borders.
From a regulatory perspective, cross-border data transfer can be problematic in some African countries, requiring certain types of data to be processed in-country, Venkatesh says.
Another major issue with traditional cloud computing in Africa is the cost of connectivity, which is still significantly higher than in the rest of the world. Add to that the fact that the locations from which data is being generated may be in hostile environments down a mine, for example with limited or intermittent connectivity.
According to Venkatesh, edge computing effectively addresses all these issues as the data is processed as close as possible to where it is generated, even possibly on the device that collects or generates the data in the first place.
The beauty of edge computing is that it has endless potential applications, particularly when those applications require some form of AI. This can range from security and medical monitoring to self-driving vehicles, video conferencing and enhanced customer experiences, he says.
Many users today are not even aware that they are using some form of edge computing. For example, its already widely used in entertainment and gaming: streaming music and video platforms often cache information to reduce latency, thus offering more network flexibility when it comes to user traffic demands, he says.
An ever-increasing number of devices are needed to communicate and process data in a localised environment, such as devices like voice assistants. Without the help of decentralised processing power, devices like Amazon Alexa and Google Assistant would take far more time to find requested answers for users.
Manufacturers use edge computing to keep a closer eye on their operations. Edge computing enables companies to monitor equipment and production lines for efficiency and even detect failures before they happen, helping to avoid costly downtime delays.
Edge computing is even being used in a mine in Zambia to detect dangerous snakes and warn miners of their location in real-time. Edge computing is also being used to detect cars and goods arriving and leaving company premises, thus preventing unauthorised use as well as theft.
As all or most of the computing work is done at the edge, only data that requires deeper analysis, review or other human interaction need be sent back to the main data centre. The amount of data to be sent is thus vastly reduced, requiring less bandwidth or connectivity time than would otherwise be the case. Edge computing is thus reshaping IT and business computing, he adds.
However, Venkatesh warns that edge computing comes with challenges of its own, not least of which is security, both physical and cyber, as well as the management and control of edge devices.
Nevertheless, he believes we are only just beginning to scratch the surface of edge computings potential and predicts that the uptake of edge computing, particularly in Africa, will continue to accelerate over the next decade.
See more here:
Edge coming to the rescue of cloud - ITWeb
Cloud computing to unveil the enigmas of our galaxy – EurekAlert
The Galactic RainCloudS project, an initiative led by members of the Faculty of Physics, the Institute of Cosmos Sciences (ICCUB) of the University of Barcelona and the Institute for Space Studies of Catalonia (IEEC), was awarded the first position in the framework of the Cloud Funding for Research call of the European project Open Clouds For Research Environments (OCRE).
The project competed against 27 proposals from twelve countries in a wide range of research disciplines. This first edition of Cloud Funding For Research funds the use of commercial computational cloud resources for research. The project counts on the collaboration from the private sector, and specifically from Pervasive Technologies, which brings experience in artificial intelligence and cloud computing; Google, and the computing infrastructure from Google Cloud and Telefnica, which offers experience on cloud resource management.
Professor Xavier Luri, director of ICCUB and principal researcher of the project, highlights that The Galactic RainCloudS project is a pioneer one in Europe in the use of commercial cloud infrastructures for research on astronomy, and results from the will to show the benefits of cloud resource uses for the scientific community.
The key of the project lies in interdisciplinarity: combining the extraordinary volumes of data from the European Space Agency Gaia Satellite with the great computational power and the flexibility of cloud infrastructures, and with the data mining techniques, it will enable the team of the University of Barcelona to study the existing links between past galaxy collisions and star formation in a holistic way, a study in which the Milky Way and satellite galaxies will be an experimental laboratory. Cloud computing is like renting powerful customized computers, for a certain period of time, which will enable us to make the necessary calculations to study the interaction between galaxies, notes Merc Romero, researcher at ICCUB.
The project also includes the development of a system to detect traces of past small galaxy collisions with the halo of our galaxy. Teresa Antoja, researcher at ICCUB, notes that the existence of granularities in the galactic halos is a prediction of the current cosmological model of the formation of our Universe: the active search for substructures of this type in the Gaia data can provide vital information on the history of the Milky Way and on the nature of dark matter.
Artificial intelligence and cloud computing
The participation of the private sector in this project shows the closeness between research and companies in the use of cutting-edge technologies as well as their shared interests. In Pervasive Technologies, we are glad to offer our knowledge on artificial intelligence and cloud computing to a pioneer project in the field of research. We will work to get the highest performance of the cloud infrastructures and artificial intelligence for this project, notes Rodolfo Lomascolo, CEO of Pervasive Technologies.
In order to be successful, the Galactic RainCloudS project must have, among other features, big data infrastructures. The Gaia satellite data hide the answer to many questions we want to solve, but we need the right tools to retrieve them, notes Roger Mor, data scientist at Pervasive Technologies and ICCUB collaborator. He adds: The available big data platforms in the commercial cloud and artificial intelligence services are fundamental tools to find, for instance, whether the interaction of Sagittarius with the Milky Way caused the reignition of the star formation in our galaxy between 5 and 7billion years ago, as stated in some studies.
Enrique Gonzlez Lezana, head of cloud sales specialist at Telefnica Tech, says that Telefnica has accompanied the University of Barcelona in the definition and unfolding of the Google Cloud architecture, where the required hypercomputing solution to work on the Galactic RainCloudS project will be hosted. The unfolded infrastructure he adds will enable the processing and analysis of big data in a flexible, scalable way, adjusted to the required needs of the researchers of the University of Barcelona. Telefnica will work with the UB during the entire process to guarantee the successful implementation of the project with teams specialized on Google Cloud services and technologies.
The project launched this May and will last a year. Galactic RainCLoudS is a necessary step in the transition of the world of research toward the efficient use of cloud computing resources. In this sense, we are pioneers in its use at the University of Barcelona and we hope our experience serves to encourage its use. The research teams needs are becoming more specific, and we are making an effort for this project to open the doors of commercial cloud computing in future projects for all research disciplines, concludes Xavier Luri.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.
See more here:
Cloud computing to unveil the enigmas of our galaxy - EurekAlert
Army official teases updated cloud plan, expected later this year – Breaking Defense
Digital background depicting innovative technologies, data protection Internet technologies. Cloud computing digital concept (Getty)
WASHINGTON: The director of the Armys cloud office said the services updated cloud plan will be made publicly available later this fiscal year as the service pushes to update it digital operations at home and in the field.
What youre going to see is a greater level of detail and maturity for not just leveraging commercial cloud computing, but really how that starts to extend into our on-premise locations, how this starts to extend into our tactical locations and how we start to kind of poke at some of the mission-enabling capabilities of how the Army operates, enabled by cloud computing, Paul Puckett, director of the Armys Enterprise Cloud Management Office, said Tuesday at an Amazon Web Services conference.
Puckett said the plan is currently being reviewed but did not offer a specific deadline for when it will be released, though he alluded to how the release of the services 2020 cloud plan was pushed back from January to May.
If it doesnt happen by the end of this fiscal year, someone will probably chat with me about that, he said. So youll see it at the very least [by] September or before.
He added the service is focusing on leveraging commercial cloud services and extending the Army network with the Unified Network Plan.
I think what youre also going to see is really a hybrid of where we can make appropriate investments on-premise in our data centers that complements the greater initiatives of the DoD around what DISA [Defense Information Systems Agency] is doing where we can start to have more on-demand cloud infrastructure in on-premise locations, both in strategic locations CONUS as well as OCONUS, Puckett said.
RELATED: DISA Has 14 Ways It Wants Industry To Help It Move Into The Future
So I see commercial revenue, our use of it growing, but I also see some kind of persistent use case for on-premise that really needs to be teased out Its not just the lift and shift from a data center construct to an on-premise cloud construct, but truly how that becomes seen as a strategic asset and greater global architecture of computing is really important.
In January, Army Chief Information Officer Raj Iyer said the service was well on its way toward implementing its first cloud outside of the US in the Indo-Pacific, a move that would allow the Army to integrate cloud services into all aspects of experimentation in the region. The Armys plan is to deliver tactical cloud capabilities to its Multi-Domain Task Forces in the Indo-Pacific region and at Joint Base Lewis-McChord in Washington state this year.
Iyer told the audience at an AFCEA Northern Virginia Army IT Day event that the service is taking a data-centered approach to migrating applications to the cloud, where moving applications to the cloud isnt the end goal, but rather the service needs to make sure its able to harvest the data in applications and systems.
So its not just the traditional ways of pulling data, Iyer said. Its now getting to an API-driven architecture to build a Command Post Computing Environment of the future We are now actively working with units where we are allowing them to now experiment mission threats and operational scenarios using the capacity in the cloud and on the platforms weve established.
Original post:
Army official teases updated cloud plan, expected later this year - Breaking Defense