Category Archives: Cloud Computing

Global Server Virtualization Software Strategic Business Report 2023: Proliferation of Cloud and OS Technologies Drive Server Virtualization – Yahoo…

DUBLIN, June 8, 2023 /PRNewswire/ -- The "Server Virtualization Software: Global Strategic Business Report" report has been added to ResearchAndMarkets.com's offering.

Research and Markets Logo

The global market for Server Virtualization Software estimated at US$8.3 Billion in the year 2022, is projected to reach a revised size of US$13.8 Billion by 2030, growing at a CAGR of 6.6% over the analysis period 2022-2030.

Para Virtualization, one of the segments analyzed in the report, is projected to record a 6.7% CAGR and reach US$4.4 Billion by the end of the analysis period.

Taking into account the ongoing post pandemic recovery, growth in the Full Virtualization segment is readjusted to a revised 5.9% CAGR for the next 8-year period.

The U.S. Market is Estimated at $2.8 Billion, While China is Forecast to Grow at 7.6% CAGR

The Server Virtualization Software market in the U.S. is estimated at US$2.8 Billion in the year 2022. China, the world's second largest economy, is forecast to reach a projected market size of US$1.4 Billion by the year 2030 trailing a CAGR of 7.6% over the analysis period 2022 to 2030.

Among the other noteworthy geographic markets are Japan and Canada, each forecast to grow at 6.2% and 6.4% respectively over the 2022-2030 period.

Within Europe, Germany is forecast to grow at approximately 6.3% CAGR. Led by countries such as Australia, India, and South Korea, the market in Asia-Pacific is forecast to reach US$1.4 Billion by the year 2030.

Select Competitors (Total 48 Featured) -

What`s New for 2023?

Special coverage on Russia-Ukraine war; global inflation; easing of zero-Covid policy in China and its `bumpy` reopening; supply chain disruptions, global trade tensions; and risk of recession.

Global competitiveness and key competitor percentage market shares

Market presence across multiple geographies - Strong/Active/Niche/Trivial

Online interactive peer-to-peer collaborative bespoke updates

Access to digital archives and Research Platform

Complimentary updates for one year

Story continues

Key Topics Covered:

I. METHODOLOGY

II. EXECUTIVE SUMMARY

1. MARKET OVERVIEW

Influencer Market Insights

World Market Trajectories

Impact of Covid-19 and a Looming Global Recession

Virtualization and Virtualization Software: A Review

World Virtualization Software Market by Technology (2020 & 2027): Percentage Breakdown of Revenues for Network Function Virtualization, Desktop Virtualization, Software-defined Storage and Server Virtualization

Server Virtualization

Development of the Server Virtualization Market

Rising Uptake and Software Maintenance Revenues Help Server Virtualization Software Market Post Decent Growth

Market Drivers & Restraints

North America to Retain Commanding Position in Server Virtualization Software Market

World Server Virtualization Software Market by Region (2020 & 2027): Percentage Breakdown of Revenues for Developed and Developing Regions

World Server Virtualization Software Market - Geographic Regions Ranked by CAGR (Revenues) for 2020-2027

Competitive Scenario

Vendor Focus on Sophisticated Features Catalyzes Server Virtualization Software Market

Recent Market Activity

Server Virtualization Software - Global Key Competitors Percentage Market Share in 2022 (E)

Competitive Market Presence - Strong/Active/Niche/Trivial for Players Worldwide in 2022 (E)

2. FOCUS ON SELECT PLAYERS

3. MARKET TRENDS & DRIVERS

Server Virtualization: An Important Cog in the IT Wheel

Cost Benefits from Server Consolidation Drive the Business Case for Server Virtualization

Server Virtualization: Enabling Green IT Strategy

Energy Consumption in Server Rooms: Comparison of Power Consumption (in %) for IT Equipment, Air Conditioning and Distribution Losses

Changing ICT Landscape to Influence Server Virtualization Software Market

Global Shipments of Smartphones (in Million Units) for the Years 2011 through 2019

Proliferation of Cloud and OS Technologies Drive Server Virtualization

Global Public Cloud Computing Market (in US$ Billion) by Segment for the Years 2019 and 2022

Status of Serverless Computing, Containers, and Modern Applications

Data Center Virtualization Trends

Select Innovations in Server Virtualization Space

Challenges Facing the Server Virtualization Software Market

Server Virtualization: Technology Overview

Virtualization Technology: A Background Study

An Introduction to Server Virtualization Software

Future of Server Virtualization Software

Server Virtualization Approaches

Virtual Server

Key Steps in Implementation of Server Virtualization

Key Benefits Package

Disadvantages of Virtualization

Backup Issues

Data Recovery

4. GLOBAL MARKET PERSPECTIVE

III. MARKET ANALYSIS

IV. COMPETITION

For more information about this report visit https://www.researchandmarkets.com/r/vvsf3m

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Media Contact:

Research and MarketsLaura Wood, Senior Managerpress@researchandmarkets.comFor E.S.T Office Hours Call +1-917-300-0470For U.S./CAN Toll Free Call +1-800-526-8630For GMT Office Hours Call +353-1-416-8900U.S. Fax: 646-607-1907Fax (outside U.S.): +353-1-481-1716

Logo: https://mma.prnewswire.com/media/539438/Research_and_Markets_Logo.jpg

Cision

View original content:https://www.prnewswire.com/news-releases/global-server-virtualization-software-strategic-business-report-2023-proliferation-of-cloud-and-os-technologies-drive-server-virtualization-301846223.html

SOURCE Research and Markets

Follow this link:
Global Server Virtualization Software Strategic Business Report 2023: Proliferation of Cloud and OS Technologies Drive Server Virtualization - Yahoo...

Fog Computing: The Backbone of Future Smart Transportation … – CityLife

Fog Computing: The Backbone of Future Smart Transportation Systems

Fog computing, a decentralized computing infrastructure, is rapidly emerging as the backbone of future smart transportation systems. As urban populations continue to grow, cities worldwide are turning to innovative technologies to improve the efficiency, safety, and sustainability of their transportation networks. One such technology is fog computing, which extends cloud computing capabilities to the edge of the network, enabling real-time data processing and decision-making at the source of data generation.

The concept of fog computing was introduced by Cisco in 2014 as a means to address the limitations of traditional cloud computing in the context of the Internet of Things (IoT). In cloud computing, data generated by IoT devices is sent to remote data centers for processing and storage. However, this centralized approach can result in high latency, bandwidth consumption, and security risks, particularly when dealing with the massive amounts of data generated by smart transportation systems.

Fog computing addresses these challenges by distributing computing resources across the network, allowing data to be processed closer to where it is generated. This decentralized approach significantly reduces latency and bandwidth consumption, enabling real-time data processing and decision-making that is critical for the efficient operation of smart transportation systems.

One of the key applications of fog computing in smart transportation is traffic management. By processing data from traffic sensors, cameras, and connected vehicles in real-time, fog computing can help traffic management systems to optimize traffic flow, reduce congestion, and improve overall transportation efficiency. For example, fog computing can enable dynamic traffic light control, adjusting signal timings based on real-time traffic conditions to minimize delays and improve traffic flow.

Fog computing can also play a crucial role in enhancing the safety of smart transportation systems. By processing data from various sources such as connected vehicles, roadside sensors, and cameras, fog computing can enable real-time detection of potential hazards and accidents. This information can then be used to alert drivers, emergency services, and traffic management centers, enabling a rapid response to incidents and reducing the risk of secondary accidents.

In addition to traffic management and safety, fog computing can also support the growing adoption of electric vehicles (EVs) and the development of smart charging infrastructure. By processing data from EVs and charging stations in real-time, fog computing can help to optimize energy consumption, reduce peak demand on the power grid, and enable dynamic pricing based on real-time energy market conditions.

Furthermore, fog computing can support the deployment of autonomous vehicles, which rely on real-time data processing and decision-making to navigate complex urban environments safely. By enabling low-latency communication between vehicles, infrastructure, and traffic management systems, fog computing can help to ensure the safe and efficient operation of autonomous vehicles in smart transportation systems.

In conclusion, fog computing is poised to play a critical role in the development of future smart transportation systems. By enabling real-time data processing and decision-making at the edge of the network, fog computing can help to improve the efficiency, safety, and sustainability of urban transportation networks. As cities around the world continue to invest in smart transportation infrastructure, fog computing will undoubtedly become an essential component of the technology ecosystem that underpins these systems.

Read more:
Fog Computing: The Backbone of Future Smart Transportation ... - CityLife

AMD Stock: Bear vs. Bull – The Motley Fool

Companies joining the artificial intelligence (AI) market have experienced a bull run in 2023, with companies like Nvidia and Microsoft enjoying stock rises of 165% and 39% since Jan. 1. As a leading chipmaker with much to gain from the sector's development, Advanced Micro Devices (AMD 3.20%) also benefited from the rally as its shares have soared 92% in the same period.

AMD's prospects in AI seem to grow by the day as its expansion in the industry develops. However, some analysts have voiced concerns that the company's price may be overinflated after its latest rally and might not be able to live up to the hype.

Here is the bear versus bull review for AMD stock.

While AMD's stock skyrocketed this year on the prospects of AI, its value has fallen. The company's price-to-earnings ratio (P/E) is up more than 1,000% year to date, hitting 540. The figure is far higher than the preferred 20 or below, which often indicates a good-valued stock.

As a result, if you're considering buying AMD stock, it's crucial to keep a long-term mindset. The company's shares could take a temporary dive if AMD can't immediately deliver on its AI potential. However, that is unlikely to affect its growth over the next decade and beyond.

AMD shares have risen about 695% in the last five years and more than 3,000% in the last decade. The company has a solid outlook thanks to its ability to supply chips to a variety of industries. Yet, with such a rapid stock rise this year based on the prospects of a largely untested market, the company's short-term stock performance could be volatile.

The good news is, despite the uncertainty, AMD's forward price/earnings-to-growth ratio (PEG), which takes into account expected earnings growth, is at an attractive 0.2. The metric suggests AMD has not veered from its growth path and remains an attractive investment for patient investors.

Amid the buzz concerning AI, AMD has garnered a lot of attention from tech enthusiasts and Wall Street. However, the complexity of the technology might leave you wondering how exactly the company will profit from the industry's growth. So, let's take a closer look.

AMD is home to a variety of powerful chips with its line of central processing units (CPUs), graphics processing units (GPUs), and data processing units (DPUs). These chips are required to run a wide range of tasks from gaming to video editing, cloud computing, and of course, running and developing AI programs.

For reference, OpenAI's ChatGPT utilized about 20,000 GPUs in 2020 and is expected to increase that number to 30,000 as it readies for commercialization.

Nvidia gained a slight edge over AMD by becoming the primary supplier of GPUs to ChatGPT. However, the fight isn't over yet. AMD's prospects in AI have strengthened thanks to a recent partnership with OpenAI's biggest investor, Microsoft.

According to a Bloomberg report from May 4, Microsoft is supporting AMD's AI chip expansion by providing financial and engineering resources.The Windows company's aim is to help AMD become an equal alternative to Nvidia.

Moreover, data from Grand View Research states the AI market is projected to expand at a compound annual growth rate (CAGR) of 37% through 2030 after hitting $137 billion in 2022. The monster CAGR indicates it's still early days for the industry, with plenty of market share still up for grabs.

Nvidia may have had a head start, but it's not too late for AMD to capture a substantial part of the sector in the coming years.

AMD shares might be pricey right now, suggesting it's wise to exercise caution. However, if you're willing to hold for the long term, the company's prospects in AI are worth an investment.

Dani Cook has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Microsoft, and Nvidia. The Motley Fool has a disclosure policy.

Read more from the original source:
AMD Stock: Bear vs. Bull - The Motley Fool

Fisher Funds: AI is changing the world but big winners could be the … – New Zealand Herald

The first California gold rush started in 1848 - but now another is underway. Photo / Getty Images

OPINION

ChatGPT is changing the world as we know it, leaving some companies in its wake but whats the best investment option to capitalise on this new wave of technology-led disruption?

Dan Rosenweig, CEO of online education company Chegg, recently painted an alarming picture for investors.

Chegg provides homework help, digital and physical textbook rental, online tutoring and other student services.

Speaking to analysts on the latest earnings call, he said the significant spike in student interest from ChatGPT is impacting Cheggs ability to get new customers, as students opt to use the AI tool for assistance with homework instead.

The statement sent the stock tumbling, closing down 48 per cent in a single day and casting a shadow over the future of the company.

AI is in its infancy, but is already causing reverberations across many industries, capturing the minds and imaginations of scientists, politicians, and corporations, and bringing terms like ChatGPT to the forefront of public consciousness.

While it may be a blessing for many companies, potentially enabling significant productivity boosts, some may face existential risks to their business models.

New gold rush

Investors hoping to capitalise on the AI boom need to look no further than the gold rushes of the 19th century to figure out the best investment strategy.

Just as it was on the riverbeds of California, Victoria, and Otago in the 1800s, the best business was not in panning and digging for gold but rather in selling the miners the picks and shovels to do so.

Back then it was Levi Strauss and Jacob Davis who were in the box seat, but today with AI its the companies selling the tools and services that support the growing AI industry. With an early lead, Amazon, Google and Microsoft are well placed to play the enablers and backbones of the AI revolution.

Many companies have already been using AI for computational and quantitative tasks.

JPMorgan Chase bank and Mastercard are using AI to automate tasks like fraud detection and risk assessment, freeing up employees to focus on other things, and Netflix is constantly developing new AI-powered features to improve its recommendation engine.

The big recent innovation has been the wave of releases of Large Language Models (LLMs) such as ChatGPT. LLMs have moved AI beyond just numbers and quantitative tasks into language, creative content creation, and replicating human interactions.

Productivity software stands to benefit substantially.

For example, GitHub, Microsofts coding platform, rolled out its AI feature named Copilot, which aids developers code writing by offering autocomplete-style suggestions as they code.

There has often been a talk of 10x super developers who are 10 times more productive than the average developer.

GitHub CEO Thomas Dohmke strongly believes 10x developers can become widespread, saying: With AI at every step, we will realise the 10x developer.

Some business models could be upended by the technology

Chegg is one example of an early company impacted by AI, and companies with a large component of repetitive and manual tasks are also likely to be impacted.

Industries previously requiring people with specialist skills, such as copywriting and website design, can now use AI to help complete these tasks in minutes, rather than days or weeks.

Instead of using a copywriter, simple prompts can be fed into Word Copilot to generate full articles and content for publishing and advertising. Websites can be built simply by taking a photo of a hand-drawn website design and feeding this into an AI model.

This isnt necessarily a negative for these industries if they can use the productivity gains on more mundane tasks, and spend more time on value-add activities.

Satya Nadella, CEO of Microsoft, believes AI could enable every developer to focus all their creativity on the big picture: building the innovation of tomorrow and accelerating human progress, today.

The largest beneficiary from AI will be the cloud computing companies.

AI is computationally intensive. One conversation on ChatGPT costs only a few cents but when used by billions of people many times over, this cost escalates exponentially, and estimates put the daily cost at over NZD$1 million per day.

The lure of Azure

Amazon AWS, Microsoft Azure and Google Cloud Platform dominate the cloud computing market and AI has expanded their horizons significantly.

Only a handful of companies ultimately have the scale and ability to provide the computing infrastructure to host AI applications. Both Amazon and Azure have made large investments in building their own AI specific chips for training and running AI models more efficiently something smaller competitors are unlikely able to replicate.

These three providers are best positioned to capture the lions share of AI workloads, either through licensing their own AI models or providing the computing musclepower to host AI models from others.

Right now, AI is like the internet in the 90s, before smartphones, app stores, streaming and social media were on the radar of most people.

Current commercial adoption remains relatively low but there are unlimited use cases. Being invested in the picks and shovels like AWS, Azure and Google Cloud Platform could offer the greatest likelihood for success regardless of which platform or AI model dominates.

Like in the infancy of the internet, we can only guess what AIs ultimate size, impact and application will be, but what we do know is companies will have to adapt to harness the power of AI or risk being left behind.

- Daniel Moser is an investment analyst at Fisher Funds

Disclaimer: The information and opinions provided here are of a general nature, and not intended to be personalised financial advice. We encourage you to seek appropriate advice from a qualified professional to suit your individual circumstances.

Follow this link:
Fisher Funds: AI is changing the world but big winners could be the ... - New Zealand Herald

IDMining offers the best way to Make Money with Cloud Mining – Analytics Insight

Cloud mining has revolutionized the world of cryptocurrency mining, providing individuals with an opportunity to participate in the lucrative crypto market without the need for extensive technical knowledge or expensive hardware. Among the various cloud mining providers, IDMining has emerged as a leading platform, offering an exceptional solution for individuals looking to make money through cloud mining. In this article, we will explore why IDMining stands out as the best way to generate income through cloud mining.

One of the key factors that sets IDMining apart is its reputation for reliability and security. With a strong track record and positive user testimonials, IDMining has established itself as a trustworthy platform in the cloud mining industry. The platform employs state-of-the-art security measures to protect users investments and personal information, giving investors peace of mind while mining cryptocurrencies.

IDMining offers a diverse range of mining contracts, catering to the needs and preferences of different investors. Whether you are a beginner looking to start small or an experienced miner seeking higher returns, IDMining has a contract suitable for you. The platform provides flexibility in terms of contract duration, mining algorithms, and cryptocurrencies, allowing users to choose the option that aligns with their investment goals.

To maximize mining efficiency and profitability, IDMining utilizes cutting-edge technology. The platform is equipped with high-performance mining hardware and advanced algorithms that optimize the mining process. By leveraging the power of cloud computing, IDMining ensures users can mine cryptocurrencies at a faster rate and with increased profitability compared to traditional mining methods.

IDMining believes in fostering transparency and providing users with a seamless mining experience. The platform offers a user-friendly interface that makes it easy for investors to navigate and monitor their mining activities. Users can access real-time data on their mining performance, earnings, and withdrawal options.

Recognizing the importance of customer satisfaction, IDMining offers dedicated customer support to assist users throughout their cloud mining journey. The support team is available to address queries, provide guidance, and resolve any issues that users may encounter. This commitment to customer service enhances the overall user experience and sets IDMining apart from its competitors.

Contract Price

Contract Terms

Fixed Return

$200

1 Day

$200+$5

$350

3 Days

$350+$29.4

$860

7 Days

$860+$174.58

$1,600

15 Days

$1,600+$720

$6,400

30 Days

$6,400+$5,952

$9,600

40 Days

$9,600+$13,056

Minimum investment amount: $200

For additional information on IDMining, please visit their website at https://idmining.com/.

When it comes to making money with cloud mining, IDMining is undoubtedly the best choice. With its reliable and secure platform, diverse range of mining contracts, cutting-edge technology, transparent interface, and dedicated customer support, IDMining offers a comprehensive solution for individuals seeking to enter the world of cryptocurrency mining. By partnering with IDMining, investors can tap into the immense potential of cloud mining and generate a sustainable income stream in the ever-evolving crypto market.

See the original post here:
IDMining offers the best way to Make Money with Cloud Mining - Analytics Insight

Cloud computing and blue-sky thinking: An atmospheric scientist … – Purdue University

WEST LAFAYETTE, Ind. Alexandria Johnson does hard science on the most nebulous of subjects: clouds. As an atmospheric scientist and assistant professor of practice in Purdue Universitys College of Science, she studies clouds wherever they are: in her lab, on Earth, throughout the solar system and into the galaxy.

The coolest thing about my research is that I can see clouds every day, Johnson said. I can look up into our own atmosphere and watch them change and evolve. Then I can take that knowledge and apply it to other planetary bodies, both within and outside our solar system.

The science of clouds covers a lot of ground. Her research shines light into topics ranging from the rainfall and microplastic precipitation in Indiana to the climates of moons and planets far outside the realm of human experience.

Studying clouds in their natural environments can be complex and subject to the variations of climate, weather and observation devices. Johnsons solution is to create her own homegrown clouds to study in her lab in the Department of Earth, Atmospheric, and Planetary Sciences. She strips the systems down to their basics to get a clear understanding on how the particles that make up clouds form, develop and interact with their environment. Nothing in her lab actually looks like a cloud; there are no mists swirling picturesquely in glass bottles. Its mostly lasers and big black boxes. But the behavior of these lab-based cloud particles mimics the behavior of cloud particles in massive sky-sweeping clouds, only in miniature.

Of course, we dont grow them at quite the same scale you see in an atmosphere, Johnson said. Instead, we can take one particle that is representative of a cloud, pump in different gases, and change the temperature and pressure of the system. We then watch as that particle grows, shrinks or changes phase with time, which are processes that happen in clouds everywhere.

Clouds on Earth dont often form without the aid of a nuclei, or a particle, and in some cases what would be considered a nuclei on Earth may be an exotic cloud elsewhere. The particles in Johnsons lab, like all particles, have a charge. Johnson and her team use an electric field to levitate and contain the individual particles so that they cant move around. These particles are then stable for extended periods of time, which enables long-term research experiments, where the pressure, temperature, electric field and laser illumination may be tweaked, and observations recorded. Other methods build upon these to allow the team to look at groups of particles and observe how they scatter and polarize light.

Using methods like these, Johnson can study how clouds form and what different cloud particle shapes and compositions can reveal, and she is able to understand the conditions that lead to different cloud types and behaviors. Like aeronautical engineers using a wind tunnel to observe how currents move around structures, Johnson uses these particles to understand the microphysics that underpin vast and complex systems.

Many scientists climatologists, meteorologists and planetary scientists, to name a few study clouds as part of their broader research. But Johnson is one of the few who studies the particular physics of clouds in the laboratory.

There are not many of us who dig into the microphysics of how clouds form, Johnson said. Anyone who studies the atmosphere has a general sense of knowledge about clouds. But none of those systems work without the physics. We have to understand the microphysics to truly grasp the complexities and implications.

Its a long-running joke that the nights of notable astronomical events on Earth seem to be almost supernaturally disposed to be cloudy. That is true of other planets, too.

Using enormous, advanced, vastly powerful telescopes, astronomers can peer through miles and light-years of space just to find clouds blocking their view of the planet itself. Rather than the planets surface, they can only perceive the opaque atmosphere that enswathes it.

Every planetary body in the solar system that has a dense atmosphere, and many outside of it, has clouds in that atmosphere. Even bodies with thin, wispy or intermittent atmospheres like Pluto have particulates hanging in the atmosphere that, while not true clouds, are a haze of particles and share many of clouds properties.

Clouds are a ubiquitous feature of planetary atmospheres, Johnson said. This is something weve seen from our own solar system, and when we look at exoplanet atmospheres, its no surprise that we find clouds there too. Unfortunately, they tend to block our view of the atmosphere that is below.

Scientists have been able to send probes and rovers to close planetary neighbors, including Venus and Mars. But for bodies that are farther away, including exoplanets planets in other star systems entirely scientists must come up with clever ways to conduct science.

The astronomers find the clouds to be an annoyance. They get in the way of the data they want, whether thats learning about the surface of the planet or its atmospheric composition, Johnson said. We see it a little differently. Yes, theyre there. We cant get rid of them. So lets use our understanding of clouds on Earth and planetary atmospheres of our solar system to learn about these things that we cant observe in exoplanets.

Most of the planets Johnson studies are cool planets. While Earth seems balmy (with planetary temperature averages around 60 degrees Fahrenheit), it is actually chilly by planetary standards, when contrasted to large, gas giants orbiting close to their stars like hot Jupiters.

Johnson and her team accumulate information about planetary bodies in Earths solar system or exoplanets. Astronomers can collect spectrographic data to analyze the chemical compounds that make up the atmosphere and use mathematical models, observations and gravitation studies to determine a planets mass, speed and orbit. Combining that information with insights from her laboratory studies, Johnson can help astronomers determine what a planets atmosphere might be like and extrapolate its chance for hosting life.

Our big questions are when, where and why do clouds form in these atmospheres? Johnson said. If we want to understand these enshrouded exoplanets, we need to understand the clouds. That understanding gives us insights into the atmospheric chemistry at work, atmospheric circulation and the climate. In a way we ground-truth astronomical observations.

Johnson is also looking up at the clouds from below, a little closer to home. In a current study, she is examining the role microplastics play in cloud formation. Microplastics pollution, which has been found just about everywhere, including large bodies of water like the Great Lakes, may form a part of clouds or be scavenged by precipitation, then shower the landscape in rainstorms and snowfall. Those microplastics have dire implications for ecosystem health, human health and agriculture.

Understanding how they become attached to clouds, move through weather systems and impact the landscape when deposited can help Johnson and her team protect life on Earth, just as they explore the possibility of livable conditions on other planets.

Its the same physics, Johnson said. Its the same processes, all throughout the universe, and it brings me a huge amount of wonder and joy. As an undergraduate physics major, I chose a senior research project studying how water droplets froze under varying conditions. I literally watched a droplet freeze hundreds of times to study the process and was entranced. I said, This is what I want to do with my life. This is amazing. I want to study clouds.

About Purdue University

Purdue University is a top public research institution developing practical solutions to todays toughest challenges. Ranked in each of the last five years as one of the 10 Most Innovative universities in the United States by U.S. News & World Report, Purdue delivers world-changing research and out-of-this-world discovery. Committed to hands-on and online, real-world learning, Purdue offers a transformative education to all. Committed to affordability and accessibility, Purdue has frozen tuition and most fees at 2012-13 levels, enabling more students than ever to graduate debt-free. See how Purdue never stops in the persistent pursuit of the next giant leap athttps://stories.purdue.edu.

Writer/Media contact: Brittany Steff, bsteff@purdue.edu

Source: Alexandria Johnson, avjohns@purdue.edu

Link:
Cloud computing and blue-sky thinking: An atmospheric scientist ... - Purdue University

Ampere Computing launches its custom chips aimed at cloud … – Reuters

May 18 (Reuters) - Ampere Computing on Thursday released a new family of data center chips with technology it has custom-designed for cloud computing companies.

Founded by former Intel Corp (INTC.O) president Renee James, Ampere has focused on courting cloud companies that buy thousands of chips at a time and in turn rent them out. The company has deals in place with Alphabet Inc's (GOOGL.O) Google Cloud, Microsoft Corp's (MSFT.O) Azure and Oracle Corp's (ORCL.N) cloud unit, among others.

Unlike Intel, Ampere uses a computing architecture from SoftBank Group Corp-owned (9984.T) Arm Ltd, which is also an investor in Ampere. But the new AmpereOne offerings announced Thursday are the first that use Ampere's own custom-designed computing cores, which are the most important part of the chips, which are in turn the brains of data center servers that power everything from business apps to social media sites.

The new Ampere chips will have as many as 192 of those cores where Intel chips tend to have only a few dozen. The high core counts are because cloud companies make money by slicing up chips and selling just a piece of their computing power to customers, and having a large number of cores makes doing so easier.

After Ampere disclosed its approach, Advanced Micro Devices (AMD.O) announced a 128-core chip based on the what is called the "x86" architecture used by AMD and Intel. Intel also has a high-core-count chip in the works.

"It's flattering that the x86 vendors have been able to get closer to us, but we're well on our way to higher core counts now," said Jeff Wittich, Ampere's chief product officer.

Ampere last year filed a confidential registration with U.S. securities regulators for an initial public offering. Oracle, where Ampere's CEO James sits on the board, is a major investor. James declined to say when Ampere might go public.

"We did not pull our registration. We are ever hopeful that the market will open and that it will open for growth companies," James said.

Reporting by Stephen Nellis in San FranciscoEditing by Mark Potter

Our Standards: The Thomson Reuters Trust Principles.

The rest is here:
Ampere Computing launches its custom chips aimed at cloud ... - Reuters

Red Hat Summit’s first day reveals key themes for the future of cloud … – SiliconANGLE News

As day one of Red Hat Summit came to a close in Boston, analysts and attendees were left reflecting on the key insights and takeaways from the event.

The big theme is how to make it simpler for the end users, said theCUBE analyst Rob Strechay (pictured, left), emphasizing the focus on driving users toward cloud, Kubernetes and Red Hat Inc.s OpenShift, all with an end-goal of improving accessibility and efficiency.

This push toward simplification was reiterated throughout the day. The announcements from day 1 were all about simplification, according to analyst Paul Gillin (right). Red Hats new offerings, including Lightspeed and an event-driven version of Ansible, are designed to reduce complexity and ease the lives of end users and developers.

Strechay, Gillin and co-analyst John Furrier broke down Red Hat Summit day 1, during an exclusive broadcast on theCUBE, SiliconANGLE Medias livestreaming studio. (* Disclosure below.)

Ansible, an automation platform acquired by Red Hat in 2015, saw a significant shift in positioning during this weeks Summit. Strechay observed a shift in emphasis from Ansible as a small configuration management niche to becoming a central theme of the conference.

They made Ansible the star of the show today, Gillin said, adding that he saw this as a sign of Red Hat recognizing the prime opportunity in addressing the escalating complexity of information technology landscapes with Ansibles automation capabilities.

The integration of Ansible into Red Hats event agenda was further underlined by Furrier.

Theyre shutting down and folding in AnsibleFest thats coming into the fold, he said. Thats big. And they were dominating most of the thematic content.

Another significant topic that emerged from the discussions was the relationship between AI and cloud computing. The panel debated the concept of AI guardrails, necessary guidelines that prevent AI from spiraling out of control.

Strechay connected this to Red Hats emphasis on hybrid cloud: Nobody knows where is AI going to really live and all that data.

On this theme, Gillin highlighted how AIs potential disasters are lurking in our future. While AIs potential problems are a hot topic, there are likely young innovators emerging, ready to solve these problems and create safer, more effective AI systems, he added.

Concerning the concept of multicloud, theCUBEs analysts expressed a certain level of skepticism. While the idea is full of promise, implementation often falls back on homegrown solutions, according to Gillin.

Strechay concurred, noting that the vendors selling the software are not the ones living with the complexities of implementation.

Despite these challenges, the analysts agreed on the essential role of open source in the future of cloud computing. Gillin asserted that the natural pull of the market now is toward open. In the context of AI, the analysts acknowledged the need for open-source AI to improve transparency and prevent monopolistic moats.

Heres the complete video interview, part of SiliconANGLEs and theCUBEs coverage of Red Hat Summit:

(* Disclosure: This is an unsponsored editorial segment. However, theCUBE is a paid media partner for the Red Hat Summit event. Red Hat Inc. and other sponsors of theCUBEs event coverage have no editorial control over content on theCUBE or SiliconANGLE.)

TheCUBEis an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate thecontent you create as well Andy Jassy

THANK YOU

Original post:
Red Hat Summit's first day reveals key themes for the future of cloud ... - SiliconANGLE News

Cloud Computing: Quality and Cataloging are Top Challenges … – Formtek Blog

By Dick Weisinger

Businesses are moving their data to the cloud but are being faced with challenges managing their data once it is there. A report by Forrester on behalf of Capital One found that two huge challenges include data quality and data that is not cataloged or categorized.

Hugo Noreno, editorial director at Forbes, said that the better the data quality, the more confidence users will have in the outputs they produce, lowering risk in the outcomes and increasing efficiency.

Capital One told Edward Segal, a senior contributor at Forbes, that without data cataloging, decision-makers struggle to understand what data they have, how the data is used, and who owns the data.

The Capital One report found that decision-makers need to address key challenges to ensure they are getting the most out of their data and can leverage that data at scale, gaining agility, increasing cost efficiency, and making better-informed decisions. Firms that fail to do this will miss the moment and fall behind.

Excerpt from:
Cloud Computing: Quality and Cataloging are Top Challenges ... - Formtek Blog

Evolution of Cloud Security | Looking At Cloud Posture Management … – SentinelOne

When cloud computing saw its earliest waves of adoption, businesses only had to decide whether or not they wanted to adopt it. The notion of cloud security in these first few years came as a secondary consideration. Though cloud computing has undergone many improvements since it made a splash following the advent of the World Wide Web, the challenge of cloud security has only become more complex and the need for it more acute.

Todays hyperconnected world sees the cloud surface face a variety of risks from ransomware and supply chain attacks to insider threats and misconfigurations. As more businesses have moved their operations and sensitive data to the cloud, securing this environment against developing threats continues to be an ever-changing challenge for leaders.

This post walks through a timeline of how cloud security has grown over recent years to combat new and upcoming risks associated with its use. Following this timeline, security leaders can implement the latest in cloud security based on their own unique business requirements.

When businesses first began to embrace the web in the 90s, the need for data centers boomed. Many businesses had a newfound reliance on shared hosting as well as the dedicated servers upon which their operations were run. Shortly after the turn of the century, this new, virtual environment became known as the cloud. Blooming demand for the cloud then spurred a digital race between Amazon, Microsoft, and Google to gain more shares across the market as cloud providers.

Now that the idea and benefits of cloud technology gained widespread attention, the tech giants of the day focused on relieving businesses of the big investments needed for computing hardware and expensive server maintenance. Amazon Web Services (AWS), and later, Google Docs and Microsofts Azure and Office 365 suite all provided an eager market with more and more features and ways to rely on cloud computing.

However, the accelerating rates of data being stored in the cloud bred the beginnings of a widening attack surface that would signal decades of cloud-based cyber risks and attacks for many businesses. Cyberattacks on the cloud during this time mostly targeted individual computers, networks, and internet-based systems. These included:

Cloud security, in this decade, thus put their focus on network security and access management. Dedicated attacks targeting cloud environments became more prominent in the following decades as cloud computing gained traction across various industries.

In the 2000s, the cybersecurity landscape continued to evolve rapidly, and the specific types and sophistication of attacks targeting cloud environments expanded. Cloud computing was becoming more popular, and cyberattacks specifically targeting cloud environments started to emerge. This decade marked a new stage of cloud security challenges directly proportional to the significant increase in the adoption of cloud.

While past its infancy, cloud computing was not as prevalent as it is now, and many businesses still relied on traditional on-premises infrastructure for their computing needs. Consequently, the specific security concerns related to cloud environments were not widely discussed or understood.

Cloud security measures in the 2000s were relatively basic compared to todays standards. To secure network connections and protect data in transit, security measures for cloud primarily focused on Virtual Private Networks (VPNs); commonly used to establish secure connections between on-premises infrastructure and the cloud providers network. Further, organizations relied heavily on traditional security technologies that were adapted for these new cloud environments. Firewalls, intrusion detection systems, and access control mechanisms were employed to safeguard network traffic and protect against unauthorized access.

The 2000s also saw few industry-specific compliance standards and regulations explicitly addressing cloud security. Since compliance requirements were generally focused on traditional on-premises environments, many businesses had to find their own way, testing out combinations of security measures through trial and errors since there were no standardized cloud security best practices.

Cloud security at the beginning of the millennium was largely characterized by limited control and visibility and heavily reliant on the security measures implemented by the cloud service providers. In many cases, customers had limited control over the underlying infrastructure and had to trust the providers security practices and infrastructure protection. This also meant that customers had limited visibility over their cloud environments, adding to the challenge of monitoring and managing security incidents and vulnerabilities across the cloud infrastructure.

In the 2010s, cloud security experienced significant advancements as cloud computing matured and became a staple of many businesses infrastructures. In turn, attacks on the cloud surface had also evolved into much more sophisticated and frequent events.

Data breaches occupied many news headlines in the 2010s, with attackers targeting cloud environments for cryptojacking or to gain unauthorized access to sensitive data. Many companies fell victim to compromises that leveraged stolen credentials, misconfigurations, and overly permissive identities. A lack of visibility into the cloud surface meant breaches could go undiscovered for extended periods.

Many high-profile breaches exposed large amounts of sensitive data stored in the cloud including:

The severity of cloud-based attacks lead to increased awareness of the importance of cloud security. Organizations recognized the need to secure their cloud environments and began implementing specific security measures. As cloud adoption continued to grow, so did the motivation for attackers to exploit cloud-based infrastructure and services. Cloud providers and organizations responded by increasing their focus on cloud security practices, implementing stronger security controls, and raising awareness for globally recognized countermeasures.

Enter the Cloud Shared Responsibility Model. Introduced by cloud service providers (CSPs) to clarify the division of security responsibilities between the CSP and the customers utilizing their services, the model gained significant prominence and formal recognition in the 2010s.

During this period, major providers such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) began emphasizing the shared responsibility model as part of their cloud service offerings. They defined the respective security responsibilities of the provider and the customer, outlining the areas for which each party was accountable. This model helped a generation of businesses better understand their role in cloud security and enabled them to implement appropriate security measures to protect their assets.

This decade also popularized the services of cloud access security brokers (CASBs); a term coined by Gartner in 2012 and defined as:

On-premises, or cloud-based security policy enforcement points, placed between cloud service consumers and cloud service providers to combine and interject enterprise security policies as the cloud-based resources are accessed. CASBs consolidate multiple types of security policy enforcement. Example security policies include authentication, single sign-on, authorization, credential mapping, device profiling, encryption, tokenization, logging, alerting, malware detection/prevention and so on.

To help businesses navigate and address the changing cloud security landscape, CASBs emerged as a critical security solution for organizations, acting as intermediaries between cloud service providers and consumers. Their main goals were to provide visibility, control, and security enforcement across cloud environments through services such as data loss prevention (DLP), cloud application discovery, encryption and tokenization, compliance, and governance.

The 2010s saw the emergence of Cloud Security Posture Management solutions and was also the starting point for improved compliance and standardization for the use of cloud in modern businesses. Industry-specific compliance standards and regulations began to address cloud security concerns more explicitly. Frameworks such as the Cloud Security Alliance (CSA) Cloud Controls Matrix and both ISO 27017 and ISO 27018 now sought to provide guidelines for cloud security best practices.

In current times, cloud technology has laid down a foundation for a modern, digital means of collaboration and operations on a large scale. Especially since the COVID-19 pandemic and the rise of remote workforces, more businesses than ever before are moving towards hybrid or complete cloud environments.

While cloud technologies, services, and applications are mature and commonly used across all industry verticals, security leaders are still facing challenges of securing this surface and meeting new and developing threats. Modern businesses need a cloud posture management strategy to effectively manage and secure their cloud environments. This involves several key elements to ensure agile and effective protection against todays cloud-based risks.

CSPM solutions have now gained a large amount of traction, enabling organizations to continuously assess and monitor their cloud environments for security risks and compliance. CSPM tools offer visibility into misconfigurations, vulnerabilities, and compliance violations across cloud resources, helping organizations maintain a secure posture.

An essential element of CSPM is cloud attack surface management. Since cloud environments introduce unique security challenges, a cloud posture management strategy helps businesses assess and mitigate risks. It allows organizations to establish and enforce consistent security controls, monitor for vulnerabilities, misconfigurations, and potential threats, and respond to security incidents in a timely manner. A robust strategy enhances the overall security posture of the cloud infrastructure, applications, and data.

CSPM also encompasses whats called the shift-left paradigm, a cloud security practice that integrates security measures earlier in the software development and deployment lifecycle. Rather than implementing security as a separate and downstream process, the shift left addresses vulnerabilities and risks at the earliest possible stage, reducing the likelihood of security issues and improving overall security posture. It emphasizes the proactive inclusion of security practices and controls from the initial stages of development, rather than addressing security as an afterthought or at later stages.

In addition, Cloud Infrastructure Entitlement Management (CIEM) tools have emerged to help organizations manage access entitlements across multicloud environments, helping to reduce the risks associated with excessive permissions.

As cloud adoption rates continue to increase, many businesses have turned to Kubernetes (K8s) to help orchestrate and automate the deployment of containerized applications and services. K8s has risen as a popular choice for many security teams that leverage its mechanism for reliable container image build, deployment, and rollback, which ensures consistency across deployment, testing, and product.

To better assess, monitor and maintain the security of k8s, teams often use the Kubernetes Security Posture Management (KSPM) framework to evaluate and enhance the security posture of Kubernetes clusters, nodes, and the applications running on them. It involves a combination of various activities including risk assessments of the k8 deployment, configuration management for the clusters, image security, network security, pod security, and continuous monitoring of the Kubernetes API server to detect suspicious or malicious behavior.

Additionally, Cloud Workload Protection Platform (CWPPs) and runtime security helps protect workloads against active threats once the containers have been deployed. Implementing K8s runtime security tools protects businesses from malware that may be hidden in container images, privilege escalation attacks exploiting bugs in containers, gaps in access control policies, or unauthorized access to sensitive information that running containers can read.

The zero trust security model has gained prominence in the 2020s. It emphasizes the principle of trust no one and requires authentication, authorization, and continuous monitoring for all users, devices, and applications, regardless of their location or network boundaries. Zero trust architecture helps mitigate the risk of unauthorized access and lateral movement within cloud environments.

Implementing the zero trust security model means taking a proactive and robust approach to protecting cloud environments from evolving cyber threats. Compared to traditional network security models, which relied on perimeter-based defenses and assuming that everything inside the network is trusted, zero trust architecture:

Cloud-native security solutions continue to evolve, providing specialized tools designed specifically for cloud environments. These tools offer features such as cloud workload protection, container security, serverless security, and cloud data protection. Many businesses leverage cloud-native tools to address the unique challenges of modern cloud deployments in a way that is scalable, effective, and streamlined to work in harmony with existing infrastructure.

Cloud-native security tools often leverage automation and orchestration capabilities provided by cloud platforms. Based on predefined templates or dynamically changing conditions, they can automatically provision and configure security controls, policies, and rules to reduce manual effort. Since many cloud breaches are the result of human errors, such tools can help security teams deploy consistent and up-to-date security configurations across their businesses cloud resources.

Continuous monitoring of cloud environments is essential for early threat detection and prompt incident response. Cloud-native security tools enable centralized monitoring and correlation of security events across cloud and on-premises infrastructure. As they are designed to detect and mitigate cloud-specific threats and attack vectors, cloud-native solutions can cater to characteristics of cloud environments, such as virtualization, containerization, and serverless computing, identifying the specific threats targeting these technologies.

The use of advanced analytics, threat intelligence, artificial intelligence (AI) and machine learning (ML) is on the rise in cloud security. These technologies enable the detection of sophisticated threats, identification of abnormal behavior, and proactive threat hunting to mitigate potential risks.

Both AI and ML are needed to accelerate the quick decision-making process needed to identify and respond to advanced cyber threats and a fast-moving threat landscape. Businesses that adopt AI and ML algorithms can analyze vast amounts of data and identify patterns indicative of cyber threats. They can detect and classify known malware, phishing attempts, and other malicious activities within cloud environments.

By analyzing factors such as system configurations, vulnerabilities, threat intelligence feeds, and historical data, the algorithms allow security teams to prioritize security risks based on their severity and potential impact. This means resources can be focused on addressing the most critical vulnerabilities or threats within the cloud infrastructure.

From a long-term perspective, the adoption of AI and ML in day-to-day operations enable security leaders to build a strong cloud security posture through security policy creation and enforcement, ensuring that policies adapt to changing cloud environments and truly address emerging threats.

Securing the cloud is now an essential part of a modern enterprises approach to risk and cyber threat management. By understanding how the cloud surface has evolved, businesses can better evaluate where they are on this development path and where they are headed. Business leaders can use this understanding to ensure that the organizations security posture includes a robust plan for defending and protecting cloud assets. By prioritizing and investing in cloud security, enterprises can continue to safeguard their organizations against developing threats and build a strong foundation for secure and sustainable growth.

SentinelOne focuses on acting faster and smarter through AI-powered prevention and autonomous detection and response. SentinelOnes Singularity Cloud ensures organizations get the right security in place to continue operating in their cloud infrastructures safely.

Learn more about how Singularity helps organizations autonomously prevent, detect, and recover from threats in real time by contacting us or requesting a demo.

Singularity Cloud

Simplifying security of cloud VMs and containers, no matter their location, for maximum agility, security, and compliance.

More:
Evolution of Cloud Security | Looking At Cloud Posture Management ... - SentinelOne