Category Archives: Cloud Computing

The Impact of Global White-box Server Adoption on Cloud … – Fagen wasanni

Exploring the Impact of Global White-box Server Adoption on Cloud Computing and Telecommunications

The global adoption of white-box servers is having a profound impact on the fields of cloud computing and telecommunications. These servers, which are unbranded and often assembled from off-the-shelf components, are increasingly being used by companies to reduce costs and increase flexibility. This trend is reshaping the landscape of the IT industry, with significant implications for both cloud computing and telecommunications.

White-box servers are gaining popularity due to their cost-effectiveness and customization capabilities. Unlike traditional servers, which are often expensive and come with pre-installed software and hardware, white-box servers allow companies to choose the components that best suit their needs. This flexibility can lead to significant cost savings, as companies can avoid paying for unnecessary features or capabilities. Moreover, the use of off-the-shelf components can further reduce costs, as these parts are typically cheaper than their branded counterparts.

The rise of white-box servers is particularly impactful in the realm of cloud computing. As more and more businesses migrate their operations to the cloud, the demand for servers is skyrocketing. White-box servers, with their cost-effectiveness and customization capabilities, are an attractive option for companies looking to expand their cloud infrastructure. By using white-box servers, companies can scale their operations more affordably and efficiently, which is crucial in the fast-paced, ever-evolving world of cloud computing.

In addition to cloud computing, the telecommunications industry is also feeling the effects of the white-box server trend. Telecommunications companies are increasingly using white-box servers to power their networks, as these servers offer the flexibility and cost-effectiveness needed to keep up with the rapid pace of technological change. For instance, white-box servers can be easily upgraded or modified to support new technologies or standards, which is a significant advantage in an industry where innovation is key.

Furthermore, the adoption of white-box servers in telecommunications is facilitating the shift towards software-defined networking (SDN) and network functions virtualization (NFV). These technologies, which aim to make networks more flexible and efficient by virtualizing network functions and decoupling them from the underlying hardware, are ideally suited to the flexible, customizable nature of white-box servers. By using white-box servers, telecommunications companies can more easily implement SDN and NFV, leading to more efficient, adaptable networks.

However, the rise of white-box servers is not without its challenges. For one, these servers typically come with less support and fewer warranties than their branded counterparts, which can be a concern for companies that rely on their servers for critical operations. Additionally, the use of off-the-shelf components can lead to compatibility issues, which can complicate the server assembly and maintenance processes.

Despite these challenges, the global adoption of white-box servers is expected to continue, driven by the ongoing demand for cost-effective, customizable server solutions in the fields of cloud computing and telecommunications. As this trend continues, it will be interesting to see how the IT industry adapts, and what new opportunities and challenges arise.

Here is the original post:
The Impact of Global White-box Server Adoption on Cloud ... - Fagen wasanni

Global Application Transformation: Unlocking the Potential of Cloud … – Fagen wasanni

Exploring the Power of Cloud and Edge Computing in Telecommunications: A Deep Dive into Global Application Transformation

The global application transformation in telecommunications is a rapidly evolving landscape, driven by the power of cloud and edge computing. This transformation is unlocking unprecedented potential, enabling telecom companies to deliver innovative services, enhance operational efficiency, and create new revenue streams.

Cloud computing, with its on-demand availability of computer system resources, especially data storage and computing power, is revolutionizing the telecommunications industry. It offers telecom companies the ability to store vast amounts of data and run applications on shared systems, reducing the need for physical infrastructure and lowering operational costs. Moreover, cloud computing provides the flexibility to scale resources up or down as per the demand, ensuring optimal utilization and efficiency.

In addition to cloud computing, edge computing is also playing a pivotal role in the global application transformation in telecommunications. Edge computing refers to the practice of processing data near the edge of the network, where the data is generated, rather than in a centralized data-processing warehouse. This approach reduces latency, improves speed, and enhances the user experience, particularly in applications that require real-time data processing.

The integration of cloud and edge computing in telecommunications is creating a paradigm shift. It is enabling telecom companies to move away from a hardware-centric model to a more software-focused approach. This shift is not only reducing capital expenditure but also enabling faster deployment of new services.

One of the key benefits of this transformation is the ability to deliver personalized services. With cloud and edge computing, telecom companies can analyze customer data in real-time, understand individual preferences, and deliver customized services. This capability is enhancing customer satisfaction and loyalty, leading to increased revenue.

Furthermore, the combination of cloud and edge computing is facilitating the deployment of advanced technologies such as artificial intelligence (AI) and machine learning (ML) in telecommunications. These technologies are helping telecom companies to automate processes, predict network issues, and enhance decision-making.

However, the global application transformation in telecommunications is not without challenges. Data security and privacy are major concerns, given the sensitive nature of the data handled by telecom companies. Ensuring compliance with various regulatory standards is another challenge. Telecom companies need to invest in robust security measures and adopt best practices to mitigate these risks.

Despite these challenges, the benefits of cloud and edge computing in telecommunications are undeniable. They are driving the global application transformation, enabling telecom companies to stay competitive in a rapidly evolving market.

In conclusion, the global application transformation, powered by cloud and edge computing, is reshaping the telecommunications industry. It is unlocking new opportunities, enhancing operational efficiency, and paving the way for innovative services. As telecom companies continue to navigate this transformation, they need to address the associated challenges and leverage the potential of these technologies to stay ahead in the digital race. The future of telecommunications lies in harnessing the power of cloud and edge computing, and the journey has just begun.

Go here to read the rest:
Global Application Transformation: Unlocking the Potential of Cloud ... - Fagen wasanni

The Rising Costs of Cloud Computing: Big Tech Responds with In … – Fagen wasanni

The shift to the cloud and the subsequent boom in the sector promised companies the ability to digitally transform themselves while keeping their data secure. However, the cost of this transformation is on the rise, particularly with the addition of generative AI tools.

Big Tech companies, burdened with hefty cloud bills, find themselves in a catch-22 situation. They cannot opt out for fear of being left behind, so they are seeking ways to cut corners. One solution being explored is the development of in-house AI chips to reduce costs.

IBM, at a semiconductor conference in San Francisco, announced its consideration of using its in-house AI chips, specifically the Artificial Intelligence Unit, to lower cloud computing costs. Other tech giants like Google, Microsoft, and Amazon are already designing their own AI chips in an effort to save money on their AI endeavors. Previously, the focus had been on specialized chips like graphic chips, but the demand is expanding.

Microsoft has accelerated its project to design its own AI chips, aiming to make them available within the company and OpenAI by next year. Googles AI chip engineering team has also moved to its Google Cloud unit to expedite progress.

Not only are cloud providers facing high costs, but clients themselves are also grappling with soaring prices. Shifts to on-premises solutions are being considered due to the expense of building on-premises AI/ML resources. However, enterprises are wary of falling behind competitors in terms of AI/ML capabilities. Cloud solutions offer an attractive option for businesses that need to strengthen their infrastructure for AI/ML integration.

To maximize return on investment, clients must carefully consider their needs in terms of tools and compare the cost of creating and using models. Its also important to avoid trying to do everything independently and instead utilize open-source and paid models as a base, training them for specific enterprise data.

Cloud providers are also attempting to lower prices to attract more customers. Amazon Web Services (AWS), for example, aims to lower the cost of training and operating AI models.

As the demand for cloud services continues to increase, fueled by AI workloads, a Gartner report predicts that AI will be one of the top factors driving IT infrastructure decisions through 2023.

In this landscape, businesses may opt to outsource cloud management and maintenance to third-party firms or tools. Adopting a hybrid approach where on-premises AI hardware is used for sensitive data processing and latency-sensitive applications, while cloud services are utilized for data storage, distributed training, and deploying AI models, allows for cost optimization.

Given the bright trajectory of AI, the cloud industry is expected to continue experiencing significant growth and benefit from it.

Excerpt from:
The Rising Costs of Cloud Computing: Big Tech Responds with In ... - Fagen wasanni

Juniper Stock Slides on Cut to Outlook as Cloud Business Slows – Barron’s

Juniper Networks shares are losing ground after the infrastructure hardware provider provided disappointing financial guidance, with weaker-than-expected demand from cloud computing customers.

While Juniper thinks it is a long-term beneficiary of the artificial intelligence software trend, it will take some time for that opportunity to develop. The shift appears to be hurting the company in the short run.

For the second quarter, Juniper (ticker: JNPR) posted revenue of $1.43 billion, up 13% from a year ago and slightly above the Wall Street consensus of $1.42 billion. Adjusted profits of 58 cents a share likewise were three cents above the Street consensus view of 55 cents.

We delivered better than expected results during the June quarter as our teams continued to execute well and we benefited from improved supply, CEO Rami Rahim said in a statement. We were particularly encouraged by the momentum we experienced in our enterprise business, which not only had a record quarter, but also represented both our largest and fastest growing vertical for a third consecutive quarter.

But managements financial forecasts proved disappointing. For the third quarter, Juniper sees revenue of $1.385 billion, which would be down about 2% from the year earlier period, falling short of the Street consensus of $1.48 billion. The company projects non-GAAP profits for the quarter of 54 cents, while the consensus call was for 62 cents.

Advertisement - Scroll to Continue

Analysts at Needham, Citi, Raymond James, and others all trimmed their financial forecasts and stock-price targets following the Thursday report.

In prepared remarks, CFO Kenneth Miller said orders were weaker than expected in the second quarter. He said he expects continued weakness in bookings in the third quarter, mostly from cloud customers, but to a lesser extent from telecom service providers.

We believe the softness in bookings is largely attributable to customer digestion of previously placed orders and certain projects being pushed to future periods, Miller said. We expect the macroeconomic environment to remain challenged, which may continue to impact customer spending. These factors are negatively impacting our revenue expectations.

Advertisement - Scroll to Continue

The company reduced its forecast of full-year revenue growth to between 5% and 6%, from 9%.

On a conference call with analysts, Rahim said that the focus by cloud providers right now is on building their AI offerings, which he says might be a bit of a negative for Juniper for now, although he adds that it will help in the longer term.

To the extent that AI is a new killer app thats going to be offered by cloud providersits going to result in an increase of trafficin areaswhere we have a significant footprint, he says.

Junipers disappointing results are weighing on shares of other network equipment providers with significant cloud exposure. Particularly hard hit is Arista Networks (ANET), which gets almost half of its business from Microsoft (MSFT) and Meta Platforms (META).

Arista shares, which had already been under pressure this week from concerns that Microsoft and Meta spent less than expected on capital equipment in the June quarter, are down 6% on Friday. That increased the stocks four-day loss to 13%.

Advertisement - Scroll to Continue

Write to Eric J. Savitz at eric.savitz@barrons.com

Read the original post:
Juniper Stock Slides on Cut to Outlook as Cloud Business Slows - Barron's

Analyzing the Environmental Impact of Cloud Computing – Analytics Insight

Examine some of the initiatives that organizations and cloud providers may take to reduce

In recent years, cloud computinghas been an increasingly popular choice for organizations trying to simplify operations and save expenses. Organizations may minimize their dependency on on-premise hardware and software by accessing remote servers and computing resources, which can result in considerable savings in energy usage and carbon emissions.Yet, the shift to cloudcomputing has environmental consequences, and as more firms use this technology, it is critical to examine the possible environmental impact of this change.

Lower Energy Consumption:Cloud computing can result in significant energy savings. The average data center needs massive energy to power and cool its servers. On the other hand, cloud providers run theirdata centerswith exceptional energy efficiency, employing innovative cooling systems and power management techniques.

Cloud companies also employ virtualization technology, enabling several users to share a single server, lowering the number of physical servers necessary. As a result, the overall carbon footprint of data centers lowers.

Carbon Footprint:According to the International Energy Agency (IEA), data centers energy usage could treble by 2030, and the industrys carbon footprint could account for up to 3.2% of global greenhouse gas emissions. Several cloud companies are looking into renewable energy to reduce the carbon footprint of data centers. Some businesses even construct their own renewable energy plants to power their data centers. Data centers are being constructed to be more energy-efficient, using features such as hot and cold aisle confinement, air-side economization, and virtualization.

E-Waste:In numerous ways, cloud computing adds to the rising problem of electronic waste. As more businesses migrate to cloud computing, they are routinely upgrading their IT equipment to stay up with the newest technical breakthroughs. In the end, it leads to an increase in electronic waste disposal. Second, cloud computing requires a significant amount of energy and resources to operate, resulting in increased e-waste in servers, data centers, and networking equipment.

E-waste disposal needs to be adequately handled, resulting in negative environmental repercussions. Toxic chemicals like lead, mercury, and cadmium can be found in e-waste and harm the air, water, and soil. When these compounds are inappropriately disposed of, they can leak into the ground and water systems, possibly polluting the local ecosystem and endangering human and animal health.

Green Cloud Computing:While cloud computing has been chastised for its environmental impact, some providers are taking deliberate steps to lessen their environmental imprint. Green cloud computing or sustainable cloud computing are other terms for it.

By adopting renewable energy sources, cloud companies may reduce their environmental effect. Amazon, Microsoft Azure, andGoogle Cloudhave all committed to using 100% renewable energy in their data centers. In reality, Amazon Web Services (AWS) declared in 2019 that it will use 80% renewable energy by 2024 and 100% renewable energy by 2030.

Green cloud computing has the potential to improve public relations. Companies may improve their image and reputation as good corporate citizens by advertising their environmental activities. This can set them apart from rivals and serve as an effective marketing strategy.

Challenges Implementing Green Cloud Computing:While green cloud computing has several potential benefits, it also confronts several obstacles that must be solved. Renewable energy is becoming more cost-effective, but it is still more expensive in some areas than traditional energy sources. It becomes difficult for cloud providers, particularly smaller providers with limited funds, to transition to renewable energy sources. Adopting green practices may need a significant reworking of present processes, incurring costs and taking up significant time.

More here:
Analyzing the Environmental Impact of Cloud Computing - Analytics Insight

Todays Cache | Twitters new name has legal baggage; Generative AI boom complicates cloud computing; Adobes Figma deal may be investigated – The Hindu

(This article is part of Todays Cache, The Hindus newsletter on emerging themes at the intersection of technology, innovation and policy. To get it in your inbox, subscribehere.)

The social media platform known as Twitter will be renamed X, announced owner Elon Musk this week. However, the letter is so widely used and trademarked that a lawsuit against Twitter is inevitable, according to legal experts. Hundreds of companies have active U.S. trademark registrations for the letter, including Twitter rival Meta and software giant Microsoft. A lawsuit could be initiated if any of the brands feel that Twitters rebranding to X could lead to confusion for its own company or services. Twitter (X), Microsoft, and Meta are yet to make public their legal actions, if any.

Musk has floated the idea of an everything app, like Chinas WeChat, where entertainment, socialisation, and global payments are all covered by one application. However, he admitted earlier in the month that Twitter has lost around half of its advertising revenue.

The boom in generative AI technologies and tools this year has made cloud computing services more expensive, but companies are feeling pressured to continue along this path. To save their budget, Big Tech firms such as Google, Microsoft and Amazon are designing their own AI chips, with IBM perhaps set to join them soon. Out of these, Amazon Web Services (AWS) is trying to draw in potential customers for its cloud services by stressing on lower prices than its competitors.

Chipmakers and the cloud industry have benefited from the boom in AI in business, but rising hardware costs and a spike in demand continue to challenge companies trying to match the pace in growth. Some companies are choosing another route: letting a third party take over the management and maintenance of their cloud.

Figma, the cloud-based platform for designers, is used by major companies such as Zoom, Airbnb, and Coinbase. However, the EU may investigate a $20 billion deal by Adobe to acquire it. Sources reported that an antitrust investigation for the deal may be expected after a preliminary review.

Adobe and the European Commission did not issue official statements about the legal proceedings. While Adobe stressed that it was working with regulators worldwide, the European Commission had earlier spoken against the deal, citing harm to healthy competition in the interactive product design sector.

Here is the original post:
Todays Cache | Twitters new name has legal baggage; Generative AI boom complicates cloud computing; Adobes Figma deal may be investigated - The Hindu

The Role of Cloud Computing in Optimizing Meat Packing Plant … – EnergyPortal.eu

Exploring the Impact of Cloud Computing on Efficiency and Optimization of Effluent Treatment in Meat Packing Plants

Cloud computing has been making waves in various industries for its potential to enhance efficiency and productivity. One such industry where this technology is making a significant impact is the meat packing sector, particularly in the area of effluent treatment.

Effluent treatment, the process of treating waste water and making it suitable for reuse or safe disposal, is a crucial aspect of meat packing plants. These plants generate a significant amount of waste water that contains various contaminants, including blood, fat, and other organic materials. If not properly treated, this effluent can have a severe impact on the environment.

Traditionally, effluent treatment in meat packing plants has been a labor-intensive and time-consuming process. It requires constant monitoring and adjustment to ensure that the treatment process is effective and that the treated water meets regulatory standards. However, with the advent of cloud computing, this process is being revolutionized.

Cloud computing allows for real-time monitoring and control of the effluent treatment process. Sensors placed throughout the treatment system can continuously collect data on various parameters, such as pH, temperature, and concentration of contaminants. This data is then sent to the cloud, where it can be accessed and analyzed from anywhere, at any time.

The real power of cloud computing lies in its ability to analyze this data and provide actionable insights. Advanced algorithms can identify patterns and trends in the data, predict potential issues before they occur, and suggest adjustments to optimize the treatment process. This not only improves the efficiency of the treatment process but also reduces the risk of non-compliance with environmental regulations.

Moreover, cloud computing enables meat packing plants to automate many aspects of the effluent treatment process. For instance, based on the data analysis, the system can automatically adjust the pH or temperature of the treatment process, reducing the need for manual intervention. This not only saves time and labor but also improves the accuracy and consistency of the treatment process.

In addition to improving the efficiency and effectiveness of the effluent treatment process, cloud computing also offers significant cost savings. By optimizing the treatment process, plants can reduce the amount of chemicals and energy used, leading to substantial cost savings. Furthermore, the automation of the process reduces labor costs and the risk of human error, further enhancing cost-efficiency.

The use of cloud computing in effluent treatment also contributes to sustainability efforts. By optimizing the treatment process, plants can reduce their environmental impact by ensuring that the treated water is safe for disposal or reuse. This not only helps to protect the environment but also enhances the sustainability image of the meat packing industry.

In conclusion, cloud computing is playing a pivotal role in optimizing effluent treatment in meat packing plants. By enabling real-time monitoring, data analysis, automation, and optimization of the treatment process, this technology is enhancing efficiency, reducing costs, and contributing to sustainability efforts. As the meat packing industry continues to embrace this technology, we can expect to see further improvements in the efficiency and effectiveness of effluent treatment.

Original post:
The Role of Cloud Computing in Optimizing Meat Packing Plant ... - EnergyPortal.eu

KPMG to invest $2bn in AI and cloud services with Microsoft – DatacenterDynamics

Major accounting firm KPMG is planning to invest $2 billion in artificial intelligence (AI) and cloud services over the next five years with Microsoft.

The Big Four accounting firm is expanding its relationship with Microsoft globally, and expects that this investment could bring in up to $12 billion in revenue over the next half-decade.

That amount would represent about seven percent of the company's annual global revenue, which last year reached $34.64bn.

By investing more in AI and cloud computing, KPMG will be able to automate things like its tax, audit, and consulting services, as well as help employees provide faster analysis. The company will use Microsoft 365 Copilot, an AI assistant not yet available to the public, as well as continuing to use OpenAi via the Azure cloud platform.

It is additionally hoped that through the use of AI, KPMG will be able to simplify its ESG reporting by analyzing transactions in a more efficient way.

Microsoft CEO Satya Nadella said: Whether its ESG reporting or audit, all of these things are going to be fundamentally transformed because the core processes of those knowledge workflows are going to be accelerated using this AI technology.

Bill Thomas, KPMG global chair and CEO, does not think that the investment will see people replaced with AI.

I certainly dont expect that well lay off a lot of people because weve invested in this partnership, Thomas said. I would expect that our organization will continue to grow and we will reskill people to the extent possible and, frankly, create all sorts of opportunities in ways that we cant even imagine yet.

Despite this, KPMG has laid off a significant number of employees. The company has an overall workforce of around 265,000, and in June laid off around 2,000 employees just four months after another 700 lost their jobs.

This is not the company's first major investment in tech. In 2019, KPMG invest $5bn in general technology, 20 percent of which went directly to Microsoft, with the rest dedicated to moving to the cloud among other things.

View post:
KPMG to invest $2bn in AI and cloud services with Microsoft - DatacenterDynamics

FDIC Office of Inspector General Cites Gaps in Cloud Migration … – Executive Gov

The Federal Deposit Insurance Corporations Office of Inspector General recently revealed that the agency had implemented effective cloud migration strategies and governance processes, but stated that improvements are needed in some areas of its cloud operations.

In a new report published Wednesday, the FDIC OIG said that the organization did not adhere to cloud computing standards recommended by the National Institute of Standards and Technology and the Office of Management and Budget such as creating a well-sorted inventory of its cloud data assets.

The FDIC did not establish an exit strategy to address issues if it needs to terminate a contract with a cloud service provider and did not develop contract management plans for 17 cloud contracts totaling over $546 million, according to the report.

Moreover, the agency did not create strategies to dispose of or decommission its legacy systems.

The FDIC expects to migrate most of its mission-essential and mission-critical systems to the cloud by 2024.

More here:
FDIC Office of Inspector General Cites Gaps in Cloud Migration ... - Executive Gov

IPOPHL to adopt Cloud computing strategies for IP registration … – BusinessMirror

The Intellectual Property Office of the Philippines (IPOPHL) announced it would adopt Cloud computing strategies for intellectual property (IP) registration services to improve internal processes and efficiencies in data management.

IPOPHL Director General Rowel S. Barba said at a side event at the World Intellectual Property Organizations (WIPO) 64th General Assemblies where the IP offices of Norway and Uganda also discussed their respective digital transformation initiatives, IPOPHL said.

Moreover, the agency said Barba shared IPOPHLs experiences in using the current version of the IP Administration System, its limitations, pain points, and his wish list for the new IPAS 4.0 implementation.

According to IPOPHL, the IPAS is a software developed and owned by WIPO and is offered to IP offices under collaborative arrangements for its provision, hosting and maintenance.

The software, it noted, enables the electronic processing of IP registration documents and is customizable to adopt internal workflows that reflect an offices rules and procedures.

The most recent release known as IPAS version 4.0 will be exclusively hosted in an Amazon Web Services (AWS) based in the EU, IPOPHL said.

Meanwhile, an earlier version, the IPAS 3.x was installed in IPOPHL in the early days of 2012 and is now showing its age. Barba reiterated that upgrading to IPAS 4.0, though very challenging, is the most logical choice for IPOPHL.

We know that the majority of the features previously missing in IPAS 3.x now come built-in as part of the IPAS 4.0 package. It is also expected to include workflow fixes that had resulted in downtimes in the old version, Barba said.

Barba revealed that WIPO has already committed to exploring how it can best support IPOPHLs new requirements as the latter transitions to the latest IPAS version. WIPO has already given IPOPHL access to navigate the IPAS 4 test version by end-July, allowing the office more time to prepare for the transition.

We are truly grateful to WIPO for its valuable technical assistance. We hope to engage with them soon to have a deeper exchange on the level of support we will need to implement IPAS 4. We want to transition smoothly and sustain it at minimal costs and disruption as we move toward greater ICT ambitions in the long term, Barba added.

Last week, the IPOPHL chief told reporters that one of the vital amendments that IPOPHL wishes to see in the Intellectual Property code is the modernization of the intellectual property system.

Its modernizing our IP system kasi nga kalakaran na sa mundo ngayon, puro tech na so gusto natin makasabay sa ibat ibang mga IP offices sa ibat ibang countries para ma-modernize natin yung system natin dito, Barba said last week.

He also noted, Right now, were proud na I think IPOPHL is one of the government agencies na online. Sa lahat ng mga filings namin ngayon dito, payment, 100 percent.

Continued here:
IPOPHL to adopt Cloud computing strategies for IP registration ... - BusinessMirror