Category Archives: Cloud Computing

Microsoft has hired an Apple Chip Architect to work on a new Server Chip to power Azure cloud computing services+ – Patently Apple

Microsoft has hired one of Apple's chip engineers to design its own custom server chips. Mike Filippo, who only worked at Apple for 2.9 years had worked at arm for a decade, at Intel for five years and AMD for 8 years. The report characterizing Filippo as a Key Apple engineer would seem a little hyped.

(Click on image to Enlarge)

Microsoft is working on in-house processors for the servers running its cloud-computing services and Surface line of personal computers.

The cloud computing heavy-weight relies heavily on Intel and Advanced Micro Devices Inc to supply chips for its Azure cloud computing services as well as Surface PCs.

The move to hire Filippo implies that Microsoft is accelerating a push to create homegrown chips for its servers powering Azure cloud computing services, the report added.For more, read the full Reuters and/or BNN Bloomberg reports.

Read more:
Microsoft has hired an Apple Chip Architect to work on a new Server Chip to power Azure cloud computing services+ - Patently Apple

What Are IaaS, PaaS, and SaaS? – IoT For All

Thanks to the continued growth of IoT, cloud computing has great potential to continue to drive technological advancements. Cloud computing, born in 2007, has aided technological revolutions through 14 years of development. You may have discovered that cloud computing has expanded its functions in recent years beyond simple storage services, such as iCloud and Google Drive. These functions include IaaS, PaaS, and SaaS.

So what are IaaS, PaaS, SaaS, and how do they play an important role in cloud computing? First of all, let us look at the definition of cloud computing.

The cloud refers to a shared poolof configurablecomputingresources. It plays a vital role in integrating computing resources and realizing automatic management through online platforms. This means that users of cloud computing can reduce labor costs, and at the same time, can achieve resource utilization efficiency.

Cloud computing means more in commercial activities. Like all other commercial resources, the computing resources have become purchasable and have flexible liquidity through resource pooling. Their low prices also make them one of the top software developers or engineers options.

There are threelayers of cloud computing, including Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), and Software-as-a-Service (SaaS). I will introduce them more specifically in the following context.

To illustrate the concept of the three layers of cloud computing, let us begin with an example introduced by Albert Barron, an executive software client-architect of IBM.

Suppose you are a caterer who plans to start a pizza business and wants to make handmade pizzas from start to the end entirely on your own. But, the complicated preparation work may make you feel stressed. Therefore, you have decided to outsource part of your work to reduce your workload. Now, you have been provided with three plans:

The outsourcers provide you with resources including kitchen, oven, gas, etc. You can use these infrastructures to make pizzas.

Expect the infrastructures; the outsourcers also provide you pizza crusts. All you have to do is to sprinkle your ingredients on the crust and let the outsourcer bake it for you. In other words, once you have customized your needs, the cloud platform will help you realize them.

The outsourcer has already prepared pizzas for you without your participation. You can package them and print your logo. Then, all you have to do now is sell them.

If we map pizza production to systematic processes, we can easily see the differences between IaaS, PaaS, and SaaS.

According to the picture shown above, it is obvious that the workload is decreasing during the service application process. IaaS > PaaS > SaaS

Simply put, IaaS is the bottom layer of cloud services and mainly provides some essential resources. For example, Amazon EC2, Microsoft Azure, Rackspace, etc. In addition to being unable to change the infrastructure, users can install any operating system or other software on the infrastructure at will. However, the installation and use process are relatively complicated, with high maintenance costs. Users need to control the bottom layer by themselves to realize the use logic of infrastructure.

PaaS provides runtime, simplifying hardware and operating system details and seamlessly scaling. Developers only need to focus on their business logic instead of the bottom layer logic. Platforms including Google App Engine and AWS Elastic Beanstalk show this feature very well. Generally speaking, PaaS refers to updating cloud-built operating software for the users. Users only need to download and install the software they need on the built platform.

SaaS means leaving the development, management, and deployment process to the outsourcers, releasing worries regarding technological matters. All the resources provided are ready to be used at any time. The internet services that ordinary users encounter are almost all SaaS, such as Facebook, Twitter, and Instagram. Its advantage is that resource utilization efficiency can be highly optimized. Because all applications such as the operating system have been deployed in the cloud, users can log in directly without other operations.

All in all, what IaaS, PaaS, or Saas can do is make our work and life more convenient. The charm of technological progress also lies here. In the world of cloud computing, what can be shared is both information and technology. Even if no maintenance staff is specialized in the cloud computing industry, the multiple-choice service platform of cloud computing allows you to use its full functions easily. This advanced technology can help reduce the work of digital transformations.

See more here:
What Are IaaS, PaaS, and SaaS? - IoT For All

Enabling Cloud Adoption in the Government – DevOps.com

In 2022, federal agencies are estimated to spend $7.8 billion on cloud computing. While this spending on cloud computing is necessary, its imperative that the government uses its allotted funding appropriately to leverage the clouds full potential and avoid common pitfalls.

Ive frequently witnessed government agencies get locked into a cloud reseller that doesnt give them the full capabilities offered by AWS, Google Cloud or Microsoft Azure. Most people outside of the federal market dont realize that the government does not buy cloud directly from the cloud providersapart from some very large contracts like the upcoming JEDI replacement contract vehicle, JWCC. They buy through a network of reseller partners that offer volume discounts and try to bundle managed services to help with cloud management.

This approach gives the reseller, not the customer, the power to hold the keys to their kingdomand can make it difficult for the customer to easily switch between cloud providers in the event of poor performance, lack of access or billing transparency. The federal government would benefit from keeping a small, but purposeful, cloud project management office (PMO) in-house. Thus ensuring that root access is available to all the accounts, subscriptions and projects to provide freedom to the agency to more easily change partners as their cloud maturity evolves.

Ive also seen government customers unfocused on the holistic end-user experience of actually using the cloud. The result is fragmented collections of different systems, processes and steps that cause engineers to be frustrated and inefficient. Customers must transform the end-to-end cloud provisioning process so that selfservice is paramount and existing technology investments are seamlessly and transparently integrated to the end-user. When done correctly, innovation will be the byproduct, because focus can go towards developing the next generation of mission systems and toolsrather than trying to determine how to get a cloud account to begin a project.

Challenges exist, but there are proven steps the government can take to get the most value from their cloud infrastructure.

More here:
Enabling Cloud Adoption in the Government - DevOps.com

Cloud and SaaS Security: Mind the Gap – MSSP Alert

by Netsurion Jan 14, 2022

Software-as-a-Service (SaaS) applications and infrastructure providers like AWS and Microsoft Azure have become the norm for organizations large and small. Enhancing cloud security maturity is even more critical given the proliferation of cloud workloads and a chronic shortage of cloud expertise. Instead of achieving the desired digital transformation and cloud optimization, organizations that ignore cloud cybersecurity gaps or underinvest can do more harm than good. Service providers are well-positioned to capitalize on cloud computing and cybersecurity growth as trusted advisors to business decision makers.

Author: Paula Rhea, CISSP, product marketing manager, Netsurion

This article walks through cloud responsibilities, the benefits of comprehensive attack surface protection, cloud security considerations, and how Managed Security Service Providers (MSSPs) can capitalize on this cloud security opportunity.

Cloud adoption has gone mainstream, with almost 95% of businesses using the cloud today. Top drivers for cloud use include:

Additional cloud workloads and apps mean sensitive data like Personal Health Information (PHI) and credit card numbers are even more widely dispersed. Organizations need to apply the same rigorous cybersecurity controls, compliance, and threat detection used for on-premises resources to cloud infrastructure. Still, there is often uncertainty regarding cloud security roles and responsibilities, and where to begin.

Customers may erroneously believe that their MSSP is responsible for virtually all aspects of IT and network infrastructure and security. Protecting cloud workloads and SaaS applications is a shared responsibility with MSSPs, end customers, and cloud infrastructure providers like AWS. According to the Center for Internet Security, a SaaS provider is solely responsible for host infrastructure, physical security, and network controls. On the other hand, service providers and customers share responsibility for areas such as application-level controls, Identity and Access Management (IAM), and endpoint protection. While its a shared responsibility, the end customer ultimately retains full responsibility for protecting their data and managing the risk.

Businesses arent the only ones to capitalize on public cloud and pervasive SaaS applications. Cyber criminals have quickly embraced the cloud and know how to exploit cloud and SaaS technology, looking for easy targets like misconfigurations on public-facing websites that are straightforward to attack and monetize.

Organizations use hundreds of operational tools to manage on-premises and cloud-based workloads and SaaS applications. This fragmented approach creates data siloes and blind spots that can impact security and operational effectiveness. Without end-to-end visibility and control, detecting and remediating threats wherever they reside can take longer and give cyber criminals a foothold into your infrastructure. A holistic approach to security analytics can also overcome another common data challenge: filtering out false positives to get to actionable insights that matter to each organization.

Augment your traditional technologies like anti-virus and help desk support to assess how cloud security can strengthen customer engagement organizations focused on improving cybersecurity maturity. These businesses understand that financially motivated cyber criminals will exploit security gaps, whether on-premises or in the cloud or a hybrid approach.

Look for cloud security solutions that:

The threat landscape has evolved. Investment in cloud security capabilities helps future proof your portfolio and prepare you for emerging areas of customer spend.

As you help organizations embark on or expand their cloud journey, its crucial to outline cloud security gaps and how to mitigate them as their trusted advisor. Gartner projects cloud spending growth of 23%. So protecting cloud workloads and SaaS applications demands the same oversight and resources as on-premises assets, albeit with the challenges surrounding a shortage of cybersecurity and cloud experts. To streamline vendor and portfolio complexity, you now have access to comprehensive attack surface coverage for endpoints, data centers, and cloud workloads. Learn more about Netsurions Managed Threat Protection with cloud coverage across infrastructure providers such as AWS and Microsoft Azure along with out-of-the-box support for hundreds of SaaS applications.

Author Paula Rhea, CISSP, is product marketing manager,Netsurion, which develops theManaged Threat Protection platformfor MSSP and MSP partners. Read more Netsurion guest blogshere. Regularly contributedguest blogsare part of MSSP Alertssponsorship program.

See the original post here:
Cloud and SaaS Security: Mind the Gap - MSSP Alert

Cobalt Iron patents analytics-based cloud brokering of data protection operations – Security Systems News

LAWRENCE, Kan.Cobalt Iron Inc., a leading provider of SaaS-based enterprise data protection, announced that it has received a patent on its technology for analytics-based cloud brokering of data protection operations.

U.S. Patent 11206306, issued on Dec. 21, 2021, describes new techniques that will be implemented in Cobalt Iron Compass, an enterprise SaaS backup platform. These techniques enable Compass to optimize cloud and on-premises computing resources automatically, making IT operations more secure, more cost-effective, and better-performing.

Security administrators, backup administrators, and other IT leaders responsible for maintaining the service levels and cost-effectiveness of IT operations are increasingly turning to the cloud for various aspects of enterprise data protection. However, cloud resources are rarely used efficiently, which leads to high cloud expenses, inefficient data protection operations, and poor service levels for backups and restores.

In addition, cloud resource utilization is often statically configured and unresponsive to changing conditions and events, and some conditions (e.g., changes in backup operational behavior; changes in the costs, availability, or performance of cloud resource services; or cyber-events associated with a cloud resource) might indicate the need to use other on-premises or cloud computing resources to process various data protection effectively. As a result of these deficiencies, enterprises need a more dynamic way to reconfigure cloud and on-premises computing resource usage that is responsive to operational and security conditions.

Cobalt Iron's new patented techniques use unique operational and infrastructure analytics to respond to changing conditions and to determine optimal usage of cloud and on-premises resources, thereby solving the problem. They can dynamically adjust the use of these resources by IT operations (e.g., data backups or disaster recovery operations) based on operational behaviors and conditions (e.g., poor operational performance or cyber-events).

The techniques disclosed in this patent are:

For example, if there is a cyber event in process against certain cloud computing resources, or if there is a change in the cost or availability of a cloud resource, then Cobalt Iron's patented techniques can dynamically reconfigure cloud computing operations to use a combination of on-premises and/or other cloud resources.

"Companies need to leverage the cloud responsibly. That is, they should use cloud services in a manner that is secure, cost-effective, and operationally beneficial. However, few tools exist to help companies optimize their cloud resource usage," said Greg Tevis, vice president of strategy at Cobalt Iron. "Cobalt Iron continues to innovate in the area of dynamic, analytics-based infrastructure optimization, including the cloud. This patent discloses techniques that broker cloud and on-premises computing resources so that enterprises end up with the best combination of security, cost-effectiveness, and efficiency for their cloud-based data protection operations."

Excerpt from:
Cobalt Iron patents analytics-based cloud brokering of data protection operations - Security Systems News

Future Growth: Growing Shift of Businesses Towards Modernizing & Revolutionizing their Processes to Foster the Growth of the Global Cognitive…

New York, USA, Jan. 12, 2022 (GLOBE NEWSWIRE) -- Research Dive states that the global cognitive cloud computing market is estimated to garner a revenue of $108,788.7 million by 2027, and rise at a CAGR of 31.3% in the forecast period from 2020 to 2027. The all-inclusive report on cognitive cloud computing market provides a brief summary of the current market scenario along with the key aspects of the industry, such as significant growth and restraining factors, challenges, and multiple growth opportunities. Besides, the report provides all the estimations of the market, making it easier and helpful for the novel participants to better understand the global market.

BIGGEST DISCOUNT EVER (Offer Limited Period Only)

End of Season Discount: Flat 20% OFF Read-Only Access Starting at $2999 Individual User (Single User) at $4560 Multi-User Access at $8700 TO $6960 Business User Access at $10700 TO $8560

Download FREE Sample Report of the Global Cognitive Cloud Computing Market: https://www.researchdive.com/download-sample/2800

Cognitive Cloud Computing Market Dynamics

Analysts at Research Dive states that the gradual shift of businesses across the globe towards modernizing their processes and the increasing utilization of artificial intelligence (AI) tools in cognitive cloud computing models are the major factors anticipated to propel the growth of the global cognitive cloud computing market. In addition, the increasing implementation of cognitive cloud computing models in OTT sector is another factor predicted to boost the global market growth by 2028. Moreover, the increasing adoption of innovative tools and emerging technologies are projected to open up massive growth opportunities for the global cognitive cloud computing market in the coming years. Conversely, the high costs associated with the implementation of cognitive cloud computing platforms may restrict the market growth in the forecast period.

Impact of COVID-19 on the Cognitive Cloud Computing Market

The outbreak of COVID-19 has favorably impacted the global cognitive cloud computing market growth during the pandemic period. The positive growth of the market is majorly owing to the growing importance of NPL (natural language processing) technique in the healthcare as well as pharmaceutical organizations in order to support scientists and healthcare professionals during the pandemic. Being an automated process, NPL technique allow clinicians to efficiently monitor and manage patient population by finding coronavirus related symptoms in real time. Thus, the rising demand for NLP is directly impacting the demand for cognitive cloud computing method in healthcare systems.

Check out How COVID-19 impacts the Global Cognitive Cloud Computing Market: https://www.researchdive.com/connect-to-analyst/2800

Natural Language Processing Sub-segment to Witness Significant Growth

Based on technology, the natural language processing sub-segment accounted for the majority of market share in 2019 and is predicted to grow at a CAGR of 32.2% during the analysis period. This sub-segment growth of the global cognitive cloud computing market is mainly owing to the increasing utilization of NLP along with cognitive computing technologies in industry verticals of almost all sizes as it enables the computers to efficiently communicate with humans in real time.

Large Enterprises Sub-segment to Witness Lucrative Growth

Based on enterprise size, the large enterprises sub-segment is expected to surpass a revenue of $73,711.1 million by 2027 and is predicted to observe lucrative growth during the analysis period. This is majorly owing to the increasing usage of cognitive computing techniques in the large enterprises mainly because it helps employees in dealing with complex decision making.

Access Varied Market Reports Bearing Extensive Analysis of the Market Situation, Updated With The Impact of COVID-19: https://www.researchdive.com/covid-19-insights

Healthcare Sub-segment to be Most Dominant

Based on industry vertical, the healthcare sub-segment is predicted to hold the largest market share and rise at a CAGR of 32.5% over the forecast period. This sub-segment growth of the global cognitive cloud computing market is mainly because cognitive computing technologies are widely utilized in the healthcare sector for assisting healthcare professionals in improved and better treatment of diseases.

North America Region to Create Massive Growth Opportunities

By region, the North America cognitive cloud computing market valued for $3,849.9 million in 2019 and is expected to dominate in the global industry during the analysis period. This is majorly due to the presence of technically progressive economies, such as the U.S. and Canada. Besides, U.S. is the major revolution hub for the numerous upcoming technologies. Moreover, most organizations are adopting novel technologies to revolutionize and modernize their business activities, which is predicted to drive the regional market growth by 2027.

Check out all Information and communication technology & media Industry Reports: https://www.researchdive.com/information-and-communication-technology-and-media

Prominent Cognitive Cloud Computing Market Players

The report presents several aspects of these major players such as strategic moves and business & financial performance of key players, latest developments, SWOT analysis, and product portfolio. Some of the key players of the global cognitive cloud computing market are:

Nuance Communications, Inc. SparkCognition Numenta Cisco Microsoft SAP CognitiveScale Hewlett Packard Enterprise Development LP EXPERT.AI IBM

These players are executing several strategies to gain a dominant position in the global industry.

For instance, in July 2020, SparkCognition, the leading industrial AI company, entered into a collaboration with a data science solutions company, Cendana Digital, to expand the global presence of SparkCognition to bring advanced AI solutions to the Malaysias oil & gas market.

The report consists of various facets of all the vital players that are operative in the market such as financial performance, product portfolio, present strategic moves, major developments and SWOT.Click Here to Get Absolute Top Companies Development Strategies Summary Report.

TRENDING REPORTS WITH COVID-19 IMPACT ANALYSIS

Post Production Market: https://www.researchdive.com/covid-19-insights/195/post-production-market

Gaming Simulator Market: https://www.researchdive.com/covid-19-insights/210/global-gaming-simulator-market

Enterprise Data Management Market: https://www.researchdive.com/167/enterprise-data-management-market

More:
Future Growth: Growing Shift of Businesses Towards Modernizing & Revolutionizing their Processes to Foster the Growth of the Global Cognitive...

Ising on the cake: Sync Computing spots opportunity for cloud resource optimisation Blocks and Files – Blocks and Files

Startup Sync Computing has devised a hardware answer to the problem that NetApps Spot solves with software: how to optimise large-scale public cloud compute and storage use.

Update. CEO Jeff Chou positions Sync vs NetApps Spot. 14 January 2022.

Its operating in near stealth, and what we describe here is not based on company announcements. Instead it relies on an article by one of its funders: The Engine, an MIT-based financial backer.

Enterprises are finding that using hundreds, if not thousands, of cloud compute instances and storage resources costs significant amounts of cash. Its virtually impossible to navigate the complex compute and storage cloud infrastructure environments in real time or manage them effectively over time, meaning cloud customers spend more, much more than they actually need to in order to get their application jobs done in AWS, Azure and Google, etc.

The genius of the Spot.io company bought by NetApp lay in recognising that software could help solve the problem. Its Elastigroup product provisions applications with the lowest cost, discounted cloud compute instances, while maintaining service level agreements, and with a 7090 per cent cost saving.

Now, two years later, a pair of MIT Lincoln Laboratory researchers argue the problem is getting so bad that navigating the maze of instance classes across time and clouds needs attacking with hardware as well as software. They say the problem, classed as combinatorial optimisation (CO), is analogous to physical world CO issues, such as the classic travelling salesman scenario. This is trying to find a route for the sales rep between a set of different destinations to minimise the time and distance travelled.

They have applied their CO algorithm expertise to designing hardware a parallel processing item to solve the specific cloud instance optimisation problem more effectively.

Sync Computing was founded 2019 in by two people: CEO Jeff Chou and CTO Suraj Bramhavar. Chou was a high-speed optical interconnect researcher at UC Berkeley and a postdoctoral researcher running high-performance computing optical simulations at MIT. Bramhavar was a photonics researcher at Intel and then a technical staff member at MIT, developing photonic ICs and new electronic circuits for unconventional computing architectures.

Their company took in a $1.3 million seed round in November 2019and more cash from an undisclosed venture round in October 2021. The company website provides a flavour of what they are doing, declaring: Future performance will be defined not by individual processors but by careful orchestration over thousands of them.The Sync Optimization Engine is key to this transition, instantly unlocking new levels of performance and savings. Our technology is poised to accelerate scientific simulations, data analytics, financial modeling, machine learning, and more. These workloads are scaling at an unprecedented rate.

Sync Computings Optimization Processing Unit (OPU) has a non-conventional circuit architecture designed to cope when the number of potential combinations (of instances and instance types for a job in the cloud) is too high for a current server to search through and find the best one. They say that is as the number of combinations scales up then their OPUs performance overtakes that of general purpose CPUs and the GPUs, taking orders of magnitude less time to find the best combination.

THE OPU uses a design mentioned in a 2019 Nature article by the two founders and others, Analog Coupled Oscillator Based Weighted Ising Machine. This describes an analog computing system with coupled non-linear oscillators which is capable of solving complex combinatorial optimisation problems using the weighted Ising model. The circuit is composed of a fully-connected four-node LC oscillator network with low-cost electronic components and compatible with traditional integrated circuit technologies.

The Ising model is a mathematical description of ferromagnetism in statistical mechanics and has become a generalised mathematical model for handling phase transitions in statistics.

The paper showed that the OPU an oscillator-based Ising machine instantiated as a breadboard could solve random MAX-CUT problems with 98 per cent success. MAX-CUT is a CO benchmark problem where the solution is to produce a maximum cut (combination of options) no larger than any other cut.

The paper argues: Solutions are obtained within five oscillator cycles, and the time-to-solution has been demonstrated to scale directly with oscillator frequency. We present scaling analysis which suggests that large coupled oscillator networks may be used to solve computationally intensive problems faster and more efficiently than conventional algorithms. The proof-of-concept system presented here provides the foundation for realizing such larger scale systems using existing hardware technologies and could pave the way towards an entirely novel computing paradigm.

Chou sent us his views on how Syncs technology relates to NetApp Spot, saying: Our solution goes much deeper technically than theirs, in fact you can use us on top of Spot (Duolingo is already using Spot). The gains we got for them were on top of Spot instances.

Fundamentally we deploy a level of optimisation that goes from the application down to the hardware, which is how were able to get even more gains. We are not just cost based, we can accelerate jobs as well. We let companies choose if they want to go faster, cheaper or both.

We are also cloud platform-agnostic, we work with AWS EMR, Databricks, etc. Whereas [NetApps] Data Mechanics is only Spark on Kubernetes within the NETapp ecosystem.

Longer term our Orchestrator product goes into cluster level scheduling to perform a global optimisation of all resources and applications; something nobody else is doing.

Sync Computings OPU could optimise large-scale public cloud resources better, meaning faster and at lower cost. Dynamically too beyond the point where conventional server processors and even GPUs give up. It is very early days for this startup, but its area of focus is the core of NetApps CloudOps business unit.

Earlier this month data protector Cobalt Iron said it had been awarded a patent that covered technology for the optimal use of on-premises and public cloud resources. This technology is based on operational and infrastructure analytics and responds to changing conditions; its dynamic.

We have two established companies highlighting software approaches to solving the public cloud CO problem. If they have identified a large enough problem that is growing then Sync Computing has a good shot at making it.

Read the original here:
Ising on the cake: Sync Computing spots opportunity for cloud resource optimisation Blocks and Files - Blocks and Files

5 Edge Computing Stocks That Could Win The 2020s – Investment U

Cloud computing dominated most of the past decade. Accordingly, major cloud computing stocks like Amazon, Microsoft and Google enjoyed impressive returns. Looking forward, edge computing stocks could be poised to dominate the 2020s.

If youre not familiar, edge computing is a new type of internet architecture. In the 2010s, cloud computing was the most popular internet architecture. Cloud computing involves storing data in the cloud. This cloud mainly lives on huge data hubs. However, over the past decade, the amount of data that we create has risen exponentially. Routing all of this information to one central hub is getting increasingly inefficient.

Edge computing has emerged as a more efficient internet architecture. It works by moving information out of the cloud and closer to the end-user.

To get an idea of how edge computing works, lets examine a company like Walmart. Walmart processes billions of pieces of information each day. Each Walmart location experiences tens of thousands of transactions every day. Walmart can use this data to get better insight into its business. For example, it can help Walmarts management understand who is buying which products and when. Walmart can then leverage this data to run specific marketing campaigns or redesign its stores.

Lets say that Walmart uses a cloud computing system to help process all of this data. Data from thousands of Walmart locations is routed to just a handful of data centers. This process takes time, can overwhelm the network, and lead to congestion.

A more efficient method is for Walmart to use a localized data center at each Walmart location. Each localized data center would process data from that specific store. Once actionable insights are pulled from the data then they can be sent to Walmarts main data hub. This allows Walmart to process information much more quickly.

Edge computing is a driving force behind technologies like self-driving cars, IoT and artificial intelligence. With that said, lets take a look at how you can invest in this rapidly growing industry.

Here are five of the best edge computing stocks to buy.

NOTE: Im not a financial advisor and am just offering my own research and commentary. Please do your own due diligence before making any investment decisions.

Nvidia used to be primarily a gaming-card vendor. In recent years, it has transitioned to a leader in artificial intelligence. It has dozens of different business lines for both software and hardware. This includes a range of solutions for edge computing.

Nvidia is considered one of the most versatile edge computing stocks. This is because it sells both hardware and software. In this sense, it operates a little bit like Apple. Apple sells iPhones (hardware) but also owns the App Store (software) where people download apps.

Nvidias edge computing solutions are part of its EGX Platform. This suite of tools offers hardware thats capable of speech recognition, business forecast, and immersive graphical experiences. In addition to this hardware, Nvidia also offers Nvidia AI Enterprise. This suite can be used for rapid deployment, management, and scaling of AI workloads.

In general, there are countless applications of edge computing technology. However, Nvidia outlines five specific use cases on its website. These use cases are for retail, manufacturing, healthcare, telecommunications, and smart cities. To help boost its solutions, Nvidia recently acquired Bright Computing.

In FY 2021, Nvidia reported annual revenue of $16.68 billion. It also reported a net income of $4.33 billion. Nvidias stock was up approximately 98% during 2021. It is also up about 936% over the past five years.

IBM anticipates that edge computing will become a huge market. Right now, only 10% of enterprise data is processed at the edge. By 2025, IBM estimates that this number will jump to 75%. This could be part of the reason that IBM is spinning off its legacy business. This move will allow IBM to focus on more cutting-edge technologies. One area of focus? Edge computing, benefiting edge computing stocks.

IBM plans to split into two companies. The new business, Kyndryl, will take over IBMs infrastructure services business. IBM, on the other hand, will be free to focus on its hybrid cloud platform and artificial intelligence. From an investor perspective, this could help paint IBM in a new light. Its similar to Fords transition to electric vehicles. Ford has recently put its full weight behind developing EVs. This has caused investors to see the stock more favorably. In 2021, Ford stock rose 150%. IBM could go through a similar turnaround post spinoff. However, it will probably take several months, if not years to work through the kinks.

In 2020, IMB reported annual revenue of $73.62 billion. It also reported a net income of $5.59 billion. IBMs stock was up approximately 11% during 2021. However, it is down 15% over the past five years.

Fastly is one of the best pure-play edge computing stocks. I say this because companies like IBM and Nvidia have dozens of other business lines. Granted, they make money from edge computing. However, this revenue stream is just small slice of their total income. On the other hand, Fastlys entire business is edge computing. In fact, its mission is to provide developers with a secure and programmable edge computing platform.

So far, Fastly is still working towards becoming profitable. However, it has done incredibly well in expanding its enterprise client base. From 2018 to 2020, Fastly added 97 new enterprise clients. During this time, the average spend per client also increased from $536,000 to $782,000. Fastly is increasing both its quantity of clients as well as the average client spend. Enterprise clients are by far the most valuable. So far, Fastly has done a great job in securing them. This could set Fastly up to become very profitable in a hurry.

In 2020, Fastly reported $290.87 million in revenue. It also reported a net loss of $95.93 million. Fastlys stock was down approximately 62% in 2021. However, it is up 33% over the past five years.

Just like Fastly, Cloudflare is one of the best pure-play edge computing stocks. Its mission statement is to help build a better internet. It claims that the modern internet was not built for what it has become. Cloudflare wants to be the foundation that new applications and infrastructure are built on.

Cloudflare has also done an excellent job prioritizing bigger clients. From 2019-2021, its number of large customers has grown from 451 to 1,260. This is a compounded annual growth rate of 67%. Cloudflare defines large customers as those that spend over $100,000 in annualized revenue.

In 2020, Cloudflare reported annual revenue of $431.06 million. It also reported a net loss of $119.37 million. Cloudflares stock was up 25% during 2021. It is up almost 400% since going public in late 2019.

Usually, its a better idea to invest in industries, not companies. This is especially true in emerging industries. For example, electric vehicles could potentially be the future of transportation. However, in 10 years, which company will be on top? It could be Tesla, Rivian, Lucid, Nio, a legacy automaker, or even a company that doesnt exist yet.

The odds of correctly picking a winning industry are relatively high. However, the odds of picking a winning company are much smaller. Due to this, many investors will buy shares of exchange-traded funds (ETFs).

An exchange-traded fund (ETF) is a fund that owns shares of many different companies. This allows you to get exposure to entire industries without putting all your eggs in one basket. There are two good ETFs when it comes to edge computing stocks. These are the Ark Next Generation Internet ETF and the Evolve Innovation Index Fund.

I hope that youve found this article on the five best edge computing stocks to be valuable! As usual, please base all investment decisions on your own due diligence and risk tolerance.

A University of Miami grad, Teddy studied marketing and finance while also playing four years on the football team. Hes always had a passion for business and used his experience from a few personal projects to become one of the top-rated business writers on Fiverr.com. When hes not hammering words onto paper, you can find him hammering notes on the piano or traveling to some place random.

Continue reading here:
5 Edge Computing Stocks That Could Win The 2020s - Investment U

Spatiotemporal change analysis of long time series inland water in Sri Lanka based on remote sensing cloud computing | Scientific Reports – Nature.com

Comparison of spectral water index methods

Figure3 shows the results of different spectral water index methods. Through overlay analysis with the original image and detailed visual analysis, it was found that AWEIsh had the best extraction performance and could accurately identify the boundary of the water body. NDWI, MNDWI, and EWI had different degrees of leakage extraction; NDWI and EWI had an evident leakage extraction in the northwest corner of the image, and the water leakage extraction of MNDWI was mainly concentrated in the middle of the image. There was a lot of water body misidentified in WI, especially in the southeast corner of the image.

Results of water extraction from different spectral water index methods. (a) The original image. The threshold values and extracted water bodies from (b) NDWI, (c) MNDWI, (d) EWI, and (f) AWEIsh methods determined by the OTSU algorithm. (e) The extraction result of WI.

Based on the visual interpretation of the water boundary, 100 test samples were selected and the confusion matrix32 was calculated to obtain the extraction accuracy of the water body from three aspects: commission error, omission error, and overall accuracy (Table 1). As seen from the table, the overall accuracy of AWEIsh was the highest, attaining a value of 99.14%, with extremely low commission and omission errors. WI had the lowest overall accuracy and the highest commission error, and could not distinguish water bodies and low reflectivity features effectively. The overall accuracies of NDWI, MNDWI, and EWI were similar. Comparing the results of the visual interpretation and quantitative analysis, the rapid extraction model of surface water based on the Google Earth Engine utilizing AWEIsh index was used for assessing the spatiotemporal changes of water bodies.

To understand the inter-annual variation trend and intra-annual variation of the reservoir area in the dry zone of Sri Lanka, time series analysis was conducted with the Maduru Oya Reservoir as the case study area. The Maduru Oya Reservoir is the second largest reservoir in Sri Lanka, located in the east-central region, which is the main water source for irrigation and drinking, and has a high incidence of chronic kidney disease of unknown aetiology (CKDu). Figure4 shows the inter-annual and intra-annual variations of Maduru Oya Reservoir area.

Observed area change in the Maduru Oya Reservoir. (a) Inter-annual variation of the Maduru Oya Reservoir area; (b) Intra-annual variation of the Maduru Oya Reservoir area in 2017.

Figure4 shows that the inter-annual fluctuation of Maduru Oya Reservoir area is slight, while the intra-annual fluctuation is significant. From 1988 to 2018, the reservoir area showed an overall increasing trend with slight float; the smallest area was recorded in 1992 (27.43 km2) and the largest area in 2013 (42.97 km2) (Fig.4a). The rainy season in the dry zone of Sri Lanka occurs from October to February, and the dry season occurs from March to September. In 2017, the maximum area of the Maduru Oya Reservoir was noted in February, and the minimum area was noted in September. The area in February was 2.24 times bigger than that of September, with a difference of 31.58 km2. The maximum area of reservoirs or lakes generally occurs at the end of the wet season (February), and the minimum area occurs at the end of the dry season (September)2, which is consistent with the occurrence of maximum and minimum area in the Maduru Oya Reservoir in 2017(Fig.4b). The area of the reservoir increased significantly in May during the dry season. According to meteorological data33, there were persistent strong winds and torrential rains in Sri Lanka in May 2017, resulting in an abnormal increase in the reservoir area. Generally, the period in which the area increased was from October to February (rainy season), while March to September (dry season) was the period in which the area decreased regardless of the influence of abnormal weather factors. The intra-annual fluctuation of the reservoir was severe, and there was a risk of drought and flooding at the same time. This observation implied that the seasonal regulation of water resources must be focussed in the future.

To systematically analyze the spatiotemporal variation characteristics of inland water in Sri Lanka in recent years, and considering the cloud cover of Landsat-5/8 images, 1995, 2005 and 2015 were selected as the study year with an interval of 10years. The distribution information of surface water in three stages was obtained by running the rapid extraction model of surface water in the Google Earth Engine. According to statistics, the surface water areas of Sri Lanka in 1995, 2005, and 2015 were 1654.18 km2, 1964.86 km2, and 2136.81 km2, respectively. In the past 20years, the water area of Sri Lanka has increased significantly. To further analyse the spatiotemporal changes of inland lakes and reservoirs, a 5-m buffer data of rivers in 2015 were produced in ArcGIS10.3 software; further, the area corresponding to the river channels were removed from the three images and only the lagoon areas were preserved. Lagoons are ubiquitous in the coastal areas of Sri Lanka, with flood discharge, aquaculture, coastal protection, and other functions34. The results consisting of the extracted lakes, reservoirs, and lagoons are shown in Fig.5.

Water extraction results for Sri Lanka in 1995, 2005, and 2015. The administrative boundary data of Sri Lanka comes from the Humanitarian Data Exchange (HDX) open platform (https://data.humdata.org). The maps were generated by geospatial analysis of ArcGIS software (version ArcGIS 10.3; http://www.esri.com/software/arcgis/arcgis-for-desktop).

The overall water area of lakes and reservoirs in Sri Lanka showed an increasing trend from 1995 to 2015, and the lagoon area increased over these 20years (Fig.5). Because the lagoon does not belong to inland freshwater sensu stricto, the corresponding statistical analysis was not included in the following step. According to statistics, the total area covered of lakes and reservoirs in Sri Lanka were 1020.41 km2, 1270.53 km2, and 1417.68 km2 in 1995, 2005, and 2015 respectively. In the past 20years, the area of lakes and reservoirs in Sri Lanka has increased by a considerable margin, attaining a value of 397.27 km2. To further analyse the spatiotemporal variation of inland lakes and reservoirs, they were divided into four grades according to their area: I (<0.1 km2), II (0.11 km2), III (15 km2), and IV (5 km2). The number and area of different types of lakes and reservoirs for each year are shown in Fig.6.

Number and area of lakes and reservoirs in Sri Lanka. (a) The number of lakes and reservoirs in 1995, 2005, and 2015; (b) Changes in lake and reservoir area in 1995, 2005, and 2015.

In Fig.6a represents the number of lakes and reservoirs in the four grades, which showed an increasing trend from 1995 to 2015; the lower the grade of lakes and reservoirs, the greater the increase in area was observed. The number of I-grade lakes and reservoirs increased most significantly, while that of the IV-grade only increased by 11. Among the newly added IV-grade lakes and reservoirs, seven were transformed from other lakes and reservoirs, and four were newly built large reservoirs, such as the Rambukkam Oya, the Weheragala, the Daduru Oya, and the Mau Ara reservoirs. From 1995 to 2015, the area of the four grades of lakes and reservoirs showed an increasing trend, and the area of IV-grade lakes and reservoirs increased significantly with a total increase of 197.36 km2 (Fig.6b). The higher the grade of lakes and reservoirs, the larger the total area. In 2015, the total area of IV-grade lakes and reservoirs was 760.53 km2, accounting for 54% of the total area among the four grades.

Figure7 shows the statistical results of the number and area of lakes and reservoirs in various provinces of Sri Lanka. From 1995 to 2015, the increase in the number and area of lakes and reservoirs in Sri Lanka were mainly concentrated in the dry zone, such as the Northern, North Central, Eastern, the Sabaragamuwa, the Uva, and Central provinces. The number and area of lakes and reservoirs in the Southern Province remained unchanged, whereas the number and area of lakes and reservoirs in the North Western and Western provinces decreased slightly. Nisansala et al. reported that the eastern, south eastern, northern, and north-central regions of the country experienced increasing rainfall trends from 1987 to 2017, while western regions and part of the northwestern and central regions of the country displayed a decreasing rainfall trend during the same period35. In recent years, Sri Lanka has built a large number of new water conservancy facilities to support agricultural irrigation, aquaculture, and local economic development, which can regulate the water distribution in the wet and dry seasons36. Therefore, in the provinces with the decrease of the number and area of lakes and reservoirs, the primary reason for the decrease was because of lesser amount of local rainfall. In the provinces with the increase of the number and area of lakes and reservoirs, the increase was mainly due to the increase in local rainfall and the construction of water conservancy facilities. In general, the number and area of lakes and reservoirs in the four grades differed, and the amount of available water resources in surface lakes and reservoirs in Sri Lanka showed an increasing trend.

Number and area of lakes and reservoirs in each province in Sri Lanka. (a) The number of lakes and reservoirs in 1995, 2005 and 2015 in each province. (b) Changes in lakes and reservoirs area in 1995, 2005 and 2015 in each province.

Read more from the original source:
Spatiotemporal change analysis of long time series inland water in Sri Lanka based on remote sensing cloud computing | Scientific Reports - Nature.com

DigitalOcean: The Decentralized Future Of Technological Innovation – Seeking Alpha

imaginima/iStock via Getty Images

DigitalOcean (NYSE:DOCN) makes cloud computing less complex. The company describes itself as a cloud infrastructure provider. In reality, the company is decentralizing cloud computing power to a variety of developers, small and large. There are multi-billion dollar companies like GitLab (NASDAQ:GTLB) that use DigitalOcean and small startups like autonomous driving startup Ghost. Overall, I like the future of the company and think there are a variety of customers they can serve across the world. With increased investor confidence and further growth, Digital Ocean can further become an international player and offer the lowest cost of cloud computing access for developers everywhere. The quest of the company, in my opinion, is a noble one and I have no problem supporting them. By decentralizing cloud computing, the company is helping usher in a new standard of technology offered by companies to consumers. Overall, the value of their product portfolio should stimulate growth for the foreseeable future.

The growth at a low cost mentality by management has been a risk-less endeavor. The company has successfully commercially navigated their product portfolio to become the lowest cost cloud computing service for small development teams everywhere. With VPN hosting options ranging from data base hosting to scalable virtual machines. The variety of products is a differentiator compared to Azure (NASDAQ:MSFT) and Amazon Web Services (NASDAQ:AMZN). Overall, there is still plenty of opportunity for disruptors in the market.

DigitalOcean Q3 Earnings Presentation

Source: (DigitalOcean Q3 2021 Earnings Presentation)

There is a rapidly growing demand for cloud computing. Everyday developers are required to engage in increasingly more complex problems because programming is such a rapidly changing field. The company's market opportunity is huge, and moving forward, I would look for DigitalOcean's market capitalization to grow with the massive market share opportunity the company has. With a 27% yearly CAGR, it is easy to see the market opportunity players in the cloud computing space have to generate long-term profits.

DigitalOcean Q3 Earnings Presentation

Source: (DigitalOcean Q3 2021 Earnings Presentation)

The accelerating top-line growth will continue as the cloud computing as a service industry grows. In addition, the variety of product offerings the digital ocean has within their cloud computing offers is taking back customers that have been lost to competitors due to complexity issues. I believe the recent setbacks in the company and the stock have generated a great buying opportunity for investors.

DigitalOcean Q3 Earnings Presentation

Source: (DigitalOcean Q3 2021 Earnings Presentation)

There are a variety of fundamental performance factors that have not been priced into the stock. Scaling ARR has improved by 36% YoY and is a sign of a healthy young enterprise. Currently, the company is scaling faster than expected CAGR, but I think that's fine. Analysts don't necessarily know what the cloud computing as a service industry will look like, and DigitalOcean has the potential to roll out more product offerings for consumers. There is long-term growth in DigitalOcean's consumer base, and the financial metrics, along with the operational metrics, show a strong picture of future growth.

DigitalOcean is a high-margin growth company with a solid earnings base. The company was profitable even before the IPO and remains so. I believe earnings can accelerate due to strong top line growth. Overall profitability is not a problem for DigitalOcean, unlike so many other software companies. At today's current 8 billion dollar valuation, DigitalOcean is undervalued relative to future prospects.

DigitalOcean Q3 2021 Earnings Presentation

Source: (DigitalOcean Q3 2021 Earnings Presentation)

There has been strong scalability and profitability. Even though the company is scaling at a rapid pace, the company's profitability has only increased. This is due to DigitalOcean's high margin business model, which requires very low costs. This is due to the SaaS ARR model that the company uses to scale relationships with its customers by using their full technology stack.

DigitalOcean Q3 Earnings Presentation

Source: (DigitalOcean Q3 2021 Earnings Presentation)

An important part of maintaining customers is ensuring their success. Customers have been staying, and DigitalOcean has done a great job of increasing their rates while maintaining customer satisfaction. The company remains the lowest-cost solution for cloud computing services and will remain so for some time. Overall, the digital infrastructure the company provides is essential for many startup developers and corporate developers alike.

There are risks due to the nature of the macroeconomic picture in the next 12 months. With the now rumored four rate hikes, there will be a large rotation out of technology stocks that could persist for some time. Tech stocks that generate consistent profits, on the other hand, will be rewarded, and I believe DigitalOcean is a great small-cap way for investors to potentially balance a mega-cap tech-heavy portfolio with a more high-growth option. Overall, the opportunities for DigitalOcean are massive and I believe the company could change technology entrepreneurship for the better. By decentralizing cloud services, we offer developers around the world access to cloud computing power that had previously only been accessed by major technology institutions. I believe DigitalOcean could be the start of a larger trend for technology development that could challenge the existing technology giants that oligopolize the existing technology stack for developers.

As mentioned, the four rate hikes could create a volatile valuation picture in 2022. However, I am of the opinion that DigitalOcean could work past this and actually have a successful 2022. With its improving operations and rapidly improving EPS, the valuation picture could naturally improve. I believe the new organic growth through high debt that has been allowed to be taken out by companies will pay off. Inflation will persist because the fed will get interest rates under control. However, certain companies will surely not become less valuable due to the rate hikes. This is the opportunity for technology investors to take advantage of the rotation out of growth and pick up stocks at low prices. For my comparison, I used two data center services providers for consumers. First, Wix.Com (NASDAQ:WIX) helps small business owners build websites to promote and scale their business. The variety of offerings Wix has gives small business owners the opportunity to own their own storage at a data center for an ARR cost. GDS Holdings Limited (NASDAQ:GDS) is a leading Chinese operator of data centers, primarily for businesses. Both of these companies are valued at under 10 billion, so they experience similar price fluctuations as investors get in and out of the profitable small tech sector.

Seeking Alpha

Source: (DigitalOcean Seeking Alpha Total Return Peer Comparison)

Overall, DigitalOcean has produced a higher return than both of its competitors during 2021. This is due to the strong demand for DigitalOcean's products due to their low cost and scalability. Even though momentum has been deescalating due to a rotation to value, DigitalOcean should hold up over the medium to long term.

Seeking Alpha

Source: (DigitalOcean Seeking Alpha EV/EBITDA Peer Comparison)

Even though DigitalOcean has been on a massive run, its valuation hasn't increased that much. This is due to the company's strong sales numbers. The sales numbers have been driving stock price growth due to the higher margins of the business. This is a great sign for the stock moving forward, even in the face of macro pressure. DigitalOcean is one of my favorite small-cap stocks for 2022.

Investors can't go wrong with DigitalOcean - strong earnings with a relatively low valuation. Most developers I talk to know about DigitalOcean and are strong believers in their products. The company, on a fundamental basis, checks out. I am long DigitalOcean and rate the company as very bullish. I look forward to analyzing reports from the company moving forward.

Read more:
DigitalOcean: The Decentralized Future Of Technological Innovation - Seeking Alpha