Category Archives: Cloud Computing

Morningstar warns about thematic tech investing, popularized recently by Cathie Wood – CNBC

Thematic tech investing is exploding in popularity. Everyone wants to invest in funds that own clean energy, cybersecurity, electric vehicles, e-sports, robotics, 3D printing and cloud computing.

But watch out, Morningstar warns: Most of these funds do not outperform the markets, have high fees, high failure rates, and run the risk of too much concentration that could cause stocks to tank should investors suddenly flee the funds.

A bull market and easy-to-understand themes have created an explosion of interest in thematic investing as people seek to put money into narrow areas of the investing universe, most of it technology themed.

In the U.S. alone, thematic funds have ballooned to more than $160 billion in assets by the end of March, from about $49 billion at the end of December 2019, according to Morningstar.

But a small group of funds is gathering the lion's share of the business.While 41 thematic funds now have assets over $1 billion, the top 10 largest account for just under half of all the assets.

While most thematic funds are tied to indexes, some are now actively managed. Cathie Wood's ARK funds, a series of actively managed funds with a small number of concentrated bets, has attracted massive inflows.

Not surprisingly, investors tend to rush into thematic funds, particularly thematic tech, during bull markets.Launches set records in 2016, 2018 and 2019, according to Morningstar. In 2019, a particularly strong year for stocks (the S&P 500 was up 29%),providers launched almost 40 new thematic ETFs.In 2020, also a strong year, a record 55 new thematic strategies came to market.

The answer, for the most part, is no.

While thematic tech investing did well during 2020, the record is much poorer over longer periods.

Thematic investing by its nature is prone to investors chasing hot themes:Clean energy, cloud computing, electric vehicles, battery technology, virtual reality, gaming, robotics, artificial intelligence, cybersecurity. These themes come in and out of popularity.Thematic investors are making a bet that they have picked a winning theme, but because thematic funds pick narrow, often volatile, parts of the market that fall in and out of favor, they suffer from high failure rates: Nearly one-third of all thematic funds have closed in the last 10 years and 34% have underperformed the broader equity market.

A major contributor of thematic investing underperformance has been high fees. Thematic funds, many of which are actively managed, charge much higher fees than broad funds and even most ETFs.

By comparison, the iShares S&P 500 ETF, a passively managed fund, charges an annual fee of only 0.03%.

"High fees charged by thematic funds have contributed to their relatively poor performance versus broad market indexes over longer periods," Morningstar said.

Morningstar also noted that a small number of actively managed funds have accumulated very large positions in many small-cap stocks, particularly Wood's Ark Innovation fund.

This concentrated ownership means that the stock prices may be subject to the whims of investors in the fund: "The liquidity promised by the ETF structure means investments can be pulled out on a whim," Morningstar noted.

Thematic investing is fun and easy to understand. I'm buying cybersecurity!I own fintech!I love clean energy!

If you like the theme but don't have the resources to investigate individual companies, thematic investing makes sense.

But Morningstar's report is a flag for investors:Don't kid yourself. You have not found the Secret of Stock Market Wealth.

"Some of the same individuals will likely get burnt when the fortunes of the more popular themes inevitably wane and investors swarm for the exits," Morningstar warns. "As timely as they may seem now, some themes will age poorly. Investors must ask themselves:Will that work-from-home ETF still be relevant in three years' time?"

What's the average investor to do? "Because of their narrower exposure and higher risk profile, thematic funds are best used to complement rather than replace existing core holdings," said Kenneth Lamont, senior research analyst at Morningstar and the author of the report.

Read more:
Morningstar warns about thematic tech investing, popularized recently by Cathie Wood - CNBC

Global Healthcare Cloud Computing Market 2021 Impact of COVID-19 on Industry Revenue, Growth and Forecast Period 2028 The Courier – The Courier

The Latest Healthcare Cloud Computing Market Research Report 2021 2028 published by MARKET RESEARCH INC. The research report provides comprehensive analysis of its current and future trends, opportunities, market size, share, status and revenue growth. The report firstly introduced the Healthcare Cloud Computing Market basics: Definitions, Classifications, Applications, and Market Overview, Product Specifications, Manufacturing Processes, Cost Structures, Raw Materials and so on. The main objectives of this research study include an overview of Healthcare Cloud Computing market strategies, SWOT Analysis, technical innovation, market competition, goods and services, government policies and regulation.

Request to Get the PDF Sample of the Healthcare Cloud Computing Market Report: @ https://www.marketresearchinc.com/request-sample.php?id=16263

The Healthcare Cloud Computing Market is growing at a High CAGR during the forecast period 2021-2028. The increasing interest of the individuals in this industry is that the major reason for the expansion of this market. The report provides a detailed overview of the industry including both qualitative and quantitative information. It provides overview and forecast of the Global Healthcare Cloud Computing Market based on various segments. It also provides market size and forecast estimates from year 2021 to 2028

Healthcare Cloud Computing Market Segmented by Major Key Players, Major Type, Major Application and Regional Outlook

Key Manufacturers/Players:

Major Types

Major Application

Regional Outlook:

The report offers in-depth assessment of the growth and other aspects of the Car Rental and Leasing market in important regions. Key regions covered in the report are North America (United States, Canada and Mexico), Europe (Germany, France, United Kingdom, Russia and Italy), Asia-Pacific (China, Japan, Korea, India, Southeast Asia and Australia), South America (Brazil, Argentina), Middle East & Africa (Saudi Arabia, UAE, Egypt and South Africa)

Request a Discount on the Healthcare Cloud Computing market Report: @ https://www.marketresearchinc.com/ask-for-discount.php?id=16263

COVID-19 Impact on Each Industry:

The COVID-19 pandemic is first a health and humanitarian crisis, and businesses are rapidly adjusting. The report clarifies the level of COVID-19 effect on each fragment under the extent of the report with its pattern over the forecast period. Coronavirus pandemic circumstances have prompted a huge easing back down in the creation and assembling of different areas. Strict lockdowns imposed by several governments have led to a temporary shutdown of small as well as major market players. Our danger administrations distinguish genuine and likely dangers all throughout the world and assist our clients with understanding the implications for their associations. Five-year forecasts, analysis of key topics and news investigation for key ventures in significant economies. These forecasts depend on the most recent information and inside and out investigation of industry patterns.

Reason to Buy

Ask any Query or Customization about this Report: @ https://www.marketresearchinc.com/enquiry-before-buying.php?id=16263

The report provides insights on the following pointers:

Note: If you have any special requirement, please let us know and we will offer you the report as you

For a better understanding of the customers, it uses effective graphical presentation techniques such as graphs, charts, tables as well as pictures. It helps the market industries to get better insights for improving the performance of the companies as well as product.

Table of Content:

1.1 Study Scope

1.2 Key Market Segments

1.3 Players Covered

1.4 Market Analysis by Type

1.5 Market by Application

1.6 Study Objectives

1.7 Years Considered

2.1 Market Size

2.2 Growth Trends by Regions

2.3 Industry Trends

3.1 Market Size by Manufacturers

3.2 Key Players Head office and Area Served

3.3 Key Players Product/Solution/Service

3.4 Date of Entry into the Market

3.5 Mergers & Acquisitions, Expansion Plans

4.1 Sales by Product

4.2 Revenue by Product

4.3 Price by Product

5.1 Overview

5.2 Breakdown Data by End User

About Us

Market Research Inc is farsighted in its view and covers massive ground in global research. Local or global, we keep a close check on both markets. Trends and concurrent assessments sometimes overlap and influence the other. When we say market intelligence, we mean a deep and well-informed insight into your products, market, marketing, competitors, and customers. Market research companies are leading the way in nurturing global thought leadership. We help your product/service become the best they can with our informed approach.

Contact Us

Market Research Inc

Author: Kevin

US Address: 51 Yerba Buena Lane, Ground Suite,

Inner Sunset San Francisco, CA 94103, USA

Call Us: +1 (628) 225-1818

Write Us: sales@marketresearchinc.com

Website: https://www.marketresearchinc.com

Read the rest here:
Global Healthcare Cloud Computing Market 2021 Impact of COVID-19 on Industry Revenue, Growth and Forecast Period 2028 The Courier - The Courier

Investors Should Check Out This Cloud Computing Stock That’s Down More Than 25% – The Motley Fool

As corporate computing infrastructures become more complex, it's important for information technology teams to keep tabs on how their tech stack is performing. Datadog (NASDAQ:DDOG) was founded to make this easier by helping its customers observe and monitor all facets of their network and every application users need. The stock has more than doubled since the company went public in September 2019, but shares have pulled back with the tech sell-off in the market. Even with shares off double digits from their recent high, Fool contributor Brian Withers explains why this cloud specialist is worth a look on a Fool Live episode that was recorded on May 13.

Brian Withers: I'm going to talk about Datadog. I really like this company. It's set up for the future in the cloud. If you don't know what Datadog does, it has a set of what it calls observability tools, which allow information technology teams to observe or look into their applications, their network, their logs. If you're not familiar with software, it creates a bunch of logs, which are just dumps of huge amounts of data. What Datadog does is pull all the stuff together in a semblance where you can look at it all on one pane of glass.

What happens over time is networks are getting more complicated and applications. Companies may have stuff that they host on-premise in their own data center that has been around for a while, some Oracle. I remember when I was in corporate, we had an Oracle instance to run our manufacturing business, and that was hosted on site. But then there's all these cloud platforms that are hosted somewhere else, on Azure, AWS, whatnot. More and more companies are getting into this hybrid environment, where it's almost users can be anywhere and the software can be anywhere. It's really important for companies that depend on their websites being available for customers, which is just about everybody nowadays.

Datadog is really helpful in pulling things together. In fact, they shared one customer this past quarter who took eight different observability tools they were using, and they centralized, consolidated down to just Datadog.

Some of the numbers are just really impressive if you look at the most recent quarter, 51% top-line revenue change. I really like how they're having customers land and then expand. They have a bunch of different products. Customers spending $100,000 or more make up a majority of their annual recurring revenue, about 75% and it's growing at a healthy 50% a year. They have about 1,400 customers that are spending more than $100,000 a year.

Their remaining performance obligations, the sum of all the contracts together, and how long they are, and the monthly fees, and whatnot, that's grown 81% year over year. To me, what that says is, since that's growing faster than revenue, customers are signing up for bigger contracts and potentially longer contracts. That really bodes well for the future of this company.

I talked about the products being a land-and-expand model. At the end of the first quarter, 75% of all their customers were using more than one product, which is up from 63% last year, and those using four or more doubled to 25%.

This company has got an addressable market of about $35 billion that it shared in its filing to go public. But this monitoring and observability stuff is really just getting started.

To me, I'm still really super positive about this company. The only thing that's changed for me is the stock price. If you haven't taken a look at this one, maybe it's time you did.

This article represents the opinion of the writer, who may disagree with the official recommendation position of a Motley Fool premium advisory service. Were motley! Questioning an investing thesis -- even one of our own -- helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.

Continued here:
Investors Should Check Out This Cloud Computing Stock That's Down More Than 25% - The Motley Fool

Going to the Moon via the Cloud – The New York Times

Before the widespread availability of this kind of computing, organizations built expensive prototypes to test their designs. We actually went and built a full-scale prototype, and ran it to the end of life before we deployed it in the field, said Brandon Haugh, a core-design engineer, referring to a nuclear reactor he worked on with the U.S. Navy. That was a 20-year, multibillion dollar test.

Today, Mr. Haugh is the director of modeling and simulation at the California-based nuclear engineering start-up Kairos Power, where he hones the design for affordable and safe reactors that Kairos hopes will help speed the worlds transition to clean energy.

Nuclear energy has long been regarded as one of the best options for zero-carbon electricity production except for its prohibitive cost. But Kairos Powers advanced reactors are being designed to produce power at costs that are competitive with natural gas.

The democratization of high-performance computing has now come all the way down to the start-up, enabling companies like ours to rapidly iterate and move from concept to field deployment in record time, Mr. Haugh said.

But high-performance computing in the cloud also has created new challenges.

In the last few years, there has been a proliferation of custom computer chips purposely built for specific types of mathematical problems. Similarly, there are now different types of memory and networking configurations within high-performance computing. And the different cloud providers have different specializations; one may be better at computational fluid dynamics while another is better at structural analysis.

The challenge, then, is picking the right configuration and getting the capacity when you need it because demand has risen sharply. And while scientists and engineers are experts in their domains, they arent necessarily in server configurations, processors and the like.

This has given rise to a new kind of specialization experts in high-performance cloud computing and new cross-cloud platforms that act as one-stop shops where companies can pick the right combination of software and hardware. Rescale, which works closely with all the major cloud providers, is the dominant company in this field. It matches computing problems for businesses, like Firefly and Kairos, with the right cloud provider to deliver computing that scientists and engineers can use to solve problems faster or at lowest possible cost.

Read the rest here:
Going to the Moon via the Cloud - The New York Times

Cadence Unleashes Clarity 3D Solver on the Cloud for Straightforward, Secure and Scalable Electromagnetic Analysis of Complex Systems on AWS -…

SAN JOSE, Calif.--(BUSINESS WIRE)--Cadence Design Systems, Inc. (Nasdaq: CDNS) today announced a straightforward, secure and cost-effective approach to gaining access to compute resources in the cloud when executing 3D electromagnetic (EM) simulations with Cadence Clarity 3D Solver Cloud. Clarity 3D Solver Cloud provides the ability to scale 3D finite element method (FEM) simulation capacity from 32 cores to thousands of cores using secure connections to Amazon Web Services (AWS).

This inventive hybrid approach gives users the option to simulate using local compute resources or cloud simulation resources without having to add to either on-premises or cloud computing budgets. Cloud simulations keep design data safe on local computers while sending only encrypted simulation-specific data to the cloud. Clarity 3D Solver Cloud automatically sets up and simulates in a private and secure AWS chamber, accessing the user-defined number of compute cores. Simulation results are returned to the local on-premises computer(s) while the data in the secure AWS chamber is immediately deleted. This ensures 3D EM data is safe, protected and only discoverable on the customers local device.

Cadences approach to accelerating Clarity 3D Solver simulations on AWS allows customers to leverage high-performance cloud resources and achieve faster turnaround time, said David Pellerin, Head of Worldwide Industry Business Development, Semiconductors, AWS. We are delighted to be working with Cadence on Clarity Cloud and on future innovations that support our shared customers in their transition to cloud.

As More than Moore drives electronic systems that are higher in frequency and speed, the demand on 3D EM simulation also grows. Due to increasing design complexity, complicated by faster and faster data transfer speeds and frequency specs such as DDR5, 112G and 5G NR, Clarity 3D Solver Cloud closes the simulation versus time/compute-resources gap.

With Clarity 3D Solver Cloud, EM simulations on entire complex systems are now possible. Instead of purchasing additional on-premises compute resources to run more and more EM simulations, Cadences innovative business model provides customers with 24-hour-per-day access to the cloud. Powered by the production-proven Cadence CloudBurst Platform, Clarity 3D Solver Cloud allows the number of compute cores to be controlled by the design engineer. When additional compute nodes are needed beyond what is available locally, flexible terms are available for adding capacity to accelerate simulation time during peak usage or to decrease time-to-simulation results when simulating larger, more complex structures. The goal is to empower customers to decrease EM simulation times while expanding EM design analysis.

We continue to innovate with our system-level analysis solutions to provide unprecedented speed and accuracy, said Ben Gu, vice president of multiphysics system analysis in the Custom IC & PCB Group at Cadence. Clarity 3D Solver Cloud delivers a flexible, scalable solution for our customers so they can solve large, complex and leading-edge EM designs on demand, without waiting for compute resources to become available. The hybrid cloud approach keeps design data secure on-premises, provides a straightforward and familiar user experience for our Clarity 3D Solver customers, and most importantly, accelerates EM results using state-of-the-art AWS compute resources to generate 3D FEM electromagnetic analysis results.

Endorsements

Simulation to ensure first-time right designs is a prerequisite for success in a high-speed interconnect system, and the ability to rapidly deliver these advanced technology services enhances Sanminas solution offerings to our leading-edge customers. The Clarity 3D Solver provides unsurpassed speed and capacity, and now, with the new cloud solution, we can quickly respond to demanding simulation needs without front-end loaded capital expenses and delays from procurement of compute resources. In addition, the convenience of running large simulations from our notebook computers allows us to optimize IT budgets without impacting engineering productivity.Drew Doblar, VP of Engineering, at Sanmina Corporation

Inventec serves our customers with excellence because of our innovation, quality, open mind and execution. Fast and accurate simulation of our designs used in cloud computing, wireless communication, intelligent devices and IoT ensures we can deliver working prototypes in the shortest possible time. Cadence has long been our trusted partner to provide simulation solutions that tightly integrate with our design tools to help us rapidly find and fix design problems. We utilize Clarity 3D Solver with on-premises hardware, but with the new Clarity 3D Solver Cloud, we have access to unlimited compute resources with zero wait time, allowing us to more quickly optimize our designs. This reduces our turnaround time and enables us to produce more robust products with fewer design iterations.Steven Ting, Director at Inventec

At Enflame Technology, our AI chips need to be delivered in reliable hardware to serve our downstream partners. We cannot afford multiple design respins, so we trust Cadence to provide us with integrated design and analysis solutions to enable us to streamline system-level design. The Clarity 3D Solver has provided unparalleled speed and capacity with proven accuracy. We utilize Clarity 3D Solver Cloud to break the bottleneck during peak demand simulation times as it gives us access to unlimited compute resources, allowing us to take full advantage of the scalability of Clarity technology to enable true 3D simulation results. This makes our design cycles predictable and increases company-wide productivity.Arthur Zhang, Founder and COO of Enflame Technology

Availability

The Clarity 3D Solver Cloud technology supports Cadences Intelligent System Design strategy, enabling system innovation. Clarity 3D Solver Cloud is available today. Customers can learn more at http://www.cadence.com/go/Claritycloud as well as register for the on-demand EM Forum event where Clarity 3D Solver Cloud will be discussed in detail. Register at http://www.cadence.com/go/ClarityWebinar.

About Cadence

Cadence is a pivotal leader in electronic design, building upon more than 30 years of computational software expertise. The company applies its underlying Intelligent System Design strategy to deliver software, hardware and IP that turn design concepts into reality. Cadence customers are the worlds most innovative companies, delivering extraordinary electronic products from chips to boards to systems for the most dynamic market applications, including consumer, hyperscale computing, 5G communications, automotive, mobile, aerospace, industrial and healthcare. For seven years in a row, Fortune magazine has named Cadence one of the 100 Best Companies to Work For. Learn more at cadence.com.

2021 Cadence Design Systems, Inc. All rights reserved worldwide. Cadence, the Cadence logo and the other Cadence marks found at http://www.cadence.com/go/trademarks are trademarks or registered trademarks of Cadence Design Systems, Inc. All other trademarks are the property of their respective owners.

Read more from the original source:
Cadence Unleashes Clarity 3D Solver on the Cloud for Straightforward, Secure and Scalable Electromagnetic Analysis of Complex Systems on AWS -...

Conduct most becoming – Europes new Cloud Code of Conduct shines a light on trust and transparency between buyers and sellers – Diginomica

A lighthouse project that will shine a light to guide both users and providers towards more trustworthy and transparent adoption of cloud computing in Europe and beyond.

Thats the bold claim made for the new EU Cloud Code of Conduct (CoC), the result of over four years of collaboration between the European Commission and suppliers from the cloud computing industry. Its stated mission is to make it easier for customers to determine whether cloud services from various providers are suitable for their designated purposes, as well as creating an environment of trust that adherence to its terms will result in a default level of data protection, built, of course, around GDPR (General Data Protection Regulation).

This is a very important moment for the industry, argues Agnieszka Bruyere, VP, IBM Cloud EMEA:

Up to now, we have been missing a reliable, easy tool to assess the compliance of cloud computing services with data protection regulations. This created an uncertainty for users and also cloud providers that slowed down the adoption of cloud computing all over Europe. This period has now come to an end.

This is the first tool in Europe that allows us to demonstrate not only compliance, but also to bring proof of compliance for cloud users and cloud providers all over the Europe. It's also very important because for the first time we have an independent monitoring body that has been accredited.

So whats actually in the Code? As per the CoCs own descriptor, the Cloud Code of Conduct:

The Code is a voluntary instrument and service providers demonstrate compliance through self-evaluation and self-declaration and/or through third-party certification, attaining one of three levels of approval.

Providers have to:

convincingly explain how the requirements of the Code are met. The Monitoring Body will refer to questionnaires. First set of questions is a derivative of the Controls Catalogue. Depending on the information provided there will be follow-up questions or requests; questions are mostly related to better understand the actual measures; requests are mostly related to further evidence and samples. In case provided information leave doubts of a CSPs compliance requests may also be related to particular remedies and or confirmations.

Once a cloud service has been verified as compliant, it will be listed in the Public Register where customers can view the results. To date, a number of vendors have gone through or are in the process of going through the certification procedures. For example, SAP, Google, Workday and Microsoft already boast some level of compliance, while Oracle and Salesforce are completing their respective certifications.

Workday has been involved in the CoC collaboration since early 2018, says Barbara Cosgrove, the firms Chief Privacy Officer, attracted in part by its broad focus on all types of cloud services:

As a cloud provider, we continue to look for innovative ways to demonstrate to our customers that we meet our obligations. We think that you must have that 'trust, but verify it' approach. We have binding corporate rules for processors, we use that as a data transfer mechanism, along with other mechanisms. But we really thought that the Code of Conduct was a unique mechanism under GDPR where, in addition to having a contractual mechanism, you were able to tie it to much more detailed audits and monitoring.

The fact that the CoC covers SaaS, PaaS and IaaS offerings is being pitched as a significant differentiator compared to other similar initiatives. As Jo Copping, Salesforces EMEA Senior Director of Privacy, explains:

That makes it a very unique proposition and a really all-encompassing solution. The Code does bring really an unprecedented level of certainty and coherence to the market. Its a really robust tool of reference that can be used by cloud providers and cloud users alike to ensure that data protection standards really are high. As cloud providers, we get a robust tool that we can use to demonstrate our commitment to EU privacy principles and also to fundamental EU values.

European customers are very highly-sophisticated when it comes to their privacy and security requirements and they have very high expectationsMore trust leads to more cloud adoption which leads to more efficiencies and ultimately more benefits for European customers, citizens and also for the economy as a whole.

Another appealing aspect of the CoC is the existence of an independent body to monitor its operation and future development, argues Karine Picard, Vice-President of Business Development EMEA, Oracle:

It was absolutely key for us as well to have an authority, like SCOPE Europe, that can monitor and and drive continuous innovation of the Code. It's the first code, but it's not going to be the only one. Actually it's an evolutive code. With cloud and innovation, things are moving very fast, so to have this authority to monitor the code and to make it evolve in the right direction is really important as well.

The CoC sets a high bar for other parts of the world to emulate, suggests Nathaly Rey, Head of Data Governance EMEA, Google Cloud:

Normally our privacy programs are developed to scale globally. The things that you do - the product changes, the engineering changes, many of the contractual changes and the operational changes - not only impact Europe, but they impact your services globally. So, by virtue of adopting the Code, you're raising the bar for the industry and many times you're raising the bar for your compliance program globally.

GDPR and European privacy is inspiring what you do in other jurisdictions, in other countriesOther jurisdictions have taken the European model as an inspiration in terms of the principles that govern data protection. We're seeing many of the processes and provisions inspired by GDPR and the previous [EU Data Protection] directive. So, definitely there is a trend hereCodes of conduct like this are helpful to help operationalize high level principles into the state of the art for an industry.

So, thumbs up from all round from the supplier perspective, but what about the other side of the coin - will buyers really care about this or dismiss it as gesture politics from the sell-side? After all, every cloud services provider talks up trust and transparency and adherence to the likes of GDPR as a de facto part of their outreach already.

Oracles Picard argues that one obvious appeal for buyers is simplification:

If a customer already has multiple clouds in their landscape, they have to manage multiple SLAs (Service Level Agreements), and of course there will be interoperability of those clouds. We have to simplify their life and having IaaS, PaaS and SaaS under one code is really an asset for the customer. For a provider like us, because we cover all of these elements, it would have been very complex for us to have to follow up different codes. Our customers already know that we are following and committed to follow GDPR and ISO and SOC, but the Code is really providing another layer of insurance for them...It's really simplifying our life internally and the life of our customer.

And customers are, to date, taking note, suggests Workdays Cosgrove:

We've definitely had conversations with our customers about it. It helps pull together that entire picture. It helps us to not send them three different documents, to try to look at one place, to be able to have one mapping that we didn't do ourselves that's coming from an approved source. GDPR can be complex and [when you try] to apply it to different cloud industries, everybody starts to have different interpretations. Its so detailed and so having a place [customers] can go to and have this controls catalog and have an overall one stop to take a look and be able to review, has been really helpful. Then they can look at our other [corporate] resources and they can look up all of our audits and third-party certifications, but having one place that does that overall kind of mapping and expectation has been really helpful in the conversation.

It is that sort of ambition that makes this EU CoC so important, concludes Matthias Cellarius, Head of SAP Data Protection & Privacy:

It's not just one code of conduct, it's the Code of Conduct as I would put it. It's the first code that's been fully acknowledged by the European Data Protection Board, and it's operational as of today. I think that's the big, big difference. It's a role model for other codes of conduct. It's a lighthouse project and it'll be something that will transform the entire industry and from which all of our customers can benefit.

Cynics observe that the great thing about having a standard is there are so many to choose from. My own personal experience of standards bodies has been to observe years and years of collaboration and co-operation as the detailed work gets done, only for participating bodies to end up fighting like a bag of cats before the ink is even dry on the press release that work is complete on the first version of the standard.

This looks different. The SaaS, PaaS and IaaS remit overview is hugely welcome and the existence and involvement of SCOPE bodes well. There are some impressive supplier testimonials to back the Code up as well assome notable omissions to date. (Amazon - can you hear me?).

Critical here will be to watch how this EU initiative does or does not influence similar movements in other parts of the world. For all the fine words that get expressed about GDPRs impact, the harsh reality is that were a long way from beating down the vested interests at play in stopping a GDPR-US avatar from taking shape.

That said, this is impressive work to date. Its also important to recognize that it is ongoing work and work on which suppliers must be held to account moving forward. As SAPs Cellarius puts it:

We have made it the lighthouse project with the potential to transform the industry. It's now on us to actually fill it with life, to make sure that all the merits that are perceived today are actually going to turn into reality, and that we all show that we can comply and that we continue working on the Code and keeping it relevant, not only for now, but for the future.

Definitely something that we'll be keeping an eye on.

Read this article:
Conduct most becoming - Europes new Cloud Code of Conduct shines a light on trust and transparency between buyers and sellers - Diginomica

Global Cloud Computing Data Center IT Asset Disposition (ITAD) Market 2021 Opportunities and Key Players To 2026 Arrow Electronics, Sims Recycling,…

Global Cloud Computing Data Center IT Asset Disposition (ITAD) Market Growth (Status and Outlook) 2021-2026 qualitatively & quantitatively analyzes the market with a description of market sizing and growth. The report offers complete data to help businesses develop their business and plan their way towards growth. The report provides figures including the latest trends and developments in the global Cloud Computing Data Center IT Asset Disposition (ITAD) market industry and important facts. The report entails a comprehensive database on market estimation based on historical data analysis. It throws light on different factors and trends affecting the development chart of the global market.

NOTE:Our report highlights the major issues and hazards that companies might come across due to the unprecedented outbreak of COVID-19.

DOWNLOAD FREE SAMPLE REPORT: https://www.marketsandresearch.biz/sample-request/153665

Complete Overview:

The primary objective of this report is to provide brief knowledge about the industry landscape with opportunities wide open in the market. In the research study, the research analysts have conducted a detailed study about all the market segments and were able to categorize the segments and the regions that should be concentrated by the market players in the coming years. The regional dominance and the highest growing regions are properly segregated for the clients so that they can channelize their investments and strategize their plans accordingly. The major segments that are categorized for the global Cloud Computing Data Center IT Asset Disposition (ITAD) market including the kindness of product, use as per specific regions, and their distribution channel or vendors.

Primitive vendors included in the market are: Arrow Electronics, Sims Recycling, IBM, HPE, Atlantix Global Systems, Iron Mountain Incorporated, GEEP, Dell, ITRenew Inc., Apto Solutions, CloudBlue Technologies, Dataserv,

The pictorial and informative representation of the market drivers and opportunities has been well explained through the different segmentation including product, application, competitive landscape, and geography. The research report provides a complete overview and research of the global Cloud Computing Data Center IT Asset Disposition (ITAD) market. The market dynamics are well explained in the report. The key attributes that are incorporated in the report include the market drivers, opportunities, the technologies that are helping the market to prosper. The competitive landscape detailing provides more knowledge associated with historical and future market plans, strategies, business operations, and acquisitions.

The product types covered in the report include: IT Equipment, Support Infrastructure,

The application types covered in the report include: Data Sanitization, Recovery, Recycling, Other

Regions And Countries Level Analysis:

A more comprehensive part of the global market research and analysis study presented in the report is regional analysis. This section highlights sales growth in various regional and country-level markets. The report provides detailed and accurate country-wise volume analysis and region-wise market size analysis of the global Cloud Computing Data Center IT Asset Disposition (ITAD) market for the historical and forecast period 2021 to 2026.

ACCESS FULL REPORT: https://www.marketsandresearch.biz/report/153665/global-cloud-computing-data-center-it-asset-disposition-itad-market-growth-status-and-outlook-2021-2026

Market analysis is provided for major regions as follows: Americas (United States, Canada, Mexico, Brazil), APAC (China, Japan, Korea, Southeast Asia, India, Australia), Europe (Germany, France, UK, Italy, Russia), Middle East & Africa (Egypt, South Africa, Israel, Turkey, GCC Countries)

The Research Aims to Addresses the Following Doubts Pertaining to The Market:

The worldwide study gives an outline, with features and forecasts of the global growth of the global Cloud Computing Data Center IT Asset Disposition (ITAD) industry. The manufacturers segment also provides SWOT analysis, products, production, value, capacity, and other vital factors of the individual player. The report delivers data related to import and export, revenue, production, and key players of all regional markets studied are covered.

Customization of the Report:

This report can be customized to meet the clients requirements. Please connect with our sales team (sales@marketsandresearch.biz), who will ensure that you get a report that suits your needs. You can also get in touch with our executives on +1-201-465-4211 to share your research requirements.

Contact UsMark StoneHead of Business DevelopmentPhone: +1-201-465-4211Email: sales@marketsandresearch.bizWeb: http://www.marketsandresearch.biz

The rest is here:
Global Cloud Computing Data Center IT Asset Disposition (ITAD) Market 2021 Opportunities and Key Players To 2026 Arrow Electronics, Sims Recycling,...

Twilio backs Terazo to power the API economy and third wave of cloud – VentureBeat

Elevate your enterprise data technology and strategy at Transform 2021.

Companies are grappling with the rapid migration to cloud computing, which opens a Pandoras box of problems at a time when technology talent is at a premium.

Internal development teams are already stretched thin and are often focused on core product development. This can leave a vacuum when it comes to building robust cloud-native applications that rely on application programming interfaces (APIs), the glue that holds modern software together. The API economy is powering digital transformation across the spectrum, helping companies improve their efficiency and profitability by funneling into the expertise of the wider digital ecosystem.

This is where Terazo comes in. The five-year-old Virginia-based company helps everyone from stealth startups to large publicly traded companies integrate and automate their internal systems by providing the tech talent (or engineering rockstars, as Terazo calls them) companies need to flourish in the cloud era. While Terazo sometimes serves as a simple outsourced software development team, it also works with internal development teams in what Terazo CEO Mark Wensell calls a blended team model, with Terazo bringing its experience in API management to the mix.

Internal development teams are often focused on the core business systems and software development initiatives of a company, with large product development backlogs that theyre assigned to, Wensell told VentureBeat. When strategic API integration or API publishing opportunities come along, using internal teams for that work often means their focus attention to core product development is taken away.

To help with this, the company today announced it has raised $10.5 million in a round of funding led by Tercera, an investment firm specializing in cloud services companies. Cloud communications API giant Twilio also participated in the round, a notable strategic investment for a company that relies on consultancy companies such as Terazo to help businesses deploy and manage their APIs.

Above: Terazo CEO Mark Wensell

In many ways, Terazo is analogous to APIs themselves. Rather than having to build a robust communications infrastructure from scratch, tech companies can leverage the expertise of third-party specialists and integrate with their smarts through an API. For example, why would Box develop and maintain its own two-factor authentication (2FA) system when it can use Twilios infrastructure instead?

Terazo and its ilk fulfill a similar role, but from a human resources perspective.

For us, things like REST API integrations are a relatively repeatable pattern across our client portfolio, whereas for an internal team, there might be a much larger learning curve to learn the necessary patterns and practices and get it right, Wensell said. That becomes even more true when a client is building on a technology platform like Twilio. We have deep experience and a lot of reusable assets that mean our clients spend is focused on the customization unique to their business process.

While Terazo generally works under tight NDAs and cant disclose its clients, it did reveal that real estate platform PrimeStreet is one of its customers and that Terazo has helped it build a more cohesive, unified digital platform powered by APIs.

Rather than just building a collection of mobile apps and websites, we helped them make the core of their business a platform in and of itself, Wensell added.

All of this feeds into a trend that has been touted as the third wave of cloud computing, with some estimates predicting cloud spending could hit $1 trillion in 2030. This opens the door to third-wave consultancies such as Terazo, which can plug a gap with specialized skills and let in-house developers focus on what they do best.

We define a third-wave consultancy as a cloud-focused professional services firm whose business values, delivery, go-to-market models, and talent are optimized for the clouds next wave of growth, Tercera partner Dan Lascell told VentureBeat. The cloud has matured over the last 20 years, and the attributes that made services firms successful in the first two waves wont necessarily cut it in the next decade.

See original here:
Twilio backs Terazo to power the API economy and third wave of cloud - VentureBeat

Healthcare Cloud Computing Market – Global Industry Analysis, Share, Growth, Trends and Forecast 2021-2027 available in the latest report – WhaTech

Guest Post By

Healthcare Cloud Computing Market - Global Industry Analysis, Size, Share, Growth, Trends and Forecast 2021-2027. Global Healthcare Cloud Computing Market size is expected to grow at an annual average of 17% during 2021-2027.

The Global Healthcare Cloud Computing Market size is expected to grow at an annual average of 17% during 2021-2027. The medical computing market is expected to grow exponentially over the next few years.

The amount of digital data managed by medical centers has multiplied over the years due to changes in payment methods, growing patient pools, and more. Advances in technology and increasing medical costs have further fueled an increase in the amount of information to be stored and managed.

(Getthis Report)

A full report of Global Healthcare Cloud Computing Market is available at: http://www.orionmarketreports.com/healthcket/10742/

The following Segmentation are covered in this report:

By Product

By Deployment Model

By Component

By Pricing Model

By Service Model

The report covers the following objectives:

Scope of the Report

The research study analyses the global Healthcare Cloud Computing industry from 360-degree analysis of the market thoroughly delivering insights into the market for better business decisions, considering multiple aspects some of which are listed below as:

Recent Developments

Geographic Coverage

Key Questions Answered by Healthcare Cloud Computing Market Report

About Us:

Orion Market Reports (OMR) endeavours to provide exclusive blend of qualitative and quantitative market research reports to clients across the globe. Our organization helps both multinational and domestic enterprises to bolster their business by providing in-depth market insights and most reliable future market trends.

Our reports address all the major aspects of the markets providing insights and market outlook to global clients.

This email address is being protected from spambots. You need JavaScript enabled to view it.

Excerpt from:
Healthcare Cloud Computing Market - Global Industry Analysis, Share, Growth, Trends and Forecast 2021-2027 available in the latest report - WhaTech

Perera envisions the future of edge computing: faster and more adaptable than ever before – Communique

As humans and our billions of devices become increasingly dependent on digital systems, computers are running out of processing power to keep up.

Fortunately, writes Darshika Perera, assistant professor of electrical and computer engineering, there is a solution: customized and reconfigurable architectures that can bring next-generation edge-computing platforms up to speed.

Perera published the research in a feature story for the spring edition of the Institute of Electrical and Electronics Engineers Canadian Review. In it, she describes a key concern for those of us living in the Internet of Things era: as more systems process massive amounts of data in the cloud, the cloud is facing serious obstacles to keeping up. Poor response times, high power consumption and security issues are just a few of the challenges it faces when transmitting, processing and analyzing the enormous burden of global data.

Pereras research, conducted with a team of ten graduate students in the Department of Electrical and Computer Engineering, highlights a twin path forward.

First, Perera writes, data processing must move away from its reliance on traditional cloud infrastructures and towards a complementary solution: edge computing.

Non-computer scientists can think of edge computing like a popular pizza restaurant opening new locations across town. A pizza that travels 20 miles from the restaurant will be cold by the time it reaches the customer. But a pizza cooked right down the street will arrive faster and reduce the strain on the original restaurants kitchen.

Similarly, edge computing processes data nearer to its source on phones, smart watches and personal computers rather than farming the job out to the cloud. Perera writes that edge computing addresses nearly all of the challenges faced by cloud computing, from speed and bandwidth to security and privacy.

But edge computing is still in its infancy, and most of the edge-computing solutions currently being visualized rely on processor-based, software-only designs.

At the edge, the processing power needed to analyze and process such enormous amount[s] of data will soon exceed the growth rate of Moores law, Perera writes. As a result, edge-computing frameworks and solutions, which currently solely consist of CPUs, will be inadequate to meet the required processing power.

Thats where Pereras second conclusion comes into play. To process and analyze ever-increasing amounts of data and to handle associated problems along the way the next generation of edge computing platforms needs to incorporate customized, reconfigurable architectures optimized for specific applications.

What does this mean? It means that computer processors will no longer perform one dedicated job. Instead, like shapeshifters, they will configure themselves to perform any computable task set before them.

The flexibility of these systems put them head and shoulders over general-purpose processors. Perera writes that reconfigurable computing systems, such as field-programmable gate arrays (FPGAs), are more flexible, durable, upgradable, compact and less expensive, as well as faster to produce for the market all of which helps to support real-time data analysis.

As Perera envisions the future of edge computing, her analysis shows that multiple applications and tasks can be executed on a single FPGA by dynamically reconfiguring the hardware on chip from one application or task to another as needed.

These kinds of improvements from traditional computing processes, Perera writes, will make next-generation edge-computing platforms smart and autonomous enough to seamlessly and independently process and analyze data in real time, with minimal or no human intervention. In the future, they could allow technologies that rely on lightning-fast edge computing, like self-driving cars, to become ubiquitous and they could certainly enable technologies in the future that are unimaginable today.

One thing is for certain: they will make computing faster, more autonomous, and more adaptive than ever before.

Read Pereras full article for IEEE Canadian Review online.

Darshika Perera is an assistant professor of electrical and computer engineering in the College of Engineering and Applied Sciences at the University of Colorado Colorado Springs. She has extensive experience in embedded systems, digital systems, data analytics and mining, hardware acceleration and dynamic reconfiguration and machine learning techniques. Her research is conducted with a team of graduate students in the Department of Electrical and Computer Engineering at UCCS. Learn more online.

See original here:
Perera envisions the future of edge computing: faster and more adaptable than ever before - Communique