Page 2,003«..1020..2,0022,0032,0042,005..2,0102,020..»

I am Just an Architect With His Head in the Cloud – hackernoon.com

"Cloud Architect" has become a trendy title in the information technology sector. Ask many people the career path they want, and they'll respond "cloud architect." But, what is a cloud architect, really? People often repeat the buzzy phrase without knowing what it entails. Not to worry though, we're here to help you clear the air.

Copywriter, community manager, editor. Interested in fintech, investing, fund management.

"Cloud Architect" has become a trendy title in the information technology sector.

Ask many people the career path they want, and they'll respond "cloud architect."

But, what is a cloud architect, really? People often repeat the buzzy phrase without knowing what it entails. Not to worry though, we're here to help you clear the air.

Let's first define what cloud architecture generally means. Cloud architecture refers to the various components that form a cloud computing system.

It refers to how individual technologies combine to create cloud environments where numerous computers share resources from a single network.

A cloud architect is a person responsible for conceptualizing and developing cloud architecture. They're responsible for converting the technical concepts and requirements for a project into a working cloud system.

A cloud architect is typically in charge of a company's cloud strategy, a very delicate role. Their duty is critical because failure in a company's cloud system can affect all the aspects of its business.

Hence, enterprises often seek highly-skilled cloud architects and pay top dollar for them. It's no surprise that the profession of a cloud architect has become trendy as of late, given the prestige and monetary resources businesses now assign to them.

The cloud computing sector is already huge yet growing enormously. According to research firm Markets and Markets, the global cloud computing market is expected to grow from $445 billion in 2021 to $947 billion in 2026. Hence, cloud architects are well-positioned to ride this growth wave. It's a wise career choice.

You've heard good things about the profession of a cloud architect. But, how can you become one? There are several vital steps to take to become one, and it starts with some initial skills you must have.

Every cloud architect must be well versed in computer programming. The most common coding languages used in cloud architecture are Java, Python, and C++, but there are many more you can learn.

You need computer programming skills to convert technical requirements into real projects. Likewise, a good cloud architect should be able to program quickly to create a proof of concept for the desired product.

You can't create a reliable cloud solution without sufficient knowledge of computer networking. A good cloud architect must know how to interact with the various components that make up a computer network.

For example, you should know how to use a content delivery network for geographic distribution or a virtual private cloud (VPC) to isolate parts of your cloud network.

Security is essential to any cloud network. Cloud computing has brought many benefits, but one of its drawbacks is opening up enterprises to a higher risk of compromise.

According to IBM, the average cost of a cloud breach is $4.2 million, so you want to avoid that.

Every cloud architect must implement advanced security measures to protect their enterprise from compromise.

Every cloud architect must know how to work with various database technologies.

Many data storage options are available, so you're free to choose anyone. For example, you can use Amazon S3 for object storage or Hadoop clusters for analyzing large amounts of structured data.

A good cloud architect must be well-versed with general or specialized cloud platforms. For example, a cloud architect in a finance firm should be familiar with the MQL5 Cloud Network, a specialized distributed network for finance experts developing and deploying automated trading models.

The MQL5 Cloud Network reached a capacity of 34,000 agents in January 2022, according to Bloomberg. The network continues to grow due to users of MQL5.community, selling idle time of their computers' processors.

The above list isn't exhaustive. There are many other things a cloud architect must know, but we listed the most basic ones.

It's essential to learn the skills required for a cloud architect. But, many people won't believe you have the skills if you don't have evidence to back it up. Professional certificates are the easiest way to signal your cloud architecture expertise to prospective employers.

The highly sought-after certificates in the cloud industry are from three cloud providers; Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).

AWS

Amazon Web Services (AWS) is the world's biggest cloud computing provider by volume and sales. Hence, enterprises around the globe give greater credence to official AWS certifications.

AWS currently offers 11 certificates covering both basic and specialty cloud topics.

They're four certificate tiers; Foundational, Associate, Professional, and Specialty. Foundational covers six months of AWS knowledge, Associate covers one year, Professional covers two years, and Specialty for an unspecified amount of time.

Microsoft Azure

Azure is the second-biggest cloud provider trailing AWS. It's the cloud computing unit of tech giant Microsoft.

Microsoft offers 12 cloud certifications with 14 exams classified into three levels; Fundamental, Associate, and Expert. Some are role-based, including Azure Administrator, Azure Solution Architect, Azure AI Engineer, etc.

A Microsoft Azure certification will help you understand how to use the cloud platform effectively.

Google Cloud Platform

Google Cloud Platform (GCP) is the third-biggest cloud provider, owned by tech giant Google. The company currently offers ten role-based certifications, including for a specialized Cloud Architect.

The Cloud Architect certification takes you through the fundamentals of the Google Cloud Platform, including Kubernetes, BigQuery, App Engine, and Cloud Firestore. It'll give you the chance to build and deploy solutions in live GCP environments.

Getting a cloud certification isn't always easy, mainly for high-level ones. Endeavor to study as required to pass the certification exams.

Enterprise spending on cloud computing is ballooning. According to Gartner, more than half of enterprise IT spending by 2025 will be on cloud services.

You can observe virtually endless growth in this sector. A certification from a leading cloud provider paired with your innate cloud computing knowledge will open up many opportunities.

With sufficient cloud computing knowledge and certification to prove it, you can offer your services to employers. The demand is outsized, so you shouldn't have problems finding a job.

The IT world is your oyster as a certified cloud developer. You have endless opportunities to apply your expertise in this sector.

"Cloud Architect" has become a trendy title in the information technology sector.

Ask many people the career path they want, and they'll respond "cloud architect."

But, what is a cloud architect, really? People often repeat the buzzy phrase without knowing what it entails. Not to worry though, we're here to help you clear the air.

Let's first define what cloud architecture generally means. Cloud architecture refers to the various components that form a cloud computing system.

It refers to how individual technologies combine to create cloud environments where numerous computers share resources from a single network.

A cloud architect is a person responsible for conceptualizing and developing cloud architecture. They're responsible for converting the technical concepts and requirements for a project into a working cloud system.

A cloud architect is typically in charge of a company's cloud strategy, a very delicate role. Their duty is critical because failure in a company's cloud system can affect all the aspects of its business.

Hence, enterprises often seek highly-skilled cloud architects and pay top dollar for them. It's no surprise that the profession of a cloud architect has become trendy as of late, given the prestige and monetary resources businesses now assign to them.

The cloud computing sector is already huge yet growing enormously. According to research firm Markets and Markets, the global cloud computing market is expected to grow from $445 billion in 2021 to $947 billion in 2026. Hence, cloud architects are well-positioned to ride this growth wave. It's a wise career choice.

You've heard good things about the profession of a cloud architect. But, how can you become one? There are several vital steps to take to become one, and it starts with some initial skills you must have.

Every cloud architect must be well versed in computer programming. The most common coding languages used in cloud architecture are Java, Python, and C++, but there are many more you can learn.

You need computer programming skills to convert technical requirements into real projects. Likewise, a good cloud architect should be able to program quickly to create a proof of concept for the desired product.

You can't create a reliable cloud solution without sufficient knowledge of computer networking. A good cloud architect must know how to interact with the various components that make up a computer network.

For example, you should know how to use a content delivery network for geographic distribution or a virtual private cloud (VPC) to isolate parts of your cloud network.

Security is essential to any cloud network. Cloud computing has brought many benefits, but one of its drawbacks is opening up enterprises to a higher risk of compromise.

According to IBM, the average cost of a cloud breach is $4.2 million, so you want to avoid that.

Every cloud architect must implement advanced security measures to protect their enterprise from compromise.

Every cloud architect must know how to work with various database technologies.

Many data storage options are available, so you're free to choose anyone. For example, you can use Amazon S3 for object storage or Hadoop clusters for analyzing large amounts of structured data.

A good cloud architect must be well-versed with general or specialized cloud platforms. For example, a cloud architect in a finance firm should be familiar with the MQL5 Cloud Network, a specialized distributed network for finance experts developing and deploying automated trading models.

The MQL5 Cloud Network reached a capacity of 34,000 agents in January 2022, according to Bloomberg. The network continues to grow due to users of MQL5.community, selling idle time of their computers' processors.

The above list isn't exhaustive. There are many other things a cloud architect must know, but we listed the most basic ones.

It's essential to learn the skills required for a cloud architect. But, many people won't believe you have the skills if you don't have evidence to back it up. Professional certificates are the easiest way to signal your cloud architecture expertise to prospective employers.

The highly sought-after certificates in the cloud industry are from three cloud providers; Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).

AWS

Amazon Web Services (AWS) is the world's biggest cloud computing provider by volume and sales. Hence, enterprises around the globe give greater credence to official AWS certifications.

AWS currently offers 11 certificates covering both basic and specialty cloud topics.

They're four certificate tiers; Foundational, Associate, Professional, and Specialty. Foundational covers six months of AWS knowledge, Associate covers one year, Professional covers two years, and Specialty for an unspecified amount of time.

Microsoft Azure

Azure is the second-biggest cloud provider trailing AWS. It's the cloud computing unit of tech giant Microsoft.

Microsoft offers 12 cloud certifications with 14 exams classified into three levels; Fundamental, Associate, and Expert. Some are role-based, including Azure Administrator, Azure Solution Architect, Azure AI Engineer, etc.

A Microsoft Azure certification will help you understand how to use the cloud platform effectively.

Google Cloud Platform

Google Cloud Platform (GCP) is the third-biggest cloud provider, owned by tech giant Google. The company currently offers ten role-based certifications, including for a specialized Cloud Architect.

The Cloud Architect certification takes you through the fundamentals of the Google Cloud Platform, including Kubernetes, BigQuery, App Engine, and Cloud Firestore. It'll give you the chance to build and deploy solutions in live GCP environments.

Getting a cloud certification isn't always easy, mainly for high-level ones. Endeavor to study as required to pass the certification exams.

Enterprise spending on cloud computing is ballooning. According to Gartner, more than half of enterprise IT spending by 2025 will be on cloud services.

You can observe virtually endless growth in this sector. A certification from a leading cloud provider paired with your innate cloud computing knowledge will open up many opportunities.

With sufficient cloud computing knowledge and certification to prove it, you can offer your services to employers. The demand is outsized, so you shouldn't have problems finding a job.

The IT world is your oyster as a certified cloud developer. You have endless opportunities to apply your expertise in this sector.

Read more:
I am Just an Architect With His Head in the Cloud - hackernoon.com

Read More..

DigitalOcean Doubles Down on Its Frugal Strategy to Win Customers – The Motley Fool

Cloud computing provider DigitalOcean (DOCN 0.45%) is built for developers and small businesses. Most of the company's customers spend less than $50 each month, and all customers have access to 24/7 support and a wealth of resources. Getting started is easy, pricing is simple, and the list of products is short to avoid overwhelming users with options.

This focus on smaller customers means that DigitalOcean can't spend too heavily on customer acquisition. A direct sales force makes sense if you're selling enterprise customers on long-term contracts worth many thousands of dollars annually. When your customer base is small and fickle, this approach just doesn't make much sense.

On top of word-of-mouth marketing fueled by satisfied customers, DigitalOcean pulls in potential users with a vast array of articles, tutorials, and guides. Instead of hiring expensive sales teams or dumping cash into pricey online ads, DigitalOcean has put in the work to build out a vast collection of helpful content.

Image source: Getty Images.

When DigitalOcean went public, its content was drawing in around 5 million unique visitors to its website each month. This traffic isn't entirely free; that content must be created and updated. But compared to buying search ads, this strategy is about as cost-effective as it gets. DigitalOcean spent just 15% of its revenue on sales and marketing in the first quarter, a small fraction of what's typical for fast-growing tech companies.

DigitalOcean supercharged this content strategy in the first quarter by acquiring CSS-Tricks, a website that features thousands of articles, videos, and guides focused on front-end development. CSS-Tricks will remain a stand-alone website, but it now prominently displays DigitalOcean branding.

With CSS-Tricks now part of the DigitalOcean family, the company recorded an average of 9 million unique website visitors during the first quarter, up 70% year over year. In a world where cloud computing is dominated by the major cloud giants, building up brand recognition is critical to DigitalOcean's long-term growth.

Acquiring websites with high-quality content may be a better use of capital for DigitalOcean than acquiring cloud computing companies. One of DigitalOcean's biggest strengths is the simplicity of its platform. The company could go out and expand its platform through acquisitions, but that would put that simplicity at risk. By increasing the number of visitors to its website, DigitalOcean can pitch its answer to the complexity of cloud computing to a greater number of potential customers.

Shares of DigitalOcean took a beating on Thursday following its first-quarter report. The company's results were mixed relative to expectations, but revenue continued to grow swiftly, and full-year guidance was reiterated. With growth stocks in general being hammered, DigitalOcean hasn't been able to escape the tidal wave of selling.

DigitalOcean's market cap has fallen to $3.8 billion as I write this, about 6.7 times its guidance for full-year revenue. DigitalOcean isn't profitable, and it will be susceptible to any slowdown in the cloud computing market. But this is a company that is capable of growing at a double-digit rate for a very long time. DigitalOcean's total addressable market is expected to top $115 billion by 2024, and it serves a type of customer that just isn't a priority for the cloud giants.

DigitalOcean's beaten-down valuation would probably have been considered rich prior to the pandemic, so some caution is warranted. But DigitalOcean looks like a good way to bet on the growing cloud computing market, and there's likely more upside potential compared to the trillion-dollar cloud giants.

View post:
DigitalOcean Doubles Down on Its Frugal Strategy to Win Customers - The Motley Fool

Read More..

Cloud computing the most critical area for construction investment – survey – Bizcommunity.com

A survey conducted by RIB CCS in Q4 2021 identified cloud computing as the most critical area for construction industry investment. This was followed by building information modelling (BIM), mobile technology, and integrated technology platforms.

RIB CCS vice president Peter Damhuis

Damhuis notes that each time a construction company moves onto a new site, it has to set up some form of infrastructure for employees and support teams. The complexity of the infrastructure differs from site to site, from relatively basic setups at smaller sites to more complex arrangements at large sites.

Before cloud computing was widely adopted by the industry, people on site would require an IT infrastructure, printers and, in some instances, a dedicated server room to facilitate the exchange of data between teams. During the setup phase, a team of IT specialists would arrive on site and go from one container to the next, installing equipment and running software.

Less infrastructure also means fewer security concerns. When construction companies work in remote areas, they often have to guard against theft. When there is less equipment and infrastructure on-site, there is less to worry about, says Damhuis.

In addition, cloud computing promotes greater efficiency when it comes to construction projects. For example, programmes such as BuildSmart can be accessed from wherever the various team members are located and provide one source of information for everyone. All of the manual processes of seeking information, submitting requisitions and creating orders can now be completed in the cloud, in real time, improving the outcomes for everyone involved.

He says while construction companies have begun to move to the cloud, the process is not happening fast enough. There is a perceived cost element involved that construction companies cite as a hindrance. I say perceived because if these businesses conducted a cost value exercise, they would realise that the costs saved on infrastructure, people efficiencies, and other peripheral issues far outweigh the cost of introducing cloud computing.

Another challenge is trust. While most people will happily conduct all of their financial transactions on their mobile phones, construction companies are loathe to put confidential information in the cloud, even with the stringent security measures in place to keep their data secure.

Damhuis says when he started conducting conversations about moving to the cloud with his clients a few years ago, there was little interest in doing so. Those same clients are now asking us to help them make the transition. I believe the Covid-19 pandemic, Microsoft, and other players in the industry are major drivers behind this.

Another compelling reason for choosing the cloud is the concept of generative design, an iterative design process that uses the full power of the cloud to compute design alternatives. For example, if the construction team were building a complex arch, a generative design would calculate the optimum span, shape and load, explains Damhuis.

Damhuis says each job has its own information, but once construction companies start compiling information over numerous job sites, they are able to track trends on projects and make better executive decisions.

Notably, the capturing of information by drones or videos streaming from site also allows for the real-time tracking of the events on site, allowing people at the support office to follow progress and creating a connection between people in the support office and people on site.

Read more:
Cloud computing the most critical area for construction investment - survey - Bizcommunity.com

Read More..

What’s Next in Computing? Important Trends for Startups and Investors – Entrepreneur

Opinions expressed by Entrepreneur contributors are their own.

Since technology and business evolve continuously, it is critical for startup founders, corporate executives and venture capital investors to understand the latest trends. Doing so puts them in a better position to design product or service offerings, build partnerships between startups and corporations and make financially smart investments. Let's look at the latest developments in computing, and I'll share my advice to help you navigate whats ahead.

As big data-based research and analysis grow and cloud computing evolves, I expect high-performance computing usage to surge in 2022. It will become increasingly important and more mainstream because quantum computing technology needs to keep up with business demand. Businesses across the globe use quantum computing to explore space, conduct medical research and discover more effectively.

We've recently seen innovation and growth in quantum computing space by players including Google, IBM, Microsoft, Amazon, and Alibaba. Cutting-edge startups,Rigetti Computing, D-Wave Systems, ColdQuanta, 1Qbit, Zapata Computingand QC Ware have surprised industry insiders with their technology and their growth.

As we progress through 2022, we will likely see even more evolution in quantum computing, especially due to the massive demand for larger machine power. This demand will rapidly accelerate quantum computing development.I advise that startups, corporations and investors be on the lookout for large qubit developments soon. Commercial quantum computing will be within our reach, and new breakthroughs will occur rapidly.

While we once regarded artificial intelligence (AI) as a far-away notion, it's already become a key component of our lives. AI helps computers improve personalization, recommend products, control devices and rank search results. It's also integral to build improved models of the environment. Moving forward, I expect that we'll see AI in an even wider range of business and technology applications.

What's more, predictive analytics are being used to forecast potential future trends. Based on a report published by Facts & Factors, the global predictive analytics market is growing at a compound average growth rate of around 24.5% and is expected to reach $22.1 billion by the end of 2026.

I also expect major growth in cloud computing this year, as it becomes more prevalent with more computing workloads running on the cloud. The growth of the Internet of Things will help push cloud computing forward. According to predictions from Gartner, global spending on cloud services is expected to reach over $482 billion in 2022, up from $314 billion in 2020.

Internet connectivity and reliability is more critical than ever as employees continue to work remotely. The Internet of Things (IoT) makes the internet an even more integral part of our lives; developments at all network levels will drive research and stimulate the internet economy.

The Cisco Visual Networking Indexforecast update for 2018 states that there will be 1.4 billion more people using the internet in 2022, compared to 3.4 billion users in 2017. This equates to almost 60% of the global population, assuming a population of 8 billion in 2022. By then, internet users are expected to consume 4.8 zettabytes of data per year, 11 times the amount of IP traffic generated in 2012, with 437 exabytes.

Even though 5G is in its infancy, I anticipate that we'll see more focus on 6G in 2022. Back in 2018, China started researching 6G technology. By late 2020, the country launched a satellite to test terahertz signal transmission. Huawei and ZTE were involved. The U.S. also started 6G research in 2018; in fact, the Federal Communications Commission (FCC) opened a higher frequency spectrum for experimental use. A Next G Alliance was started in 2020, with companies on board including Apple, AT&T and Google. Japan, Korea and several European countries now take 6G seriously, and I expect that we'll see more announcements about this technology.

The growth of high-performance computing makes security and privacy even more important for businesses and consumers. Therefore, it's important that you consider these when making decisions. Several corporations including Colonial Pipeline, Acer and the National Basketball Association have been the target of major cyber attacks recently, and businesses have become more vulnerable to destructive attacks.

A growing number of cyber attacks will spur more counteractive measures across the board. Defending cyber attacks will include educating individuals so they can identify and avoid network assaults, thereby safeguarding their company's image. The use of AI can enable cybersecurity to spot risks and learn the behaviors of cybercriminals, thus preventing prevent future attacks.

While we can't predict the future, we can make smart business decisions today by understanding the latest technology and business trends in computing. This helps businesses and investors know where to apply their efforts now and how to anticipate how computing will evolve in future years. It's not easy to predict, but understanding the patterns of innovation in computing will help startups, and the corporations and venture capital firms that invest in them become even more successful.

See the article here:
What's Next in Computing? Important Trends for Startups and Investors - Entrepreneur

Read More..

Cloud Computing in Cell Biology Market Overview with Demographic Data and Industry Growth Trends 2022-2028 Queen Anne and Mangolia News – Queen Anne…

Global Cloud Computing in Cell Biology Market 2022 includes Precise Company profiling of leading players of the Cloud Computing in Cell Biology request. All of the parts studied in the report are anatomized grounded on different factors similar as request share, profit, and CAGR. To more understand the request, its veritably important to consider the opinions of request experts. Cloud Computing in Cell Biology Market report contains expert opinions. Its also divided into sections by type and sections by operation. All types describe the product for the cast period 2022-2028.

The researcher has also completely anatomized different regions similar as North America, Europe, and the Asia Pacific on the base of product, profit, and deals in the Cloud Computing in Cell Biology request. The experimenters used advanced primary and secondary exploration methodologies and tools for preparing this report on the Cloud Computing in Cell Biology request.

Get a Sample PDF of Cloud Computing in Cell Biology Market Report @

https://trouve360reports.com/global-cloud-computing-in-cell-biology-market-58479/?ff_landing=3&form=request-sample

Cloud Computing in Cell Biology Market Size and Growth 2022-2028:

Cloud computing provides fundamental support to address the challenges with shared computing resources including computing, storage, networking and analytical software. Progress in biomedical research is increasingly driven by insight gained through the analysis and interpretation of large and complex data sets. Recently, cloud computing has emerged as a powerful, flexible, and scalable approach to disparate computational and dataintensive problems. researcher predicts global cloud computing in cell biology market will grow from USD 1,798 million in 2021 to USD 5,830 million by 2028, achieving a CAGR of 18.3 percent, according to the latest edition of the Global Cloud Computing in Cell Biology Market Report.

The report provides in-depth analysis and insights regarding the current global market scenario, latest trends and drivers into global Cloud Computing in Cell Biology market. It offers an exclusive insight into various details such as market size, key trends, competitive landscape, and growth rate and market segments. This study also provides an analysis of the impact of the COVID-19 crisis on the Cloud Computing in Cell Biology industry.

Top manufacturers of Cloud Computing in Cell Biology Market in 2022:

This industry report offers market estimates and forecasts of the global market, followed by a detailed analysis of the application, and region. The global market for Cloud Computing in Cell Biology can be segmented by application: alloy products, batteries, chemicals, others. Cloud Computing in Cell Biology market is further segmented by region: Asia-Pacific, Europe, North America, Middle East and Africa (MEA), South America.

Any special requirements about this report, please let us know and we can provide custom report.

Cloud Computing in Cell Biology Market Segmentation:

By deployment mode:

Key questions answered in this report-

Table of Content:

PART 1. INTRODUCTION Report description Objectives of the study Market segment Years considered for the report Currency Key target audiencePART 2. METHODOLOGYPART 3. EXECUTIVE SUMMARYPART 4. MARKET OVERVIEW Introduction Drivers Restraints Impact of COVID-19 pandemicPART 5. MARKET BREAKDOWN BY DEPLOYMENT MODE Public cloud Private cloud HybridPART 6. MARKET BREAKDOWN BY APPLICATION Discovery and preclinical research Clinical trials Pharmaceuticals manufacturing OthersPART 7. MARKET BREAKDOWN BY END USER Biotechnology and pharmaceutical companies Clinical laboratories Contract research organization (CRO)PART 8. MARKET BREAKDOWN BY REGION Asia Pacific Europe North America Rest of the World (RoW)PART 9. KEY COMPANIES Accenture plc Alphabet Inc. Amazon Web Services, Inc. Cisco Systems, Inc. Dell Technologies Inc. Huawei Technologies Co., Ltd. International Business Machines Corporation (IBM) Microsoft Corporation Oracle Corporation Salesforce.com Inc. SAP SE

Get Cloud Computing in Cell Biology Market Report @ 2600 USD for a single-user license copy of 72 Pages.

Enquire For Corporate Licence Copy of Cloud Computing in Cell Biology Market @

https://trouve360reports.com/global-cloud-computing-in-cell-biology-market-58479/?ff_landing=6&form=corporate-license

About Trouve 360 Reports:

The growth of your organization depends largely on how much of a grip you have on your market. An elemental part of it is market research. With end-to-end industry reports that cover a plethora of factors, your search ends here at Trouve360reports.

Contact Us

Email us: info@trouve360reports.com

Call Us: +1 (415) 655-1709

Website: https://trouve360reports.com/

Our Social Sites:

https://www.facebook.com/trouve360reports

https://trouve360reports.medium.com/

https://www.linkedin.com/company/trouve360-reports/

Go here to read the rest:
Cloud Computing in Cell Biology Market Overview with Demographic Data and Industry Growth Trends 2022-2028 Queen Anne and Mangolia News - Queen Anne...

Read More..

Explore options for FinOps certifications and training – TechTarget

If you are looking to prove you can effectively manage and optimize cloud costs, a FinOps certification is a good place to start.

FinOps refers to a set of practices for managing operational expenses in cloud computing. Just as the term DevOps highlights collaboration between development and IT operations teams, FinOps suggests collaboration between finance and IT operations and engineering teams.

FinOps practices were developed by the FinOps Foundation, a program within the Linux Foundation.

FinOps helps teams ensure financial accountability across an organization. By using economic incentives offered by cloud providers, as well as cloud financial management strategies, the goal is to have a cost-effective cloud deployment. FinOps also aims to help organizations make careful tradeoffs between cloud costs and performance.

For IT pros who want to learn more about FinOps, or demonstrate their FinOps expertise, there are certifications and training programs available. Some come directly from the FinOps Foundation. Others, such as those from AWS and Coursera, offer training in cloud cost management and optimization more generally. While some of these courses may not explicitly mention FinOps in their titles, the concepts they cover can lend themselves to a FinOps practice.

The FinOps Foundation has a variety of options for certifications and training programs.

This self-paced course is targeted at a wide variety of IT and business roles, including cloud budget owners, cloud architects, FinOps analysts, IT program and portfolio management leaders, and financial business advisors. The course introduces FinOps fundamentals and key concepts. Recommended prerequisites include knowledge of cloud computing basics, at least one public cloud platform and pay-as-you-go consumption models.

The $599 course provides access to the following:

Alternatively, there is a virtual, instructor-led FinOps Certified Practitioner course that spans two days. The FinOps Foundation offers this course, which costs $1,500, on specific dates.

The FinOps Certified Practitioner exam costs $300. It lasts one hour and has 50 multiple-choice questions. A grade of 75% is needed to pass, and the certification is valid for two years.

The Linux Foundation also offers bundled options for the FinOps Certified Practitioner exam. For $599, individuals have access to the online course and exam. For $1,500, individuals attend an instructor-led course and can then take the exam.

This foundational course trains engineers on how to work with FinOps, finance and procurement teams to efficiently manage cloud use and costs. It's aimed primarily at software and system engineers as well as DevOps engineers and managers. While there are no prerequisites, it is a practitioner-level course that explores FinOps from the perspective of engineering teams. The $299 course provides access to the following:

This hands-on course is the FinOps Foundation's most advanced training option. A mix of instructor-led and self-paced learning, it requires an estimated 40 to 50 hours of work over a span of a few weeks. Prerequisites include at least six months of FinOps work experience and a FinOps Certified Practitioner certificate. The course costs $3,750.

To meet the requirements for this certification, you'll need to do the following:

The two-hour FinOps Certified Professional exam includes 100 questions. A grade of at least 75% is required to pass.

An AWS certification shows that the recipient has specific AWS knowledge, skills and capabilities. The AWS Cloud Financial Management for Builders course is specifically for individuals who want to more effectively manage and optimize their AWS cloud costs. It's aimed at developers, system admins and architects. An Architecting on AWS certificate is recommended for this course.

This intermediate-level course covers the following:

There are virtual and in-person options in various locations and languages. The three-day course costs $2,025.

Cloud Academy is an online training platform that offers certificates on a variety of topics, from cloud computing fundamentals to business management. Subscriptions include access to the content library, hands-on labs and exam preparation. They cost $39 per month, or $399 per year.

Cloud Academy offers an Optimizing Cloud Costs course that focuses on cloud financial management. Participants learn how to do the following:

The course averages 11 hours, broken down into 17 modules that include hands-on labs, courses, webinars and an exam. The free exam consists of 30 multiple-choice questions and lasts 45 minutes. You must score at least 60% to pass.

In addition to this, Cloud Academy offers numerous other courses, exams and resources on its platform.

AWS offers a course called Cloud Cost Management: Optimization Strategies through Coursera, another online training platform. This course covers how to get started with optimizing AWS costs and capacity. Set over three weeks, this six-hour, online, beginner-level course offers flexible deadlines and a certificate upon completion. The objective is to develop skills in financial management, cloud computing and cost management. Topics covered include AWS pricing basics, VM right-sizing and using the AWS Billing console to analyze spending. You can enroll for free, but the certificate costs $49.

From practice tests to study guides, there are various ways to supplement FinOps training and prepare for a certification exam.

For $25, online course provider Udemy offers two practice tests for the FinOps Certified Practitioner exam. Each practice test has 50 questions.

The book mentioned above -- Cloud FinOps: Collaborative, Real-Time Cloud Financial Management by J. R. Storment and Mike Fuller -- is an additional study resource. The book uses real-life examples to help readers learn about cloud costs and financial management, two concepts you'll need to understand well to attain FinOps certification.

Read the original here:
Explore options for FinOps certifications and training - TechTarget

Read More..

AMZN, GOOGL, CRM among top 5 cloud stocks to explore in May – Kalkine Media

Cloud computing has been one of the fastest-growing sectors in recent years. The trend, which was already growing, has been accelerated due to the Covid-19 pandemic. The shift towards digitalization during the period has helped the companies engaged in cloud computing.

Although the market has witnessed choppy trading in recent months due to various uncertainties, the cloud segment of the companies has seen growth. With the earnings season on track, investors are exploring opportunities in the cloud computing sector, given the sector's growth potential.

Here we explore some of the top cloud computing companies to explore in May.

Also Read: PLD to SPG: Can these 5 REIT stocks provide hedge against inflation?

Amazon.com, Inc. is one of the leading e-commerce-focused technology companies based in Seattle, Washington. It also provides cloud computing, media streaming, artificial intelligence, and other related services.

The shares of the company traded at US$2460.09 at 2:02 pm ET on May 4, down by 1.01% from their closing price of May 3. Its stock value decreased by 27.08% YTD.

The firm has a market cap of US$1.23 trillion, a P/E ratio of 58.82, and a forward one-year P/E ratio of 95.65. Its EPS is US$41.43.

The 52-week highest and lowest stock prices were US$3,773.08 and US$2,367.50, respectively. Its trading volume was 3,956,668 on May 3.

The company's net sales increased by 7% YoY to US$116.4 billion in Q1, FY22. Its net loss came in at US$3.84 billion, or US$7.56 per diluted share, as compared to US$8.10 billion, or US$15.79 per diluted share in Q1, FY21.

Also Read: Is Petoverse (PETO) crypto the new big name in the metaverse industry?

Source: Pixabay

Microsoft is another leading technology firm that specializes in software, consumer electronics, personal computers, cloud computing, and other related services. It is based in Redmond, Washington.

The stock of the company traded at US$281.18 at 2:07 pm ET on May 4, down by 0.21% from its previous closing price. The MSFT stock fell 15.82% YTD.

The market cap of the company is US$2.11 trillion, the P/E ratio is 29.48, and the forward one-year P/E ratio is 30.23. Its EPS is US$9.58.

The stock saw the highest price of US$349.67 and the lowest price of US$238.07 in the last 52 weeks. Its share volume on May 3 was 25,978,610.

The company's revenue increased by 18% YoY to US$49.4 billion in Q3, FY22. On a GAAP basis, its net income came in at US$16.72 billion, or US$2.22 per diluted share, as compared to US$15.45 billion, or US$2.03 per diluted share in Q3, FY21.

Also Read: Is Flux (FLUX) crypto rising on Terra (LUNA) news?

Alphabet is one of the major conglomerate firms based in Mountain View, California. It offers various internet and technology-related services like search engines, cloud computing, etc., to its clients. Notably, Alphabet is the parent company of Google.

The shares of the company traded at US$2365.91 at 2:12 pm ET on May 4, up 0.82% from their closing price of May 3. Its stock value declined 19.08% YTD.

The firm has a market cap of US$1.56 trillion, a P/E ratio of 21.53, and a forward one-year P/E ratio of 20.84. Its EPS is US$110.56.

The 52-week highest and lowest stock prices were US$3,030.93 and US$2,193.62, respectively. Its trading volume was 1,248,368 on May 3.

The company's revenue surged 23% YoY to US$68.01 billion in Q1, FY22. Its net income came in at US$16.43 billion, or US$24.62 per diluted share, as compared to US$17.93 billion, or US$26.29 per diluted share in the year-ago quarter.

Also Read: Crypto-centred Decrypt raises US$10 mn after spin-off from ConsesSys

Salesforce is a cloud-based software firm based in San Francisco, California. It provides customer relationship management (CRM) software and tools to its clients to help them in their sales, customer service, analytics, etc.

The stock of the company traded at US$177.37 at 2:19 pm ET on May 4, down 0.56% from its previous closing price. The CRM stock plunged 30.18% YTD.

The market cap of the company is US$178.27 billion, the P/E ratio is 119.58, and the forward one-year P/E ratio is 84.93. Its EPS is US$1.50.

The stock saw the highest price of US$311.75 and the lowest price of US$167.55 in the last 52 weeks. Its share volume on May 3 was 5,047,686.

The company's revenue increased by 26% YoY to US$7.33 billion in Q4, FY22, while its net loss came in at US$28 million, or US$0.03 per diluted share. For fiscal 2022, the company's revenue surged 25% YoY to US$26.49 billion.

Meanwhile, it now expects its revenue to be between US$32.00 billion and US$32.1 billion in fiscal 2023.

Also Read: Why is L-1 blockchain Fantom (FTM) crypto rising?

Adobe is a software company that provides a wide range of services and products for professionals, marketers, app developers, and other related sectors. It is based in San Jose, California.

The shares of the company traded at US$404.01 at 2:39 pm ET on May 4, down 0.88% from their closing price of May 3. Its stock value decreased by 27.78% YTD.

The firm has a market cap of US$191.72 billion, a P/E ratio of 40.3, and a forward one-year P/E ratio of 36.75. Its EPS is US$10.07.

The 52-week highest and lowest stock prices were US$699.54 and US$394.04, respectively. Its trading volume was 2,196,257 on May 3.

The company's revenue surged 9% YoY to US$4.26 billion in Q1, FY22. Its net income came in at US$1.266 billion, or US$2.66 per diluted share, as compared to US$1.261 billion, or US$2.61 per diluted share in Q1, FY21.

Also Read: Why are Web3 freelancers flocking to Braintrust (BTRST) crypto?

Although experts anticipate that the cloud computing sector will maintain steady growth in the coming days, the recent macroeconomic factors have forced investors to keep away from these growth stocks. Meanwhile, the S&P 500 technology sector declined 17.48% YTD after witnessing a strong momentum in the previous year. On the other hand, the tech-savvy Nasdaq Composite dropped 20.52% YTD.

Hence, investors should closely evaluate the companies before spending on the stocks.

Continued here:
AMZN, GOOGL, CRM among top 5 cloud stocks to explore in May - Kalkine Media

Read More..

How can we be sure machine learning is accurate? – University of Rochester

May 2, 2022

Scientists rely increasingly on models trained with machine learning to provide solutions to complex problems. But how do we know the solutions are trustworthy when the complex algorithms the models use are not easily interrogated or able to explain their decisions to humans?

That trust is especially crucial in drug discovery, for example, where machine learning is used to sort through millions of potentially toxic compounds to determine which might be safe candidates for pharmaceutical drugs.

There have been some high-profile accidents in computer science where a model could predict things quite well, but the predictions werent based on anything meaningful, says Andrew White associate professor of chemical engineering at the University of Rochester, in an interview with Chemistry World.

White and his lab have developed a new counterfactual method, described in Chemical Science, that can be used with any molecular structure-based machine learning model to better understand how the model arrived at a conclusion.

Counterfactuals can tell researchers the smallest change to the features that would alter the prediction, says lead author Geemi Wellawatte, a PhD student in Whites lab. In other words, a counterfactual is an example as close to the original, but with a different outcome.

Counterfactuals can help researchers quickly pinpoint why a model made a prediction, and whether it is valid.

The paper identifies three examples of how the new method, called MMACE (Molecular Model Agonistic Counterfactual Explanations), can be used to explain why:

The lab had to overcome some major challenges in developing MMACE. They needed a method that could be adapted for the wide array of machine-learning methods that are used in chemistry. In addition, searching for the most-similar molecule for any given scenario was also challenging because of the sheer number of possible candidate molecules.

Coauthor Aditi Seshadri in Whites lab helped solve that problem by suggesting the group adapt the STONED (Superfast traversal, optimization, novelty, exploration, and discovery) algorithm developed at the University of Toronto. STONED efficiently generates similar molecules, the fuel for counterfactual generation. Seshadri is an undergraduate researcher in Whites lab and was able to help on the project via a Rochester summer research program called Discover.

White says his team is continuing to improve MMACE, by trying other databases in their search for most similar molecules, for example, and refining the definition of molecular similarity.

The project was supported by grants from the National Science Foundation and the National Institute of General Medical Sciences of the National Institutes of Health. The University of Rochester Center for Integrated Research Computing (CIRC) provided computational resources and technical support.

Tags: Andrew White, Center for Integrated Research Computing, Department of Chemical Engineering, Hajim School of Engineering and Applied Sciences, research finding

Category: Science & Technology

Go here to see the original:
How can we be sure machine learning is accurate? - University of Rochester

Read More..

Rapid Adaptation of Deep Learning Teaches Drones to Survive Any Weather – Caltech

To be truly useful, dronesthat is, autonomous flying vehicleswill need to learn to navigate real-world weather and wind conditions.

Right now, drones are either flown under controlled conditions, with no wind, or are operated by humans using remote controls. Drones have been taught to fly in formation in the open skies, but those flights are usually conducted under ideal conditions and circumstances.

However, for drones to autonomously perform necessary but quotidian tasks, such as delivering packages or airlifting injured drivers from a traffic accident, drones must be able to adapt to wind conditions in real timerolling with the punches, meteorologically speaking.

To face this challenge, a team of engineers from Caltech has developed Neural-Fly, a deep-learning method that can help drones cope with new and unknown wind conditions in real time just by updating a few key parameters.

Neural-Fly is described in a study published on May 4 in Science Robotics. The corresponding author is Soon-Jo Chung, Bren Professor of Aerospace and Control and Dynamical Systems and Jet Propulsion Laboratory Research Scientist. Caltech graduate students Michael O'Connell (MS '18) and Guanya Shi are the co-first authors.

Neural-Fly was tested at Caltech's Center for Autonomous Systems and Technologies (CAST) using its Real Weather Wind Tunnel, a custom 10-foot-by-10-foot array of more than 1,200 tiny computer-controlled fans that allows engineers to simulate everything from a light gust to a gale.

"The issue is that the direct and specific effect of various wind conditions on aircraft dynamics, performance, and stability cannot be accurately characterized as a simple mathematical model," Chung says. "Rather than try to qualify and quantify each and every effect of turbulent and unpredictable wind conditions we often experience in air travel, we instead employ a combined approach of deep learning and adaptive control that allows the aircraft to learn from previous experiences and adapt to new conditions on the fly with stability and robustness guarantees."

Time-lapse photo shows a drone equipped with Neural-Fly maintaining a figure-eight course amid stiff winds at Caltech's Real Weather Wind Tunnel.

O'Connell adds: "We have many different models derived from fluid mechanics, but achieving the right model fidelity and tuning that model for each vehicle, wind condition, and operating mode is challenging. On the other hand, existing machine learning methods require huge amounts of data to train yet do not match state-of-the-art flight performance achieved using classical physics-based methods. Moreover, adapting an entire deep neural network in real time is a huge, if not currently impossible task."

Neural-Fly, the researchers say, gets around these challenges by using a so-called separation strategy, through which only a few parameters of the neural network must be updated in real time.

"This is achieved with our new meta-learning algorithm, which pre-trains the neural network so that only these key parameters need to be updated to effectively capture the changing environment," Shi says.

After obtaining as little as 12 minutes of flying data, autonomous quadrotor drones equipped with Neural-Fly learn how to respond to strong winds so well that their performance significantly improved (as measured by their ability to precisely follow a flight path). The error rate following that flight path is around 2.5 times to 4 times smaller compared to the current state of the art drones equipped with similar adaptive control algorithms that identify and respond to aerodynamic effects but without deep neural networks.

Out of the lab and into the sky: engineers test Neural-Fly in the open air on Caltech's campus

Neural-Fly, which was developed in collaboration with Caltech's Yisong Yue, Professor of Computing and Mathematical Sciences, and Anima Anandkumar, Bren Professor of Computing and Mathematical Sciences, is based on earlier systems known as Neural-Lander and Neural-Swarm. Neural-Lander also used a deep-learning method to track the position and speed of the drone as it landed and modify its landing trajectory and rotor speed to compensate for the rotors' backwash from the ground and achieve the smoothest possible landing; Neural-Swarm taught drones to fly autonomously in close proximity to each other.

Though landing might seem more complex than flying, Neural-Fly, unlike the earlier systems, can learn in real time. As such, it can respond to changes in wind on the fly, and it does not require tweaking after the fact. Neural-Fly performed as well in flight tests conducted outside the CAST facility as it did in the wind tunnel. Further, the team has shown that flight data gathered by an individual drone can be transferred to another drone, building a pool of knowledge for autonomous vehicles.

(L to R) Guanya Shi, Soon-Jo Chung, and Michael O'Connell, in front of the wall of fans at Caltech's Center for Autonomous Systems and Technologies

At the CAST Real Weather Wind Tunnel, test drones were tasked with flying in a pre-described figure-eight pattern while they were blasted with winds up to 12.1 meters per secondroughly 27 miles per hour, or a six on the Beaufort scale of wind speeds. This is classified as a "strong breeze" in which it would be difficult to use an umbrella. It ranks just below a "moderate gale," in which it would be difficult to move and whole trees would be swaying. This wind speed is twice as fast as the speeds encountered by the drone during neural network training, which suggests Neural-Fly could extrapolate and generalize well to unseen and harsher weather.

The drones were equipped with a standard, off-the-shelf flight control computer that is commonly used by the drone research and hobbyist community. Neural-Fly was implemented in an onboard Raspberry Pi 4 computer that is the size of a credit card and retails for around $20.

The Science Robotics paper is titled "Neural-Fly Enables Rapid Learning for Agile Flight in Strong Winds." Coauthors include Anandkumar and Yue, as well as Xichen Shi (PhD '21), and former Caltech postdoc Kamyar Azizzadenesheli, now an assistant professor of computer science at Purdue University. Funding for this research came from the Defense Advanced Research Projects Agency (DARPA) and Raytheon.

Go here to see the original:
Rapid Adaptation of Deep Learning Teaches Drones to Survive Any Weather - Caltech

Read More..

The race to digitization in logistics through machine learning – FreightWaves

A recent Forbes article highlighted the importance of increasing digital transformation in logistics and argued that many tech leaders should be adopting tech-forward thinking, execution and delivery in order to deliver with speed and keep a laser focus on the customer.

Since the COVID-19 pandemic, and even before, many logistics companies have been turning to technology to streamline their processes. For many, full digitization across the supply chain is the ultimate goal.

Despite many already taking steps toward advancing digitization efforts across supply chains, these processes are still fragmented due to all the moving parts and sectors of the industry such as integrators, forwarders and owners and the processes they each use.

Scale AI is partnering with companies in the logistics industry to better automate processes across the board and eliminate bottlenecks by simplifying integration, commercial invoicing, document processing and more through machine learning (ML).

ML is a subfield of artificial intelligence that allows applications to predict outcomes without having to be specifically programmed to do so.

The logistics industry has historically depended on lots of paperwork and this continues to be a bottleneck today. Many companies already use technology like optical character recognition (OCR) or template-based intelligent document processing (IDP). Both of these are substandard systems that can process raw data but require human key entry or engineers to make the data usable through creating and maintaining templates. This is costly and cannot be scaled easily. In a world where the end users are moving to getting results instantly and at a high quality, these methods take too long while providing low accuracy.

In the industry of logistics, it is a race to digitization to create a competitive edge, said Melisa Tokmak, General Manager of Document AI at Scale. Trying to use regular methods that require templates and heavily rely on manual key entry is not providing a good customer experience or accurate data quickly. This is making companies lose customer trust while missing out on the ROI machine learning can give them easily.

Scales mission is to accelerate the development of artificial intelligence.

Scale builds ML models and fine-tunes them for customers using a small sample of their documents. Its this method that removes the need for templates and allows all documents to be processed accurately within seconds, without human intervention. Tokmak believes that the logistics industry needs this type of technology now more than ever.

In the market right now, every consumer wants things faster, better and cheaper. It is essential for logistics companies to be able to serve the end user better, faster, and cheaper. That means meeting [the end users] where they are, Tokmak said. This change is already happening, so the question is how can you as a company do this faster than others so that you are early in building competitive edge?

Rather than simply learning where on a document to find a field, Scales ML models are capable of understanding the layout, hierarchy and meaning of every field of the document.

Document AI is also flexible to layout changes, table boundaries and other irregularities compared to that of traditional template-based systems.

Tokmak believes that because the current technology of OCR and IDP are not be getting the results needed by companies in the industry, the next step is partnering with companies, like Scale, to incorporate ML into their processes. After adopting this technology, Tokmak added that this can lead to companies knowing more about the market and getting visibility on global trade, which can lead to building new relevant tech.

Flexport, a recognizable name in the logistics industry and customer of Scale AI, is what is referred to as a digital forwarder. Digital forwarders are companies that digitally help customers through the whole shipment process without owning anything themselves. They function as a tech platform to make global trade easy, looking end to end to bring both sides of the marketplace together and ship more easily.

Before integrating an ML-solution, Flexport struggled to make more traditional means of data extraction like template-based and error-prone OCR work. Knowing its expertise was in logistics, Flexport partnered with Scale AI, an expert in ML, to reach its mission of making global trade easy and accessible for everyone more quickly, efficiently, and accurately. Now Flexport prides itself in its ability to process information more quickly and without human intervention.

As the supply chain crisis worsened, Flexports needs evolved. It became increasingly important for Flexport to extract estimated times of arrival (ETAs) to provide end users more visibility. Scales Document AI solution accommodated these changing requirements to extract additional fields in seconds and without templates from unstructured documents by retraining the ML models, providing more visibility on global trade at a time when many were struggling to get this level of insight at all.

According to a recent case study, Flexport has more than 95% accuracy with no templates and a less than 60-second turnaround since partnering with Scale.

Tokmak believes that in the future, companies ideally should have technology that functions as a knowledge graph a graph that represents things like objects, events, situations or concepts and illustrates the relationship among them to make business decisions accurately and fast. As it pertains to the logistics industry, Tokmak defines it as a global trade knowledge graph, which would provide information on where things are coming and going and how things are working, sensors all coming together to deliver users the best experience in the fastest way possible.

Realistically this will take time to fully incorporate and will require partnership from the logistics companies. The trick to enabling this future is starting with what will bring the best ROI and what will help your company find the easiest way to build new cutting edge products immediately, Tokmak said. There is a lot ML can achieve in this area without being very hard to adopt. Document processing is one of them a problem not solved with existing methods but can be solved with machine learning. It is a high value area with benefits of reducing costs, reducing delays, and bringing one source of truth for organizations within the company to operate with.

Tokmak stated that many in the industry have been disappointed with previous methods and were afraid to switch to ML for the same fear of disappointment but that has changed quickly in the last a few years. Companies do understand ML is different and they need to get on this train fast to actualize the gains form the technology.

It is so important to show people the power of ML and how every industry is getting reshaped with ML, Tokmak said. The first adopters are the winners.

The leading voices in supply chain are coming to Rogers, Arkansas, on May 9-10.

*limited term pricing available.

Go here to read the rest:
The race to digitization in logistics through machine learning - FreightWaves

Read More..