Page 3,030«..1020..3,0293,0303,0313,032..3,0403,050..»

Looking into the crystal ball: Tech predictions for 2021 – ITProPortal

2020 was a year of evolution for many sectors of the tech industry due to the reality of the Covid-19 pandemic. Many tech leaders had to pivot their strategies by shifting to a largely remote-work model and assessing new markets that were impacted by unforeseen economic impacts.

The tech industry has had major changes this year and because of that many experts are predicting new or evolved trends in 2021. Below multiple tech experts highlight their top predictions within the technology industry for the new year.

Krishna Subramanian, president and COO at Komprise:

In 2021, cloud storage costs begin to overtake compute costs. For the past three years, cloud cost optimization has been a key priority for businesses. In fact, Gartner predicted that 80 percent of businesses will outspend their cloud budgets in 2020. A bulk of these costs so far has been in the compute, since cloud object storage is relatively cost effective. But this is changing, since cloud file storage is typically ten times more expensive than S3, and file data is way more voluminous than block data all of which underscores the importance of using cloud file storage just when you need it. In 2021, enterprise IT organizations will begin adopting cloud data management solutions to understand how cloud data is growing and manage its lifecycle efficiently across the various cloud file and object storage options.

Anshu Sharma, CEO and co-founder, Skyflow:

Every large company is on a long-term transition to digital and cloud - so they can effectively compete against the likes of Amazon, and the Silicon Valley startups who are aiming for them. Companies like Nike, JP Morgan Chase, and Walgreens have been trying to transform and in 2020 they all got an unintended boost in their push to the cloud - because they had to. The Fortune 500, having seen relative success with cloud and digital are not going back. They are all doubling down. 2021 will be the year of the digital double down.

Patrick Harr, CEO, SlashNext:

Over the last 30 days, 10 percent of company users were phished, according to live data we compiled across more than 100 large and mid-sized enterprises. Every day, SlashNext Threat Labs detect 21,000 new phishing attacks, almost double the number of threats from a year ago, and SlashNext Threat Labs are seeing an alarming 50-75 percent attacks getting past conventional phishing defenses to compromise enterprise networks. So, if you think your current defenses will keep you safe, think again. And in 2021, we anticipate this problem will get much, much worse.

Stowe Boyd, analyst, Gigaom:

The pandemic has accelerated the adoption of technologies that were popular before, but which are now essential. One example has been the combination of work chat tools and video conferring, as typified by Microsoft Teams and Slack. Microsoft has seen a dramatic uptick in usage, and the release of Google's new take on the former GSuite, now known as Google Workspace, which also integrates work chat and video conferencing represents another challenge for Slack. As the two leaders in what we might think of as 'business operating systems,' Google and Microsoft present a difficult challenge for Slack, since companies will not want to pay extra for functionality, they already have access to in their communications and file storage platforms.

Saad Siddiqui, principal, Telstra Ventures:

The unrelenting pace of open source innovation will continue in 2021, particularly in the areas of data analytics and data infrastructure, leaving many Fortune 1000 businesses and other SMBs struggling to keep pace with, and integrate with, open source innovations. Unlike large tech players, who can afford to hire an army of engineers to constantly change things and add their own herbs and spices to improve their operations, most businesses and newer vendors dont have as many talented engineers or cant hire more engineers to keep pace with change. To address this talent wall, were seeing the rise of a hybrid open source business model, whereby open source data analytics companies like Incorta and infrastructure companies like Rancher Labs monetize closed source, out-of-the-box capabilities that deliver open source innovations while requiring less time and resources for enterprises to derive value.

Cornelia Davis, CTO, Weaveworks:

2021 will see the emergence of common distributed operational patterns widely implemented across all industries, and with the coming of 5G and the edge, this could not be timelier. There are several signals that foreshadow this, such as the adoption of GitOps as a defacto standard and best practice for operating Kubernetes and its workloads. IT systems will not only enjoy greater resilience, and security, but having GitOps in place also lays the foundation for massive scalability and growth.

Yiannis Antoniou, analyst, Gigaom:

Responsible AI / ML will become the hottest topic in the cloud ML industry. Given societys increased emphasis on combatting unfairness and bias and the overall interest in better interpretability and explainability of machine learning models, cloud providers will invest and enhance their ML offerings to offer a full suite of responsible ML / AI capabilities that will aim to satisfy and reassure regulators, modelers, management and the market on the fair use of ML. Meanwhile, AI / ML will continue to see explosive growth and usage across the whole industry, with significant enhancements in ease-of-use and UX combining within a responsible AI / ML framework to drive the next growth spurt of this sector.

IT Experts

See the original post here:
Looking into the crystal ball: Tech predictions for 2021 - ITProPortal

Read More..

How can the cloud industry adapt to a post-COVID world? – IT PRO

One of the unexpected silver linings to the global coronavirus crisis has been the rapid growth the cloud industry has enjoyed. The shift to remote working during the various lockdowns that have taken place over the course of 2020, was largely, if not entirely, facilitated by cloud services. This has meant that while other sectors have struggled and there has been an overall economic downturn, cloud companies have performed relatively well financially.

Although they wouldnt want to characterise the past few months as profiting from the pandemic, the likes of Zoom and Microsoft Teams have surged in usage and revenue, with the latter surpassing 44 million users as early as March. This period has also accelerated many digital transformation projects, with engineers more than capable of carrying out projects at pace and scale, including the traditionally lethargic public sector. This success, however, has been driven entirely by the effects of the pandemic, forcing the industry to question whether, and how, it can adapt once their services are no longer as highly sought after.

While we all rejoiced at the news that a potential COVID-19 vaccine may be available for distribution before the end of the year, shares in a handful of companies dropped sharply in response, including at least 15% reduction in the valuation of Zoom.

Whether things go back to the way they were, or cloud companies continue to play a more pivotal role than ever, is yet to be determined. For independent cloud consultant Danielle Royston, the goal of going back to normality in 2021 is misplaced. Theres no point wasting time and energy trying to return to the halcyon days of pre-COVID, she says. Lets focus instead on some of the positive disruptions weve seen this year. In all the companies Ive been at, Ive promoted and in some cases fully converted to remote working. I saw this as the inevitable direction that work and society was going, as the cloud computing tools were already there. And it makes sense: A better quality of life for employees, ease of collaboration, cutting the costs of business travel.

This is a trend that Tom Wrenn, cloud investment expert and partner at private equity firm ECI Partners, predicts will continue well into next year, telling ITPro that COVID-19 forced many companies into rapidly adopting cloud-based operations. These, driven by government-enforced lockdowns, allowed them to continue operating remotely. Now, having done a basic shift to cloud-based systems, he adds, 2021 will be the year of full cloud adoption, with businesses starting to optimise all its benefits; for example, data analytics and AI. If rapid investment was needed in 2020, next year businesses will want to see a return on that investment and will expect to see more from their cloud computing providers.

Although the recent transition to remote working is a trend sparked by COVID-19, the consensus is that its the beginning of a wider cultural shift. Former IBM boss Ginni Rometty is among the latest to suggest as much, claiming mass remote working will continue in some form as part of a broader hybrid model in future. This may involve companies keeping some physical presence while establishing the infrastructure and equipment to allow workers to work remotely as and when desired.

Cisco CTO for UK and Ireland, Chintan Patel, agrees, telling IT Pro that remote working gained widespread acceptance during COVID-19, even in organisations where it was unthinkable before. This means cloud and software as a service (SaaS) tools will continue to remain a crucial part of many setups, even though businesses will mostly return to a form of hybrid model. For remote working, cloud plays a central role; think secure cloud-based collaboration, accessing cloud-based business applications, and extending the security perimeter to thousands of devices, he explains. Its important to note, though, that cloud-based consumption models are not limited to remote working only. As to those returning to the offices, we see technology can help make the workplace more secure and efficient. As and when companies prepare for a return to office, they also need to optimise their space, address worker concerns about sanitation and social distancing and plan how to communicate policies and information clearly.

Accelerate digital transformation with enterprise apps on the cloud

The risks and rewards of moving enterprise applications to a multi-cloud environment

Technology will play a major part in instigating the changes needed in future, with a key role to play for many of the firms that have enjoyed success during the pandemic. While demand for software such as video conferencing platforms may not be as sky-high as it was at the beginning of the pandemic, Wrenn argues the next big step is how cloud companies can eat further into the market share enjoyed by the traditional telephone industry. More and more businesses are using Microsoft Teams or Zoom to interact, he explains, when previously they would have used conference lines or even called a person directly due to it being more convenient. Cloud providers need to think about how they can make the most of this opportunity as the way in which people interact changes.

To some extent, we should all consider ourselves lucky the global pandemic happened when it did, given that cloud computing has only in recent recently become as advanced as it is now. Thus, rather than profiting from the pandemic, this period has been the making of the industry. After all, cloud storage, processing, and compute facilities are already set up, and ready to expand easily and automatically, as and when enterprises need, according to Royston, who claims this wouldnt have been the case ten to 15 years go. It wouldve been an epic failure and caused even more disruption and long-term damage to global economies. This year, white-collar workers being able to quickly adapt to working from home in their millions is part of whats helped many sectors stay afloat. And its because of the investment and ongoing work of hyperscalers over the past few years thats meant businesses can support workers in doing this.

The IT Pro Podcast: A post-COVID cloud future

COVID has rewritten the rulebook for businesses - but will it last?

Connectivity, too, will continue to grow as organisations reliance SaaS tools increases too, Patel adds, with firms expecting more from these companies beyond provision. With cloud infrastructures becoming increasingly diverse, especially with applications adding more layers of complexity, businesses will be looking to strengthen their infrastructure. This will be achieved by gaining deeper visibility across their IT estates, ensuring workloads have continuous access to required resources and running systems that connect and protect at scale - from on-prem to hybrid cloud configurations. This is in addition to using technologies such as machine learning to give customers tools to manage their ever-growing data lakes. This is where providers can step in to guide customers on their migration journeys.

As such, the greatest challenge facing cloud providers, in light of the above, will largely be customer retention, according to Tom Wrenn. If we take online meeting services as an example, historically businesses would have had to invest in a service, such as [Cisco] WebEx, which is often costly and comes with a lot of equipment, he says. Today, however, businesses are using Zoom and Teams for this and can just turn services on and off with little upfront investment. This means that customers arent locked into providers in a way they once were. As a result, cloud computing providers will need to over-deliver for their clients, retaining a high level of customer service as well as ensuring that service levels dont decline as they undergo a huge period of growth.

Three key steps to delivering data-driven marketing

Go further with data management in your marketing efforts

How to take infrastructure monitoring to the next level

The four imperatives for building true observability

Go further with mobile marketing

Easy steps to get your mobile strategy up-to-speed

MLOps 101: The foundation for your AI strategy

What is MLOps and why do you need an MLOps infrastructure?

See more here:
How can the cloud industry adapt to a post-COVID world? - IT PRO

Read More..

The National Institute for Health Research on connecting through cloud to fight Covid-19 – ComputerWeekly.com

Working in partnership with the National Health Service, and funded by the Department of Health and Social Care (DHSC), the National Institute for Health Research (NIHR) collaborates with universities, local governments, research teams and the general public to carry out life-changing medical research projects.

The NIHR is one of those hidden gems inside the NHS, Justin Riordan-Jones, head of systems and information at the DHSC, tells Computer Weekly. And our job usually means peoples lives get better in about 10 years time, as a result of the work we do.

This year, though, the organisation has been actively working to improve the lives of the nation in a much shorter timeframe, as the NIHR and its stakeholders have worked tirelessly to help bring the Covid-19 coronavirus pandemic under control and help save lives.

The past 10 months have really highlighted just how important and vital research is to the health and wealth of the nation, and we were delighted to be able to be part of rolling out the vaccines, rolling out research and all that good stuff, he says.

The onset of the pandemic in early 2020 prompted a rapid shift in priorities in the NIHR, as it set about coordinating 50 urgent public health studies including two focusing on possible vaccines into the effects of Covid-19.

The overarching aim of this work was to gather as much clinical and epidemiological evidence as possible to inform the UK governments response to the pandemic, while supporting efforts to create new diagnostic systems, treatments and vaccines to curb the spread of the novel coronavirus.

Research initiatives on this kind of scale typically take months, even years, to get off the ground, but the NIHR was able to do so this time around in a matter of weeks, with the help of its Google Cloud-based Digital Hub.

The NIHR is one of those hidden gems inside the NHS, and our job usually means peoples lives get better in about 10 years time, as a result of the work we do. Justin Riordan-Jones, DHSC

Described by Riordan-Jones as the fundamental backbone to the NIHRs operations, the Digital Hub provides the organisations 8,000 employees with access to Googles portfolio of cloud-based communication, collaboration and productivity tools, formerly known as G Suite.

The setup was originally devised in 2014 to bring a little more order to the way the NIHR worked, both in-house and with its external research partners and stakeholders, by replacing the patchwork of data repositories and collaboration tools they relied on to work together.

We wanted a solution that would empower us to operate as a single corporate entity over multiple locations, multiple platforms and multiple scenarios, says Riordan-Jones.Our [previous] system worked in the beginning, but the lack of consistency was starting to slow down our progress.

Working with technology consultancy PA Consulting, the NIHR embarked on finding a suitable replacement for this patchwork of productivity tools, before deciding to press ahead with the deployment of Google G-Suite, which has since been rebranded as Google Workspace.

Fundamentally, choosing Google came down to its development pathway, he says, following a market evaluation of four similar products, which resulted in Google Workspace emerging as a front runner along with one other.

At that stage, there was very little functionality difference between them. What was hugely different was peoples acceptance of how well [the two systems] worked, and the ease of operation so they could migrate swiftly [to Google], he adds.

But it was the fact we could already see a development pathway for two to three years at that point, which we knew would be beneficial to us. So it was the vision and clarity of where Google was going with the product that made us choose it.

The Google Workspace deployment provided the organisations employees with access to their own corporate email addresses for the first time via Gmail, as well as video-conferencing tools in the form of Google Meet, and cloud-based document storage and collaboration through Google Drive.

Rather than having to host multiple systems in different places and locations, and a [heterogenous] technology stack that were forever running around trying to keep up to date, we have something in place now that that is much more compartmentalised and much easier to maintain, says Riordan-Jones.

The setup has also served to give its employees a greater sense of corporate identity, as well as make the productivity portion of its IT estate easier to manage and control.

We have seen a much greater corporate approach than we ever had before, so people now act as if they are the NIHR and not body x inside the NIHR, and we have reassurance and know that the hub is behaving the way we want it to, he continues.

We also know we are benefiting from the levels of security that Google wraps around it, and we have greater control and visibility over how its working [compared with the previous system].

Relying on a single system to fulfil its collaboration and productivity requirements has unlocked sizeable cost savings for the NIHR as well.

When we rolled this out in 2014-15, we were the biggest public sector implementation of G Suite technology at that stage, although weve since been overtaken because weve proved how well it works, says Riordan-Jones.

Over the years, [that has unlocked] 10-15m pounds worth of savings, which has gone into research rather than technology endeavours.

Even so, six years is a long time in tech, and the Digital Hub has been subject to numerous tweaks to its functionality in that time, based on user feedback and the NIHRs wider organisational goals, says Riordan-Jones.

We dont sit there and say this is the definitive version, he says. We are constantly looking at how it needs to be improved, based on the feedback we get from the users and [our] interpretation of where we need to be going for our digital strategy.

The latest iteration of the Digital Hub has been in place since March 2020, with its original functionality bolstered by the inclusion of Google Cloud Search, which integrates with Google Workspace to make it faster and easier for users to locate data stored within its entire infrastructure.

We are constantly looking at how [the Digital Hub] needs to be improved, based on the feedback we get from the users and [our] interpretation of where we need to be going for our digital strategy Justin Riordan-Jones, DHSC

The NIHRs workforce is distributed across various offices, universities and hospitals in the UK, which had led to some data being unintentionally siloed and isolated within these locations, making it inaccessible to some employees who need access to it.

Over the course of 16 weeks, this data sharing pain point was addressed through a Google Cloud Search-focused redesign of the Digital Hub, which has been instrumental in enabling the NIHR to rapidly refocus and coordinate its Covid-19 research efforts.

The long-standing productivity and collaboration functionality of the Digital Hub has also come into its own during the pandemic, as employees have grappled with working remotely.

In fact, its employees have taken to this new way of working like ducks to water, with the NIHR reporting a 379% uptick in the use of Googles cloud-based video-conferencing service Meet during the first two months of the first UK lockdown. Furthermore, use of Google Drive, the search giants online storage and document collaboration offering, was up 198%.

With the reworked Digital Hub now firmly embedded in the NIHR, Riordan-Jones says the scene is set for the organisation to push its digital ambitions further than ever before, following the appointment of John Nother as its chief digital officer in March 2020.

One of Nothers top priorities since joining the organisation has been to set out a five-year digital strategy for the NIHR, which is geared towards streamlining data sharing processes within the organisation through the adoption of what Riordan-Jones describes as a do once and share approach.

As an example, Riordan-Jones cites the process research organisations have to go through to secure ethical approval from the Health Research Authority before they can start work, which requires them to submit details about the nature of the project they want to embark on.

So you tell this organisation all that information and then you come to the NIHR which helps you run that research inside the NHS, and we ask for all that same information again, he says.

It is very simple to say we should be sharing that data, but unfortunately, due to legal considerations and other factors, it is not easy to do so at the moment. But were working our way through it, so that somebody who arrives at the beginning of the research journey only gets asked the absolute questions that need to be asked, rather than having to repeat themselves over and over, he adds.

All this repetition slows down the pace at which the NIHR and its stakeholders are able to work, but as the organisations rapid response to the pandemic has shown it is possible to modify such processes to get where it needs to be faster.

Because of the way the response to the pandemic was organised, we were able to modify processes in some respects to do that. And we have learned a lot as a result about what we can do, how we can do it, and where we can do it, says Riordan-Jones.

Hopefully, were going to take some of the lessons learned from this exercise now and roll those forward because we have proof, even in unfortunate circumstances, that it works and we can roll that forward.

View original post here:
The National Institute for Health Research on connecting through cloud to fight Covid-19 - ComputerWeekly.com

Read More..

Zoom recordings will be deleted after six months – The Daily Evergreen

Users can turn on a setting to notify them seven days before a recording is permanently deleted

LAUREN PETTIT

Before this change, the recordings were stored for nine months. Now, they are saved for six months to decrease the number of Zoom recordings stored in the WSU cloud.

WSU Zoom recordings stored in the cloud will be deletedif they are more than six months old.

Prior to this new policy, Zoom recordings were deleted after nine months, said Corey Oglesby, communications specialist, trainer for WSU Information Technology Services.The new policy went into effect Dec. 8.

Oglesby said the six-month timeline was necessary to help decrease the number of Zoom recordings stored in the WSU cloud storage space. The university was approaching the maximum number of recordings that can be stored in the cloud.

Zoom recordings dramatically increased in response to COVID-19, when we all went online in March, he said.

In March, the number of newly registered WSU Zoom users increased from 2,500 to 31,000, Oglesby said.

Thats a crazy overnight increase, he said. Suddenly there were 30-40,000 meetings taking place every week, and a lot of times there were more than 600 meetings happening at the same time.

Oglesby said after recordings are deleted, they are moved to the users trash bin in their Zoom account. Users then have 30 days to retrieve and download those recordings to their computers before they are permanently deleted.

Users can turn on a setting in Zoom that will notify them seven days before their cloud recordings are deleted from the trash bin, he said.

Zoom users can choose to be notified in the Email Notifications section in the settings for meetings, according to the WSU Zoom support teamsself-help article.

Continued here:
Zoom recordings will be deleted after six months - The Daily Evergreen

Read More..

Global Trend Expected to Guide Private Cloud Storage Market from 2020-2026: Growth Analysis by Manufacturers, Regions, Type and Application – Murphy’s…

The Private Cloud Storage Market report comprises a competitive analysis with a focus on key players and participants ofPrivate Cloud Storage Industrycovering in-depth data related to the competitive landscape, positioning, company profiles, key strategies adopted, and product-profiling with a focus on market growth and potential.

In the report, a concise presentation has been included concerning the product or service. Moreover, the various trends and affecting factors of the Private Cloud Storage market. These variables has helped decide the behavior of the market during the forecast period and empowered our specialists to make effective and precise predictions about the market future.

The report covers a forecast and an analysis of thePrivate Cloud Storage Market on a global and regional level. The historic data is given from 2013-2019 and the estimated period is from 2020-2026 based on revenue.

ThePrivate Cloud Storage market was valued at US$ 6603.3 Mn in 2018 and is expected to reach US$ XX Mn by 2026, at a CAGR of 16% throughout 2020-2026.

Request for a Sample Copy of Private Cloud Storage Market Report @ https://www.alltheresearch.com/sample-request/412

Competitive Landscape Covered inPrivate Cloud Storage MarketReport:

This report includes a study of the marketing and development strategies, along with the product portfolios of the leading companies. The Private Cloud Storage market report elaborates insights on the Market Diversification (Exhaustive information about new products, untapped regions, and recent developments), Competitive Assessment (In-depth assessment of market shares, strategies, products, and manufacturing capabilities of leading players in the Private Cloud Storage market).

Top playersCovered in Private Cloud Storage Market Study are:

Private Cloud Storage Market Segmentation

Private Cloud Storage market is split by Type and by Application. For the period 2020-2026, the growth among segments provides accurate calculations and forecasts for sales by Type and by Application in terms of volume and value. This analysis can help you expand your business by targeting qualified niche markets.

Market Segmentation by Type:

Market Segmentation by Applications:

For more Customization in Private Cloud Storage Market Report:https://www.alltheresearch.com/customization/412

Global Private Cloud Storage Market: Regional Segmentation

Impact of COVID-19 on Private Cloud Storage Market:

The report also contains the effect of the ongoing worldwide pandemic, i.e., COVID-19, on the Private Cloud Storage Market and what the future holds for it. It offers an analysis of the impacts of the epidemic on the international market. The epidemic has immediately interrupted the requirement and supply series. The Private Cloud Storage Market report also assesses the economic effect on firms and monetary markets. Futuristic Reports has accumulated advice from several delegates of this business and has engaged from the secondary and primary research to extend the customers with strategies and data to combat industry struggles throughout and after the COVID-19 pandemic.

For More Details on Impact of COVID-19 on Private Cloud Storage Market:https://www.alltheresearch.com/impactC19-request/412

Research ObjectivePrivate Cloud Storage Market Research:

The report is useful in providing answers to several critical questions that are important for the industry stakeholders such as manufacturers and partners, end-users, etc., besides allowing them in strategizing investments and capitalizing on market opportunities.

Key target audience:

Buy Full Report on Private Cloud Storage Market@ https://www.alltheresearch.com/buy-now/412

About AllTheResearch:

AllTheResearch was formed with the aim of making market research a significant tool for managing breakthroughs in the industry. As a leading market research provider, the firm empowers its global clients with business-critical research solutions. The outcome of our study of numerous companies that rely on market research and consulting data for their decision-making made us realise, that its not just sheer data-points, but the right analysis that creates a difference. While some clients were unhappy with the inconsistencies and inaccuracies of data, others expressed concerns over the experience in dealing with the research-firm. Also, same-data-for-all-business roles was making research redundant. We identified these gaps and built AllTheResearch to raise the standards of research support.

For All Your Research Needs, Reach Out to Us:

Contact Name: Rohan S.

Email: [emailprotected]

Phone: +1 (407) 768-2028

Go here to read the rest:
Global Trend Expected to Guide Private Cloud Storage Market from 2020-2026: Growth Analysis by Manufacturers, Regions, Type and Application - Murphy's...

Read More..

What is Artificial Intelligence (AI)? | IBM

Artificial intelligence enables computers and machines to mimic the perception, learning, problem-solving, and decision-making capabilities of the human mind.

In computer science, the term artificial intelligence (AI) refers to any human-like intelligence exhibited by a computer, robot, or other machine. In popular usage, artificial intelligence refers to the ability of a computer or machine to mimic the capabilities of the human mindlearning from examples and experience, recognizing objects, understanding and responding to language, making decisions, solving problemsand combining these and other capabilities to perform functions a human might perform, such as greeting a hotel guest or driving a car.

After decades of being relegated to science fiction, today, AI is part of our everyday lives. The surge in AI development is made possible by the sudden availability of large amounts of data and the corresponding development and wide availability of computer systems that can process all that data faster and more accurately than humans can. AI is completing our words as we type them, providing driving directions when we ask, vacuuming our floors, and recommending what we should buy or binge-watch next. And its driving applicationssuch as medical image analysisthat help skilled professionals do important work faster and with greater success.

As common as artificial intelligence is today, understanding AI and AI terminology can be difficult because many of the terms are used interchangeably; and while they are actually interchangeable in some cases, they arent in other cases. Whats the difference between artificial intelligence and machine learning? Between machine learning and deep learning? Between speech recognition and natural language processing? Between weak AI and strong AI? This article will try to help you sort through these and other terms and understand the basics of how AI works.

The easiest way to understand the relationship between artificial intelligence (AI), machine learning, and deep learning is as follows:

Let's take a closer look at machine learning and deep learning, and how they differ.

Machine learning applications (also called machine learning models) are based on a neural network,which is a network of algorithmic calculations that attempts to mimic the perception and thought process of the human brain. At its most basic, a neural network consists of the following:

Machine learning models that arent deep learning models are based on artificial neural networks with just one hidden layer. These models are fed labeled datadata enhanced with tags that identify its features in a way that helps the model identify and understand the data. They are capable of supervised learning (i.e., learning that requires human supervision), such as periodic adjustment of the algorithms in the model.

Deep learning models are based on deep neural networksneural networks with multiple hidden layers, each of which further refines the conclusions of the previous layer. This movement of calculations through the hidden layers to the output layer is called forward propagation. Another process, called backpropagation, identifies errors in calculations, assigns them weights, and pushes them back to previous layers to refine or train the model.

While some deep learning models work with labeled data, many can work with unlabeled dataand lots of it. Deep learning models are also capable of unsupervised learningdetecting features and patterns in data with the barest minimum of human supervision.

A simple illustration of the difference between deep learning and other machine learning is the difference between Apples Siri or Amazons Alexa (which recognize your voice commands without training) and the voice-to-type applications of a decade ago, which required users to train the program (and label the data) by speaking scores of words to the system before use. But deep learning models power far more sophisticated applications, including image recognition systems that can identify everyday objects more quickly and accurately than humans.

For a deeper dive into the nuanced differences between thesetechnologies, read AI vs. Machine Learning vs. Deep Learning vs. Neural Networks: Whats the Difference?

Weak AIalso called Narrow AI or Artificial Narrow Intelligence (ANI)is AI trained and focused to perform specific tasks. Weak AI drives most of the AI that surrounds us today. Narrow is a more accurate descriptor for this AI, because it is anything but weak; it enables some very impressive applications, including Apple's Siri and Amazon's Alexa, the IBM Watson computer that vanquished human competitors on Jeopardy, and self-driving cars.

Strong AI, also called Artificial General Intelligence (AGI), is AI that more fully replicates the autonomy of the human brainAI that can solve many types or classes of problems and even choose the problems it wants to solve without human intervention. Strong AI is still entirely theoretical, with no practical examples in use today. But that doesn't mean AI researchers aren't also exploring (warily) artificial super intelligence (ASI), which is artificial intelligence superior to human intelligence or ability. An example of ASI might be HAL, the superhuman (and eventually rogue) computer assistant in 2001: A Space Odyssey.

As noted earlier, artificial intelligence is everywhere today, but some of it has been around for longer than you think. Here are just a few of the most common examples:

The idea of 'a machine that thinks' dates back to ancient Greece. But since the advent of electronic computing (and relative to some of the topics discussed in this article) important events and milestones in the evolution of artificial intelligence include the following:

IBM has been a leader in advancing AI-driven technologies for enterprises and has pioneered the future of machine learning systems for multiple industries. Based on decades of AI research, years of experience working with organizations of all sizes, and on learnings from over 30,000 IBM Watson engagements, IBM has developed the AI Ladder for successful artificial intelligence deployments:

IBM Watson products and solutions give enterprises the AI tools they need to transform their business systems and workflows, while significantly improving automation and efficiency. For more information on how IBM can help you complete your AI journey, explore IBM's portfolio of managed services and solutions.

Sign up for an IBMid and create your IBM Cloud account.

View original post here:
What is Artificial Intelligence (AI)? | IBM

Read More..

Special Operations Strives to Use the Power of Artificial Intelligence – Department of Defense

U.S. Special Operations Command hopes to increasingly use artificial intelligence and machine learning in all aspects of warfare, its commander said.

Army Gen. Richard D. Clarke spoke virtually today with Hudson Institute scholars.

Clarke noted that Project Maven jump-started the employment of AI. Project Maven was initially executed to automate the processing and exploitation of full-motion video collected by intelligence, instead of relying on humans to sort through all of it.

With AI's ability to shift quickly through terabytes of data to find relevant pieces of intelligence, it allows the human to make faster and better informed decisions, he said.

AI can also be incredibly effective at monitoring the information environment, he said.

During a recent visit with a special operations commander in Afghanistan, Clarke noted that the commander said influencing the population in a positive way can mean the difference between winning and losing.

Socom has been using AI for logistics, and the maintenance piece in particular, for more than two years now, he said. It saves money in terms of, for example, predicting engine life or failure on a tank or aircraft. And it allows better use of those assets.

AI-powered health care can predict injuries or point to treatments to get operators in the fight more quickly, he mentioned.

In the realm of mission command, AI will power the Joint All-Domain Command and Control system, which will allow commanders to better communicate and make decisions, he said.

While Socom is forging ahead quickly with AI, Clarke mentioned that his organization is also working closely with the military services and organizations like the Joint Artificial Intelligence Center, as well as with industry, allies and partners.

Clarke emphasized that it's important that commanders set the tone and set the conditions to allow innovation and encourage people to come up with great ideas.

Humans are more important than the hardware, he said. "It's the talented people that we have to help foster. You've got to invest the human capital into this space."

Looking to the future, Clarke said he is optimistic that AI will be successfully leveraged by the Defense Department to maintain the lead against peer competitors China and Russia. It will require updating policy and infrastructure, using cloud computing and having the right people who are enabled with the right leadership.

See original here:
Special Operations Strives to Use the Power of Artificial Intelligence - Department of Defense

Read More..

How the Role of Artificial Intelligence is Changing During the Pandemic – MarketScale

Seasoned technology executives host a weekly discussion to highlight how IT teams and leaders can prepare to be agile and scalabale in an ever changing world.

To say 2020 has been a year of change is a monumental understatement. Everything from the way we socialize, to how we receive health care, to how we work, has changed massively. The hosts of The Suite Spot Carlos Vargas,Howard HoltonandPaul Lewis spoke with Michael Davidson, Senior Data Scientist, Microsoft about some of the changes that businesses had no choice but to make.

Davidson feels that these challenges provided learning opportunities for businesses. He presented companies shifting to remote work as an example, The hardest gap to overcome is the one you mentioned and thats accepting that its even possible. And once youve done it a few times, it doesnt become rote. I mean every project has its own challenges, but just the belief that you can do it, I think is probably the biggest hurdle for most leadership teams to overcome, Davidson said.

He feels that once companies have seen the positive outcomes of change, it becomes easier to enact more change. Especially when you have to react to some sort of catastrophic change in the operating environment. Davidson suspects that a lot of the workplace changes weve seen may remain, even once social distancing guidelines are a thing of the past. As much Covid is obviously a tragedy of proportions we havent seen, but at the same time like anything in life, hopefully we can learn from it, Davidson noted. He finds it hard to believe that workers will be eager to resume long, inconvenient, and expensive commutes, when working remotely has suited them just fine especially in the tech industry. Its nonsensical, right? Were showing most, many of these jobs can be done at home. And for those that cant Fair enough, but certainly in most of our lines of work, its probably 95% of what it was, Davidson stated.

Read the original:
How the Role of Artificial Intelligence is Changing During the Pandemic - MarketScale

Read More..

What is machine learning? Here’s what you need to know – Business Insider – Business Insider

Machine learning is a fast-growing and successful branch of artificial intelligence. In essence, machine learning is the process of allowing a computer system to teach itself how to perform complex tasks by analyzing large sets of data, rather than being explicitly programmed with a particular algorithm or solution.

In this way, machine learning enables a computer to learn how to perform a task on its own and to continue to optimize its approach over time, without direct human input.

In other words, it's the computer that is creating the algorithm, not the programmers, and often these algorithms are sufficiently complicated that programmers can't explain how the computer is solving the problem. Humans can't trace the computer's logic from beginning to end; they can only determine if it's finding the right solution to the assigned problem, which is output as a "prediction."

There are several different approaches to training expert systems that rely on machine learning, specifically "deep" learning that functions through the processing of computational nodes. Here are the most common forms:

Supervised learning is a model in which computers are given data that has already been structured by humans. For example, computers can learn from databases and spreadsheets in which the data has already been organized, such as financial data or geographic observations recorded by satellites.

Unsupervised learning uses databases that are mostly or entirely unstructured. This is common in situations where the data is collected in a way that humans can't easily organize or structure it. A common example of unstructured learning is spam detection, in which a computer is given access to enormous quantities of emails and it learns on its own to distinguish between wanted and unwanted mail.

Reinforcement learning is when humans monitor the output of the computer system and help guide it toward the optimal solution through trial and error. One way to visualize reinforcement learning is to view the algorithm as being "rewarded" for achieving the best outcome, which helps it determine how to interpret its data more accurately.

The field of machine learning is very active right now, with many common applications in business, academia, and industry. Here are a few representative examples:

Recommendation engines use machine learning to learn from previous choices people have made. For example, machine learning is commonly used in software like video streaming services to suggest movies or TV shows that users might want to watch based on previous viewing choices, as well as "you might also like" recommendations on retail sites.

Banks and insurance companies rely on machine learning to detect and prevent fraud through subtle signals of strange behavior and unexpected transactions. Traditional methods for flagging suspicious activity are usually very rigid and rules-based, which can miss new and unexpected patterns, while also overwhelming investigators with false positives. Machine learning algorithms can be trained with real-world fraud data, allowing the system to classify suspicious fraud cases far more accurately.

Inventory optimization a part of the retail workflow is increasingly performed by systems trained with machine learning. Machine learning systems can analyze vast quantities of sales and inventory data to find patterns that elude human inventory planners. These computer systems can make more accurate probability forecasting for customer demand.

Machine automation increasingly relies on machine learning. For example, self-driving car technology is deeply indebted to machine learning algorithms for the ability to detect objects on the road, classify those objects, and make accurate predictions about their potential movement and behavior.

Continued here:
What is machine learning? Here's what you need to know - Business Insider - Business Insider

Read More..

Facebook, other partners using artificial intelligence to forecast future COVID-19 trends – WGRZ.com

Unlike data released by the federal or state government, Facebook is releasing potential COVID-19 trends at the county level.

BUFFALO, N.Y. Every day we are inundated with data as it pertains to COVID-19. Whether it be state or federal data even county data but the data released on a daily basis only paints the picture of what has happened and doesn't necessarily speak to what may happen.

Facebook, along with partners at Carnegie Mellon, the University of Maryland, and many more, are using artificial intelligence, aggregated mobility statistics, and user surveys to forecast future COVID-19 trends.

"Our AI forecast is really just using artificial intelligence to predict what COVID-19 will look like two weeks out using both this information from public health systems," said Laura McGorman, public policy manager at Data For Good, a program spearheaded by Facebook.

McGorman says that Facebook uses readily available public health data as part of its AI-powered forecasting. The big question you probably have; however, is Facebook and its partners looking at your profile information to see if you went to Aunt Ginny's at Thanksgiving in order to lump you into their forecast?

The answer is no, not by default at least.

"It's completely separate, the Data For Good program takes sort of user control very seriously," McGorman said. "In the case of the mobility statistics that we generate for COVID-19, that's only being generated from people who choose to share their location information with Facebook when they use the app on their phone."

If you opt in to sharing your location while using the Facebook app, the app will track you. By tracking you Facebook will know if you're in a particular area that has seen recent spikes in COVID-19 cases. This information is only part of the complete picture that Facebook's AI is trying to paint.

"We train this model alongside 15 other forecasts," McGorman said. "On average, at a county level on a daily basis, we were within about 20 or so cases from the actual number."

Facebook is releasing this data in conjunction with the Humanitarian Data Exchange and Direct Relief.

The dashboard, updated multiple times a day, allows uses to look at the two-week forecast for each county in the United States. According to McGorman, this allows local leaders to analyze the data to make decisions on responding to spikes, and what the future may hold.

"What we're able to offer Erie County is a view into their county in particular," McGorman said. "In terms of what case count is going to look like over the next two weeks, we are trying to be more detailed than the average forecast here."

2 On Your Side reached out to four Department of Health offices in Western New York. None of them knew about this tool, the data it contained or didn't respond to our inquiry.

Read more here:
Facebook, other partners using artificial intelligence to forecast future COVID-19 trends - WGRZ.com

Read More..