Page 3,755«..1020..3,7543,7553,7563,757..3,7603,770..»

Here’s how AI is helping Indian Insurance industry improve customer experience – ETCIO.com

Being in the business for the past 19 years, IFFCO Tokio General Insurance has adopted an omnichannel distribution strategy, propelled by a strong IT backbone, to reach different customer segments as per their convenience.

But Artificial Intelligence is helping the insurer to go a step further.

"We leverage AI for image processing to analyse the extent of the damaged vehicle (personal cars), which helps us to generate a list of repairable and replaceable parts that were damaged in the accident. Within a few minutes, the assessment and the cost are given to the customer, which they can either accept or reject. If the assessment gets accepted, the payment is made within fifteen minutes directly to the customers bank account," Seema Gaur, ED & Head- IT, IFFCO Tokio General Insurance said. The company has automated claim settlement with Artificial Intelligence.

In the absence of AI, these photographs were assessed by claim officers, and it was tedious and manual jobs, as they took a few hours to assess the damage. With the AI-based app, the list of damaged parts is generated within minutes along with a cost. It is now much more convenient for claim officers, as they are now in a better position to justify the assessment to the claimant.

Artificial Intelligence has helped transform businesses. For large companies and industries, artificial intelligence is generating new avenues for growth and profitability.

Being an old and highly regulated industry that insurance is, companies have comparatively taken up technological implementations rather fast. Insurance no longer seems to be steeped in manual, paper based processes that are slow. McKinsey's estimates a potential annual value of $1.1 trillion, if artificial intelligence is fully applied to the insurance industry.

With all the benefits and cost saving ratios that this technology has given, organisations are more keen to adopt this in the times to come.

Take for example, Kotak Life Insurance, a life insurance business of Kotak Mahindra Bank that decided to automate the process of verifying whether the the customer receiving annuity from its insurance is still alive or not. The process was especially cumbersome for older people who had to visit the branch to prove that they are still alive, a not so good customer experience. But that has completely changed with the help of AI.

Explaining the process, Kirti Patil, Senior EVP- IT & CTO, Kotak Life Insurance says, "To submit the certificate of existence, a user had to visit the branch. It was a time consuming process for both customers and the company. We automated this process and made it digital by using Artificial Intelligence. Now, the user can simply submit a photograph through application and the intelligent algorithms verify whether the person is alive or not."

Kotak is also working on analytics-based marketing communications and AI based fraud analysis.

One key reason why AI will prove to be crucial is the ever increasing datafication of business interactions, private life and public life. AI can be used to effectively assess drivers and trends in the insurance industry and secure efficiency gains.

According to Deloitte, AI is helping insurers by predicting risk with greater accuracy, customizing products and using enhanced foresight to rapidly deploy new products. It allows institutions to be more agile, enabling them to deploy new products in response to emerging risk.

Artificial Intelligence is not only helping organisations to evaluate claims easily but is also helping them provide real time assistance to their customers. Walking on similar lines, Max Life Insurance is trying to reduce the distance with its customers with Artificial Intelligence.

"The moment you can reduce the distance between a company and the customer to zero, that is digital. So digital is not about technology from our point of view, digital is about the customer, digital is about reducing distance of communication and optimizing the ease of engagement. We are achieving it with the help of AI and bot capabilities with big data analytics, said Manik Nangia, Chief Digital Officer & Director Marketing, Max Life Insurance.

To service its customers in an instant, accurate and efficient manner, Max Life Insurance has leveraged Artificially Intelligent technologies. Its website offers a live chat supported by a bot that is constantly learning and providing more and more information to customers in a timely and precise manner.

While Insurance seems to be a space, companies have explored the capabilities of AI, there still remains some business which are yet to find the right use cases for them. To leap from AI mystery to mastery, practitioners need to bridge the gap in understanding the technologys inner workings and its business value proposition.

Continued here:
Here's how AI is helping Indian Insurance industry improve customer experience - ETCIO.com

Read More..

Evolution of the Cloud Conversation – UC Today

Rye Austin, Sales and Marketing Director at Core, is used to talking to people about transitioning to cloud-based services. But over the past year or so, hes observed a significant alteration in the way that conversation is going.

Today, people dont want to put servers in their offices. Even before the very recent huge drive towards remote working, the current climate was changing. Having on-premises environments is high risk and expensive and needs more maintenance. It doesnt make sense for most customers now, so everyones looking at the cloud, Austin explained.

Indeed, for any startup the way forward is obvious.

But for complex organisations with legacy systems its not so straightforward, and preserving business continuity requires expert project management to enact the change smoothly.

Rye Austin

Every organisation has got different systems, different data and different needs. So we look to understand how an organisation would make the absolute most of the cloud. And that means looking at the systems theyve got, because there are different levels of the cloud and different ways of doing things.

So for one organisation, they might choose to have email hosted in a SaaS style solution that makes the most sense for email. Other systems like document management, they might choose a different platform, and other workloads might be set up on Azure or similar. he continued, illustrating the complexity and consultative nature of the change management process involved.

While being a Microsoft Gold partner and expert, Cores independence is a critical asset in putting together a plan which best meets the needs of each client effectively on their journey to full cloud-based operations. We see a lot of vendors that are trying a kind land grab, to get customers onto their platform, said Austin. But this isnt serving customers well, nor demonstrating the transformative potential of cloud-based working:

Just putting the same stuff in the cloud and doing things the same way, then ticking the box and saying were done, is really not the best way of doing it. Sales people have targets to meet, and might be looking to meet targets to win organisations to take that first step into the cloud.

But the skill sets required to maintain cloud services are different. The tooling you need to maintain it and be secure with data is different. Its a big transition, from how things are in the on-premises world.

When Cores consultants work with clients, they avoid the rush to roll out, and instead work through a structured process to co-create a change management plan.

We start with education, Austin explained.

We explain whats out there, what the options are, and the benefits of the different options. They all have pros and cons, we want to educate the customer first

Once the customer is in a position to make an informed choice about the possibilities, the options for transition can be further discussed. We look in detail at the business case, and how well be there to support what would be a large transformation for them. We need to examine how its going to improve their organisation and enhance the bottom line.

Only at that point, can the project itself be specified. The third stage is to actually plan and roadmap the rollout and design. To say, here is where we think you should go, and heres how to get there.

So, you start with the compass, before you unroll the map which makes a lot of sense, on a complex journey to an exciting new destination.

For nearly 30 years, Core has been working with organisations to successfully implement transformative digital technologies which deliver defined business outcomes. Cores Technology Enablement services are designed to maximise user acceptance, engagement, and ultimately, the utilisation of the technology to realise business value.

Continue reading here:
Evolution of the Cloud Conversation - UC Today

Read More..

AI can better predict drug response to lung cancer therapies – ETCIO.com

New York, Researchers have used Artificial Intelligence (AI) to train algorithms and predict tumour sensitivity in three advanced non-small cell lung cancer therapies which can help predict more accurate treatment efficacy at an early stage of the disease.

The researchers at Columbia University's Irving Medical Center analyzed CT images from 92 patients receiving drug agent nivolumab in two trials; 50 patients receiving docetaxel in one trial; and 46 patients receiving gefitinib in one trial.

To develop the model, the researchers used the CT images taken at baseline and on first-treatment assessment.

"The purpose of this study was to train cutting-edge AI technologies to predict patients' responses to treatment, allowing radiologists to deliver more accurate and reproducible predictions of treatment efficacy at an early stage of the disease," explained Laurent Dercle, associate research scientist at the Columbia University Irving Medical Center.

However, this type of evaluation can be limited, especially in patients treated with immunotherapy, who can display atypical patterns of response and progression.

"Newer systemic therapies prompt the need for alternative metrics for response assessment, which can shape therapeutic decision-making,"

Dercle said in a paper appeared in the journal Clinical Cancer Research.

The researchers used machine learning to develop a model to predict treatment sensitivity in the training cohort.

Each model could predict a score ranging from zero (highest treatment sensitivity) to one (highest treatment insensitivity) based on the change of the largest measurable lung lesion identified at baseline.

"We observed that similar radiomics features predicted three different drug responses in patients with advanced non-small cell lung cancer (NSCLC) ," Dercle said.

"With AI, cancer imaging can move from an inherently subjective tool to a quantitative and objective asset for precision medicine approaches," he added.

Go here to see the original:
AI can better predict drug response to lung cancer therapies - ETCIO.com

Read More..

Galvanizing the new age of IT with AI and hybrid cloud – ETCIO.com

By- Amith Singhee At the dawn of the Information Age in the 1970s, the role of Information Technology (IT) was limited to computing plumbing - to keep the networks and computers working. In the 90s and 2000s, it evolved into an enterprise shared servicesmodel that was essential for operational efficiency, cost takeout and decision support.

Today, IT is witnessing another shift that increasingly requires the Chief Information Officer organization to act as a partner in defining business strategy and driving topline growth via IT-driven business transformation. To realize this, the IT delivery platform that includes infrastructure, applications, processes and roles of people -needs to be scalable and adaptable tokeep pace with the rapidly changing business and operational needs, and, hence, transform to a hybrid cloud IT architecture.

The transformation will involve four phases: Advice for Cloud, Move to Cloud, Build for Cloud and Manage on Cloud. Artificial Intelligence (AI) will play a fundamental role across all these phases. To understand this better, consider three illustrative scenarios from the Move, Build and Manage phases that are foundational to hybrid cloud adoption across industries: application modernization, DevSecOps and incident analysis and remediation.

Modernizing applications for the hybrid cloud

However, the application today does not allow for an API-based integration and is architected as a large monolith that would not efficiently scale to support dynamic loads on the system.Since enterprises have many such monolithic applications, they would like to modernize these applications to a scalable and modularized architecture, leveraging microservices and cloud technologies like Red Hat OpenShift, SaaS and IaaS, deployed across a hybrid cloud footprint. However, such modernization can take substantial manual effort, which can easily span over one to two years.

By using AI, the retailer can substantially reduce the time and effort of modernization and achieve better outcomes. Here is how Beginning with the Discover and translate stage of the modernization process, AI can analyse all application artefacts such as source code, logs and architecture documents toestimate a topological model of the application. Next, algorithmic and AI approaches can be used to do a Biz-Ops Analysis andoverlay business and operational KPIs on the components of the topological model. Finally, at the Optimize and Re-engineer stage, AI and optimizationcan be used to generate a Biz-Ops optimized target design including architecture, refactored source code, DevOps configuration and deployment specification. The architectcan guide and orchestrate all the stages of this process in an interactive manner.

DevSecOps in a Hybrid Cloud environment

DevSecOps integrates development, IT security and IT operations in a unified approach, where high levels of automation shorten application development lifecycle, enable continuous delivery, remove agility barriers from security gates, and ensure high software quality.AI can be of tremendous value in enabling effective DevSecOps. During development, for instance, AI-enhanced static code analysis will detect security vulnerabilities and non-compliance issues in the development phase itself, as plugins within the developers favourite Integrated Development Environment. This will substantially reduce the delays that occur today from post-development security checks.

AIs role in incident analysis and remediation

In the operations domain, consider the situation when an incident has occurred, for example, a user webpage may be unresponsive, or a web service API returns a 502 error and that has triggered a ticket or alert. If a site reliability engineer (SRE) is tasked with resolving the incident, AI can assist with incident analysis and predict next best actions, understand causality to estimate root causes and recommend remediation approaches.

This is how it works - AnAI system, trained from a corpus of historical incident data, can analyse all available structured and unstructured data related to the active incident, extract important incident features in the context of deployed application topology and diagnose the issue better.

Secondly, a variety of AI techniques can be used to predict the next best actions that the SRE can leverage to further diagnose the incident in collaboration with relevant subject matter experts. Updates to the active incident data can then iteratively be used to further advance the incident analysis, diagnosis and recommendations towards resolution of the incident.

While these three scenarios are widely applicable to enterprises across industries, similarly significant application of AI is likely to be seen to most other parts of the IT delivery platform and application lifecycle.

The shift to experiential age

The dramatic improvements in AI over recent years have been driven heavily by use cases and datasets from the consumer world images, speech and text. With the emerging synergy between hybrid cloud and AI, we will witness tremendous innovation and business value in the enterprise IT world. Today, customers increasingly expect instant and seamless multi-channel engagement and data privacy, andenterprises need to meet or exceed these rapidly evolving customer expectations.

As businesses leverage technology, data and AI at an increasing scale and new business models rapidly emerge,they need toundergo business transformation at tremendous speed to stay competitive while continuously reducing costs. To conclude, AI and hybrid cloud are catalysing the journey of digital transformation by enabling a fundamental shift in the role IT plays in the business.

The author is Senior Manager Hybrid Cloud, IBM Research India.

See original here:
Galvanizing the new age of IT with AI and hybrid cloud - ETCIO.com

Read More..

London teacher with half a million hits on maths website during coronavirus outbreak shortlisted as among world’s best – Evening Standard

The latest headlines in your inbox

A London maths teacher whose free online learning platform is seeing over half a million hits a day as schools shut down worldwide due to coronavirus has been shortlisted to win the title of worlds best teacher.

Dr Jamie Frost, 33, who holds a PhD in Computer Science from Oxford University and was previously investment banker, now works over 90 hours a week running website DrFrostMaths.com while holding down a full-time job at Tiffin School, a boys grammar in Kingston.

His free site, which earned him the nomination for the sixth Varkey Foundation $1 million Global Teacher Prize, offers interactive online quizzes and teaching slides, videos, and a bank of UK exam board questions for students to practice on - as well as learning resources for teachers.

Dr Frost says it has had nearly seven million downloads and is used by 5,500 schools worldwide, including over half of all UK secondaries.

Since schools began closing worldwide due to the coronavirus pandemic it has seen more than 100,000 additional visits per day, and Dr Frost told the Standard its now going up.

He said: I will easily exceed half a million today.

The dedicated server is absolutely hammered because schools are closed. Theres so much traffic its struggling to cope the problem is going to be getting worse every day.

I reached out to tech companies for help... Google has now reached out to me to offer $10,000 of free 'cloud credit' and potential support moving the site to their cloud servers. It is fantastic.

Tiffin School, which shut on Monday for deep cleaning over COVID 19, is using the site for Year 11 practice exams this week.

The teacher, who lives in Surbiton, added: Its remarkable how even with schools shut down we are able to cope using technology.

Dr Frost started out as an investment banker, coding trading algorithms for Morgan Stanley in New York and Canary Wharf, but found it soul destroying.

He realised he was happier sharing his knowledge with students, as he had as an adjunct teacher at Oxford.

He completed teacher training and began working at Tiffin School in 2013, and initially launched the site in a bid to help pupils who were struggling with maths.

The Global Teacher Prize will be awarded at the Natural History Museum on October 12.

It is awarded to a single school teacher who has made an outstanding contribution to the profession, and aims to shine a spotlight on the important role teachers play in society.

The winning educator is allowed to put their $1 million, awarded in equal instalments over 10 years, towards new projects and initiatives of their choosing.

The shortlist is compiled from over 40,000 entries worldwide by a panel of international education experts.

If he wins the prize, Dr Frost plans to use the funding to expand the site to offer questions and support on exam syllabuses in some of the sites most popular regions outside the UK, such as Malaysia, and employ teachers there to help.

The sites resources have already been used to teach in a district of schools in Zimbabwe.

Hear Dr Frost on this episode of The Leader podcast:

Dr Frost said: I love teaching so much, being in the classroom and interacting with kids. I am just delighted to be shortlisted.

Its absolutely fantastic, but its not about the recognition. Its that if I was to win the money I would spend it all on expanding the platform more globally.

Tiffin School headteacher, Mike Gascoigne, said: Dr Jamie Frost is an amazing teacher who fully deserves his shortlisting for this incredible award.

"As a Tiffin teacher, the school is proud to support his brilliant work.

At this time of COVID19 crisis and isolation, it will undoubtedly be even more useful to schools and pupils. He is a truly inspirational teacher making a huge impact in the world of maths.

Read this article:
London teacher with half a million hits on maths website during coronavirus outbreak shortlisted as among world's best - Evening Standard

Read More..

Surge in home working highlights Microsoft licensing issue: If you are not on subscription, working remotely is a premium feature – The Register

Working from home and want to access your PC at work? The best solution may cost thousands in additional Microsoft licensing costs.

In the scramble to migrate employees to home working, there are issues for businesses who normally have staff in an office working on desktop PCs, or accessing network file shares and intranet applications, or running applications that connect to an on-premises database.

This poses some difficult and potentially risky and expensive questions for organisations that are not already set up to have all or most of their staff working remotely. Business continuity is top of mind, but as security expert Bruce Schneier has observed: "Worrying about network security seems almost quaint in the face of the massive health risks from COVID-19, but attacks on infrastructure can have effects far greater than the infrastructure itself."

One area of concern is the risk from users on home PCs accessing corporate assets. "These systems are more likely to be out of date, unpatched, and unprotected. They are more vulnerable to attack simply because they are less secure," noted Schneier and that is before taking into account the variety of websites visited and software installed by family members, including children.

Staff working at home could use a VPN to connect to the corporate network. VPNs have many advantages, but by putting the remote machine in effect on the internal business network, it also poses risks, for example if malware on the remote machine is able to damage business assets.

Microsoft has some solutions for remote access without a VPN, including a feature of Windows Server called Remote Desktop Gateway (RDS Gateway). Users can connect to the gateway over SSL (no VPN required) and use a Remote Desktop client to access their work PC, or a desktop session on Windows server, or a desktop application running on the server.

The snag here is that using RD Gateway requires a Remote Desktop Services Client Access License (CAL), as Microsoft makes clear in this document [PDF]:

"An RDS CAL is required to use any functionality included in the Remote Desktop Services role in Windows Server. For example, if you are using RDS Gateway and/or Remote Desktop Web Access to provide access to a Windows client operating system on an individual PC, both an RDS CAL and Windows Server CAL are required."

An RDS CAL can cost over 100 per user we found a single CAL on sale from Microsoft for 186.53, though you can do better from other resellers and with bulk licensing deals.

Just the thing for a pandemic: an extra fee to have your users work remotely

In addition, some vendors have curious rules about remote access to their applications that incur additional fees. Rich Gibbons, a licensing trainer at IT Asset Management (ITAM), noted that Citrix and/or Remote Desktop Services are "the easiest way to quickly become non-compliant with a LOT of vendors."

The problem here is a fundamental one, which is that companies including Microsoft are in a hybrid world, part based on cloud concepts where everything is on the internet and easily accessed from all kinds of devices, and part based on traditional business networks with servers, locally installed software and desktop PCs. The licensing model for these two types of environment is different, with the traditional environment generally being more complex. Customers who license applications like Windows and Office on a per-user subscription are much better placed than those with perpetual per-device licenses.

Wes Miller, is a research analyst and licensing specialist at Directions on Microsoft, based in Kirkland near Seattle. He told us that Microsoft is trying to move to the cloud model, but "a lot of customers are stagnating; they don't want to pay for the subscription. It's on-premises versus cloud. It's per-device vs per-user. And it's perpetual licensing vs subscription licensing. You have to put all three of them together.

"What we've got is a lot of customers, especially regulated and institutional customers, who either for regulatory or cost reasons don't want to go into the cloud and subscriptions, they want to sit on-premises with perpetual product, and they don't have a good story of how to help their employees go remote today."

Security is one thing, but in the midst of a global pandemic, does anyone care about licensing? "You're not going to get audited right now," Gibbons told The Reg. "But in six months or whenever this is over, you need to know what you've done."

Miller concurs. "I think were going to see a grace period here. When those audits do start back up, I think businesses may well be able to ask for some sort of grace period or leeway, but the reality is, if you put something in place to meet these new work scenarios, you need to expect to pay for whatever Microsofts current licensing model is for that."

If you need Windows and Office, and are on a Microsoft 365 licence (not just Office 365), it is worth noting that Windows Virtual Desktop (WVD) running Windows 10 can be used without an additional licence or RDS Cal. You still need an RDS CAL for accessing Windows Server desktops and apps, if needed. Optimising licensing costs is a specialist task but can make a big difference to costs.

Miller says a virtual desktop environment is a good answer. "Solutions like Windows Virtual Desktop (WVD) or using the server-based variant of Amazon Workspaces, have been license-proven and the technology is proven, at least in the case of Amazon Workspaces, WVD being newer. The biggest thing is, approach it in context with either what you own, or what you are willing to buy, which is weird to think about given the current time were in."

For customers not on subscriptions, Microsoft's habit of treating remote access as a premium feature looks out of date, and in the current circumstances particularly unwelcome. These problems do not exist if you are using born-in-the-cloud solutions like Google's G Suite, and largely disappear if you are on a Microsoft 365 subscription. Remote access is now the norm, which means Microsoft should give up its addiction to things like RDS cals. Customers too will have to adapt, with subscription licensing now hard to avoid.

Sponsored: Webcast: Why you need managed detection and response

Read the original post:
Surge in home working highlights Microsoft licensing issue: If you are not on subscription, working remotely is a premium feature - The Register

Read More..

Google Teams Up with Solo.io to Extend Istio – Container Journal

Google and Solo.io are now collaborating to make open source Istio service mesh more extensible by adding support for WebAssembly (WASM), which was created under the auspices of the World Wide Web Consortium (W3C) and provides a portable target for compiling more than 30 high-level languages.

Solo.io has been working to marry WASM with Envoy, an open source proxy server being developed under the auspices of the Cloud Native Computing Foundation (CNCF). The Istio service mesh is built on top of Envoy, so now Google and Solo.io are working toward providing WASM support for Istio.

That capability should make it easier for DevOps teams to extend Istio to add additional services, such as a web application firewall (WAF), as a filter to the Istio service mesh.

Solo.io CEO Idit Levine says that as part of that effort the company will focus on improving the overall developer experience for Istio, while Google and the rest of the contributors to the service mesh continue to enhance the core platform. The goal is to create an ecosystem of filters that extend the capabilities of Istio and Envoy in a way that allows developers to programmatically add new capabilities without having to deploy a dedicated appliance, says Levine.

Those filters can be added to either Istio or Envoy without requiring IT teams to recompile either platform. In the case of Istio, extensions to proxies can be made without having to restart the service mesh.

The ability to extend Istio using WASM requires organizations to have version 1.5 of the service mesh installed. While Istio has gained some traction among organizations that have adopted Kubernetes, the service mesh is not especially accessible to the average IT team. WASM will also provide a means of making it much more feasible for IT teams to manage Istio deployments as the ecosystem continues to evolve and expand, says Levine.

Solo.io to help foster that emerging ecosystem has already made available WebAssembly Hub, a service for building, sharing, discovering and deploying WASM extensions. Those WASM extensions can then be deployed as containers.

Istio, of course, is not only service mesh that has been created for cloud-native computing environments. Levine says she expects that the filters created for Istio using WASM will be portable to other service mesh platforms. That capability will not only preserve the flexibility of IT organizations in terms of which service mesh to employ when, but also ultimately serve to lower the total cost of computing as more appliances are transformed into filters.

It may be a while before Istio and Envoy achieve enough critical mass to turn that vision into reality. However, as containers and Kubernetes continue to gain traction its only a matter of time before more organizations rely on service meshes to bring some order to the microservices chaos that is likely to ensue. Once that occurs, it then becomes only a matter of time before organizations also realize that service meshes are also programmable platforms that can be used to deliver in a much lighter fashion all kinds of network and security services.

Related

See the rest here:
Google Teams Up with Solo.io to Extend Istio - Container Journal

Read More..

Running These Workloads? You Should Take A Look At The IBM Z15 – Forbes

Four frame IBM z15

Talk to any IT administrator about what they look for in a server platform and performance, openness, and security will top the list. Specifically, run my most demanding applications fast, enable my organization to support the range of workloads that power my modern business, secure my environment and protect my data. Ask those same IT administrators who can deliver on these and a combination of vendors will be quoted back, with a third party integrating. However, I believe these characteristics are, in essence, IBMs z15.

Dispel the outdated notion of mainframe technology. I believe IBM Z is a new class of compute platform that is open and well suited to drive the digitally transformed business. In one system, legacy back-office co-resides and integrates with cloud native. Likewise, structured and unstructured data types are integrated to deliver all of the necessary intelligence to the business in a secure, cloudified way. And the z15 is the latest generation platform.

Where the IBM Z brings it

Before getting into the workloads supported, lets cover what I think are the four features that make IBM Z a smart choice for any enterprise IT organization.

Reliability: There is a reason why IBM is the core backbone to the industries that make such large investments in the management and utility of data.Banking, financial services, and healthcare to name just a few. Its because of the confidence around the availability of systems (and the data residing on these systems). Seven 9s availability, serviceability without disruption of operations, and consistency of performance all equals SLA adherence that IBM Z can deliver.

Performance, and consistency of performance: organizations need to be able to rely on their systems to be both performant and consistent in that performance profile. In multi-tenant virtualized and cloud environments, this concept is easier to promise than to achieve. Because of the IBM Z architecture and system software, features like VM isolation deliver not just security, but protection against performance disruptions often associated with noisy neighbors.

Security: Security is more complex than locking down perimeters or user access management. Its about protecting the systems and data that make the business run. Data must be protected in all phases of its existence at rest, at work, and in flight. IBMs pervasive encryption is a hardware based crypto engine that ensure data is protected both at rest and in flight with zero impact to system performance. And Data Privacy Passports is an IBM technology that allows an organization to protect data even as it is shared cross-organization and cross cloud with partners, customers and others. Imagine being able to revoke access to sensitive customer information that was shared with a third party directly from your console? This is Data Privacy Passports and for any company that is concerned with Sensitive Personal Identifying Information (SPII), this feature alone should make the z15 from IBM worth evaluating (I detail this and more in this review of the z15).

Openness: Mainframes populate the vast majority of enterprises and house the applications and data that are considered mission critical. At the same time, these systems and the applications and data housed on them must be able to integrate with newer application and data architectures and span the on-prem to cloud compute model employed by virtually every company. Throw in the thousands of virtual servers and countless containers that drive the everyday functions of the enterprise, and IT becomes very messy.

Contrary to the outdated view some hold of mainframes, IBM Z is perhaps the most open compute platform on the market as it can support this very messy environment.Legacy? Check. Linux? Check. Containerized? Yup. Virtualized? Yup.

So, what are those workloads?

Based on what IBM Z delivers in terms of performance and security, its natural to assume that any workload would benefit. And this is pretty much true. For organizations that run a mixed environment, there are a few workloads in particular that really shine on the Z system, even if not obvious to an IT administrator. Transaction processing and analytics processing (OLTP, OLAP), real time analytics, and massive container and VM consolidation are ideal workloads for IBM Z. Lets dig into these a little bit.

On-Line Transaction Processing (OLTP) and On-Line Analytics Processing (OLAP): Real time responsiveness to customer orders and inquiries is critical to customer retention. And in this digital economy, real time means, in fact, real time. Because of this, the infrastructure supporting these workloads must be able to vertically scale with needs of the applications that serve customers. This is true for both long-term incremental growth and shorter term bursty and elastic requirements that are more event driven.

Further, the ability to scale appropriately must not disrupt the current application performance. The abstraction of compute from I/O in the Z system allows for this scale without disruption requirement. And Z CPU performance (up to 5.2GHz) with access to massive memory delivers on the requirement for real world performance.

Finally, the Z system tool chain enables IT organizations to better integrate the multiple data types coming from multiple sources. COBOL and Java are not mutually exclusive in the datacenter and any IT professional or data architect that has worked on a project of scale knows that the ability to seamlessly integrate across legacy and modern is an absolute necessity to successfully drive digital transformation.

Real Time Analytics: Real time is the operative element of real time analytics. In order for real time analytics engines to be effective, they must be able to access multiple data sources simultaneously. And in the age of edge and cloud computing, security is a must.

While the z15 hardware specifications are impressive, what I find most interesting about IBMs approach to enabling real time analytics across the enterprise is the tools developed for IT organizations. The company developed tools to execute real time queries against standard SQL environments and the mass of unstructured data flooding the business, and provides an environment where data scientists can quickly develop and deploy on the tools they are familiar with Java, Scala and Python with Apache Spark, Anaconda, and the like.Sounds like a win.

Enterprise VM Consolidation: Consolidation of Linux VMs on the z15? Serious? Yeah. Running Linux on IBM z15 enables organizations to literally consolidate hundreds of Linux based VMs on a single server, reducing the space requirements and management headaches enterprise IT faces on a daily basis, while at the same time cutting roughly in half the power required to run the same workloads. What is LinuxONE? Think the z15 running Linux to support your enterprise needs. Its worth checking out.

IBM makes some incredible claims around z15 supporting virtualized environments 7x better per core performance (versus x86) and 4.8x better workload performance with z/VM memory. What this adds up to is near native performance. But equally compelling to me is the way this virtualized environment is made more performant and resilient through the VM security and isolation that results from IBMs Pervasive Encryption technology, disk mirroring and overall system design.

Am I forgetting a couple workloads?

You may notice a couple of obvious workloads missing from the above list. Specifically, digital asset management (DAM) and Artificial Intelligence/Machine Learning (AI/ML). I really wanted to highlight a few of the workloads that were maybe not the obvious candidates for IBMs Z system. However, the support for DAM and AI/ML has never been stronger on the z15.

The integration of compression and pervasive encryption combined with Data Privacy Passports make the z15 the most formidable IBM Z platform for digital asset management and custody as they enable the highest levels of asset protection without compromising on performance.

Likewise, IBM has long been on the forefront of the artificial intelligence (AI) market (raise your hand if youve seen a Watson commercial). The company has built a framework and toolchain that enables enterprise IT to quickly deploy Watson machine learning (ML) on IBM Z to better enable real-time operations such as fraud detection for banking and finance.

Wrapping up

Enterprises have long viewed the mainframe as critical to the business due to its reliability, resilience, scale and performance supporting line of business applications. As IBM developed the z15 to align the evolving needs of the enterprise transformation, openness and security became key capabilities to expand. And with the z15, the company has done just that. Perhaps this is why IBM saw Z system revenue increase by 60% in its latest quarterly earnings.

The world runs on the mainframe. Whether well-established banks, healthcare providers, government organizations or the partners and customers that digitally interact with these organizations. Even more sophisticated startups and newer companies should consider deploying Z system, as I do believe all will benefit from how IBM has embraced openness and the tightest security in the z15. From a die-hard server-only guy for nearly 30 years, I hope that means something.

Note: This analysis contains contributions from server and compute analyst Matt Kimball.

Disclosure: Moor Insights & Strategy, like all research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, including Amazon.com, Advanced Micro Devices,Apstra,ARM Holdings, Aruba Networks, AWS, A-10 Strategies,Bitfusion,Cisco Systems, Dell, DellEMC, Dell Technologies, Diablo Technologies, Digital Optics,Dreamchain, Echelon, Ericsson, Foxconn, Frame, Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries,Google,HPInc., Hewlett Packard Enterprise, HuaweiTechnologies,IBM, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, MACOM (Applied Micro),MapBox,Mavenir, Mesosphere,Microsoft,National Instruments, NetApp, NOKIA, Nortek,NVIDIA, ON Semiconductor, ONUG, OpenStack Foundation, Panasas,Peraso, Pixelworks, Plume Design,Portworx, Pure Storage,Qualcomm, Rackspace, Rambus,RayvoltE-Bikes, Red Hat, Samsung Electronics, Silver Peak, SONY,Springpath, Sprint, Stratus Technologies, Symantec, Synaptics, Syniverse,TensTorrent,TobiiTechnology, Twitter, Unity Technologies, Verizon Communications,Vidyo, Wave Computing,Wellsmith, Xilinx, Zebra, which may be cited in this article.

Go here to read the rest:
Running These Workloads? You Should Take A Look At The IBM Z15 - Forbes

Read More..

Why AI might be the most effective weapon we have to fight COVID-19 – The Next Web

If not the most deadly, the novel coronavirus (COVID-19) is one of the most contagious diseases to have hit our green planet in the past decades. In little over three months since the virus was first spotted in mainland China, it has spread to more than 90 countries, infected more than 185,000 people, and taken more than 3,500 lives.

As governments and health organizations scramble to contain the spread of coronavirus, they need all the help they can get, including from artificial intelligence. Though current AI technologies arefar from replicating human intelligence, they are proving to be very helpful in tracking the outbreak, diagnosing patients, disinfecting areas, and speeding up the process of finding a cure for COVID-19.

Data science and machine learning might be two of the most effective weapons we have in the fight against the coronavirus outbreak.

Just before the turn of the year, BlueDot, an artificial intelligence platform that tracks infectious diseases around the world, flagged a cluster of unusual pneumonia cases happening around a market in Wuhan, China. Nine days later, the World Health Organization (WHO)released a statementdeclaring the discovery of a novel coronavirus in a hospitalized person with pneumonia in Wuhan.

BlueDot usesnatural language processingandmachine learning algorithmsto peruse information from hundreds of sources for early signs of infectious epidemics. The AI looks at statements from health organizations, commercial flights, livestock health reports, climate data from satellites, and news reports. With so much data being generated on coronavirus every day, the AI algorithms can help home in on the bits that can provide pertinent information on the spread of the virus. It can also find important correlations between data points, such as the movement patterns of the people who are living in the areas most affected by the virus.

The company also employs dozens of experts who specialize in a range of disciplines including geographic information systems, spatial analytics, data visualization, computer sciences, as well as medical experts in clinical infectious diseases, travel and tropical medicine, and public health. The experts review the information that has been flagged by the AI and send out reports on their findings.

Combined with the assistance of human experts, BlueDots AI can not only predict the start of an epidemic, but also forecast how it will spread. In the case of COVID-19, the AI successfully identified the cities where the virus would be transferred to after it surfaced in Wuhan. Machine learning algorithms studying travel patterns were able to predict where the people who had contracted coronavirus were likely to travel.

Coronavirus (COVID-19) (Image source:NIAID)

You have probably seen the COVID-19 screenings at border crossings and airports. Health officers use thermometer guns and visually check travelers for signs of fever, coughing, and breathing difficulties.

Now,computer vision algorithmscan perform the same at large scale. An AI system developed by Chinese tech giant Baidu uses cameras equipped with computer vision and infrared sensors to predict peoples temperatures in public areas. The system can screen up to 200 people per minute and detect their temperature within a range of 0.5 degrees Celsius. The AI flags anyone who has a temperature above 37.3 degrees. The technology is now in use in Beijings Qinghe Railway Station.

Alibaba, another Chinese tech giant, has developed an AI system that candetect coronavirus in chest CT scans. According to the researchers who developed the system, the AI has a 96-percent accuracy. The AI was trained on data from 5,000 coronavirus cases and can perform the test in 20 seconds as opposed to the 15 minutes it takes a human expert to diagnose patients. It can also tell the difference between coronavirus and ordinary viral pneumonia. The algorithm can give a boost to the medical centers that are already under a lot of pressure to screen patients for COVID-19 infection. The system is reportedly being adopted in 100 hospitals in China.

A separate AI developed by researchers from Renmin Hospital of Wuhan University, Wuhan EndoAngel Medical Technology Company, and the China University of Geosciences purportedly shows 95-percent accuracy on detecting COVID-19 in chest CT scans. The system is adeep learning algorithmtrained on 45,000 anonymized CT scans. According to a preprint paperpublished on medRxiv, the AIs performance is comparable to expert radiologists.

One of the main ways to prevent the spread of the novel coronavirus is to reduce contact between infected patients and people who have not contracted the virus. To this end, several companies and organizations have engaged in efforts to automate some of the procedures that previously required health workers and medical staff to interact with patients.

Chinese firms are using drones and robots to perform contactless delivery and to spray disinfectants in public areas to minimize the risk of cross-infection. Other robots are checking people for fever and other COVID-19 symptoms and dispensing free hand sanitizer foam and gel.

Inside hospitals, robots are delivering food and medicine to patients and disinfecting their rooms to obviate the need for the presence of nurses. Other robots are busy cooking rice without human supervision, reducing the number of staff required to run the facility.

In Seattle, doctors used a robot to communicate with and treat patients remotely to minimize exposure of medical staff to infected people.

At the end of the day, the war on the novel coronavirus is not over until we develop a vaccine that can immunize everyone against the virus. But developing new drugs and medicine is a very lengthy and costly process. It can cost more than a billion dollars and take up to 12 years. Thats the kind of timeframe we dont have as the virus continues to spread at an accelerating pace.

Fortunately, AI can help speed up the process. DeepMind, the AI research lab acquired by Google in 2014, recently declared that it has used deep learning to find new information about the structure of proteins associated with COVID-19. This is a process that could have taken many more months.

Understanding protein structures can provide important clues to the coronavirus vaccine formula. DeepMind is one of several organizations who are engaged in the race to unlock the coronavirus vaccine. It has leveraged the result of decades of machine learning progress as well as research on protein folding.

Its important to note that our structure prediction system is still in development and we cant be certain of the accuracy of the structures we are providing, although we are confident that the system is more accurate than our earlier CASP13 system, DeepMinds researchers wroteon the AI labs website. We confirmed that our system provided an accurate prediction for the experimentally determined SARS-CoV-2 spike protein structure shared in the Protein Data Bank, and this gave us confidence that our model predictions on other proteins may be useful.

Although its too early to tell whether were headed in the right direction, the efforts are commendable. Every day saved in finding the coronavirus vaccine can save hundredsor thousandsof lives.

This story is republished fromTechTalks, the blog that explores how technology is solving problems and creating new ones. Like them onFacebookhere and follow them down here:

Published March 21, 2020 17:00 UTC

Continued here:
Why AI might be the most effective weapon we have to fight COVID-19 - The Next Web

Read More..

Emerging Trend of Machine Learning in Retail Market 2019 by Company, Regions, Type and Application, Forecast to 2024 – Bandera County Courier

The latest report titled, Global Machine Learning in Retail Market 2019 by Company, Regions, Type and Application, Forecast to 2024 unveils the value at which the Machine Learning in Retail industry is anticipated to grow during the forecast period, 2019 to 2024. The report estimates CAGR analysis, competitive strategies, growth factors and regional outlook 2024. The report is a rich source of an exhaustive study of the driving elements, limiting components, and different market changes. It states market structure and then further forecasts several segments and sub-segments of the global market. The market study is provided on the basis of type, application, manufacturer as well as geography. Different elements such as opportunities, drivers, restraints, and challenges, market situation, market share, growth rate, future trends, risks, entry limits, sales channels, distributors are analyzed and examined within this report.

Exploring The Growth Rate Over A Period:

Business owners want to expand their business can refer to this report as it includes data regarding the rise in sales within a given consumer base for the forecast period, 2019 to 2024. The research analysts have mentioned a comparison between the Machine Learning in Retail market growth rate and product sales to allow business owners to discover the success or failure of a specific product or service. They have also added the driving factors such as demographics and revenue generated from other products to offer a better analysis of products and services by owners.

DOWNLOAD FREE SAMPLE REPORT: https://www.magnifierresearch.com/report-detail/7570/request-sample

Top industry players assessment: IBM, Microsoft, Amazon Web Services, Oracle, SAP, Intel, NVIDIA, Google, Sentient Technologies, Salesforce, ViSenze,

Product type assessment based on the following types: Cloud Based, On-Premises

Application assessment based on application mentioned below: Online, Offline

Leading market regions covered in the report are: North America (United States, Canada and Mexico), Europe (Germany, France, UK, Russia and Italy), Asia-Pacific (China, Japan, Korea, India and Southeast Asia), South America (Brazil, Argentina, Colombia), Middle East and Africa (Saudi Arabia, UAE, Egypt, Nigeria and South Africa)

Main Features Covered In Global Machine Learning in Retail Market 2019 Report:

ACCESS FULL REPORT: https://www.magnifierresearch.com/report/global-machine-learning-in-retail-market-2019-by-7570.html

Moreover in the report, supply chain analysis, regional marketing type analysis, international trade type analysis by the market as well as consumer analysis of Machine Learning in Retail market has been covered. Further, it determines the manufacturing plants and technical data analysis, capacity, and commercial production date, R&D Status, manufacturing area distribution, technology source, and raw materials sources analysis. It also depicts to depict sales, merchants, brokers, wholesalers, research findings and conclusion, and information sources.

Customization of the Report:This report can be customized to meet the clients requirements. Please connect with our sales team (sales@magnifierresearch.com), who will ensure that you get a report that suits your needs. You can also get in touch with our executives on +1-201-465-4211 to share your research requirements.

See the original post here:
Emerging Trend of Machine Learning in Retail Market 2019 by Company, Regions, Type and Application, Forecast to 2024 - Bandera County Courier

Read More..