Category Archives: Data Science
New grants awarded in Agricultural Genome-to-Phenome research – College of Agriculture and Life Sciences
Ames, IA The Agricultural Genome to Phenome Initiative has awarded nine grants to 27 institutions in the third and final round of project seed grant competition.
The Agricultural Genome to Phenome Initiative, a three-year project ending in 2023, is funded by the U.S. Department of Agricultures National Institute of Food and Agriculture. The goal of AG2PI is to connect crop and livestock scientists to each other and to those working in data science, statistics, engineering and social sciences to identify shared problems and collaborate on solutions.
The awarded projects help advance multidisciplinary crop and livestock research by addressing genome-to-phenome challenges, developing solutions for research infrastructure needs, or sharing solutions across kingdoms.
The seed grant projects awarded in Round 3 will run from 6 to 12 months and span three funding levels:
Two of the new grants, which support critical needs in research data management and standardization, involve Iowa State University faculty.
An enabling-level grant, Creating a FAIR Data Ecosystem for Incorporating Single Cell Genomics Data into Agricultural G2P Research is a project led by Christopher Tuggle, professor in the Department of Animal Science, Iowa State University. The international team includes: Christine Elsik, University of Missouri; Peter Harrison, EMBL-European Bioinformatics Institute, Cambridge, UK; Nicholas Provart, University of Toronto; and collaborators Tony Burdett, EMBL-European Bioinformatics Institute; Tim Tickle, the Broad Institute of MIT and Harvard; Marc Libault, University of Nebraska-Lincoln; Wes Warren, University of Missouri; Ben Cole, Lawrence Berkeley National Laboratory; and James Koltes, Iowa State.
Through deep international collaboration, the project aims to create an agricultural equivalent to the Human Cell Atlas Data Coordination Platform (HCA DCP), which will improve the availability of RNA single-cell data sequencing output to crop and livestock researchers. The project will provide training to early career scientists under Tuggles and Elsiks advisement, who will be working directly with scientists at HCA DCP and testing the agricultural metadata standards for usability.
An emerging-level grant, led by Hao Cheng at the University of California-Davis, also includes ISU faculty as well as international collaborators. Tuggle and Distinguished Professor Jack C.M. Dekkers, also in the ISU Department of Animal Science, are co-investigators; collaborators Richard Mott and Lingzhao Fang, both based in the UK, round out the team members. The project, Homomorphic Encryption to Enable Sharing of Confidential Data, will test and evaluate an approach that allows multiple researchers to access and analyze encrypted datasets while protecting intellectual property and ensuring FAIR (Findable, Accessible, Interoperable, and Reusable) data use principles.
Other project teams receiving AG2PI Round 3 seed grants include:
Emerging Grants:
Enabling Grants:
Establishing Grants:
For more information on all the awarded AG2PI seed grants as well as other grant opportunities, visit the webpage: https://www.ag2pi.org/seed-grants/.
The AG2PI is funded by the U.S. Department of Agricultures National Institute of Food and Agriculture. The goal of AG2PI is to build communities that address the challenges of genome-to-phenome research across crops and livestock. The AG2PI partners include Iowa State University, University of Nebraska, University of Arizona, University of Idaho and the Iowa Corn Promotion Board.
Read more from the original source:
Predibase Takes Declarative Approach to AutoML The New Stack – thenewstack.io
Its no secret that creating and deploying machine learning models takes too long. In Algorithmias 2021 Enterprise Trends in Machine Learning, 25% of respondents said creating a model took one week to one month, while 24% put that time at one month to one quarter. And 37% said it took one quarter to one year to deploy a model.
At Uber, an intent classification system that involved 1,500 lines of TensorFlow took five months to create and seven months to deploy.
A second machine learning project, fraud detection, with 900 lines of PyTorch, took five months to create and four months to deploy.
A product recommendation tool with 1,200 lines of PyTorch took six months to create and seven months to deploy.
San Francisco-based startup Predibase is out to change that by providing a low-code declarative ML platform that both data scientists and non-experts can use, easing the pressure on organizations to hire more scarce and expensive data scientists. Users can just state what they want to do starting with just six lines of Python code and let the system figure out how to do it and the infrastructure required.
Its built atop two machine learning technologies created by the Predibase founders at Uber: Ludwig and Horovod. Ludwig is an open source,declarative machine learning frameworkthat provides the simplicity of an autoML solution with the flexibility of writing your own PyTorch code. Horovod, an open source component of Ubers Michelangelo deep learning toolkit makes it easier to start and speed up distributed deep learning projects with TensorFlow.
The experience is that data science organizations have to basically reinvent the wheel and create a bespoke solution for every single one of these products, and theres not much in common among them. Because of that, the whole organization becomes a bottleneck for machine learning adoption, said Predibase CEO, Piero Molino. The result is that it just takes too long for machine learning models to bring value to an organization.
In contrast, he compares a declarative configuration system to what Kubernetes has done for infrastructure.
Our vision is to make machine learning as easy as writing a SQL query, Molino said.
The basic idea is to let users specify entire model pipelines as configurations the parts they care about and automate the rest.
Traditional machine learning projects involve a complicated ML life cycle that spans feature and data engineering; model development and training; and model production and governance. Cross-functional data science teams struggle to manage these phases in a coherent and sustainable way, said Kevin Petrie, vice president of research at Eckerson Group.
Predibase represents a level of innovation to simplify the ML life cycle. Predibase proposes to let data science teams specify the desired inputs and outputs for their ML model. That is, they create configuration files that Predibase then figures out how to implement. Data science teams still can customize as many parameters, etc. as they like by making modular changes to meet new or changing customer requirements.
In short, Predibase proposes to minimize the complexity of the ML life cycle, which is the biggest barrier to success with data science projects.
Its easy to get started. That Uber intent classification system could be created, for example with six lines of code. You get something that is readable and reproducible and shareable, he said.
But one of the advantages is that you retain all the flexibility and control that an expert needs. So you can specify through the configuration all the details about the models choosing among different model architectures, training parameters, about the preprocessing of the data. Its all accessible through a parameter in the configuration, which makes it easy to iterate and improve models. Make changes with just a new configuration.
Its also extensible. So if youre an expert developer, you can add your own keys to the configuration. You can extend this by adding your own piece of PyTorch, for instance, and then it can be referenced from the configuration.
The company has deep expertise in machine learning.
Molino, the creator of Ludwig, previously was staff research scientist at Stanford University and co-founder and senior research scientist at Uber AI.
Fellow Predibase co-founders are:
Predibase enables users to easily connect to structured and unstructured data stored anywhere on the cloud data stack; write model pipeline configurations and run on a scalable distributed infrastructure to train models as easily as on a single machine; deploy model pipelines with the click of a button and query them immediately.
Predibase is building the first declarative ML platform that enables enterprises to develop and operationalize models, from data to deployment, without having to choose between simplicity and the power of fine-grained controls. The rapid success of both the open source foundations and the beta of its commercial platform in the Fortune 500 has been incredibly exciting, Greylock Partner Saam Motamedi said at the recent announcement of a $16.25 million Series A round.
Still in private beta with Fortune 500 customers, Predibase is looking toward a general release in the second half of this year.
Customers have been using datasets of about 1 billion to 2 billion rows about 100 to 200 columns and several hundred gigabytes. Internal benchmarking has run up to 2 terabytes. Ludwig and Horovod, however, have been tested on much larger data set sizes even than that, according to Rishi.
The company maintains it takes a different approach than other automated machine learning products.
Thinking of something like DataRobot or Google Cloud AutoML, for example, [they] provide these interfaces where you kind of bring in data, click a button and you get models out, explained Molino. We found that thats actually pretty unsatisfying for a lot of users and customers because they tend to be black boxes that dont have any configurability or control. So the minute that the platform doesnt give you a good out-of-the-box model, youre kind of stuck, and you end up graduating out.
Users can access the capabilities in Predibase purely through Python, through the UI or through PQL (Predictive Query language), an extension of SQL.
The PQL extension includes predicates that allow you to bring machine learning and data together, Rishi explained. Its flexibility puts machine learning in the language of, of data users, so they can use filter, group by aggregate, join or any other commands that theyre familiar with in SQL. Its extensible. Simply add new features as an additional predicate. Predibase makes it just as easy to use text and image and other types of fields as standard tabular fields.
This is really simple. It brings machine learning into the hands of a broader set of users that are familiar with SQL, but at the same time, behind the scenes, the power and flexibility of the Ludwig configuration system provide state-of-the-art performance on both structured and unstructured data, and the combination of the two, Molino said.
And finally, we also abstract away the infrastructure based on Horovod, they can train and deploy models at scale. And its basically a big-tech-level infrastructure without the need to have a big-tech-level engineering team to build it, right. Its already built for you.
Models can be queried as REST APIs, through the Python SDK and through the PQL language. Though the entire process is encapsulated in the platform, the models also can be exported, should the user need to run them elsewhere.
A model repositories page summarizes the models just as configuration, making comparing model versions easy.
The company is spending the first half of this year making that product enterprise-ready with robustness, enterprise-grade security and enabling multicloud deployments, Molino said. After a GA launch, it wants to pursue integrations with the wider ML ecosystem, with tools like dbt, for instance, and eventually make Predibase self-service.
Feature image via Pixabay
Read this article:
Predibase Takes Declarative Approach to AutoML The New Stack - thenewstack.io
Ventiv Technology Appoints Mark Tainton as New Global Head of Advanced Analytics – Business Wire
ATLANTA--(BUSINESS WIRE)--Ventiv Technology, the leading provider of advanced analytics for risk, claims and underwriting solutions, today announced the appointment of Mark Tainton to lead its Advanced Analytics product division.
The addition of Tainton to the Ventiv team is another strategic move to accelerate the companys leadership position in decision science-based solutions for the risk, claims administration, and insurance industry. Today, Ventivs suite of analytics solutions includes Data Discovery, Data Exploration, Geospatial, Ventiv Predict, and its Data Science Platform.
The breadth and depth of our analytical tools and capabilities today are already helping Ventiv clients lower their total cost of risk. We have the strong analytical foundation to bring new capabilities to market rapidly, says Tainton. We are working to develop cutting-edge AI automation, straight-through processing, and predictive scoring to enhance decision-making for our growing customer base.
Tainton, known for fostering business, data, and technology innovation, has a rich history of leading, building, and mentoring Data Analytics & Data Science teams. His past experiences include serving as Global Head of Business Intelligence and Management Information with Aon Risk Services, Vice President of Global Business Intelligence with Arthur J. Gallagher, and Head of Data Analytics with Calamos Investments.
We have analytics capabilities that differentiate both our RMIS platform and our Claims, Billing, and Policy Administration solutions in the market, and as a company, weve accelerated innovation of new decision science solutions across our product set, says Salil Donde, Ventiv Technology CEO. Taintons addition to the team will accelerate these capabilities and solutions for our customers.
About Ventiv Technology
Ventiv Technology is a leading global provider of risk management information systems (RMIS), enterprise risk management (ERM), insurance claims, billing, policy, and administration technology integrated with its market-leading analytics and predictive models. Ventiv Technologys SaaS and on-premise solutions are deployed by insurers, brokers, insured corporate entities, federal and regional governments, public entities, third party claims administrators (TPAs), and risk pools across a variety of industries, including transportation & logistics, retail, financial services, leisure & hospitality, energy, aviation and manufacturing among several others.
With over 45 years experience, Ventiv serves 450 customers and more than 450,000 users over 40 countries. Ventivs global footprint and experienced team of industry veterans deliver insights to organizations that allow our customers to predict, manage, and respond to risk.
For more information, please visit http://www.ventivtech.com.
Read this article:
Ventiv Technology Appoints Mark Tainton as New Global Head of Advanced Analytics - Business Wire
Resurgent COVID-19, Flu and Other Viruses are Pushing New Zealand’s Health System to the Limit – Global Biodefense
As Aotearoa New Zealand heads into the colder winter months, the pressures on our health system and staff are growing significantly.
On top of the ongoing impact of COVID-19, flu cases have begun to spike.
Conditions are also primed for potential outbreaks of other illnesses including measles, whooping cough and respiratory syncytial virus (RSV).
If we are to weather the coming storm, there will need to be a recommitment to public health measures that slow the spread of respiratory infections, as well as a renewed drive for widespread vaccination.
The first wave of Omicron swept through Aotearoa New Zealand in late February and March.
Unfortunately, as seen in many other countries, the fall in case numbers has been much slower than the rise, with infections reaching a plateau in all age groups.
Case numbers have been driven by a high number of infections in young people between the ages of 10 and 29 years old. But the elderly have borne the brunt of hospitalisations, largely due to the higher risk of severe outcomes for older adults.
Age stratification aside, persistent inequities have also left Mori and Pasifika at the sharp end of the outbreak both in terms of cases and severe outcomes.
Hospitalisation rates and reinfections are rising in many age groups, mirroring trends seen elsewhere.
New Zealand can expect another resurgence of COVID-19 this winter.
While 95% of New Zealand has received the second dose of the vaccine, one of the highest rates in the world, fewer have received a booster. We also have lower than optimal levels of childhood vaccination.
Long COVID will add a layer of complication for our medical services.
A recent report by the US Center for Disease Control (CDC) suggests one in five COVID-19 survivors aged 18 to 64 years old, and one in four survivors aged 65 years and above, experienced at least one condition that might be attributable to previous COVID-19 infection.
Despite being labelled as one of this generations disability challenges, there is currently no test for long COVID.
Worryingly, COVID-19 deaths in Australia have started to trend upwards. Evidence from Australia has shown that the overwhelming majority of people are dying from, not with, COVID-19.
Health professionals are not just worried about COVID-19. The flu and other viruses are also expected to hit hard this year.
Thanks to closed borders, managed isolation and quarantine, and lockdowns, the last time New Zealand experienced a flu season was in 2019.
We are now more vulnerable to the virus. There has already been a reported surge in Dunedin.
In response, the government has made two million vaccines available and has the widened eligibility for people to get vaccinated for free.
Unfortunately, there is growing concern that part of the population may not get vaccinated due to immunisation fatigue, or may be unable to due to structural inequities in access to vaccines.
As with COVID-19, looking across the Tasman can help us understand what is likely to happen in New Zealand.
Much like New Zealand, flu rates in Australia have, until now, been very low due to closed borders.
The latest Australian national surveillance for influenza shows a steep rise in rates of the flu, as well as rising hospital and ICU admissions.
Before the COVID-19 pandemic even started, our research highlighted declines in childhood immunisation for vaccine-preventable diseases.
Public health officials are now noticing further significant declines in routine childhood immunisations.
In April, the World Health Organization reported a 79% increase in measles cases in the first two months of 2022.
Meaningfully addressing long-standing inequities in childhood vaccination programmes takes on new urgency in the face of these vaccination gaps.
Lessons can also be learned from the COVID-19 vaccination programme regarding the success of handing leadership to Mori and Pasifika community providers to improve vaccination rates.
We have long been warned that an underfunded health system might struggle with a seasonal surge in demand.
Pressure points have appeared across the country. On May 23, Dunedin Hospitals COVID-19 ward was at capacity. Two days later, Nelson Hospital also hit capacity limits, leading to temporary ambulance ramping at the emergency department.
Canterbury District Health Board, Hawkes Bay District Health Board, and MidCentral District Health Board have recently urged people to consider alternative care for minor conditions to help alleviate the pressure.
Community health providers are also struggling to meet demands.
During the winter, we spend more time in indoor spaces with inadequate ventilation. We are also becoming more complacent with our mask wearing as policies relax.
In the future, vaccines will need to improve.
But for now, its important to remember that three doses of the COVID-19 vaccine remain effective against hospitalisation even for newer variants, as well as lowering the risk of infection.
But there are things we can all do to avoid the worst this winter has to offer, including to:
Finally, workplaces should continue to support people to stay home and isolate if required.
ABOUT THE AUTHORS
Matthew Hobbs, Senior Lecturer in Public Health and Co-Director of the GeoHealth Laboratory, University of Canterbury. As a researcher he is interested in modifying environments to improve behaviour/health. His research uses geospatial and population health methods to inform local and national public health policy to reduce health inequities. Most recently he was awarded by the European Association for the Study of Obesity (2019) the New Investigator in Public Health Award which recognizes rising stars in the field of obesity through their wider contributions to public health and it is highly regarded throughout the scientific community.
Alex Kazemi, Intensive Care specialist, University of Auckland. Dr. Alex Kazemi has a background in clinical medicine, but is currently pursuing a Masters degree in Public Health at the University of Auckland. He completed his MBBS at Kings College, London, graduating as a doctor in 1996 from Kings College Medical School. Subsequently he completed specialist training in Aotearoa with fellowships in Emergency Medicine and Intensive Care. He worked as an Intensive Care Specialist at Middlemore Hospital, Auckland for ten years. He has interests in use of data analytics for public health and health surveillance as well as in the communication of public health data.
Lukas Marek, Researcher and lecturer in Spatial Data Science, University of Canterbury. I received my PhD in 2015 from the Department of Geoinformatics at Palacky University in the Czech Republic. In my thesis, I conducted a nationwide research study on the spatial and spatiotemporal analysis of gastroenteritis. My main research interest is mostly focused on spatial analyses and (geo)visualisation of health- and wellbeing-related data using spatial data science.
This article is courtesy of The Conversation.
See more here:
Devo Launches New Security Research Team to Accelerate Innovation and Protection for Organizations – GlobeNewswire
CAMBRIDGE, Mass., June 02, 2022 (GLOBE NEWSWIRE) -- Devo Technology, the cloud-native logging and security analytics company, today announced a newly established team of security researchers and data scientists to form Devo SciSec. With Chief Technical Officer Gunter Ollmann at the helm, SciSec will bring together threat research, advanced data science, and machine learning expertise to enable organizations to preemptively detect and mitigate entire classes of threats.
As a result of inadequate repeatable threat management methodologies and the continually changing and expanding threat landscape, security teams struggle to stay ahead of threat actors. Compounding this, security operations centers (SOCs) cant find and retain the talent they need to effectively adapt their security defenses. The SciSec team is set out to assist Devo customers by revolutionizing threat research and providing them with actionable intelligence and security content. Devo SciSec arms customers with expert-built detections, investigation tactics, and security analytics to complement analysts skill sets and greatly improve SOC efficacy.
Security has historically been treated as an art and not a science, which has made approaches to threat protection very reactionary and more like firefighting, rather than a proactive activity. This instantly puts organizations at a disadvantage, said Gunter Ollmann, CTO at Devo. Devo SciSec hopes to change how we fundamentally approach threat research by providing collective intelligence built by data scientists that forecasts a global view of threats and adversaries, paired with optimal response strategies.
The addition of SciSec lends itself to the autonomous SOC, a concept introduced by the company last month when announcing the acquisition of AI-powered threat hunting company Kognos. The autonomous SOC establishes complete visibility, automation, analytics, and open access to community expertise and content to enable the SOC to eliminate the repetitive manual tasks that lead to analyst burnout and SOC inefficiency. SciSec plays a key part in arming Devo with the insights that help security teams work smarter.
Since its formation, SciSec has used their innovative approaches to deliver several value-adding capabilities for Devo customers.
The launch of SciSec coincides with Devos announcement of $100 million in Series F funding at a valuation of $2 billion, led by Eurazeo.
The Devo SciSec team will be in attendance at the 2022 RSA Conference, June 6-9. For more information visit here, or come by booth #3241.
About DevoDevo is the only cloud-native logging and security analytics platform that releases the full potential of your data to empower bold, confident action. With unrivaled scale to collect all of your data without compromise, speed to give you immediate access and answers, and clarity to focus on the signals that matter most, Devo is your ally in protecting your organization today and tomorrow. Headquartered in Cambridge, Massachusetts, with operations in North America, Europe and Asia-Pacific, Devo is backed by Insight Partners, Georgian, TCV, General Atlantic, Bessemer Venture Partners, Kibo Ventures and Eurazeo. Learn more at http://www.devo.com.
Devo PR Contact:Shannon Todescashannon.todesca@devo.com+1 (781) 797-0898
The rest is here:
NLT & Canopy Weather Partner to Provide FEMA Crucial Disaster Information in Response of Deadly Tornadoes – PR Newswire
WASHINGTON, June 2, 2022 /PRNewswire/ -- New Light Technologies (NLT) and Canopy Weather, a leading provider of weather data products, recently announced a collaboration to provide the Federal Emergency Management Agency (FEMA) and other organizations critical data for disaster response during major tornadoes. The firms are providing the agency with near-real-time tornado tracks that support enhanced situational awareness for effective disaster response when major tornadoes hit.
About 1,200 tornadoes hit the U.S. every year, causing, on average, 70 fatalities, 1,500 injuries, and significant damage and destruction. Wind from tornadoes can reach more than 300 miles per hour, and damage paths can be more than one mile wide and 50 miles long. It is likely that with the impacts of climate change, the frequency and severity of deadly tornadoes will increase, putting vulnerable populations and critical infrastructure at risk. When a tornado makes landfall, FEMA must rapidly gather information on the size, scope, extent and potential impacts of the tornado in order to efficiently deliver resources to the communities most in need.
Through this collaboration, NLT, a renowned leader in the development of disaster analytics and delivery of real-time decision-support systems for emergency management, works directly with FEMA to provide the agency with accurate swaths of all tornados making touchdown in the US continent and its territory within minutes after they do. Relying on innovative proprietary technologies and data, Canopy Weather's team of meteorologists and scientists are able to record the exact path of any tornado and generate an accurate, granular swath, immediately upon impact.
"This collaboration with NLT allows us to serve FEMA and first responders, which enables them to respond with unparalleled speed and precision, clearing the 'information fog' within minutes after a tornado strikes," said Matt Van Every, CEO of Canopy Weather. "In real time, tornadoes are incredibly difficult to pin down at the street or building level, let alone which side of a town was hit. Our team of meteorologists have been working toward this tornado capability for almost a decade. Previously, it typically took days to gather information about what was or was not hit; now it's mere hours."
On December 10, 2021, a swarm of tornadoes tore a 200-mile path through the U.S. Midwest and South, causing large-scale significant damage and impacting close to 80 victims, including in Mayfield, Kentucky. More than $64 million in federal assistance have been approved for Kentucky homeowners, business owners and renters. Immediately after the incident, FEMA's Response Geospatial Office (RGO) received through NLT the precise storm track and tornado swath, which were used to prioritize areas that have likely been impacted and where local populations are especially vulnerable. The derived analytics were used to inform FEMA's leadership as well as President Biden about the potential impacts of the disaster.
According to Dr. Ran Goldblatt, NLT's Chief Scientist, this data was vital for FEMA`s situational awareness during the historic December 10-11 storms and tornadoes. "Knowing the storm tracks and tornado swaths in close to real time is crucial for rapid and efficient situational awareness," said Goldblatt. "When automatically integrated with other tools, such as the Tool for Emergency and Prioritizing Operations (TEMPO), FEMA can better understand the potential impacts of deadly tornadoes upon vulnerable communities and critical community functionalities."
For over 20 years NLT has been providing organizations with all-hazard, risk-based geospatial and data analytic modeling and simulation services. "Our partnership with Canopy Weather is an example of a fruitful collaboration between leading data and service providers that leads to real time analytics to convert data into saving-lives actions," said Rob Pitts, NLT's Program Manager. "Our goal is to leverage the best available disaster-related data and to combine it with NLT's advanced modeling tools to provide FEMA with the necessary insights to improve disaster operations," said Pitts.
About New Light Technologies
New Light Technologies Inc. (NLT) is a leading provider of integrated science, technology, and mission services with over 20 years of experience serving government and commercial organizations. Offering renowned expertise in cloud, agile software development, cybersecurity, data science, geospatial and remote sensing, NLT provides comprehensive consulting, research, digital transformation services, and fit-for-purpose analytics solutions for a range of industries from emergency management to economics to health. NLT offers distinctive capabilities in the delivery of secure cloud-native platforms and web-based decision support tools and has pioneered predictive disaster risk analytics in support of homeland security missions. To learn more visit http://www.NewLightTechnologies.com.
Media Contact:Tim Kuhn[emailprotected](202) 630-0497
SOURCE New Light Technologies and Canopy Weather
Follow this link:
Why AI and ML is Reshaping the Fintech Industry – FinTech Magazine
Andreas Braun is the Director Artificial Intelligence & Data Science at PwC Luxembourg. He is passionate about technologies that have the human factor at the forefront. Biometrics, in particular in its usage of Machine Learning and AI methods is his area of expertise. We caught up with him to find out how his role has developed, and why AI is at the forefront of the most innovative fintech solutions.
I am the AI lead at PwC Luxembourg, which gives me the great opportunity to have an impact on all the different sectors that are touched by AI which pretty much means all of them.
While Luxembourg is first of all known as a prime destination for the fund industry, it also has a very healthy start-up scene, with a strong focus on fintech. We are involved as mentors and partners for many of the initiatives in this area. AI plays a key role in many innovative products and services provided by fintech companies.
The last decade in AI has been truly exceptional. From humble attempts to improve scanning documents, Deep Learning and further developments have been truly exciting. It enabled computers to overtake machines in many tasks, from object recognition, face detection, to many predictive tasks that are highly relevant in the financial sector.
There are however limitations for the foreseeable future. Just like our cars are not fully self-driving yet, we still need the human factor in many investment and financial decisions and for the foreseeable future this will not change. There is a lot of opportunity in combining human ingenuity and AIs capacity in learning rapidly from large amounts of data.
Pretty much all fintechs use data, and most of them AI. So let me just pick two examples. Know Your Customer (KYC) and Anti Money Laundering (AML) are areas where AI-powered robots are able to rapidly screen massive amounts of information and find links between globally distributed information sources. Imagine a fintech that would have predicted (and maybe prevented) the Wirecard scandal if it happened just a few years later.
RegTech is exploding using AI technologies to deal with the increasing cost and complexity of compliance. AI systems assess the most recent regulations and legal communications from governments around the world and compare it with your procedures. The cost savings can be massive. And of course, fintechs could help regulators create more effective regulation, too.
Maybe lets just focus on one key regulation, the European AI Act that has been in the making for almost two years. It foresees a risk-based approach towards the use of AI, when dealing with customer data, or when making decisions that may impact your customers, which covers a large chunk of AI applications. Credit risk for persons and claims processing are two areas in which many fintechs are active that are specifically mentioned as high risk. However, the regulation is still under discussion, so changes may still apply in the future.
The smart incumbent banks are looking at inspiration or see it as a potential investment opportunity. In terms of personal banking, many have rolled out features that leverage AI to analyse personal spending habits, also in order to suggest suitable products. Others acquire teams in order to foster their own innovation capacity.
The sector is however still hampered by a long legacy of data processing and storage, meaning that they are partially working with very old infrastructures, dissociated data sources and manual processes that prevent the effective use of data and in turn, AI.
Looking into the crystal ball is always challenging, but one topic will be hyper personalisation of services, which becomes more feasible with advanced AI techniques. The capacity to provide your client with the perfect service is bound to be a significant competitive advantage. This has driven the success of many shopping and streaming services and the AI investments and advances are massive. A fintech that cracks this case will have a chance to become the next unicorn.
One can see this as a continuation of existing trends. Mobile first, multi-channel services have been shaping the financial industry already over the past years. These technologies will allow our mobile devices to become more powerful and our access to online financial services becoming more immersive. Personally, I am not sure if I want to check how my stocks are doing through VR goggles, but it will enable more services being available 24/7. Honestly, who really wants to spend 2 hours in traffic and the waiting room, to get a single signature from your agent?
Curiosity has been driving me in both my research and professional career. The fintech sector has an exceptional capacity to pique my curiosity. I never cease to be amazed by the new means and ways in which fintech companies are using the latest innovations from research, find new applications for old stuff and come up with amazing solutions to many of our main pressing global challenges.
See the original post here:
Why AI and ML is Reshaping the Fintech Industry - FinTech Magazine
Predicting the Future with Business Analytics | Royal News: June 1 2022 – Scranton
The annual median pay for a career in data analytics can range from $82,000 to $114,000. Additionally, market-related analytics is one of the fastest growing fields with Forbes estimating a growth rate of 67% over the past year, and 136% over the past three years and the U.S. Bureau of Labor Statistics projecting employment growth of up to 27% from 2016 to 2026.
The University of Scrantons Bachelor of Science in business analytics is designed to help students learn the variety of skill sets needed to be successful in this interdisciplinary field, where they will be asked to conduct, analyze and evaluate data in order to make better business decisions. Potential job titles one can pursue with this degree include management consultant, data analyst, operations research analyst, and more.
As part of the Kania School of Management (KSOM), Scrantons business programs are accredited by the rigorous standards of the Association to Advance Collegiate Schools of Business (AACSB), which is an elite accolade that fewer than five percent of business schools worldwide hold.
Scrantons business analytics program includes courses from data mining, database management systems, and simulation as well as the communication, managerial and decision-making skills needed to Students will be introduced to tools like structured query language (SQL) and Visual Basic for Applications (VBA) as related to the use of analytics in decision making.
Scranton also offers a masters degree in business analytics in an online or on-campus format, as well as a business analytics certificate.
Continue reading here:
Predicting the Future with Business Analytics | Royal News: June 1 2022 - Scranton
dunnhumby announces winners of third annual coding event with INR 4.5 Lacs prize pool – PR Newswire
In this edition, dunnhumby presented challenges to participants in two important technology streams in a time-controlled live coding environment. The React track tested participants' skills in creating fast and scalable user interfaces for large web applications, whereas the .Net track gave a problem statement to the participants that tested their application development skills. Through a series of rounds, the challenges judged multiple skills including domain proficiency, problem solving, and business understanding along with scalability, innovation and effectiveness of the solution.
Mujaffar Husain, a software engineer from Noida, won the grand prize of INR 1,00,000 in the React challenge. Following Mujaffar, Sai Sharan Kuchalakanti from Hyderabad and Sushant Sawarn from Bengaluru won the first runner-up and second runner-up prizes.
The winner of the .Net challenge with the same prize money was Deepak Mishra, an application developer based in Hyderabad, followed by Sandeep Arra from Hyderabad and Divya Parakh from Udaipur who came second and third respectively.
Nitesh Maan, Director of Media Engineering at dunnhumby explains, "It's great to see the enthusiasm around our third edition of Code Combat featuring first ever React and .NET coding challenge. We had some great submissions for both, from all parts of the country. These technologies have become popular choices amongst the developer community for building cutting-edge software applications that provide a rich user experience. At dunnhumby, React and .NET are amongst the many technologies used by our software engineers to build high performing, self-serve, enterprise grade products that enhance the shopping experience of billions of customers worldwide."
The annual nationwide event is part of dunnhumby's continued efforts to promote a culture of innovation and problem solving skills. With a workforce of more than 750 and strong double digit growth in recent years, it is one of the fastest growing customer data science companies in India. Over the past few years, the company has expanded its engineering and product development capabilities in India significantly, while also creating a workplace with greater gender diversity and high employee satisfaction scores.
Manoj Madhusudanan, Head of dunnhumby India, concludes, "With this third edition, Code Combat has become a household name amongst coding enthusiasts and serves as a symbol of our pursuit for innovative, high quality engineering solutions. We are looking forward to continuing with coding challenge in varied technology streams, to nurture the tremendous talent we have in the country."
About dunnhumby India
Established in 2008, dunnhumby Indiais a hub ofData Engineering, Data Science,and Product Development with deep expertise in Price & Promotions, Category Management, Customer Knowledge, Customer Engagement,and Media delivery. The teams in India play a key role for clientsspread globallyat different stages of their journey with dunnhumby.
With a long history of Data Science and software developmentandthe ability to attract exceptional talentthrough encouraginga culture of innovation, agility, and flexibility, the India office sits at the heartof global dunnhumby, influencing the success of our entire client network. It is at the forefront of developing the best products and science using a variety of techniques and tools including highly scalable cloud-hosted models, Machine Learning, and Artificial Intelligence.
Learn more atwww.dunnhumby.com
Photo: https://mma.prnewswire.com/media/1830305/codecombat_winners.jpgLogo: https://mma.prnewswire.com/media/1216769/Dunnhumby__Logo.jpg
SOURCE dunnhumby
Read more:
dunnhumby announces winners of third annual coding event with INR 4.5 Lacs prize pool - PR Newswire
Analytics and Data Science News for the Week of May 27; Updates from Amplitude, Gartner, TigerGraph, and More – Solutions Review
The editors at Solutions Review have curated this list of the most noteworthy analytics and data science news items for the week of May 27, 2022.
Keeping tabs on all the most relevant data management news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last month, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy data science and analytics news items.
With the Presto Query Analyzer, data platform teams can get instant insights into their Presto clusters including query performance, bandwidth bottlenecks, and much more. The Presto Query Analyzer was built for the Presto community and is free to use. Presto has become the SQL query engine of choice for the open data lakehouse.
Read on for more.
DSML engineering platforms focus primarily on the development of machine learning models that can rive business varying systems. As a result, tools in this market have evolved from supporting a core data science audience with code-driven model development to now also supporting data engineering, application development, and infrastructure user personas. Gartner recommends selecting a provider by identifying gaps in current model development practices and paying attention to model deployment, management, and governance capabilities.
Read on for more.
Multipersona data science and machine learning tools enable more people in an organization to utilize key capabilities for advanced analytics. Gartner is quick to recommend against limiting these platforms to model prototyping and development. Data and analytics leaders should put these products to use to fully support the deployment of data science and machine learning models. This should cover not only technical aspects but also governance, risk management, and responsible AI ethics.
Read on for more.
The ML Workbench is a Jupyter-based Python development framework that allows data scientists to quickly build powerful deep learning AI models using connected data. The ML Workbench enables organizations to unlock even better insights and greater business value on node prediction applications, such as fraud, and edge prediction applications, such as product recommendations.
Read on for more.
For consideration in future data analytics news roundups, send your announcements to tking@solutionsreview.com.
Tim is Solutions Review's Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.
Read this article: