Category Archives: Data Science
Top 5 trends in business analytics: Heres how ISB Executive Educations Applied Business Analytics progr – Times of India
It is no secret that data has the potential to power businesses of all kinds. It can instantly give organisations a competitive edge in solving problems, increasing efficiency, boosting revenue, cutting risks, and various other preventive and predictive benefits. The power of data and analytics is so immense that today businesses, irrespective of their size, are leveraging analytics tools to determine how data can be used for innovation to stay ahead in a constantly changing corporate landscape. To understand how business analytics work in real life, one can look at the Covid-19 pandemic when businesses that were data-adept could stay ahead because they could minimise risk and monitor remotely for seamless operations.'; var randomNumber = Math.random(); var isIndia = (window.geoinfo && window.geoinfo.CountryCode === 'IN') && (window.location.href.indexOf('outsideindia') === -1 ); //console.log(isIndia && randomNumber A recent report published by The Economic Times also states how 83% of India's data-driven companies are more resilient and confident during the pandemic than non-data-driven companies. This indicates that analytics is one of the biggest buzzwords in the business world right now, and it is set to get more significant in the coming years. With customers becoming more data-aware, there is also additional pressure on businesses to come up with smart analytics solutions to stay ahead of the rest. This puts the spotlight on the need for data analysts and business analysts who can drive the growth of an organisation by tapping into the various aspects of data. These professionals are expected to help companies create better roadmaps by predicting what their customers want and helping them become more responsive.Congratulations!
You have successfully cast your vote
Data Security: Providing secure solutions is one of the top-most ways to gain a customers trust and move up the success ladder. This explains why many companies invest in security solutions to stay competitive and relevant. Data security is a huge asset for both small and large organisations as it adds to the overall reliability and integrity factor. It also prevents unauthorized access to a companys equipment and systems and helps avoid the unnecessary expenditures that often occur when data loss occurs.
Natural Language Processing (NLP): NLP is another big trend that data professionals should pay close attention to develop optimized business solutions. When applied together with business analytics tools, NLP can allow users to interpret data in their native languages. Through this, NLP converges people, data, and analytics tools to provide insights to organisations for the best business results.
Real-time Data and Analytics: This trend has tremendous potential in the post-covid reality where things are continuously in flux. There is an immense need for real-time and accurate updates for developing solid business strategies to respond to unpredictable situations. This is why businesses have to be adept at accessing real-time data to stay on top of changes and challenges. It can go a long way in cutting risks and developing effective business responses by immediately accessing relevant data whenever the need occurs. High-speed real-time data access is one of the best ways companies can go ahead of competitors.
Data Literacy: As data becomes the foundation of success for businesses of all sizes, the need to use and understand data as a collaborative tool is at an all-time high. This makes data literacy a top requisite for both professionals and companies as it is a critical tool for success. As a textbook definition, data literacy is the ability to understand, read, write, and communicate data in the context of organisational benefit. As per Gartner, the lack of data literacy skills is one of the biggest roadblocks in companies and also for professional growth in the current time. As trends like predictive analysis become popular in the world of data, analytics-related work will not be done only by experts, thus emphasizing the need for digital literacy!
In this business landscape, professionals need to be aware of the latest data analytics trends, and one of the best ways to attain it is withISB Executive Educations Applied Business Analytics programme. The 12-week programme is designed in collaboration with Eruditus Executive Education to help you explore in-demand ML techniques in business and their applications. No prior coding experience is required, and at the end of the programme, you will gain actionable insights from an industry-oriented curriculum and real-world examples taught via video lectures from the renowned faculty at ISB. The programme also includes live online sessions with Eruditus programme leaders.
Some of the ways in which ISB Executive Educations Applied Business Analytics programme will help you
Designed for a wide range of professionals This cutting-edge programme is best suited for professionals who are:
1. Looking to upskill in data analytics to manage data science and analytics teams and improve functional performance through analytics.2. Interested in leveraging business analytics to outpace the competition and develop data-driven growth strategies.3. Seeking to sharpen their strategy offering for clients by providing data-driven solutions to their problems.
A chance to be a part of the ISB Executive Network
ISB Executive Network comes with various benefits such as: ISB Newsletters with the latest updates Learning and Networking opportunities via ISB Executive Education Network Group on LinkedIn Exclusive invitations to Online and Offline events like Webinars, Conferences, and Master Classes Scope to facilitate as a Guest Speaker at ISB Executive events Preferential pricing with 10% discount on programme fee on ISB Executive Education Open programmes Upgrade to ISB Executive Alumni Group with 100+ learning hours
Learn a range of data-related subjects This programme has 12 modules to offer, starting from the introduction to Business Analytics to Data Preliminaries for Analytics. These modules will help you learn from the basics to more advanced Business Analytics applications. Other modules include topics like Regression Descriptive and Predictive Analytics, Visualisation Perceptual Mapping for Business Analytics, Decision Trees for Business Analytics, Network Analytics for Business Analytics, Experiments, and Causal Inference.
About Indian School of Business
Go here to see the original:
Postdoctoral Research Fellow, DARE Training Centre job with UNIVERSITY OF SYDNEY | 275911 – Times Higher Education (THE)
About the opportunity
Funded by the ARC Industrial Transformation Research Program, the DARE Training Centre is a group of multi-disciplinary, world-class data scientists. We are working to develop and apply data science methodologies for the stewardship of Australia's natural resources, with water, minerals and biodiversity as the three pillars of research. These methodologies will focus on the quantification of uncertainty in complex models of natural systems at scale.
We are currently seeking a postdoctoral research associate to join the team and work towards meeting the DARE objectives. In addition, you will be provided with opportunities to pursue your own research interests, grants and fellowships whilst working alongside some of Australia's best data scientists, industry partners and government agencies.
For more information on DARE, please click here
About you
The University values courage and creativity; openness and engagement; inclusion and diversity; and respect and integrity. As such, we see the importance of recruiting talent aligned to these values and are looking for Postdoctoral Research Associate who has:
To apply for this position, please address the above criteria in a cover letter which you should attach to your application.
Closing date for applications
11.59 pm, Sunday 23 January 2022
Please note: The University of Sydney will be closed from 24 December 2021 until 9 January 2022 inclusive.
To keep our community safe, please be aware of our COVID safety precautions which form our conditions of entry for all staff, students and visitors coming to campus.
Sponsorship / work rights for Australia
Please note: Visa sponsorship is only available for this position for candidates who are currently located within Australia.
Pre-employment checks
Your employment is conditional upon the completion of all role required pre-employment or background checks in terms satisfactory to the University. Similarly, your ongoing employment is conditional upon the satisfactory maintenance of all relevant clearances and background check requirements. If you do not meet these conditions, the University may take any necessary step, including the termination of your employment.
EEO statement
The University of Sydney is committed to diversity and social inclusion. Applications from people of culturally and linguistically diverse backgrounds; equity target groups including women, people with disabilities, people who identify as LGBTIQ; and people of Aboriginal and Torres Strait Islander descent, are encouraged.
How to apply
Applications (including a cover letter, CV, and any additional supporting documentation) can be submitted via the Apply button at the top of the page.
For employees of the University or contingent workers, please login into your Workday account and navigate to the Career icon on your Dashboard. Click on USYD Find Jobs and apply.
For a confidential discussion about the role, or if you require reasonable adjustment or support filling out this application, please contact Rebecca Astar or Linden Joseph, Recruitment Operations, by email torecruitment.sea@sydney.edu.au
The University of SydneyThe University reserves the right not to proceed with any appointment.
Click to view the Position Description for this role.
Read more here:
We’re starting to see a national response to ransomware, says Mandiant CEO – CNBC
Kevin Mandia testifies during a Senate Intelligence Committee hearing on Capitol Hill on February 23, 2021 in Washington, DC.
Drew Angerer | Getty Images
As the recent Log4j breach demonstrates, U.S. businesses and government organizations have been taking a pounding from cybercriminals. It's coming in the form of ransomware, data breaches, distributed denial-of-service (DDoS), and other damaging attacks.
Now, many are saying enough is enough.
"I think more people are taking advantage of the United States and our openness and our true global workforce than in any other nation," said Kevin Mandia, CEO of cyber security company Mandiant, in a session on cybersecurity at CNBC's recent Technology Executive Council (TEC) Summit in New York.
Rather than simply bolstering traditional defenses such as firewalls and waiting to be the next potential victim of a cyber assault, companies are beginning to take a more proactive approach to security. They're going on the offensive, actively seeking out cyber threats and disabling them before they can wreak havoc on systems and networks.
The increase in ransomware, one of the more insidious and damaging types of attacks, is a major driver for going on the offensive. An August 2021 report from research firm International Data Corp. showed that more than one-third of organizations worldwide have experienced a ransomware attack or breachthat blocked access to systems or data in the previous 12 months.
Weary of the ongoing assaults, organizations are fighting back.
"What you're starting to see is a coordinated national response maybe even a coordinated international response because [of] ransomware," Mandia said. "Quite frankly everybody hates it except for people doing it and the people harboring those who do it."
A good example of an effective coordination was the takedown earlier this year of REvil, a ransomware-as-a-service operation linked to Russia. A group of countries and law enforcement organizations used technical and legal methods to knock the operation off the Internet.
While it's uncertain exactly how REvil was taken out of commission, the collaboration by multiple entities is a positive development in the effort to minimize or eliminate threats, Mandia said. With ransomware becoming a national security issue as well as a criminal one, the U.S. needs to consider bringing military assets to bear in the fight to stop these attacks, he said.
"We can do a lot of different things rather than just constantly making it a clean-up on aisle nine after the crime," he said. Military action "doesn't mean drone strikes, it means proportional response" to the attacks, he added. That can only happen when the sources of the attack are identified.
A strong step would be the creation of a national "doctrine" that states how the U.S. will deal with creators of ransomware and other cyber threats, as well as the nations that harbor them, Mandia said.
"There could be some vagueness to that doctrine, but people need to know that the nation is going to have a coordinated response" to attacks, he said. "There comes a time where you just can't stand there and take it anymore."
Technology executives expect the high level of external threats to continue, with TEC members responding to a recent CNBC survey saying that state-sponsored cyber warfare (32%) and criminal organizations (25%) remain the most dangerous cyber threats. They give the Biden administration decent marks in its cybersecurity efforts so far, with less than 5% of TEC members saying Biden has done a "poor" job on cybersecurity during his first year in office. Thirty-nine percent of respondents said the Biden administration has done a "good job," while another 9% described its efforts as "excellent." Another 35% said the administration has done an "average job" when it comes to cybersecurity.
On an individual basis, businesses can take steps to get ahead of cybercriminals. For example, they can deploy threat hunting and threat intelligence tools or services.
With threat hunting, companies' security teams proactively search through networks and systems to find and isolate advanced threats that can evade older security tools such as firewalls, intrusion detection systems, and security information and event management products. The latest threat-hunting offerings can be at least partially automated via technologies such as machine learning, so companies don't need to rely on time-consuming manual hunting processes.
Threat intelligence is also effective at detecting and stopping cyber attacks. Security teams can tap resources such as open source intelligence, social media intelligence, and services from a number of vendors to track existing and emerging threats.
Threat intelligence lets companies develop a more proactive approach to cyber security, in large part because it's predictive. Another key benefit of using such resources is that they promote the sharing of knowledge and experiences among the cyber security community, which supports the idea of a broad effort to "attack" cybercriminals before they strike.
Identity technologies such as multi-factor authentication (MFA) and advanced endpoint security tools, both of which support the concept of "zero trust" security, can also help companies be more proactive.
"Multi-factor authentication is a phenomenal thing for breaking lateral movement," Mandia said. MFA is the first step toward a zero-trust network, he says.
The recent TEC survey finds more than half (52%) of respondents saying their firms are in the initial stages of implementing zero trust security; another 27% say they already have implemented the approach and are seeing the benefits.
Monitoring of networks and endpoints, such as mobile devices, will continue to get increasingly sophisticated and effective with the addition of AI, machine learning, and data science, because security teams can better predict when suspicious activities are actually the beginning of an attack.
Aside from technology, employeetraining in security awareness is essential for building a proactive security program.
Tactics such as phishing, malware, and social engineering in many cases succeed because employees are not properly trained or not trained at all in how to recognize these incidents.
With the proper mindset, tools, and training, companies can create a proactive cyber security program that thwarts cybercriminals before they can do damage.
Bob Violino, special to CNBC.com
Continue reading here:
We're starting to see a national response to ransomware, says Mandiant CEO - CNBC
Amplitude Ranks #1 in Three Analytics Categories by G2, Announces 2021 Datamonsters of the Year Winners – Business Wire
SAN FRANCISCO--(BUSINESS WIRE)--Amplitude, Inc. (Nasdaq: AMPL), the pioneer in digital optimization, today announced that it ranked #1 across six categories in the G2 Winter 2022 Report, showcasing the ease of doing business with Amplitude and its product superiority. This is the fifth consecutive quarter that G2 users ranked Amplitude as the #1 Product Analytics solution. Amplitude also announced the 2021 winners of its Datamonsters of the Year awards, highlighting the top global leaders leveraging Amplitude to build data-informed cultures for product-led growth. Together, the G2 rankings and the Datamonsters of the Year awards highlight customers deep trust in Amplitudes award-winning technology as a strategic part of their digital growth stack.
G2 Winter 2022 Report: Leading Product Analytics in the Enterprise
As the worlds largest software review site, G2 recognizes the top picks in business software and services based on feedback from thousands of G2 users, from product managers to marketers and executives. In the Winter 2022 Report, G2 users rated Amplitude as the #1 Product Analytics solution overall as well as the top Product Analytics solution for enterprises, signaling strong enterprise adoption of Amplitudes Digital Optimization System.
G2 users also rated Amplitude #1 in both Mobile Analytics and Mobile App Analytics. According to data from Sensor Tower, consumer app spending is expected to reach $270 billion by 2025, and Amplitudes rankings demonstrate businesses deepening understanding of the value of product data in driving customer engagement, retention and revenue. In addition to the product categories, G2 users also ranked Amplitude #1 in Satisfaction for Product Analytics, #1 in its Enterprise Relationship Index, which tracks ease of doing business, likelihood to recommend and quality of support, and #1 in Mid-Market Usability.
Digital optimization helps teams gain actionable insights into product behavior to help maximize growth. The first step in this journey is product analytics, said Jennifer Johnson, chief marketing and strategy officer at Amplitude. Users have spoken, and after five reports as the #1 solution its clear we are the leader. We congratulate our Datamonsters of the Year award winners for leading the way in the industry, and we thank our customers for their continued trust and support.
2021 Datamonsters of the Year
Today, Amplitude also unveiled the winners of its Datamonsters of the Year awards, an annual celebration of the top customers leveraging product data to drive strategy and business growth. These Amplitude customers continuously created and shared the most insights with their teams over the past year. Amplitudes 25 winners come from product, data science, marketing and analytics teams, demonstrating the democratization of product data across organizations to help drive growth. In fact, more marketing and customer success roles use Amplitude than data scientists, according to our 2021 Product Report.
The top five 2021 Datamonsters of the Year include:
As we aim to provide our customers with world-class content and digital experiences, Amplitude equips us with self-serve, behavioral insights that enable our teams with a real-time view and understanding of our customers changing preferences and needs, said Robbin Brillantes, Data Analytics Head, ABS-CBN Global Ltd. Amplitude makes it easy to share learnings across our organization, which has not only created a data-informed and collaborative culture that brings us closer to our customers, but continues to fuel curiosity, creativity, and experimentationdriving continuous improvement to all ABS-CBN digital offerings for our customers to enjoy.
View the complete list of winnersincluding two returning winners from 2020here.
To learn more about Amplitude, request a custom demo today and download the complete G2 Winter 2022 report here.
About Amplitude
Amplitude is the pioneer in digital optimization software. More than 1,400 customers, including Atlassian, Instacart, NBCUniversal, Shopify, and Under Armour rely on Amplitude to help them innovate faster and smarter by answering the strategic question: How do our digital products drive our business? The Amplitude Digital Optimization System makes critical data accessible and actionable to every team unifying product, marketing, developers, and executive teams around a new depth of customer understanding and common visibility into what drives business outcomes. Amplitude is the best-in-class product analytics solution, ranked #1 by G2. Learn how to optimize your digital products and business at amplitude.com.
See more here:
From Strategy to Action: How to Break the Code of Analytics at Scale in Retail and CPG – Global Banking And Finance Review
By Chris Hillman, Teradatas Data Science Director EMEA and Chris Newbery, Teradatas Industry Consultant Retail/CPG EMEA
A recentMcKinsey&Co articleoutlines the challenges retail and CPG companies face when trying to break the code of digital analytics. They, in common with our own consultants, perceive the need for the retail and CPG leaders of the future to successfully leverage analytics at speed and scale to drive performance. By their estimates, leaders have already delivered more than triple the total return to shareholders than the laggards in this area!
REALISING VALUE FROM ANALYTICS IN RETAIL
We all recognise the drivers: unprecedented changes to consumer behaviour, radically intensified competition, highly pressured margins and rapidly evolving sales channels all accelerated by the COVID pandemic. McKinsey reflects our thinking on two further points of specific relevance to the Chief Data Officer: the need to create a simplified enterprise data architecture, and to give flexibility to quickly deploy and reuse analytics across the organisation. However, the challenge we encounter every day is that moving from understanding the requirements, to implementing solutions, is never as easy as companies hope. We see a big discrepancy in the industry between recognition of the need to change (which most understand is required) and the ability to deliver the transformation (with many unable to do this anywhere near quickly enough). McKinsey suggests that only 20% of CPG businesses are realising value from analytics at scale, and that reticence to invest and scale-up quickly is at least partly to blame.
HARMONISING DATA AT SCALE
Looking at just two dimensions of McKinseys matrix illustrates that with a strategic road map that builds on proven frameworks, the CDO can quickly show significant return on investment and provide the basis for multiplying benefits. Many retailers and CPGs have difficulty in harmonizing data at scale and creating production pipelines that unify data to deliver reusable features. Simply put, too much data resides in silos, only feeding individual departmental insights. Data is seldom shared or re-used, and looking for data, even within a specific area of the business, is difficult and time consuming. Attempting to integrate data sets across stores, geographies, ERP systems, product lines or departments is often all but impossible. These factors are why highly skilled and in-demand Data Scientists spend up to 80 percent of their time just preparing data. Worse, once all that effort has been made, the results are often forgotten, and the next project starts from scratch all over again!
AnEnterprise Feature Storeimplemented on Teradata Vantage can help retailers and CPGs overcome these bottlenecks. Prepared, integrated and performant data features are catalogued and stored in a referenceable library from where they can be reused by other Data Scientists across the entire business; thereby reducing huge amounts of time/money/data duplication, as well facilitating a more efficient processes that will deliver wider implementations of AI and Machine Learning to support multiple business cases.
Crucially, the Enterprise Feature Store approach accumulates value. Start with a small discrete project which will prove the concept and establish the first features in the store. Subsequent projects can then reuse these as they build out the store. The more projects and the more features, the more value is driven. The more value is driven, the more they get developed, and you are rewarded with a virtuous cycle of return on investment.
ALIGN AND ACCELERATE ANALYTICS DEPLOYMENT
The second of McKinseys dimensions to highlight is the need to align analytics vision, talent, and tools. Many in the sector continue to struggle to deploy analytic models with the flexibly required. Different Data Scientists use different languages to create their models, using bespoke pipelines that lead to technical debt (what happens when they leave?) and make it hard to deploy at scale. Instead, huge swathes of data are copied and moved to silos where they increase overall costs and quickly become out of date.
The TeradataAnalytics 123approach provides a formula to overcome these hurdles. The Enterprise Feature Store provides a trusted repository of proven features, while delivering the security, privacy and governance required to build trust. Data Scientists can then use their preferred modelling language(s) to develop the analytics using trusted features, and seamlessly deploy/execute them in Teradata Vantage on live data. Data movements are minimised, and real-time dashboards, automated actions and all other advanced analytics can run from live data, placed at the heart of the decision-making process.
SIMPLE, EFFICIENT, REUSABLE
Working closely with the biggest retailers and CPGs across the world, Teradata has helped them take these vital steps. A typical engagement starts with using integrated data from two or three functions or systems to answer a specific business issue for example, Teradata worked with a French grocer to better understand price drivers on a selection of SKUs and quickly increased profit margins by 5-10%. Once proven, the concept can then be extended to additional lines and locations. The features created are then re-used to drive complementary business analysis and so the benefits multiply. In many cases these solutions leverage existing data, technologies and platforms simply by providing a better unified and more efficient analytics pipeline.
It is clear to most retail and CPG businesses that current manual and fractured approaches to analytics within their businesses are not sustainable. Mainstream retail/CPG businesses are deploying a few million predictive models at best but to compete in this new environment they will need to scale and deploy hundreds of millions of models in production. Breaking the code of analytics is not a once and done action; it is not a sprint, but more of a relay race. Sequential projects will piece together the crucial elements of an enterprise-wide data platform and enable fast, flexible deployment of analytics across the business.
See the rest here:
Subex : What is AutoML and how it is democratizing AI? – marketscreener.com
At a time when businesses are looking at adopting Artificial Intelligence (AI) not just for competitive advantage but even for mere survival, it is increasingly challenging to build a successful AI practice with acute skills shortage for data scientists. On the other hand, Machine Learning (ML), which is built for its application involving laborious tasks such as cleaning data, preparing data and training ML algorithms, validation etc. However, there is continuous effort to automate these tasks by built more intelligent ML procedures and algorithms. AutoML , as we call it, can democratize ML by allowing even business users to develop and execute their own data models with little to no training on data science. Other than bridging the skills gap, automation in ML processes can also eliminate data biases, a major concern today, and reduce human errors while improving overall efficiency. Moreover, AutoML would allow domain experts and technical experts like data scientists, ensuring continued focus on business value.
The need for AutoML - Challenges with traditional ML processes
The growing interest in AI and ML means that there is a crippling shortage of data scientists. There were over 2.7 million open positions for data science and analytics jobs, according to a report by the Business-Higher Education Forum.As per the US Bureau of Labor Statistics, the number of jobs in the data science field will grow by 26 percent through 2026, adding nearly 11.5 million new jobs.
However, demand vastly outpaces supply for data scientists given how challenging it had been for several decades to work in this domain. It is impossible to generate hundreds of thousands of new data scientists in an instant, making it tough for organizations to implement their data science plans.Lack of these skillsets is one of the biggest reasons holding back thousands of companies from starting their AI journey. That said, automation is rapidly trying to solve this problem by making data science more accessible to even those without years of data science experience or even a degree in the subject.
Even so, lack of required skills is not the only challenge that organizations looking at machine learning face today. Even if an organization has the right skills, it may still be highly under-utilized because of the sheer amount of time that it takes just to clean the data. Data scientists spend as much as two-thirds of their time just cleaning the data. Just imagine if this is automated, what kind of fillip it will provide to the domain.
Further, data scientists often don't come with domain and business expertise. However, even if bring domain and business understanding they end up focusing most of their time ingesting and processing data in order to make the models relevant. As a result specific business context often go amiss, leading to unsuccessful adoption of AI/ML.
Traditional ML processes are also highly dependent on human expertise, given the amount of customization that each ML model requires for the specific problem on hand. This makes the entire process inherently time-consuming. To build a new ML model, you still have to through the rigours of data preparation, feature engineering, training the model, evaluation and selection.
Biases in AI and ML models are also a major subject of debate today. Biases often creep in because of manual interventions and the inability of humans to analyze massive data sets for possible biases. The complexity of ML models currently has turned them into black boxes with very little visibility into what goes inside and what is impacting the final results.It is therefore vital to automate the process of machine learning to get better visibility into the models, eliminate all biases, and improve the overall efficiencies.
What is AutoML?
While machine learning continues to evolve, Automated Machine Learning (AutoML) goes beyond automation to accelerate the process of building ML and deep learning models. It automates several aspects of the ML processes, including the identification of the best performing algorithm from the available universe of features, algorithms and hyperparameters.
How Does AutoML Help?
By eliminating repetitive tasks, such as data cleaning, AutoML frees up the highly valued human resources to move towards value-adding analysis and more in-depth evaluation of the best-performing models. This allows enterprises to significantly cut down the time-to-market for the products and solutions built on these ML models.It:
However, complete automation also has its own set of challenges. Tesla founder Elon Musk famously said "AI is far more dangerous than nukes." Apart from Musk, technology leaders like Bill Gates and Steve Wozniak have expressed concern about the dangerous aspect of AI. For instance, anyone with malicious intent can program AI systems to carry out mass destruction. Any powerful technology can be misused and AI is no different. The truth is that as long as AI systems continue to be Black Boxes, it will continue to remain a threat.
Some new age solutions are changing that equation by bringing in transparency and making it easier for users to interact better with AI systems. HyperSense AI Studio , for example, is built with guided analytics capabilities, which is a combination of automated ML and interactive ML. This allows usersto develop applications with a combination of automation and human interaction at any stage of the data science cycle based on task and business user requirements. The solution also generates alerts and gives recommendations to users as they are creating a pipeline.
The process eliminates biases that might have crept in and ensures that the system is not seen as a Black Box by providing details of how it functions and arrives at the results.
Through AutoML, the user can easily automate tasks like data pre-processing, feature engineering and hyper-parameter tuning. Moreover, it allows reusing features instead of rebuilding again from scratch for different models driving AI at scale.
What's trending?
Several Machine Learning processes do not require any human intervention, allowing domain experts to work on building AI models instead of depending solely on the data scientists.
Data scientists, however, do not have to be a rare commodity anymore. Just how the power of a mobile phone camera made citizen journalism possible, the power of AutoML is now creating citizen data scientists . This new breed of professionals will now be able to build their own AI models without any formal education in Machine Learning or AI. Anyone familiar with the usage of Excel and interest in data analysis can potentially become a citizen data scientist.
The role of citizen data scientists will be critical in the growth of AI. In order to scale AI, one needs a massive number of data scientists. Moreover, citizen data scientists don't just fill the skills gap. The biggest mismatch in ML initiatives is that ML projects are often associated with a lack of domain expertise. Data scientists are great at working on data, but they don't necessarily come with a good understanding of your business or industry. Connecting the roles of domain expertise and data expertise has been a massive challenge for several firms.
However, by putting the ability to build a data model into the hands of a business user, AI projects can move towards newer dimensions that can only be perceived by a business domain expert.
What are the benefits of AutoML?
Other than democratizing machine learning, AutoML also has several other advantages. Automating the machine learning processes, for example, can tremendously accelerate the speed of training multiple models while also improving accuracy. In addition, AutoML eliminates biases in datasets by limiting human intervention and automating most of the processes in the ML pipeline. The reduced human intervention also cuts down on human errors in the process.
Automation also makes ML more scalable by enabling multiple ML models to be trained simultaneously, and in doing so, it also optimizes the overall ML processes to a great extent.
HyperSense AI Studio is an excellent example of AutoML platform . The platform enables enterprises to build and operationalize AI successfully using automated machine learning. It increases the efficiency of data scientists allowing them to focus on higher-value tasks. It automates every step of the data science lifecycle including, feature engineering, algorithm selection, and hyper-parameter tuning.
By leveraging HyperSense AI Studio , data scientists and domain experts can easily build ML models with higher scale, productivity, and efficiency while sustaining the model quality. By automating large part of the ML processes, the platform accelerates the time to get production-ready models with greater ease and efficiency. It also reduces human errors mainly because of manual measures in ML models.
It also makes data science accessible to all, enabling both trained and non-trained resources to rapidly build accurate and robust models, thus fostering a decentralized process. Further, it enhances collaboration between domain and technical experts which encourages the focus to remain on business value and not on technical part of the implementation. This helps in bringing down silos and promotes collaboration in other areas as well.
The quality of the machine learning model is not only based on code but also on the features used for running the model. Around 80% of data scientists' time goes into creating, training, and testing data. HyperSense AI Studio comes built-in with a feature store that allows features to be registered, discovered, and used as a part of an ML pipeline. It allows reusing features instead of rebuilding again from scratch for different models driving AI at scale.
Key Takeaway
AI projects for long have been stuck at pilot stages due to several challenges that include lack of data scientists, slow progress in ML processes and even lack of coordination between business and data teams.According to a Gartner study, about 75 percent of organizations will shift from piloting to operationalizing AI by the end of 2024. Also, 50 percent of enterprises will devise AI orchestration platforms to operationalize AI. This, however, wouldn't be possible without leveraging AutoML .
AutoML has the potential of democratizing AI and Machine Learning and finally take AI projects from mere pilots to scaled deployments. AutoML platforms like HyperSense AI Studio increases the efficiency of data scientists by allowing them to focus on higher-value tasks. The platform automates every step of the data science lifecycle including, feature engineering, algorithm selection, and hyper-parameter tuning, ensuring enhanced operational efficiency. In addition, it comes built-in with a feature store that allows features to be registered, discovered, and used as a part of an ML pipeline and even allows reusing features instead of rebuilding again from scratch for different models driving AI at scale.
Get better results from your data with HyperSense AutoML
Try AI Studio for Free
Tharika Tellicherry is an Associate Marketing Manager at Subex. She has extensive experience in Product Marketing, Content Creation, PR, and Corporate Communications. She is an avid blogger and enjoys writing about technology, SaaS products, movies, and digital customer experience.
See the article here:
Subex : What is AutoML and how it is democratizing AI? - marketscreener.com
Multidisciplinary Data Science Scholarship Program to Host Information Session – University of Arkansas Newswire
The College of Engineering is pleased to announce that the Multidisciplinary Data Science Scholarship program will host an in-person information session 5-6 p.m. Jan. 25, 2022.
Sponsored by a grant from the National Science Foundation, the MDaSScholarship program is designed to increase the graduation rate of underrepresented STEM undergraduate students interested in careers in data science.
The program is open to all STEM disciplines science, technology, engineering and mathematics but preference will be given to students enrolled in the data science program and disciplines related to data science (industrial engineering, computer science and computer engineering, physics, biology, and mathematics with a specialization in statistics).
MDaS aims to establish recruitment and retention programs for new data science majors and other related STEM disciplines to help meet growing workforce needs for qualified STEM graduates with data science skills. The program brings faculty and business leaders together to mentor students interested in data science and other related STEM disciplines, in career development initiatives. It also helps to provide skills sought after, with the aim of improving the graduation rate of students interested in careers related to data science.
Data science combines the fields of computer science, mathematics, statistics and information systems with a focus on the generation, organization, modeling and use of data to make scientific and business decisions. To learn more about data science at the University of Arkansas, please visit datascience.uark.edu.
To claim your spot for the Tuesday, Jan. 25, information session about the scholarship program, register here.
Read more:
Smart SNFs: The ROI of Data Analytics – Skilled Nursing News
How SNFs approach and calculate their return on investment for advanced analytics technology
Health care providers know they need to invest in technology for the many benefits it provides from predictive patient analytics to operations analysis and optimization. But particularly in skilled nursing settings where margins are already compressed and staffing challenges are persistent, justifying the investment in new technology can be challenging. Leaders often want to know specifically what return on investment they will see when implementing a new technology solution, so that they can quickly and easily prove the value of that technology to their organizations and investors.
Those data-enabled SNFs successfully utilizing analytics technology today recognize there are a few ways to approach the question of ROI, beginning with identifying two kinds of metrics: hard ROI and soft ROI. Consideration of both will effectively prove the value of the technology, and an understanding from the outset of what is feasible in terms of calculating ROI can be critical.
Shortly after implementation of data analytics, for example, users will likely be able to see some soft return on investment in the form of improved clinical outcomes. Residents may have fewer hospitalizations, longer length of stay, or more appropriate care planning for their specific conditions. Yet the organization may not have the full scope of historical data for a hard ROI calculation to be validated.
A combined approach can help organizations truly show the value of their investment by seeing ROI from both perspectives and understanding the benefits that both offer.
A soft to hard ROI transition doesnt mean you stop focusing on clinical quality improvement, says Kevin Keenahan, SVP business development for PointRight Analytics parent company Net Health. Its an additive process where analytics continues to collect high-quality data and the product demonstrates clinical outcome improvement in real-time. Its important to set milestones where you can calculate reimbursement, lost revenue, and/or cost savings.
Soft ROI calculations for SNFs
On a basic level, soft ROI describes outcomes that clearly have value to an organization, but may not be directly tied to a dollar amount.
Softer ROI is the value of having more accurate data that gets reported to the Centers for Medicare and Medicaid Services, and having accurate publicly reported quality measures, says Janine Savage, VP, product management, analytics and business intelligence for PointRight Analytics. And many of the quality measures impact star ratings, which is far reaching for public perception of the care provided as well as to meet Accountable Care Organization and payer requirements for skilled patient referrals.
In the case of PointRight, the solution offers several different analysis tools. Its MDS assessment tool allows SNFs to evaluate their data and gain a real-time perspective on their performance and outcomes, rather than a look-back based on past data submitted to CMS. The value of this accuracy may not be specifically measurable, but the soft return on investment can come in the form of new partnerships, additional referrals and greater public perception of the SNF and organization.
PointRight users also see better regulatory performance, including an average of 12% fewer total deficiencies, 37% fewer substandard quality of care deficiencies, and 36% fewer widespread deficiencies.
For some organizations, these outcomes and a soft ROI approach suffices in that a SNFs leadership knows the analysis ultimately improves quality.
Analytics unquestionably improves the quality of what we do in a variety of ways, says Mitch Marsh, senior vice president, residential services, for ArchCare a multi-site provider of a health care services including five skilled nursing facilities in New York state that utilizes PointRight. We know that and were not looking for a demonstrable dollar amount.
Hard ROI calculations for SNFs
Yet for other organizations, a hard calculation will be necessary in justifying the investment upfront and following implementation. This more traditional approach to ROI can be applied for SNFs once they have gathered a sufficient amount of data.
We look over time on average for those who use our solutions, and they see an increase of $4.21 in per diem reimbursement rate on average, Savage says. Multiplied by an average stay, it could be a difference of $100,000 or more. We are able to show ROI very objectively.
Other areas where SNFs can determine a measurable ROI include:
Fall prediction: PointRight analysis has determined the cost savings to a medium-sized hospital with average hip fractures as $69,300. This is based on the number of hip fractures per year and reduction of hip fractures due to fall prediction. A SNF can demonstrate its value to referring hospitals and payers by decreasing fall rates, thereby reducing healthcare costs while significantly improving the patient experience.
Hospital readmissions: Knowing the likelihood that a SNF resident will be rehospitalized and adapting care planning to prevent readmissions is directly translatable into a reimbursement dollar figure. Empty beds equal lost revenue.
Reimbursement mix: By analyzing reimbursement-related data and correcting for errors, PointRight users achieve reimbursement levels to which theyre entitled for the care theyre providing. On average, facilities see an increase in their PDPM per diem rate of $4.21 as a result of more accurate MDS coding.
Ultimately, the use of data analytics over time can improve reimbursement significantly.
It could be the difference between having beds empty and getting a very high reimbursement rate, Savage says.
Setting ROI expectations and finding added value
In addition to balancing the measurable value of technology with those outcomes than are less easily tied to dollar amounts, organizations are best served to set ROI expectations from the onset of implementing analytics technology.
Organizations tend to focus on areas where performance could be better, for example, when instead they can realize significant value in areas where they are succeeding.
Dont obsess about the failures, Marsh says. Instead, investigate and clone the successes.
Having unrealistic expectations can also be a major hurdle.
Dont be too ambitious in the early days, Keenahan says. As we think about building ROIs, ideally you will have a product that has a very compelling soft ROI or hard ROI and may not have both.
Utilizing a data analytics partner that understands ROI and what SNFs specifically can achieve in terms of both types of ROI is another consideration for SNF organizations pursuing analytics technology for the first time, or those who are seeking a new vendor. SNF leaders should ask questions around the partners expertise in data science, their capabilities beyond transactional and episodic reporting, and their deep understanding of analytics, statistics and data science in order to explain the data as it relates to key organizational decisions.
If you are really going to engage with analytics and operationalize them, you have you have the underlying belief there are going to be benefits, Savage says. In the end, what makes something valuable to an organization is what their end users feel about it.
To learn more about how PointRight Analytics, a Net Health Company, can help optimize clinical outcomes in your SNF environment, visit PointRights solutions for SNFs.
See the original post:
Smart SNFs: The ROI of Data Analytics - Skilled Nursing News
31 Data Science and Analytics Predictions from 19 Experts for 2022 – Solutions Review
We polled 19 experts and received 31 data science and analytics predictions for 2021, in an attempt to help you make the best business decisions.
As part of Solutions Reviews third-annual #BIInsightJam, we called for the industrys best and brightest to share their data science and analytics predictions for 2022. The experts featured here represent the top data science and analytics solution providers with experience in this niche. Data science and analytics predictions have been vetted for relevance and ability to add business value as well. These are the best predictions from the dozens we received. We believe these are actionable and may impact a number of verticals, regions, and organization sizes.
Note: Data science and analytics predictions are listed in the order we received them.
Mainstream AI and Deep Learning
As the toolset for AI applications continues to evolve, machine learning and deep learning platforms have entered the mainstream and will attain the same level of maturity as specialized data analytics. Just like we currently see a plethora of fully integrated managed services based Apache Spark and Presto, in 2022, we will see vertical integrations emerging based on the likes of PyTorch and Tensorflow. MLOps for pipeline automation and management will become essential, further lowering the barriers and accelerating the adoption of AI and ML.
Data in Motion is the Next Automation Holy Grail
For automated decisions and machine learning, both AI technologies that rely on the input of data, the data itself remains far from malleable. All too often, massive amounts of enterprise data is difficult to scale, store, and use in an actionable way. Until AI and automation technologies can better master the flow of data, advancements will be slow-moving. In the years ahead, as enterprises master how to tap into data in motion, we will see greater innovation in automation that enables decision-making based off real-time, data-backed insights.
Predictive Analytics Will Drive New, Emerging Use Cases Around the Next Generation of Digital Applications
The technology will become more immersive and embedded, where predictive analytics capabilities will be blended seamlessly into the systems and applications with which we interact. Predictive analytics will drive use cases in next-gen apps like metaverse applications (convergence of digital and physical worlds, powered by technologies such as IoT, digital twins, AI/ML, and XR) and the next generation of composable applications.
Enterprises Will Employ Adaptive Learning in BotVille
Bots and automation are all the rage. However, while many companies have become pretty good at automating existing processes, few have learned from that automation. You need a heads-up display for RPA with AI-driven insights. Adaptive, incremental, dynamic learning techniques are growing fields of AI/ML that, when applied to the RPAs exhaust, can make observations on the fly. Patterns of behavior are continuously observed. These dynamic learning technologies help business users see and act on AHA moments and make smarter decisions.
Enterprises Will Discover the Big AI Lie
92% of companies are invested more in AI in 2021, yet just 12% are deploying it at scale, down from last year. Whats going on? How can companies be spending MORE on AI but getting LESS from it? There are many non-obvious factors at play: culture, tools, bias concerns, fear, and automation grace the top of the list. In 2022, firms must meet these challenges head-on with a cultural approach to model operationalization to better manage, track and optimize algorithms. Only then will data science move from the playground to the battleground.
ModelOps is Hot
Working from home in the pandemic has accelerated collisions and collaborations between teams of data scientists, DevOps, and model ops developers to get data science apps into production. Emerging from this is a focus on converting ad-hoc processes into a controlled environment for managing low code and code first components, processes for data flows and model connections, along with rules, actions and decisions. Continuous analysis of models actually in operations is also in focus to assess ROI of the data science app, model drift and model rebasing.
ML Engineers are now in the middle of this configuring deployment scenarios in hybrid cloud environments, working with data scientists, data engineers, business users and devops teams; and with app dev and design teams.
Analytics Must Move Beyond Insights and Into Actions and Decisions
Todays fast-changing business climate demands real-time visibility and up-to-the-minute recommendations from data and analytics. To survive the post-pandemic world, organizations need to be able to predict whats going to happen next based on the data they have; and develop more discipline around decisions and actions. Processes for measuring impact and closing the decision intelligence loop will sharpen their focus.
More Open-Source Behind Analytics & AI
As the momentum behind the Open Data Lake Analytics stack to power Analytics & AI applications grew over the past year, well see a bigger focus on leveraging Open Source to address the limitations around flexibility and cost when it comes to traditional enterprise data warehouses. Open source cloud-native technologies like Presto, Apache Spark, Superset, and Hudi will power AI platforms at a larger scale, opening up new use cases and workloads that arent possible on the data warehouse.
Augmented BI Will Make Data Democratization a Reality
The biggest barrier to data democratization was allowing business and non-technical users to access raw data for analysis. Augmented analytics capabilities such as Natural Language Processing (NLP) and Natural Language Querying (NLQ) will allow business users to get answers to important questions without having to work with the data directly. This helps companies bypass complexities in issuing and managing user-level permissions to raw data.
Contextual Embedded Analytics Will be Key to Successful Analytics Implementation
The chances of an organization acting uponinsightis much higher when presented directly within the business application than when it is presented in a standalone BI software. This is mainly because of two aspectscontextual availability and short reach to the decision-making audience. For example, wheninsighton project efficiency is present right within a project management software, it makes it easy for project managers to relate it to their daily work and put measures in place to fix inefficiencies.
Unstructured Analytics
In 2022, with powerful technologies available to them, organizations will invest more in unstructured analytics. To date, most business intelligence has been conducted using structured data; however, there are countless problems that cannot be answered by these clean-cut numbers. Burgeoning people analytics teams are offered a new means of assessing uniquely human situationstalent acquisition, workforce sentiment, productivity, etc.by analyzing the textual, conversational, and communicative data created by the workforce each day. These emails, files, and collaboration data speak to the human side of the enterprise that has long remained out of reach.
Unstructured Data Analytics Workflow Solutions Will Emerge
Processing and indexing petabytes of unstructured data is today largely a manual effort. Large organizations employ legions of data professionals to search, catalogue and move this data so it can be ingested by analytics tools and manipulated. Theres a dire need to simplify and automate these processes. Solutions that index files easily across multiple file and cloud silos and automate the systematic data movement will be on the rise.
Also, data analytics solutions for unstructured data might be verticalized, so they are sector specific or applications specific. For instance, medical images and how you interpret them is a contextual event requiring specific knowledge of clinical data sets. Organizations are creating custom workflows consisting of cloud-based analytics tools like Amazon Comprehend for PII detection along with manual data movement and data lakes. The time is ripe for commercial data management solutions which can enable easy search of specific data sets across a global enterprise and stream this data continually to systematically automate the workflow of unstructured data analytics.
The CAO Will Eclipse the CDO
While many companies today have a chief data officer, in 2022 we will see more enterprises establish chief analytics officer or chief data and analytics officer roles. Elevating analytics reflects an evolving understanding of data science and machine learning as the ultimate functions that turn data into business value, and increasingly core to company-wide strategy.
Democratization of ML Through Up-Skilling Will Make More Analysts Comfortable with Code
For over 20 years, different products have promised to enable advanced analytics with no-code or drag-and-drop user interfaces. The latest wave of this trend will lose enthusiasm in favor of companies investing to upskill their workforce. Analytical programming languages like Python and R will become more table stakes (especially with the rise of data science degree programs in secondary education), just as Excel and SQL became a decade ago.
Unpredictable Business Conditions Will Accelerate Adoption of Model Monitoring
Model monitoring, already critical in a post-pandemic economy, will become essential. The continued volatility of unpredictable business factors, from supply chains to extreme weather, will greatly accelerate the need for businesses to continuously monitor how well their models reflect the real and rapidly changing world of their customers.
Organizations Will Redefine What it Means to Build a Culture of Analytics
For too long, business leaders have assumed that upskilling their workforce with data classes/certifications and investing in self-service tools would lead to a data-driven organization. They are finally ready to admit that its not working. Self-service BI does not close the skills gap. Not everyone has time or interest in becoming a data analyst or data literate, especially now in todays post-COVID landscape where teams are understaffed and people are valuing their time differently in and outside of work. In 2022, organizations will redefine what it means to build a culture of analytics and change the paradigm by bringing insights to workers in a more digestible way turning to methods and solutions like embedded analytics that wont require them to learn new skills or invest additional time.
The Most Data-Driven Organizations Will Combat Tool Fatigue by Bringing Data to Workers Where They Are
The rise of work-from-home and the digital acceleration brought on by the pandemic means that more people than ever are using different tools in different places to do their jobs from email to collaboration software like Slack and Teams to the many point solutions needed to get work done across departments. As a result, workers everywhere are experiencing tool fatigue, distractions and inefficiencies from jumping around from software to software or being forced to use tools that dont fit into their personal workflow. Rather than investing in data/analytics solutions that add yet another tool to the mix, well start to see more organizations in 2022 delivering insights to employees directly within their workflows via embedded analytics (for example, directly within Slack, Teams, etc.). In this environment, workers can make data-driven decisions without thinking twice and without any disruptions.
Automation Turns Prescriptive Analytics Into Prescriptive Guidance
For years we heard that the future of analytics will go beyond descriptive analytics (what happened) and predictive analytics (what will happen) to prescriptive guidance (what to do about it). AI combined with automation will finally make this possible by dynamically combining relevant data and alerting knowledge workers to take action, in advance, before an event occurs. Customer Service reps will be notified to reach out to potentially angry customers before they even call in. Sales leaders will react immediately to dips in revenue pipeline coverage due to upstream activities without waiting until the end of the quarter. Retail managers can optimize inventory before items sell out by combining more than just sales data, such as purchasing patterns of other items, external market trends, and even competing promotional campaigns. Prescriptive analytics will finally evolve from telling us just where the numbers are going, to helping us make smarter, proactive decisions.
Decision Intelligence Makes Inroads for Enterprise-Wide Decision Support
Organizations have been acquiring vast amounts of data and need to leverage that information to drive business outcomes. Decision intelligence is making inroads across enterprises, as regular dashboards and BI platforms are augmented with AI/ML-driven decision support systems.
In 2002, decision intelligence has the potential to make assessments better and faster, given machine-generated decisions can be processed at speeds that humans simply cannot. The caveat machines still lack consciousness and do not understand the implications of the decision outcome. Look for organizations to incorporate decision intelligence into their BI stack to continuously measure the outcome to avoid unintended consequences by tweaking the decision parameters accordingly.
Organizations Embrace Composable Data and Analytics to Empower Data Consumers
Monolithic architectures are already a thing of the past but expect even smaller footprints. As global companies deal with distributed data across regional, cloud and data center boundaries, consolidating that data in one central location is practically impossible. Thats where composable data architecture becomes paramount and brings agility to data infrastructure. Data management infrastructure is extremely diverse and usually every organization uses multiple systems or modules that together constitute their data management environment. Being able to build a low-code, no code data infrastructure provides flexibility and user friendliness, as it empowers business users to put together their desired data management stack and makes them less dependent on IT.
In 2022, expect organizations to accelerate building composable data and analytics environments that can bring faster business value and outcomes.
Small and Wide Data Analytics Begin to Catch On
AI/ML is transforming the way organizations operate, but to be successful, it is also dependent on historical data analytics, aka big data analytics. While big data analytics is here to stay, in many cases this old historical data continues to lose its value.
In 2022, organizations will leverage small data analytics to create hyper-personalized experiences for their individual customers to understand customer sentiment around a specific product or service within a short time window. While wide data analytics is comparatively a new concept and yet to find widespread adoption given the pace at which organizations are making use of unstructured and structured data together expect to see small and wide data analytics to gain better traction across organizations as we enter 2022.
The Rise of the Just in Time Data Analytics Stack
Theres a small, but fast growing, segment of the data analytics space that is focused on new approaches to the enterprise stack, including continuing to move all the things to the cloud. However, the hybrid multicloud imposes requirements of its own most notably the ability to manage and analyze data no matter where it lives in the hybrid multicloud environment.
Startups like Starburst, Materialize.io, Rockset, and my own company Stardog develop platforms that are designed to query, search, connect, analyze, and integrate data where it lays without moving or copying it, in a just-in-time fashion. In a world where the number of places that data may be residing in storage is increasing, rather than decreasing, expect to see enterprises reach for data analytics solutions that are not coupled to where data lives. This trend will accelerate in 2022 as data movement between storage systems will continue to be removed from the stack in order to accelerate time to insight.
Knowledge Graph-Enabled Data Fabrics Become the Connective Tissue for Maximizing Analytics Value
Gartner indicates that data fabric is the foundation of the modern data management platform with capabilities for data governance, storage, analytics, and more. Relying on traditional integration paradigms involving moving data and manually writing code is no longer acceptable as data scientists and data engineers spend almost 80 percent of their time wrangling data before any analytics are performed. Shrewd organizations looking to adopt this approach are realizing that the centerpiece of a properly implemented data fabric is an enterprise knowledge graph, which compounds data fabrics value for better, faster, lower-cost analytics while hurdling the data engineering challenges obstructing them.
2022 will be the year organizations adopt enterprise knowledge graph platforms to support their data fabrics that use a combination of graph data models, data virtualization, and query federationalong with intelligent inferencing and AIto eliminate this friction by simplifying data integration, reducing data preparation costs, and improving the cross-domain insights generated from downstream analytics.
Business Users Will be Empowered to Become Data Analysts
Enterprises will empower business users to become data analysts by applying well-trained natural language processing (NLP) and machine learning technologies, and implementing richly curated data catalogs to unleash the power of complex analytics. Organizations with integrated data strategies will provide their employees with the tools that allow them to gain data analyst superpowers by tapping into vast amounts of data and drive business results. This improves the productivity of business users and eliminates bottlenecks caused by the reliance on data analysts to find and analyze trusted data within their organizations, making the process more prolonged and arduous than necessary.
Move from Dashboards to Data-Driven Apps
If humans, even augmented by real-time dashboards, are the bottleneck, then what is the solution? Data-driven apps that can provide personalized digital customer service and automate many operational processes when armed with real-time data. In 2022, look to many companies to rebuild their processes for speed and agility supported by data-driven apps.
Industrial Data Scientists Emerge to Facilitate Industrial AI Strategy
The generational churn occurring in the industrial workforce will inspire another trend: the widespread emergence of industrial data scientists as central figures in adopting and managing new technologies, like industrial AI and just as importantly, the strategies for deploying and maximizing these technologies to their full potential.New research revealed that while 84 percent of key industrial decision-makers accepted the need for an industrial AI strategy to drive competitive advantage and 98% acknowledged how failing to have one could present challenges to their business only 35% had actually deployed such a strategy so far.
With one foot in traditional data science and the other in unique domain expertise, industrial data scientists will serve a critical role in being the ones to drive the creation and deployment of an industrial AI strategy.
The Conversation Around Data for AI Will be Prioritized
The discussions around data for AI have started, but they havent nearly received enough attention. Data is the most critical aspect for building AI systems, and we are just now starting to talk and think about the systems to acquire, prepare, and monitor data to ensure performance and lack of bias. Organizations will have to prioritize a data-first approach within an enterprise architecture in 2022 to enable AI and analytics to solve problems and facilitate new revenue streams.
The Need for Data in Decision Making Has Never Been Greater
As the demand for business intelligence (BI) software rises, so do new advancements giving users the ability to analyze and make intelligent decisions without any programming knowledge. Not only can enterprises gain a competitive advantage, but todays BI is being used to address supply chain issues and save lives. The pandemic emphasized the importance of relying on data, rather than hunches, as the world became dependent on COVID-19 visualizations to steer us out of the crisis. Government agencies and health experts are using big data analytics tools to understand, track, and reduce the spread of the virus. BI helps health experts identify vaccine supply chain issues, virus hotspots, COVID-19 rates, and more, all in real-time. The next-gen BI may change the way we determine trends and ultimately, it may even be able to predict the future.
AI Technologies Associated with Data Science Will be Used Increasingly by Data Engineers
Data engineers will increasingly use AI-based tools in their day-to-day work. To support this, more analytics vendors will incorporate AI programmatic capabilities in their platforms, opening up new opportunities for data engineers. This will also blur the line between data engineering and data science, providing new opportunities for innovation.
The Most Transformational Analytics Use Cases Will Come From Citizen Analysts
Due to their domain expertise, proximity to the business, and availability of new (tools|technologies), citizen data analysts will become the most important and influential individuals who work with data. This will lead to an explosion of new ideas and practical applications for data, marking the next big turning point for the industry.
Just-in-Time Supply Chain Failures Will Fuel Meteoric Rise of Just-in-Time Data Analytics
Faced with a full-blown supply chain crisis, companies will have to address long-standing issues in their data pipelines bottlenecks and other fragilities that prevent teams from gaining the visibility into supply chains they need to survive the decade. No longer held back by the gravity of legacy models, systems and approaches, companies will embrace innovative new solutions in a bid to make just-in-time data analytics a reality for their business.
Tim is Solutions Review's Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.
Read the rest here:
31 Data Science and Analytics Predictions from 19 Experts for 2022 - Solutions Review
Working in tandem, Barney will guide engineering and data science teams at Aktify, while Rigby will spearhead the product development and design….
Country
United States of AmericaUS Virgin IslandsUnited States Minor Outlying IslandsCanadaMexico, United Mexican StatesBahamas, Commonwealth of theCuba, Republic ofDominican RepublicHaiti, Republic ofJamaicaAfghanistanAlbania, People's Socialist Republic ofAlgeria, People's Democratic Republic ofAmerican SamoaAndorra, Principality ofAngola, Republic ofAnguillaAntarctica (the territory South of 60 deg S)Antigua and BarbudaArgentina, Argentine RepublicArmeniaArubaAustralia, Commonwealth ofAustria, Republic ofAzerbaijan, Republic ofBahrain, Kingdom ofBangladesh, People's Republic ofBarbadosBelarusBelgium, Kingdom ofBelizeBenin, People's Republic ofBermudaBhutan, Kingdom ofBolivia, Republic ofBosnia and HerzegovinaBotswana, Republic ofBouvet Island (Bouvetoya)Brazil, Federative Republic ofBritish Indian Ocean Territory (Chagos Archipelago)British Virgin IslandsBrunei DarussalamBulgaria, People's Republic ofBurkina FasoBurundi, Republic ofCambodia, Kingdom ofCameroon, United Republic ofCape Verde, Republic ofCayman IslandsCentral African RepublicChad, Republic ofChile, Republic ofChina, People's Republic ofChristmas IslandCocos (Keeling) IslandsColombia, Republic ofComoros, Union of theCongo, Democratic Republic ofCongo, People's Republic ofCook IslandsCosta Rica, Republic ofCote D'Ivoire, Ivory Coast, Republic of theCyprus, Republic ofCzech RepublicDenmark, Kingdom ofDjibouti, Republic ofDominica, Commonwealth ofEcuador, Republic ofEgypt, Arab Republic ofEl Salvador, Republic ofEquatorial Guinea, Republic ofEritreaEstoniaEthiopiaFaeroe IslandsFalkland Islands (Malvinas)Fiji, Republic of the Fiji IslandsFinland, Republic ofFrance, French RepublicFrench GuianaFrench PolynesiaFrench Southern TerritoriesGabon, Gabonese RepublicGambia, Republic of theGeorgiaGermanyGhana, Republic ofGibraltarGreece, Hellenic RepublicGreenlandGrenadaGuadaloupeGuamGuatemala, Republic ofGuinea, RevolutionaryPeople's Rep'c ofGuinea-Bissau, Republic ofGuyana, Republic ofHeard and McDonald IslandsHoly See (Vatican City State)Honduras, Republic ofHong Kong, Special Administrative Region of ChinaHrvatska (Croatia)Hungary, Hungarian People's RepublicIceland, Republic ofIndia, Republic ofIndonesia, Republic ofIran, Islamic Republic ofIraq, Republic ofIrelandIsrael, State ofItaly, Italian RepublicJapanJordan, Hashemite Kingdom ofKazakhstan, Republic ofKenya, Republic ofKiribati, Republic ofKorea, Democratic People's Republic ofKorea, Republic ofKuwait, State ofKyrgyz RepublicLao People's Democratic RepublicLatviaLebanon, Lebanese RepublicLesotho, Kingdom ofLiberia, Republic ofLibyan Arab JamahiriyaLiechtenstein, Principality ofLithuaniaLuxembourg, Grand Duchy ofMacao, Special Administrative Region of ChinaMacedonia, the former Yugoslav Republic ofMadagascar, Republic ofMalawi, Republic ofMalaysiaMaldives, Republic ofMali, Republic ofMalta, Republic ofMarshall IslandsMartiniqueMauritania, Islamic Republic ofMauritiusMayotteMicronesia, Federated States ofMoldova, Republic ofMonaco, Principality ofMongolia, Mongolian People's RepublicMontserratMorocco, Kingdom ofMozambique, People's Republic ofMyanmarNamibiaNauru, Republic ofNepal, Kingdom ofNetherlands AntillesNetherlands, Kingdom of theNew CaledoniaNew ZealandNicaragua, Republic ofNiger, Republic of theNigeria, Federal Republic ofNiue, Republic ofNorfolk IslandNorthern Mariana IslandsNorway, Kingdom ofOman, Sultanate ofPakistan, Islamic Republic ofPalauPalestinian Territory, OccupiedPanama, Republic ofPapua New GuineaParaguay, Republic ofPeru, Republic ofPhilippines, Republic of thePitcairn IslandPoland, Polish People's RepublicPortugal, Portuguese RepublicPuerto RicoQatar, State ofReunionRomania, Socialist Republic ofRussian FederationRwanda, Rwandese RepublicSamoa, Independent State ofSan Marino, Republic ofSao Tome and Principe, Democratic Republic ofSaudi Arabia, Kingdom ofSenegal, Republic ofSerbia and MontenegroSeychelles, Republic ofSierra Leone, Republic ofSingapore, Republic ofSlovakia (Slovak Republic)SloveniaSolomon IslandsSomalia, Somali RepublicSouth Africa, Republic ofSouth Georgia and the South Sandwich IslandsSpain, Spanish StateSri Lanka, Democratic Socialist Republic ofSt. HelenaSt. Kitts and NevisSt. LuciaSt. Pierre and MiquelonSt. Vincent and the GrenadinesSudan, Democratic Republic of theSuriname, Republic ofSvalbard & Jan Mayen IslandsSwaziland, Kingdom ofSweden, Kingdom ofSwitzerland, Swiss ConfederationSyrian Arab RepublicTaiwan, Province of ChinaTajikistanTanzania, United Republic ofThailand, Kingdom ofTimor-Leste, Democratic Republic ofTogo, Togolese RepublicTokelau (Tokelau Islands)Tonga, Kingdom ofTrinidad and Tobago, Republic ofTunisia, Republic ofTurkey, Republic ofTurkmenistanTurks and Caicos IslandsTuvaluUganda, Republic ofUkraineUnited Arab EmiratesUnited Kingdom of Great Britain & N. IrelandUruguay, Eastern Republic ofUzbekistanVanuatuVenezuela, Bolivarian Republic ofViet Nam, Socialist Republic ofWallis and Futuna IslandsWestern SaharaYemenZambia, Republic ofZimbabwe
Read the original: