Category Archives: Machine Learning
Machine learning-guided determination of Acinetobacter density in … – Nature.com
A descriptive summary of the physicochemical variables and Acinetobacter density of the waterbodies is presented in Table 1. The mean pH, EC, TDS, and SAL of the waterbodies was 7.760.02, 218.664.76 S/cm, 110.532.36mg/L, and 0.100.00 PSU, respectively. While the average TEMP, TSS, TBS, and DO of the rivers was 17.290.21C, 80.175.09mg/L, 87.515.41 NTU, and 8.820.04mg/L, respectively, the corresponding DO5, BOD, and AD was 4.820.11mg/L, 4.000.10mg/L, and 3.190.03 log CFU/100mL respectively.
The bivariate correlation between paired PVs varied significantly from very weak to perfect/very strong positive or negative correlation (Table 2). In the same manner, the correlation between various PVs and AD varies. For instance, negligible but positive very weak correlation exist between AD and pH (r=0.03, p=0.422), and SAL (r=0.06, p=0.184) as well as very weak inverse (negative) correlation between AD and TDS (r=0.05, p=0.243) and EC (r=0.04, p=0.339). A significantly positive but weak correlation occurs between AD and BOD (r=0.26, p=4.21E10), and TSS (r=0.26, p=1.09E09), and TBS (r=0.26, 1.71E-09) whereas, AD had a weak inverse correlation with DO5 (r=0.39, p=1.31E21). While there was a moderate positive correlation between TEMP and AD (r=0.43, p=3.19E26), a moderate but inverse correlation occurred between AD and DO (r=0.46, 1.26E29).
The predicted AD by the 18 ML regression models varied both in average value and coverage (range) as shown in Fig.1. The average predicted AD ranged from 0.0056 log units by M5P to 3.2112 log unit by SVR. The average AD prediction declined from SVR [3.2112 (1.46464.4399)], DTR [3.1842 (2.23124.3036)], ENR [3.1842 (2.12334.8208)], NNT [3.1836 (1.13994.2936)], BRT [3.1833 (1.68904.3103)], RF [3.1795 (1.35634.4514)], XGB [3.1792 (1.10404.5828)], MARS [3.1790 (1.19014.5000)], LR [3.1786 (2.18954.7951)], LRSS [3.1786 (2.16224.7911)], GBM [3.1738 (1.43284.3036)], Cubist [3.1736 (1.10124.5300)], ELM [3.1714 (2.22364.9017)], KNN [3.1657 (1.49884.5001)], ANET6 [0.6077 (0.04191.1504)], ANET33 [0.6077 (0.09500.8568)], ANET42 [0.6077 (0.06920.8568)], and M5P [0.0056 (0.60240.6916)]. However, in term of range coverage XGB [3.1792 (1.10404.5828)] and Cubist [3.1736 (1.10124.5300)] outshined other models because those models overestimated and underestimated AD at lower and higher values respectively when compared with raw data [3.1865 (14.5611)].
Comparison of ML model-predicted AD in the waterbodies. RAW raw/empirical AD value.
Figure2 represents the explanatory contributions of PVs to AD prediction by the models. The subplot A-R gives the absolute magnitude (representing parameter importance) by which a PV instance changes AD prediction by each model from its mean value presented in the vertical axis. In LR, an absolute change from the mean value of pH, BOD, TSS, DO, SAL, and TEMP corresponded to an absolute change of 0.143, 0.108, 0.069, 0.0045, 0.04, and 0.004 units in the LRs AD prediction response/value. Also, an absolute response flux of 0.135, 0.116, 0.069, 0.057, 0.043, and 0.0001 in AD prediction value was attributed to pH, BOD, TSS, DO. SAL, and TEMP changes, respectively, by LRSS. Similarly, absolute change in DO, BOD, TEMP, TSS, pH, and SAL would achieve 0.155, 0.061. 0.099, 0.144, and 0.297 AD prediction response changes by KNN. In addition, the most contributed or important PV whose change largely influenced AD prediction response was TEMP (decreases or decreases the responses up to 0.218) in RF. Summarily, AD prediction response changes were highest and most significantly influenced by BOD (0.209), pH (0.332), TSS (0.265), TEMP (0.6), TSS (0.233), SAL (0.198), BOD (0.127), BOD (0.11), DO (0.028), pH (0.114), pH (0.14), SAL(0.91), and pH (0.427) in XGB, BTR, NNT, DTR, SVR, M5P, ENR, ANET33, ANNET64, ANNET6, ELM, MARS, and Cubist, respectively.
PV-specific contribution to eighteen ML models forecasting capability of AD in MHWE receiving waterbodies. The average baseline value of PV in the ML is presented on the y-axis. The green/red bars represent the absolute value of each PV contribution in predicting AD.
Table 4 presents the eighteen regression algorithms performance predicting AD given the waterbodies PVs. In terms of MSE, RMSE, and R2, XGB (MSE=0.0059, RMSE=0.0770; R2=0.9912) and Cubist (MSE=0.0117, RMSE=0.1081, R2=0.9827) ranked first and second respectively, to outmatched other models in predicting AD. While MSE and RMSE metrics ranked ANET6 (MSE=0.0172, RMSE=0.1310), ANRT42 (MSE=0.0220, RMSE=0.1483), ANET33 (MSE=0.0253, RMSE=0.1590), M5P (MSE=0.0275, RMSE=0.1657), and RF (MSE=0.0282, RMSE=0.1679) in the 3, 4, 5, 6, and 7 position among the MLs in predicting AD, M5P (R2=0.9589 and RF (R2=0.9584) recorded better performance in term of R-squared metric and ANET6 (MAD=0.0856) and M5P (MAD=0.0863) in term of MAD metric among the 5 models. But Cubist (MAD=0.0437) XGB (MAD=0.0440) in term of MAD metric.
The feature importance of each PV over permutational resampling on the predictive capability of the ML models in predicting AD in the waterbodies is presented in Table 3 and Fig. S1. The identified important variables ranked differently from one model to another, with temperature ranking in the first position by 10/18 of the models. In the 10 algorithms/models, the temperature was responsible for the highest mean RMSE dropout loss, with temperature in RF, XGB, Cubist, BRT, and NNT accounting for 0.4222 (45.90%), 0.4588 (43.00%), 0.5294 (50.82%), 0.3044 (44.87%), and 0.2424 (68.77%) respectively, while 0.1143 (82.31%),0.1384 (83.30%), 0.1059 (57.00%), 0.4656 (50.58%), and 0.2682 (57.58%) RMSE dropout loss was attributed to temperature in ANET42, ANET10, ELM, M5P, and DTR respectively. Temperature also ranked second in 2/18 models, including ANET33 (0.0559, 45.86%) and GBM (0.0793, 21.84%). BOD was another important variable in forecasting AD in the waterbodies and ranked first in 3/18 and second in 8/18 models. While BOD ranked as the first important variable in AD prediction in MARS (0.9343, 182.96%), LR (0.0584, 27.42%), and GBM (0.0812, 22.35%), it ranked second in KNN (0.2660, 42.69%), XGB (0.4119, 38.60); BRT (0.2206, 32.51%), ELM (0.0430, 23.17%), SVR (0.1869, 35.77%), DTR (0.1636, 35.13%), ENR (0.0469, 21.84%) and LRSS (0.0669, 31.65%). SAL rank first in 2/18 (KNN: 0.2799; ANET33: 0.0633) and second in 3/18 (Cubist: 0.3795; ANET42: 0.0946; ANET10: 0.1359) of the models. DO ranked first in 2/18 (ENR [0.0562; 26.19%] and LRSS [0.0899; 42.51%]) and second in 3/18 (RF [0.3240, 35.23%], M5P [0.3704, 40.23%], LR [0.0584, 27.41%]) of the models.
Figure3 shows the residual diagnostics plots of the models comparing actual AD and forecasted AD values by the models. The observed results showed that actual AD and predicted AD value in the case of LR (A), LRSS (B), KNN (C), BRT 9F), GBM (G), NNT (H), DTR (I), SVR (J), ENR (L), ANET33 (M), ANER64 (N), ANET6 (O), ELM (P) and MARS (Q) skewed, and the smoothed trend did not overlap. However, actual AD and predicted AD values experienced more alignment and an approximately overlapped smoothed trend was seen in RF (D), XGB (E), M5P (K), and Cubist (R). Among the models, RF (D) and M5P (K) both overestimated and underestimated predicted AD at lower and higher values, respectively. Whereas XGB and Cubist both overestimated AD value at lower value with XGB closer to the smoothed trend that Cubist. Generally, a smoothed trend overlapping the gradient line is desirable as it shows that a model fits all values accurately/precisely.
Comparison between actual and predicted AD by the eighteen ML models.
The comparison of the partial-dependence profiles of PVs on AD prediction by the 18 modes using a unitary model by PVs presentation for clarity is shown in Figs. S2S7. The partial-dependence profiles existed in i. a form where an average increase in AD prediction accompanied a PV increase (upwards trend), (ii) inverse trend, where an increase in a PV resulted in a decline AD prediction, (iii) horizontal trend, where increase/decrease in a PV yielded no effects on AD prediction, and (iv) a mixed trend, where the shape switch between 2 or more of iiii. The models' response varied with a change in any of the PV, especially changes beyond the breakpoints that could decrease or increase AD prediction response.
The partial-dependence profile (PDP) of DO for models has a downtrend either from the start or after a breakpoint(s) of nature ii and iv, except for ELM which had an upward trend (i, Fig. S2). TEMP PDP had an upward trend (i and iv) and, in most cases filled with one or more breakpoints but had a horizontal trend in LRSS (Fig. S3). SAL had a PDP of a typical downward trend (ii and iv) across all the models (Fig. S4). While pH displayed a typical downtrend PDP in LR, LRSS, NNT, ENR, ANN6, a downtrend filled with different breakpoint(s) was seen in RF, M5P, and SVR; other models showed a typical upward trend (i and iv) filled with breakpoint(s) (Fig. S5). The PDP of TSS showed an upward trend that returned to a plateau (DTR, ANN33, M5P, GBM, RF, XFB, BRT), after a final breakpoint or a declining trend (ANNT6, SVR; Fig. S6). The BOD PDP generally had an upward trend filled with breakpoint(s) in most models (Fig. S7).
See the original post:
Machine learning-guided determination of Acinetobacter density in ... - Nature.com
AI and Machine Learning will help to Build Metaverse Claims Exec – The Coin Republic
According to one of the executives at Facebook, reports related to Metaverses demise have been exaggerated more than they needed to be.
Meta hosted a press event in New York on 11 May announcing a new AI generative Sandbox tool for advertisers. Nicola Mendelsohn who is Metas Head of Global Business expressed that they are still very much interested in the Metaverse and reinstated that Mark Zuckerberg is very clear about that.
Responding to various reports by news media organizations showing how Meta is not interested in the Metaverse, Nicola explained that they are really interested in the Metaverse. He addressed the attendees saying that this whole Metaverse thing can take 5-10 years before they realize the vision of what theyre talking about.
Mendelsohns comments come as a defense against the growing speculation that Meta is focusing on artificial intelligence more than Metaverse in recent months during the period when the social media giant, Facebook Inc rebranded itself as Meta and couldnt stop talking about the Metaverse.
The recent surge in reports suggesting Meta is moving away from the Metaverse is because of AI tools dominating headlines. Speculations rose that Metas rebranding and announcement quickly faded as soon as artificial intelligence started making headlines and it made some analysts and critics think that Meta is moving towards the latest buzz trend and farther away from Metaverse.
The stance by Mendelson comes despite the fact that Metas Reality Labs lost $3.9 billion in the first quarter of 2023 which is $1 billion more than the first quarter of 2022.
Meta explained that to build the Metaverse and to make Quest virtual reality headsets, generative AI will play a huge part and will be used by brands and creators.
The newly launched AI Sandbox by the company will leverage generative AI to create text for ad copy aimed at different demographics, automatically crop photos and videos, and turn text prompts into background images for ads on Facebook and Instagram. Andrew Bosworth, CTO of Meta previewed the first incoming tools in March.
Nicola Mendelson explained that if you want to build a virtual world as a company its very difficult to do that but he said that with the help of machine learning and Generative AI, this can be done. John Hegeman, VP of Monetization at Meta said that the AI will help them to build the Metaverse more effectively. He further added, The Metaverse will be another great opportunity to create value for folks with AI.
Oncyber, which is a 3D world-building platform, launched an AI tool powered by OpenAIs ChatGpt that lets users customize their digital environments via text commands. Mendelson feels that the full vision of the company in relation to the metaverse could be challenged by Apples mixed reality headset, which is set to be announced soon.
Nancy J. Allen is a crypto enthusiast and believes that cryptocurrencies inspire people to be their own banks and step aside from traditional monetary exchange systems. She is also intrigued by blockchain technology and its functioning.
See the original post here:
AI and Machine Learning will help to Build Metaverse Claims Exec - The Coin Republic
Machine learning market size to grow by USD 56,493.47 million between 2022 and 2027; Alibaba Group Holding Ltd., Alphabet Inc., among others…
NEW YORK, May 11, 2023 /PRNewswire/ -- The machine learning market size o grow by USD 56,493.47 million at a CAGR of 47.81% from 2022 to 2027, according to the latest research report from Technavio. The research report focuses on top companies and crucial drivers, current growth dynamics, futuristic opportunities, and new product/project launches. Discover IT Consulting & Other Services industry potential and make informed business decisions based on qualitative and quantitative evidence highlighted in Technavio reports.View Sample Report
Technavio has announced its latest market research report titled Global Machine Learning Market
Vendor Landscape
The machine learning market is fragmented. Many local and global vendors offer machine learning with limited product differentiation. However, due to the significant growth of the market, vendors are continuously adopting the latest innovations. Therefore, the threat of rivalry was low, and it is expected to remain the same during the forecast period. Some of the key vendors covered in the report include:
Alibaba Group Holding Ltd.- The company offers machine learning platforms for AI that rely on Alibaba Cloud distributed computing clusters.
Alphabet Inc. - The company offers innovative machine learning products and services that will help build, deploy, and scale more effective AI models.
Amazon.com Inc. - The company offers SageMaker which is a machine learning service enabling data scientists, data engineers, MLOps engineers, and business analysts to build, train, and deploy ML models for any use case, regardless of ML expertise.
BigML Inc. - The company offers machine learning which provides predictive applications across industries including aerospace, automotive, energy, entertainment, financial services, food, healthcare, IoT, pharmaceutical, transportation, and telecommunications.
Altair Engineering Inc.
Alteryx Inc.
Cisco Systems Inc.
Fair Isaac Corp.
H2O.ai Inc.
Hewlett Packard Enterprise Co.
Iflowsoft Solutions Inc.
Intel Corp.
International Business Machines Corp.
Microsoft Corp.
Netguru S.A
Salesforce.com Inc.
SAP SE
SAS Institute Inc.
TIBCO Software Inc.
Yottamine Analytics LLC
Story continues
For the market's vendor landscape highlights with a comprehensive list of vendors and their offerings - View Sample Report
Key Market Segmentation
The market is segmented by end-user (BFSI, retail, telecommunications, healthcare, and automotive and others), deployment (cloud-based and on-premise), and geography (North America, Europe, APAC, Middle East and Africa, and South America).
By end-user, the market will observe significant growth in the BFSI segment during the forecast period. BFSI companies are increasingly adopting machine learning solutions to understand their customers and provide customized solutions. The adoption of machine learning solutions is helping BFSI companies in achieving automated processing, data-driven insights about customers, and personalized customer outreach. In addition, the ongoing digital transformation initiatives in the BFSI sector has been driving the growth of the segment.
View Sample Reportfor more highlights into the market segments.
Regional Market Outlook
North America will account for 36% of the market growth during the forecast period. The growth of the regional market is driven by the increase in data generation from industries such as telecommunications, manufacturing, retail, and energy. Also, the increasing need to ensure data consistency and accuracy, improve data quality, identify data patterns, detect anomalies, and develop predictions among enterprises is another major factor driving the growth of the machine learning market in North America.
For more key highlights on the regional market share of most of the above-mentioned countries. -View Sample Report
The machine learning market covers the following areas:
Market Dynamics
Driver The market is driven by the increasing adoption of cloud-based offerings. Cloud-based services solutions offer various benefits such as minimal cost for computing, network and storage infrastructure, scalability, reliability, and high resource availability. Cloud-based solutions eliminate the need for dedicated IT support teams and hence reduces operating costs. Many such benefits are increasing the adoption of cloud-based offerings among enterprises. Machine learning solutions helps enterprises to scale up the production workload of their projects over cloud with the increase in data. Thus, the increased adoption of cloud-based offerings will drive the growth of the market during the forecast period.
Trend The growing number of acquisitions and partnerships is identified as the key trend in the market. Vendors operating in the market are focused on forming strategic alliances with other players to gain a competitive advantage. These growth strategies are allowing vendors to gain access to new clients. Strategic partnerships also provide access to a larger customer base and technologies to help them improve their product portfolio. Moreover, partnerships and acquisitions help vendors to expand their presence in new markets. This trend among vendors will have a positive influence on the market growth over the forecast period.
Challenge The shortage of skilled professionals is identified as the major challenge hindering market growth. Most enterprises lack a proper mix of AI and machine learning skillset-based workforce. This has led enterprises to invest more time and money in retaining and training their existing employees. Also, companies working and investing in AI and machine learning face challenges in finding skilled workforces. Such challenges are restricting the growth of the market in focus.
Why Buy?
Add credibility to strategy
Analyzes competitor's offerings
Get a holistic view of the market
Grow your profit margin with Technavio - Buy the Report
Related Reports:
The deep learning market is estimated to grow at a CAGR of 29.79% between 2022 and 2027. The size of the market is forecast to increase by USD 11,113.13 million. The market is segmented by application (image recognition, voice recognition, video surveillance and diagnostics, and data mining), type (software, services, and hardware), and geography (APAC, North America, Europe, Middle East and Africa, and South America).
The cloud analytics market is estimated to grow at a CAGR of 20.69% between 2022 and 2027. The size of the market is forecast to increase by USD 49,051.7 million. The market is segmented by solution (hosted data warehouse solutions, cloud BI tools, complex event processing, and others), deployment (public cloud, hybrid cloud, and private cloud), and geography (North America, Europe, APAC, Middle East and Africa, and South America).
Gain instant access to 17,000+ market research reports.Technavio's SUBSCRIPTION platform
Machine Learning Market Scope
Report Coverage
Details
Base year
2022
Historic period
2017-2021
Forecast period
2023-2027
Growth momentum & CAGR
Accelerate at a CAGR of 47.81%
Market growth 2023-2027
USD 56,493.47 million
Market structure
Fragmented
YoY growth 2022-2023(%)
42.74
Regional analysis
North America, Europe, APAC, Middle East and Africa, and South America
Performing market contribution
North America at 36%
Key countries
US, China, Japan, UK, and Germany
Competitive landscape
Leading Vendors, Market Positioning of Vendors, Competitive Strategies, and Industry Risks
Key companies profiled
Alibaba Group Holding Ltd., Alphabet Inc., Altair Engineering Inc., Alteryx Inc., Amazon.com Inc., BigML Inc., Cisco Systems Inc., Fair Isaac Corp., H2O.ai Inc., Hewlett Packard Enterprise Co., Iflowsoft Solutions Inc., Intel Corp., International Business Machines Corp., Microsoft Corp., Netguru S.A, Salesforce.com Inc., SAP SE, SAS Institute Inc., TIBCO Software Inc., and Yottamine Analytics LLC
Market dynamics
Parent market analysis, market growth inducers and obstacles, fast-growing and slow-growing segment analysis, COVID-19 impact and recovery analysis and future consumer dynamics, and market condition analysis for the forecast period.
Customization purview
If our report has not included the data that you are looking for, you can reach out to our analysts and get segments customized.
Browse through Technavio's Information Technology MarketReports
Key Topics Covered:
1 Executive Summary
2 Market Landscape
3 Market Sizing
4 Historic Market Size
5 Five Forces Analysis
6 Market Segmentation by End-user
7 Market Segmentation by Deployment
8 Customer Landscape
9 Geographic Landscape
10 Drivers, Challenges, and Trends
11 Vendor Landscape
12 Vendor Analysis
13 Appendix
About UsTechnavio is a leading global technology research and advisory company. Their research and analysis focuses on emerging market trends and provides actionable insights to help businesses identify market opportunities and develop effective strategies to optimize their market positions. With over 500 specialized analysts, Technavio's report library consists of more than 17,000 reports and counting, covering 800 technologies, spanning across 50 countries. Their client base consists of enterprises of all sizes, including more than 100 Fortune 500 companies. This growing client base relies on Technavio's comprehensive coverage, extensive research, and actionable market insights to identify opportunities in existing and potential markets and assess their competitive positions within changing market scenarios.
ContactTechnavio ResearchJesse MaidaMedia & Marketing ExecutiveUS: +1 844 364 1100UK: +44 203 893 3200Email: media@technavio.comWebsite: http://www.technavio.com/
Global Machine Learning Market
Cision
View original content to download multimedia:https://www.prnewswire.com/news-releases/machine-learning-market-size-to-grow-by-usd-56-493-47-million-between-2022-and-2027-alibaba-group-holding-ltd-alphabet-inc-among-others-identified-as-key-vendors---technavio-301820931.html
SOURCE Technavio
Read more here:
Machine learning market size to grow by USD 56,493.47 million between 2022 and 2027; Alibaba Group Holding Ltd., Alphabet Inc., among others...
Humans in the Loop: AI & Machine Learning in the Bloomberg … – AccessWire
Originally published on bloomberg.com
NORTHAMPTON, MA / ACCESSWIRE / May 12, 2023 / The Bloomberg Terminal provides access to more than 35 million financial instruments across all asset classes. That's a lot of data, and to make it useful, AI and machine learning (ML) are playing an increasingly central role in the Terminal's ongoing evolution.
Machine learning is about scouring data at speed and scale that is far beyond what human analysts can do. Then, the patterns or anomalies that are discovered can be used to derive powerful insights and guide the automation of all kinds of arduous or tedious tasks that humans used to have to perform manually.
While AI continues to fall short of human intelligence in many applications, there are areas where it vastly outshines the performance of human agents. Machines can identify trends and patterns hidden across millions of documents, and this ability improves over time. Machines also behave consistently, in an unbiased fashion, without committing the kinds of mistakes that humans inevitably make.
"Humans are good at doing things deliberately, but when we make a decision, we start from whole cloth," says Gideon Mann, Head of ML Product & Research in Bloomberg's CTO Office. "Machines execute the same way every time, so even if they make a mistake, they do so with the same error characteristic."
The Bloomberg Terminal currently employs AI and ML techniques in several exciting ways, and we can expect this practice to expand rapidly in the coming years. The story begins some 20 years ago
Keeping Humans in the Loop
When we started in the 80s, data extraction was a manual process. Today, our engineers and data analysts build, train, and use AI to process unstructured data at massive speeds and scale - so our customers are in the know faster.
The rise of the machines
Prior to the 2000s, all tasks related to data collection, analysis, and distribution at Bloomberg were performed manually, because the technology did not yet exist to automate them. The new millennium brought some low-level automation to the company's workflows, with the emergence of primitive models operating by a series of if-then rules coded by humans. As the decade came to a close, true ML took flight within the company. Under this new approach, humans annotate data in order to train a machine to make various associations based on their labels. The machine "learns" how to make decisions, guided by this training data, and produces ever more accurate results over time. This approach can scale dramatically beyond traditional rules-based programming.
In the last decade, there has been an explosive growth in the use of ML applications within Bloomberg. According to James Hook, Head of the company's Data department, there are a number of broad applications for AI/ML and data science within Bloomberg.
One is information extraction, where computer vision and/or natural language processing (NLP) algorithms are used to read unstructured documents - data that's arranged in a format that's typically difficult for machines to read - in order to extract semantic meaning from them. With these techniques, the Terminal can present insights to users that are drawn from video, audio, blog posts, tweets, and more.
Anju Kambadur, Head of Bloomberg's AI Engineering group, explains how this works:
"It typically starts by asking questions of every document. Let's say we have a press release. What are the entities mentioned in the document? Who are the executives involved? Who are the other companies they're doing business with? Are there any supply chain relationships exposed in the document? Then, once you've determined the entities, you need to measure the salience of the relationships between them and associate the content with specific topics. A document might be about electric vehicles, it might be about oil, it might be relevant to the U.S., it might be relevant to the APAC region - all of these are called topic codes' and they're assigned using machine learning."
All of this information, and much more, can be extracted from unstructured documents using natural language processing models.
Another area is quality control, where techniques like anomaly detection are used to spot problems with dataset accuracy, among other areas. Using anomaly detection methods, the Terminal can spot the potential for a hidden investment opportunity, or flag suspicious market activity. For example, if a financial analyst was to change their rating of a particular stock following the company's quarterly earnings announcement, anomaly detection would be able to provide context around whether this is considered a typical behavior, or whether this action is worthy of being presented to Bloomberg clients as a data point worth considering in an investment decision.
And then there's insight generation, where AI/ML is used to analyze large datasets and unlock investment signals that might not otherwise be observed. One example of this is using highly correlated data like credit card transactions to gain visibility into recent company performance and consumer trends. Another is analyzing and summarizing the millions of news stories that are ingested into the Bloomberg Terminal each day to understand the key questions and themes that are driving specific markets or economic sectors or trading volume in a specific company's securities.
Humans in the loop
When we think of machine intelligence, we imagine an unfeeling autonomous machine, cold and impartial. In reality, however, the practice of ML is very much a team effort between humans and machines. Humans, for now at least, still define ontologies and methodologies, and perform annotations and quality assurance tasks. Bloomberg has moved quickly to increase staff capacity to perform these tasks at scale. In this scenario, the machines aren't replacing human workers; they are simply shifting their workflows away from more tedious, repetitive tasks toward higher level strategic oversight.
"It's really a transfer of human skill from manually extracting data points to thinking about defining and creating workflows," says Mann.
Ketevan Tsereteli, a Senior Researcher in Bloomberg Engineering's Artificial Intelligence (AI) group, explains how this transfer works in practice.
"Previously, in the manual workflow, you might have a team of data analysts that would be trained to find mergers and acquisition news in press releases and to extract the relevant information. They would have a lot of domain expertise on how this information is reported across different regions. Today, these same people are instrumental in collecting and labeling this information, and providing feedback on an ML model's performance, pointing out where it made correct and incorrect assumptions. In this way, that domain expertise is gradually transferred from human to machine."
Humans are required at every step to ensure the models are performing optimally and improving over time. It's a collaborative effort involving ML engineers who build the learning systems and underlying infrastructure, AI researchers and data scientists who design and implement workflows, and annotators - journalists and other subject matter experts - who collect and label training data and perform quality assurance.
"We have thousands of analysts in our Data department who have deep subject matter expertise in areas that matter most to our clients, like finance, law, and government," explains ML/AI Data Strategist Tina Tseng. "They not only understand the data in these areas, but also how the data is used by our customers. They work very closely with our engineers and data scientists to develop our automation solutions."
Annotation is critical, not just for training models, but also for evaluating their performance.
"We'll annotate data as a truth set - what they call a "golden" copy of the data," says Tseng. "The model's outputs can be automatically compared to that evaluation set so that we can calculate statistics to quantify how well the model is performing. Evaluation sets are used in both supervised and unsupervised learning."
Check out "Best Practices for Managing Data Annotation Projects," a practical guide published by Bloomberg's CTO Office and Data department about planning and implementing data annotation initiatives.
READ NOW
View additional multimedia and more ESG storytelling from Bloomberg on 3blmedia.com.
Contact Info:Spokesperson: BloombergWebsite: https://www.3blmedia.com/profiles/bloombergEmail: [emailprotected]
SOURCE: Bloomberg
Link:
Humans in the Loop: AI & Machine Learning in the Bloomberg ... - AccessWire
The Yin and Yang of A.I. and Machine Learning: A Force of Good … – Becoming Human: Artificial Intelligence Magazine
Photo by Andrea De Santis on Unsplash
As Artificial Intelligence (AI) and Machine Learning (ML) technologies have become more sophisticated, theyve permeated almost every aspect of our lives. These advancements hold incredible potential to transform society for the better, but they also come with a dark side. So much hype for AI has kicked off this year, spurred by the introduction of Open AIs ChatGPT. However, AI and ML have been around for a while, really kicking into full gear in the 2010s. We are just seeing the outcome for these developments now.
In fact, the 2020s will be defined by advancements of AI and ML. We are just scratching the surface with the potential for these advanced technologies. At its core though, stands the human intention and intervention. AI and ML can serve both as a force of good and a force of evil. However, they, undoubtedly have the potential to revolutionize industries while also posing some serious threats if misused.
The rise of AI and ML presents a double-edged sword. On one hand, these technologies have the potential to revolutionize industries, improve lives, and protect the environment. On the other hand, they can also lead to job displacement, loss of privacy, and perpetuation of biases.
It is up to us as a society to ensure that we harness the power of AI and ML for good while mitigating their potential for harm. By implementing thoughtful regulation, fostering ethical AI practices, and prioritizing transparency, we can harness the benefits of these technologies while minimizing the risks.
Original post:
The Yin and Yang of A.I. and Machine Learning: A Force of Good ... - Becoming Human: Artificial Intelligence Magazine
Financial Leaders Investing in Analytics, AI and Machine Learning … – CPAPracticeAdvisor.com
A new survey shows that continued inflation and economic disruptions are the top concerns for more than half of organizations in 2023. Despite this, most organizations expect their revenues to either increase or stay the same this year. As a result, three quarters of organizations plan to resume business travel in 2023 and half of organizations surveyed plan to invest in analytic technologies that can help navigate uncertain economic conditions.
The Enterprise Financial Decision-Makers Outlook April 2023 semi-annual survey was published by OneStream Software, a leader in corporate performance management (CPM) solutions for the worlds leading enterprises. Conducted by Hanover Research, the survey targeted finance leaders across North America to identify trends and investment priorities in response to economic challenges and other forces in the upcoming year.
When asked about current business drivers and plans for 2023, financial leaders are focused on the following factors:
COVID is still prevalent, but the business impact is shrinking
As the world returns to some type of normal following the pandemic, organizations are planning to reintroduce business travel but are still wary of supply chains. More than half of financial leaders expect COVID-related supply chain disruptions to continue into 2024 (54%) or beyond, down 18% from the Spring 2022survey.Business travel is poised for a comeback this year as 75% of organizations plan to resume this practice in 2023. In the Spring 2022 survey, most organizations (80%) had planned to resume business travel, but the study shows very few have actually implemented the plan (10%), citing the costs of flights, hotel, food and the lack of necessity.
Analytic technology is gaining focus to help navigate uncertainty
Trends in the survey foreshadow an increased usage of analytic technology that improves productivity and supports more agile decision-making across the enterprise. Cloud-based planning and reporting solutions remain the most used data analysis tool (91%), however, most organizations also use predictive analytics (85%), business intelligence (84%) and ML/AI (75%) tools at least intermittently. About half of organizations are planning to invest more in each of these tools this year, compared to 2022.
Adoption momentum for these tools started during the pandemic with no sign of slowing down. According to theSpring 2021survey, organizations said that in comparison to pre-pandemic they were increasing investments in artificial intelligence (59%), predictive analytics (58%), cloud-based planning and reporting solutions (57%) and machine learning (54%).
Organizations are realizing the value of AI
According to the survey, two-thirds of organizations (68%) have adopted an automated machine learning (AutoML) solution to supplement some of their workforce needs, a significant uptick when compared toSpring 2022(56%). In theFall 2022survey, 48% of respondents planned to investigate an AutoML solution in the future, which suggests respondents stayed true to their word and dove in on the technology in the last six months.
Finance leaders see opportunities for improvement in many areas with the help of AI/ML technologies, including ChatGPT. The tasks and processes they believe these technologies will be most useful for include financial reporting, financial planning & analysis, sales/revenue forecasting, sales & marketing and customer service.
Along with investing in new technology, almost all organizations (91%) are investing or planning to invest in new solutions that specifically support finance functions. The most common solutions are cloud-based applications (52%), AI/ML (43%), advanced predictive analytics (42%) and budgeting/planning systems (42%).
The current economic headwinds have finance leaders acutely aware of their investment decisions and weighing the benefits vs. the costs, said Bill Koefoed, Chief Financial Officer, OneStream. With revenue growth through economic uncertainty in mind, financial leaders are looking to invest in solutions that can support more agile decision-making, while delivering a fast return on investment. AutoAI and other AI innovations coming to light in the last couple of years have the potential to improve the speed and accuracy of forecasting and support more informed, confident decision making. OneStream is a proud innovator in this space and partners with organizations around the globe to help them navigate these challenging times.
Read more here:
Financial Leaders Investing in Analytics, AI and Machine Learning ... - CPAPracticeAdvisor.com
Novolyze EMP Adds Predictive Insight, Machine Learning to Boost … – Quality Digest
Health Care
Published: Monday, May 15, 2023 - 12:01
(Novolyze: Washington, D.C.) -- Novolyze, a leader in food safety solutions and quality digitization technology, has upgraded its environmental monitoring program (EMP) to include advanced predictive analytics and machine learning. This latest upgrade will enable Novolyzes technology to automatically generate trend charts and digital heat maps using digital data. This, in turn, will lead to better outbreak forecasting and prediction of pathogens such as listeria or salmonella, which have surged in recent months.
An EMP is a crucial tool for food and beverage manufacturers to maintain food safety and quality, especially for ready-to-eat (RTE) foods. Those are foods that require no further cooking or processing before consumption. As such, they are at higher risk of contamination by foodborne pathogens.
Novolyzes EMP has always been a critical tool for ensuring food safety and compliance. By testing the environment, including surfaces and equipment, the EMP helps manufacturers identify potential contamination risks and take appropriate corrective action to remove the risk. With the new upgrade, Novolyzes EMP is even more robust, providing manufacturers with real-time predictive analytics that enable them to stay one step ahead of potential foodborne illness outbreaks.
We are committed to providing the latest technology and solutions to help the food industry reduce risk and maintain the highest levels of food safety, says Novolyze CEO Karim-Franck Khinouche. With this latest upgrade, our EMP is more powerful than ever, and we are excited to continue helping our customers keep their food safe.
The use of predictive insight in an EMP can help food and beverage manufacturers identify potential areas of contamination before they become a problem. By collecting and analyzing data on environmental conditions, such as temperature, humidity, and sanitation practices, manufacturers can develop models to predict where and when potential contamination events may occur. This allows them to take proactive measures to prevent contamination, rather than waiting for a problem to arise and then reacting to it.
For example, if manufacturers use predictive models to identify specific areas in the facility that are at a higher contamination risk, they can take steps to increase sanitation measures in that area or adjust production processes to reduce that risk. By doing so, they can prevent potential food safety issues and ensure that the RTE foods they produce are safe for consumption. Novolyzes EMP can support these efforts.
Additionally, the use of predictive insight in an EMP can also help improve product quality. By monitoring and analyzing data on environmental conditions, manufacturers can identify and address issues that may affect the quality of the product, such as changes in temperature or humidity. This can help ensure that the products are of consistent quality and meet the expectations of consumers.
Novolyzes EMP is particularly relevant in the current climate, with foodborne illness outbreaks becoming far too common, says Khinouche. By utilizing the latest technology, including predictive analytics and machine learning, Novolyzes EMP is helping to cut down on food safety and quality control issues, enabling manufacturers to maintain the highest levels of food safety and ensuring that consumers can trust the foods they eat.
For more information, visit novolyze.com.
Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types.
However, someone has to pay for this content. And thats where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You wont see automobile or health supplement ads.
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
So please consider turning off your ad blocker for our site.
Thanks,Quality Digest
Read more:
Novolyze EMP Adds Predictive Insight, Machine Learning to Boost ... - Quality Digest
Decoding the Quant Market: A Guide to Machine Learning in Trading – Rebellion Research
Decoding the Quant Market: A Guide to Machine Learning in Trading
In the ever-changing world of finance and trading, the search for a competitive edge has been a constant driver of innovation. Over the last few decades, the field of quantitative trading has emerged as a powerful force, pushing the boundaries of what is possible and reshaping the way we approach the market. At the heart of this transformation lies the fusion of cutting-edge technology, data-driven insights, and the unwavering curiosity of the human mind. It is this intersection of disciplines that forms the foundation for Decoding the Quant Market: A Guide to Machine Learning inTrading.
In this book, I aim to share my experiences and insights, offering a comprehensive guide to navigating the world of machine learning in quantitative trading. Furthermore, the journey begins with a foundational understanding of the core principles, theories. Moreover, algorithms that have shaped the field. From there, we delve into the practical applications of these techniques, exploring real-world examples and case studies that illustrate the power of machine learning in trading.
Decoding the Quant Market is designed to be accessible to readers from diverse backgrounds, whether they are seasoned professionals or newcomers to the field of finance and technology. As a result, of combining theoretical knowledge with practical insights and examples. Thus, this book aims to provide a well-rounded understanding of the complex world of machine learning in trading.
Amazon.com: Decoding the Quant Market: A Guide to Machine Learning in Trading eBook : Marti, Gautier: Kindle Store
Continued here:
Decoding the Quant Market: A Guide to Machine Learning in Trading - Rebellion Research
DDN is the Leading Data Storage Company Behind the Machine … – PR Newswire
With More AI Data Capacity Deployed Than any Other Storage Company in the World, DDN is Fueling Breakthrough Applications in Medicine, Finance, Research, Autonomous Driving, and Space Exploration
CHATSWORTH, Calif., May 11, 2023 /PRNewswire/ --DDN, the global leader in artificial intelligence (AI) and multi-cloud data management solutions, today announced that it has sold more AI storage appliances in the first four months of 2023 than it had for all of 2022. Broad enthusiasm for the business opportunities presented by generative AI has resulted in a steady increase in investment in AI and AI infrastructure.
Language-based AI applications, such as ChatGPT, have garnered much attention, though there are a number of other AI use cases with significant breakthroughs on the horizon. Ultra-realistic 3D and immersive universes in gaming, sophisticated new protein and molecule creation for drug discovery, autonomous driving, and space exploration are just a few examples where massive data processing is required to fuel innovations.
"The trillions of data objects and parameters required by generative AI cannot be fulfilled without an extremely scalable and high-performance data storage system," explains Dr. James Coomer, SVP of Products, DDN. "DDN has been the solution of choice for thousands of deployments for organizations such a NASA, University of Florida, and Naver."
Whether in gaming, autonomous driving or natural language processing, massive data sets curated from various data sources are at the heart of generative AI and machine learning applications. However, to operate properly, they require very high levels of write and read data performance, extreme scale in cost-effective data capacity, a flexible delivery on-premises and in the cloud, as well as the highest efficiency and a minimal power footprint.
"Teams are moving rapidly to harness generative AI and need highly performant and scalable data platforms to be successful," said David Hall, Head of HPC at Lambda, DDN's customer and solution partner. "Lambda's deep expertise in large-scale GPU clusters, combined with cutting-edge DDN data platforms, is helping customers do just that - with flexible AI deployments in our cloud, on-premises or in colocation data centers."
DDN built the AI400X2 appliance specifically for these enterprise AI applications, on-premise in data centers and in the cloud. It delivers up to exabytes of data at terabytes per second in sustained write and read performance, providing 33x higher efficiency than traditional storage systems at a fraction of the power requirements.
While generative AI applications that produce human-like text or digital image generators garner a lot of public attention, drug design, material science, chip design, and industrial manufacturing all stand to benefit from this technology. With any novel technology, reducing risk is paramount for enterprises on their path toward realizing value. With leading AI technology and expertise, DDN delivers proven solutions that accelerate customers' strategic AI journey.
About DDN
DDN is the world's largest private data storage company and the leading provider of intelligent technology and infrastructure solutions for enterprise at scale, AI and analytics, HPC, government, and academia customers. Through its DDN and Tintri divisions, the company delivers AI, data management software and hardware solutions, and unified analytics frameworks to solve complex business challenges for data-intensive, global organizations. DDN provides its enterprise customers with the most flexible, efficient and reliable data storage solutions for on-premises and multi-cloud environments at any scale. Over the last two decades, DDN has established itself as the data management provider of choice for over 11,000 enterprises, government, and public-sector customers, including many of the world's leading financial services firms, life science organizations, manufacturing and energy companies, research facilities, and web and cloud service providers.
Contact:Press Relations at DDN[emailprotected]
Walt & Company, on behalf of DDNSharon Sumrit[emailprotected]
2023 All rights reserved. DDN is a registered trademark owned by DataDirect Networks. All other trademarks are the property of their respective owners.
SOURCE DataDirect Networks (DDN)
View original post here:
DDN is the Leading Data Storage Company Behind the Machine ... - PR Newswire
Predicting ED Workload with Machine Learning: Patient-Level … – Physician’s Weekly
The following is a summary of the Machine Learning Methods for Predicting Patient-Level Emergency Department Workload, published in the January 2023 issue of Emergency Medicine by Joseph et al.
Work Relative Value Units (wRVUs) are incorporated into various salary structures as a measure of the time and effort put into patient care. Therefore, predicting the number of wRVUs a patient will generate at triage with high accuracy would have many operational and clinical benefits, including easing the burden on individual doctors by spreading their workload more evenly. This study used data typically collected during triage to test whether deep-learning methods could accurately predict a patients wRVUs. Participants were adults who visited an urban, academic ER between July 1, 2016, and March 1, 2020.
Structured data (age, sex, vital signs, Emergency Severity Index score, language, race, standardized chief complaint) and unstructured data (free-text chief complaint) were used in the de-identified triage information, with wRVUs serving as the outcome measure. The researchers looked at five models, including the mean wRVUs per the primary complaint, linear regression, neural networks, gradient-boosted trees on structured data, and neural networks on unstructured textual data. A mean absolute error was used as a metric to rank the quality of the models. Between January 1, 2016, and February 28, 2020, they analyzed 204,064 visits. Age, gender, and race significantly affected wRVUs, with the median wRVUs being 3.80 (interquartile range 2.56-4.21).
Model errors decreased with increasing model complexity. Predictions with the use of chief complaints demonstrated a mean error of 2.17 wRVUs per visit (95% CI 2.07-2.27). The linear regression model showed an error of 1.00 wRVUs (95% CI 0.97-1.04), the gradient-boosted tree showed an error of 0.85 wRVUs (95% CI 0.84-0.86), the neural network with structured data showed an error of 0.86 wRVUs (95% CI 0.85-0.87), and the neural network with unstructured data showed an error of 0.78 wRVUs (95% CI 0.76-0.80). However, deep learning techniques show promise in overcoming the limitations of chief complaints as a predictor of the time required to evaluate a patient. These algorithms may have numerous useful applications, such as reducing bias in the triage process, quantifying crowding and mobilizing resources, and balancing emergency physicians and compensation.
Source: sciencedirect.com/science/article/abs/pii/S0736467922005686
Link:
Predicting ED Workload with Machine Learning: Patient-Level ... - Physician's Weekly