Category Archives: Data Mining

The Full Picture, the Right Picture High-Resolution Mass Spectrometry for Metabolomic Profiling – Technology Networks

Metabolomics is an important discovery tool in numerous research areas, including drug discovery, disease research and crop engineering, as well as in many industrial applications like biomanufacturing. The acquisition of confident measurements depends on accurate metabolite identification and reliable, properly controlled data generation, so an effective methodology is key.High-throughput, large-scale setups are particularly susceptible to variation. So, workflows must be designed with high accuracy and reproducibility in mind to ensure that changes detected are attributable to biology. Recorded spectra are compared to authentic standards for valid compound identification, and multiple quality checks eliminate variation introduced by sample preparation, operator and instrument.Although the focused analysis of a targeted set of metabolites is informative and reliable, sometimes a wider scope is needed to explain a phenotype or explore beyond what is known. Comprehensive metabolome coverage detection of all metabolites in a sample provides that scope but poses high demands on the instrument and setup. First, discriminating the hundreds of metabolites potentially in a sample requires a mass spectrometry (MS) system with high resolution power to tease apart compounds of closely related mass. Second, the instrument must acquire data across multiple ionization modes without bias to fully capture the structural diversity of the metabolome. Third, every run must include calibrations and controls to accurately analyze target metabolites as well as facilitate retro-mining data for unknown molecules.The first two demands are met by modern mass spectrometers that offer exceptional measurement resolution and analytical versatility while the third is met by a well-planned experimental design.

The choice of a control is dictated by the objective of a run. In the case of metabolomic profiling, the internal controls should be a mixture of reference compounds to support identification of target metabolites as well as represent a spectrum of chemical types, molecular weights, concentrations and ion charge. These controls are available commercially as isotope-labeled metabolite mixes figure 2 illustrates the coverage of a commonly used solution.

Figure 2. Spanning the metabolome: internal controls made for metabolomic profiling. Credit: Thermo Fisher ScientificBuilt into a run sequence, internal controls serve two purposes. First, the mix is serially diluted to run as a calibrant to obtain absolute concentration of the sample metabolites. Second, the labelled metabolite mix is spiked into samples at a constant concentration. The retention time, mass accuracy and signal response of each internal standard is then measured throughout the run sequence to assess instrument and method performance.The latter performance evaluation tests for losses or errors in the analytical workflow, ensuring that the instrument and method produce valid data. Steady retention time of the internal control across injections attests to consistent sample processing through separation and measurement, while reproducible mass accuracy and stable signal response verify instrument reliability. As an additional quality check, a pooled sample consisting of aliquots from all test samples is analyzed intermittently throughout the run sequence (typically every 1020 injections). This provides a repeatable measurement of the same sample over the entire sequence or study and confirms a stable signal response for endogenous metabolites.

Based on the cross-run and cross-injection analysis of the internal control summarized in figure 4, the method and HRAM system used performed exceedingly well. Controls from all injections and runs showed minimal deviation in retention time (A), mass accuracy (B) and signal response (C).

By retro-mining the acquired data, urea was found to be at lower concentration in dialyzed FBS (Figure 5B). Uridine, also identified retrospectively by matching data to a spectral library, was more concentrated in heat-inactivated FBS from the US, compared to other US-sourced batches (data not shown).

See the original post here:

The Full Picture, the Right Picture High-Resolution Mass Spectrometry for Metabolomic Profiling - Technology Networks

Anacortes Mining acquires Tres Cruces surface rights and drill data as option agreements with MBM terminate – Proactive Investors USA

CEO Jim Currie says the company is moving forward with the planning of a drill program to provide further information for the feasibility study on the oxide resource while further exploring high-grade gold discovered at depth

() said it has resolved all outstanding issues with Minera Boroo Misquichilca SA (MBM).

Furthermore, Anacortes said it has completed the transfer of ownership of surface rights, drill core, and all related data associated with its 100%-owned Tres Cruces project in north-Central Peru, which is about 10 kilometres (km) south of Barrick's past-producing Lagunas Norte operation.

Subsequently, MBM (formerly Minera Barrick Misquichilca)has no further rights related to Tres Cruces.

"This is an important step in advancing the Tres Cruces project as now all mineral claims, surface rights, core, reverse circulation pulps and rejects as well as the entire exploration database are completely under the ownership of Anacortes, said Anacortes CEO Jim Currie in a statement.

This will allow us to move forward with the planning of a drill program aimed at providing further information for the Feasibility Study on the oxide resource at Tres Cruces, while further exploring the high-grade gold discovered at depth, not included in initial resource estimates."

Going all the way back to May 2002, Anacortes subsidiaryNew Oroperu Resources Inc. granted an option to MBM to earn up to 70% of the equity in Aurifera Tres Cruces S.A. (ATC), a New Oroperu (now an Anacortes) subsidiary conducting exploration activities on the Tres Cruces project owned by ATC. In 2006, to enable MBM to conduct exploration activities on the Tres Cruces, ATC assigned the project to MBM.

Under the terms, Anacortes said MBM was required to make a production decision in respect of the Tres Cruces project on or before December 31, 2020, and if it did not do so, New Oroperu had the right to terminate. Similarly, the Mining Assignment Agreement expired by its terms on December 31, 2020.

The Mining Assignment Agreement further provided that promptly following expiration, MBM would assign to ATC any surface/access rights to the Tres Cruces project. MBM did not make a production decision on or before December 31, 2020.

On October 18, 2021, following the acquisition of New Oroperu by Anacortes Mining Corp., MBM, Anacortes and its subsidiaries (including ATC) confirmed that all agreements have been terminated and MBM has transferred to ATC the surface/access rights to the Tres Cruces project in exchange for payment to MBM of US$1,620,709.

Tres Cruces is one the highest-grade, undeveloped gold oxide deposits globally with a leachable gold resource of 630,000 ounces grading 1.3 grams per ton (g/t) contained within a larger indicated gold resource of 2,474,000 ounces grading 1.65 g/t gold and an inferred resource of 104,000 ounces grading 1.26 g/t gold.

Tres Cruces is strategically located in a highly prospective geological belt that hosts such significant gold deposits as Yanacocha, Lagunas Norte, and Pierina.

Contact the author: patrick@proactiveinvestors.com

Follow him on Twitter @PatrickMGraham

Read the rest here:

Anacortes Mining acquires Tres Cruces surface rights and drill data as option agreements with MBM terminate - Proactive Investors USA

Hackers mining third-party apps to steal your health data – Greater Kashmir

The researchers at app security company Approov were able to access over 4 million patient and clinician records from over 25,000 providers through third-party apps that link up with hospital health records to pull out data.

"Cybersecurity analyst Alissa Knight got access to more than 4 million patient and clinician records by exploiting vulnerabilities in data aggregators' application programming interfaces, along with associated apps that track medications and share patient records," reports STAT News.

The records included demographics, lab results, medications, procedures, allergies, and more.

"Collectively, the tested tools can read and write data to the major EHR systems," the report said on Monday.

Knight checked for vulnerabilities in apps built using the Fast Healthcare Interoperability Resources (FHIR) standard.

"She didn't need to use advanced cybersecurity hacking. She just used basic stuff that your freshman year of cybersecurity would have stressed," said John Moehrke, member of the FHIR management group.

The electronic health records housed at hospitals and health centres are well protected.

"But as soon as a patient gives permission for their data to leave the health record and head toward a third-party app - like programmes that track people's medications, for example - it's easy for hackers to access," The Verge reported.

The hacking attempts on the healthcare industry began to rise last year during the pandemic.

In 2020, 1 million people were affected almost every month by data breaches at healthcare organisations, according to health and human services (HHS) data.

Nation-state-backed hackers are also trying to infiltrate healthcare systems and steal vaccine-related research and other information, according to warnings from intelligence agencies in the US, Europe and Canada.

Four years ago, the UK's National Health Service (NHS) suddenly found itself one of the most high-profile victims of a global WannaCry ransomware attack.

Go here to see the original:

Hackers mining third-party apps to steal your health data - Greater Kashmir

Top 10 Business Intelligence Books to Read 2021 – Analytics Insight

Its tough to keep up with all the trends and developments in your business when youre trapped at home. Although you should be able to do your essential tasks, skill-building activities like seminars and unplanned in-person interactions have been delayed or made more difficult. This may lead you to search for alternative strategies to improve your work abilities. Reading about issues that can help you grow your business and thrive in your position is one alternative. This article includes a list of the top business intelligence (BI) books to help you brush up on the procedures, roles, and technologies involved.

While this book concentrates mostly on BI knowledge for total novices, it takes a novel perspective to the subject and provides nuggets of useful information to readers at all levels.

The insights provided inside this fantastic resource not only dive into the necessity of adopting digital data for long-term commercial success but also give a well-balanced combination of suggestions and ideas to help you maximize your ROI via all of your BI activities and projects.

This product by Wayne Eckerson outlines everything connected to business dashboards and is one of the best BI publications to help you build a data-driven corporate culture. Wayne gives an overview on how to set up and manage a real-world company dashboards performance. This book will be especially beneficial to technical operators or business managers who are responsible for the performance of their team. Its for data-driven individuals, and by reading it, youll be allowing Eckerson to assist you in fine-tuning your communication approach. He is aware of the technical and sociological implications of implementing BI software throughout an organization. If youre a datapine user, youre probably relieved to be free of these aches and pains.

This thorough business analytics book guides you through the analytical procedure for deriving valuable knowledge and business value from the statistics you collect, as well as the fundamental concepts of data science. Its chock-full of real-world examples, so brush up on your high-school arithmetic abilities and be ready to follow a constant stream of data-driven reasoning. This post covers success stories of prominent organizations using data science to solve significant business problems, as well as how data science technologies helped them do so.

Today, less than 0.5 percent of all data gets evaluated and used, as per this book. Thats incredible. While many company executives remain skeptical about the use of big data in their operations, this book has the potential to alter their views.

This is one of the definitive business intelligence and analytics books of the contemporary day, delving deep into how to develop a long-term BI plan while gathering and analyzing large data in the most efficient way possible.

Big data is a big part of business intelligence, and David Stephenson Ph.D.s 2018 edition is a results-driven handbook with equal parts curiosity, inspiration, and practical ideas adapted to todays information-rich world, one of the best business analytics books of the present.

Youll not only understand how to manage big data and utilize it to improve your daily operations, but youll also have access to a number of case studies that will put all of the tips, approaches, and concepts into context.

This great book on business analytics will teach you the foundations of how to make professional visuals that highlight vital data-driven insights in a simple, succinct, and easy-to-read manner. Simply said, its fantastic, especially if youre an internet marketer or somebody who often works with huge datasets for marketing reports. Instead of focusing on how to make charts seem dazzling, the book focuses on how to make your visualizations clear, useful, and relevant.

Too Big to Ignore provides a fantastic introduction to the subject. Rather than laying out analytic methodologies, this business intelligence book illustrates how big data works in a straightforward, understandable, and practical manner.

Youll discover why big data is such a huge problem and why its critical for businesses to understand how to use it in their day-to-day operations.

This best-selling BI book does exactly what it says it will: it provides a reliable reference for people new to the area who are searching for invaluable practical information.

This great best-selling book on BI can help you pick the correct technology to meet your BI-based objectives, aspirations, and wants by taking a step-by-step approach to learning. Furthermore, key skills such as comprehending core concepts, planning, establishing your BI plan, generating effective roadmaps, and more will be taught here. Theres no jargon, just sensible guidance from one of the greatest books on BI in the world.

The study developed by its writer, Cindi Howson, a Research Vice President at Gartner and an expert BI analyst, identified the analytical techniques employed by the industrys largest players, which gives this most informative of business intelligence books its uniqueness. It is not a theoretical article, but rather a narrative based on real-world BI methods. Because each organization requires a unique BI deployment, and there is no one-size-fits-all approach, this useful BI book demonstrates a variety of successful methods to business intelligence.

This book, unlike the other BI publications on our list, does not explain big data to its audience or provide solutions to design-based challenges. The writers, on the other hand, present a beneficial action plan in this book that will assist you in evolving your company processes by using the power of data.

The books format invites users to use the books findings right immediately (nearly in real-time) and create their own business intelligence plan. Each chapter delves into a distinct aspect of a business intelligence project, providing practical elements such as project flow charts, project responsibilities, and the risks associated with each phase.

Since business intelligence is a topic that applies to a wide range of sectors, these books can help you improve your knowledge of the data your company gathers and the technologies used to handle it, regardless of your industry. Big data and business intelligence are becoming more important resources for businesses, which should come as no surprise. To thrive in virtually every industry, businesses must gather, understand, and convey a story using data. Organizations may utilize BI technologies to acquire data from both internal and external operations and store, analyze, and analyze it.

See the rest here:

Top 10 Business Intelligence Books to Read 2021 - Analytics Insight

FAU Engineering One of the Top Three Fastest Improving Colleges in the U.S. – Newswise

Newswise Florida Atlantic Universitys College of Engineering and Computer Science (COECS) is rapidly rising in U.S. News & World Report rankings, and is now one of the top three fastest improving engineering colleges in the nation (2020 to 2022). The college also is ranked No. 111 by U.S. News & World Reports Best Colleges Rankings 2022, among public engineering colleges whose highest degree offered is a Ph.D.

This rise in rankings is due in large part to numerous achievements during the past four years, which include a 2.6-fold (164 percent) increase in external research funding, a 5.8-fold (480 percent) increase in student internships, and a 40 percent increase in M.S. and Ph.D. degrees awarded.

Our College of Engineering and Computer Science is soaring to new heights and our latest U.S. News & World Report rankings is evidence that we are getting noticed by peers for our research and development output and for our robust engineering programs, which are continuously progressing, evolving and ascending to align with national priorities for workforce development, said Stella Batalama, Ph.D., dean, the COECS. Our graduates are sought by leading engineering firms in industry and government, and our innovative programs are preparing both undergraduate and graduate students to be leaders in engineering disciplines.

Research efforts in the COECS are supported by the National Science Foundation (NSF), the National Institutes of Health (NIH), the United States Department of Defense, the U.S. Department of Transportation, the U.S. Department of Education, the state of Florida as well as industry. Eight faculty members have received the NSF Early CAREER award, and two have received NIH MIRA awards, all in support of early-career faculty.

The COECS is trailblazing a path in cutting-edge research and technologies focused on artificial intelligence (AI) and autonomy, data science and machine learning. This year, the college launched the Center for Connected Autonomy and Artificial Intelligence(ca-ai.fau.edu), a revolutionary effort on networked AI and connected robotics that combines expertise in AI, computing, sensing solutions, big data analytics and autonomous technologies. Supported by the Schmidt Family Foundation, this one-of-its-kind center already has garnered more than $9 million in basic research federal funding. Center projects cover underwater, surface, air and space applications that are supported by autonomous resilient machine-to-machine wireless networking.

The college also is home to several state-of-the-art institutes and centers including the Freight Mobility Research Institute, a Tier 1 University Transportation Center funded by the U.S. Department of Transportation, established in 2017. The institute has received more than $15 million in funding to date to help strengthen the nations economic competitiveness. The colleges Industry/University Research Collaboration Center for Advanced Knowledge Enablement (CAKE) supported by the NSF and industry and established in 2009, enables close and sustained engagement between industry innovators, academic research teams, and government agencies in the areas of information technology, multimedia and data mining, and cloud computing. CAKE members include companies such as JM Family Enterprises; Motorola Mobility (powered by Google); LexisNexis; Tecore Wireless Systems and Florida Solar Energy. FAU CAKE has received more than $9 million in federal, state and industry funding.

At the forefront of research in sensing technologies is the FAU Institute for Sensing and Embedded Network Systems Engineering (I-SENSE), established in 2016 and housed in the COECS. One of the universitys four research pillars, I-SENSE is focused on infrastructure systems and smart cities, marine and environment and smart health and behavior.

Located on FAUs Dania Beach campus in Broward County, FAU SeaTech The Institute for Ocean and Systems Engineering was established in 1997 as a state-funded Type II Research Center. Focused on acoustics, marine vehicles, marine materials, ocean energy technologies, among others, SeaTech has established a long-term partnership with the U.S. Naval Surface Warfare Center Carderock Division and its South Florida Testing Facility. SeaTech is consistently funded by the U.S. Office of Naval Research.

The COECS offers 24 graduate degree programs (M.S. and Ph.D.), 13 undergraduate degree programs and 12 undergraduate and graduate certificates. In just four years, the college has launched several future-focused programs including Floridas first Master of Science program in Artificial Intelligence (MSAI). The college is designated as a Hispanic-Serving Institution by the U.S. Department of Education, and has recently received $1 million from the NSF to provide scholarships to high-achieving low-income engineering undergraduate students to pursue an M.S. degree in AI.

The college also has established Floridas first Professional Master of Science and Ph.D. degree programs in computer science for working professionals with flexible schedules and offers a M.S. and B.S. degree program in data science and analytics. New programs in the college also include a Ph.D. degree program in transportation and environmental engineering and a co-op B.S. degree program in computer science.

The COECS recently secured $3.4 million in graduate student training grants in data science and AI, as well as $4.3 million in grants focusing on population genomics with advanced machine learning and AI tools. In addition, the college hosts a number of undergraduate research scholarship initiatives to enable students to participate in federally-funded research projects, which creates a pipeline for its graduate degree programs.

The most recent point of pride for the COECS is a state-of-the-art Fab Lab (rapid prototyping and small production facility) that officially opens its doors next month to the entire University and broader community.

- FAU -

About FAUs College of Engineering and Computer Science:

The FAU College of Engineering and Computer Science is internationally recognized for cutting edge research and education in the areas of computer science and artificial intelligence (AI), computer engineering, electrical engineering, biomedical engineering, civil, environmental and geomatics engineering, mechanical engineering, and ocean engineering. Research conducted by the faculty and their teams expose students to technology innovations that push the current state-of-the art of the disciplines. The College research efforts are supported by the National Science Foundation (NSF), the National Institutes of Health (NIH), the Department of Defense (DOD), the Department of Transportation (DOT), the Department of Education (DOEd), the State of Florida, and industry. The FAU College of Engineering and Computer Science offers degrees with a modern twist that bear specializations in areas of national priority such as AI, cybersecurity, internet-of-things, transportation and supply chain management, and data science. New degree programs include Master of Science in AI (first in Florida), Master of Science and Bachelor in Data Science and Analytics, and the new Professional Master of Science and Ph.D. in computer science for working professionals. For more information about the College, please visit eng.fau.edu.

About Florida Atlantic University: Florida Atlantic University, established in 1961, officially opened its doors in 1964 as the fifth public university in Florida. Today, the University serves more than 30,000 undergraduate and graduate students across six campuses located along the southeast Florida coast. In recent years, the University has doubled its research expenditures and outpaced its peers in student achievement rates. Through the coexistence of access and excellence, FAU embodies an innovative model where traditional achievement gaps vanish. FAU is designated a Hispanic-serving institution, ranked as a top public university by U.S. News & World Report and a High Research Activity institution by the Carnegie Foundation for the Advancement of Teaching. For more information, visitwww.fau.edu.

More here:

FAU Engineering One of the Top Three Fastest Improving Colleges in the U.S. - Newswise

University At Buffalo – News Center: Data Mining The Past – Patch.com

New algorithm searches historic documents to discover noteworthy people

BUFFALO, N.Y. Old newspapers provide a window into our past, and a new algorithm co-developed by a University at Buffalo School of Management researcher is helping turn those historic documents into useful, searchable data.

Published in Decision Support Systems, the algorithm can find and rank people's names in order of importance from the results produced by optical character recognition (OCR), the computerized method of converting scanned documents into text that is often messy.

"It's a known fact that when OCR software is run, very often the text gets garbled," says Haimonti Dutta, PhD, assistant professor of management science and systems in the UB School of Management. "With old newspapers, books and magazines, problems can arise from poor ink quality, crumpled or torn paper, or even unusual page layouts the software isn't expecting."

To develop the algorithm, the researchers partnered with the New York Public Library (NYPL) and analyzed more than 14,000 articles from New York City newspaper The Sun published during November and December of 1894. The NYPL has scanned more than 200,000 newspaper pages as part of Chronicling America, an initiative of the National Endowment for Humanities and the Library of Congress that is working to develop an online, searchable database of historical newspapers from 1777 to 1963.

Their algorithm ranks people's names by importance based on a number of attributes, including the context of the name, title before the name, article length and how frequently the name was mentioned in an article.

The algorithm learns these attributes only from the textit does not rely on external sources of information such as Wikipedia or other knowledgebases. But since the OCR text is garbled, it can't determine how effective these attributes are for ranking people's names. So the researchers used statistical measures to model the many data attributes, which helped provide the desired ranking of names.

The researchers used two sets of the historic articles to test their algorithm: One set was the raw text produced from the OCR software, the other set had been cleaned up manually by New York City schoolchildren, who are using the articles to write biographies of local, notable people of the time.

When compared to the cleaned-up versions of the stories, the ranking algorithm is able to sort people's names with a high degree of precision even from the noisy OCR text.

Dutta says their process has wide reaching implications for discovering important people throughout history.

"We recently used this technique on African American literature from the Civil War to learn more about the important people during the era of slavery," says Dutta. "Going forward, we'll be expanding the technique to examine relationships between people and build out the social networks of the past."

Dutta collaborated on the study with Aayushee Gupta, PhD, research scholar at the International Institute of Information Technology Bangalore Department of Computer Science.

This press release was produced by the University at Buffalo - News Center. The views expressed here are the author's own.

Read more from the original source:

University At Buffalo - News Center: Data Mining The Past - Patch.com

Integrated Predictive Safety Systems: Highlighting the Importance of Safety in Mining – AZoMining

Often involving harsh conditions and work with heavy machinery, mining is one of the most dangerous modern industries, and while safety has been a major focus for as long as mines have been around, the use of cutting-edge artificial intelligence and machine learning technologies are opening a new realm of possibility.

Image Credit:King Ropes Access/Shutterstock.com

More specifically, cutting-edge predictive safety systems are being developed to anticipate accident rates based on a set of working conditions. Machine learning algorithms trained on safety data accurately predict injury statistics. These systems are still in their infancy, but mine operators are becoming increasingly invested in the technology.

The adoption of machine learning and automation in mining would likely not be where it is today without the COVID-19 pandemic. The crisis acted as an accelerant for the adoption of technology that takes humans out of the equation. Mining companies used the technology to remove many workers from potentially hazardous situations and the success they saw validated the use of remote operations. Meanwhile, many companies that did not adopt these technologies fell behind the rest of the pack.

The capability to collect data that can power sophisticated analytics is facilitating a shift; from a generalized analysis of historical safety data to solutions based on predictive models.

With the appropriate data, analytics can help mining operators recognize specific scenarios related to a greater risk of injury and relevant solutions. Results from the latest research are showing a high level of accuracy from machine learning models.

In a September 2020 study, researchers used models trained on historical datasets to predict the number of days away from work caused by accidents.

Interestingly, the study team revealed that detailed accident narratives fed into the models led to better predictions than models trained on context-free tabular data. The study team said superior narratives-based predictions were likely due to additional details provided by the person recording the incident.

For instance, an accident narrative might explain how a worker was injured when a tiny bit of metal flew under their safety shield and safety glasses. Tabular data on such an incident typically would not make any mention of the worker wearing a safety shield or safety glasses.

In other results of the study, maintenance workers were found to have the highest risk of injury of all types of workers at open-pit mines.

Researchers also found that total mining experience played a significant role when it came to the risk of severe injury. However, the job-related experience was not found to have a significant relationship to the severity of the injury.

Shift start time and time of day were also found to be major factors related to injury severity. Furthermore, material handling was found to be the activity most associated with significant injury among mining workers.

The study team said the results show the potential benefits of machine learning when it comes to predictive safety. With their ability to handle large amounts of multidimensional data and improve over time, machine learning models will be invaluable when it comes to making better decisions around mining safety.

One of the biggest hurdles for predictive safety systems is the capacity to collect optimal data. Unfortunately, simply gathering massive amounts of tabular data is not enough to provide useful insights. Furthermore, current mining technologies exist in separate silos, making data gathering difficult.

Accelerated by COVID-19, mining operations are shifting to a more centralized model and this shift will be beneficial when it comes to aggregating useful data. This centralization will help to break down silos and unify data sources. As this happens, it will be important to sustain available, consistent, and reliable data.

Standards must be kept high by executives who oversee the collection and processing of data. Furthermore, internal data may not be enough to generate significant insights, and executives who oversee data may have to coordinate with their counterparts across the industry.

The emergence of wearable technology offers significant promise when it comes to enhancing the safety of mining operations and integrated predictive safety. However, the many different wearable devices out there exist in silos. Safety-relevant data collected from watches, hard hats and other wearables has limited value if it cannot be unified.

Computer vision, video analysis, and collision prevention systems on vehicles are other areas that hold significant promise for integrated predictive safety systems. Data collected from these systems could be used in areas such as preventative maintenance and malfunction prediction.

Furthermore, the continued rollout of automated and robotics assets will not only fuel predictive safety measures through data collection, but also offer the bonus of taking humans out of harm's way.

Once again, the data provided by all these emerging technologies will have limited use for predictive safety if there is a significant lack of integration. In addition to developing these technologies, there also needs to be significant effort put towards integrating the various pieces of this complex jigsaw puzzle.

Prinsloo, G. et al. Trend 9: On the road to zero harm. Deloitte Insights. [Online] Available at: https://www2.deloitte.com/us/en/insights/industry/mining-and-metals/tracking-the-trends/2021/next-generation-mining-safety-technology.html

Carr, J. AI predictive maintenance is the holy grail for mining. Axora. [Online] Available at: https://www.axora.com/insights/ai-predictive-maintenance-holy-grail-mining/

Yedla, A. et al. (2020) Predictive Modeling for Occupational Safety Outcomes and Days Away from Work Analysis in Mining Operations. International Journal of Environmental Research and Public Health. 17(19):7054 http://dx.doi.org/10.3390/ijerph17197054

Deloitte. Workplace Safety Analytics. [Online] Available at: https://www2.deloitte.com/content/dam/Deloitte/ca/Documents/Analytics/ca-en-analytics-workplace-safety-analytics.pdf

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Excerpt from:

Integrated Predictive Safety Systems: Highlighting the Importance of Safety in Mining - AZoMining

Peter Thiel bets on the far right: Tech tycoon spending millions to bankroll "Trump wing" of GOP – Salon

Billionaire Republican donor Peter Thiel is bankrolling election conspiracists and primary challengers againstRepublicans who backedDonald Trump's impeachment after the Jan.6 Capitol riot, including an ArizonaSenate candidate who is literally on his payroll.

Thiel, the Facebook board member who co-founded PayPal and later the controversial data-mining company Palantir, has long been a top Republican benefactor, donating millions to GOPcandidates and political action committees. But in the wake of Trump's 2020 defeat, Thiel has grown more aggressive in his political investments, dropping more than $20 millionto support two far-right Senate candidates and helping to fund primary challengers against Rep. Liz Cheney, R-Wyo., and other Republicans who called for Trump's removal after the deadly riot.

"He wants to be the patron of the Trump wing of the Republican Party," said Max Chafkin, a Bloomberg reporter and author of "The Contrarian: Peter Thiel and Silicon Valley's Pursuit of Power." Thiel is focused on building out "Trumpism after Trump," Chafkin said in an interview with Salon, describing the tech billionaire as "in many ways further to the right than Trump."

Thiel, who donated $1.25 million to back Trump in 2016, has made an even bigger splash this election cycle with a $10 million donation to back his protgBlake Masters, who plans to run for the Republican nomination in next year'sArizona Senate election. Masters is uniquely connected to Thiel, serving as the chiefoperating officer of Thiel Capital, the billionaire's venture capital fund, and co-writing Thiel's book "Zero to One."

While candidates like Virginia's Glenn Youngkinhave stepped away from their corporate careers to run for office, Masters appears to still be on Thiel's payroll. He earned $775,000 from Thiel Capital last year and received more than $340,000 in royalty payments from the sales of "Zero to One," according to a personal finance disclosurethat was first reported by Insider. Masters did not respond to questions from Salon about whether he still collectsa salary from Thiel'scompany, but still lists himself as the firm's COO on his LinkedIn page.

Thiel last month hosted a fundraiserfor Masters'campaign at his Los Angeles home that cost up to $5,800 to attend.

Saving Arizona PAC, the Thiel-funded effort that has already spent nearly $1.7 millionin Arizona, has launched ads attacking state Attorney General Mark Brnovich, Masters' principal GOPopponent, for rejecting Trump's lie that voter fraud cost him the election.

"Mark Brnovich says President Trump is wrong on voter fraud. Really? Brnovich failed to convene a grand jury, certified Biden as president. Now he's nowhere to be found, making excuses instead of standing with our president," the ad says.

Brnovich was one of multiple state officials, including Republican Gov. Doug Ducey, who certified the election results. The addoes not explainwhy Brnovich should have convened a grand jury. There has been no evidence of widespread fraud in Arizona or any other state,and courts haverepeatedly rejectedchallenges by Trump allies seeking to overturn Biden's win.

The PAC also blasted Brnovich on Twitter, arguing that he is "nowhere to be found in the fight against voter fraud."

Masters is the "only candidate who will demand fair and transparent elections," the PAC said.

The Saving Arizona PAC recently made yet another six-figure ad buy attacking Brnovich for not being Trumpy enough.

Masters, who was endorsedby Trump's former national security adviser Robert O'Brien this week, has walked a fine line whendiscussing the presidential election. He has stoppedshort of claiming that the election was stolen outright, but has boostedconspiracy theories on Twitter about "dead people voting,"Dominion voting machinesandfears about election "integrity,"echoing a trope employed by numerous other Republicanswho have tried to distance themselves from the voter fraud lie while still trying to appease Trump loyalists.

After the dubious so-called auditin Arizona's Maricopa County actually showed Biden gaining a handful of votes compared to the official total, Trump and other Republicans began to claimthat the audit had turned up serious questions about the election administration. In fact, Republican audit officials testified to Congress last week that the county held a "free, fair and accurate election."Masters, however, sided with TrumpWorld throughout the process, teasingthe release of the audit report, echoing Trump's claimsabout "fake" polls and "anti-Trump disinformation," and making the evidence-proofargumentthat "no matter what the audit finds, we know this election wasn't fair."

Masters later demanded action from Brnovich in response to the audit, though he did not say exactly what he wanted the state attorney general to do.

"The AZ audit findings have been referred to the Attorney General," Masters tweeted. "The ball is now in Brnovich's court. He has a track record of doing the bare minimum, so let's pay attention, and we'll see if the Republican establishment is serious about election integrity."

Masters also demanded that Ducey immediately "call a special session" to impose new voting restrictions, even though Ducey had already signed a billto restrict mail ballots and purge the state's popular early voting list in the spring.

"Get the legislature back to work so they can tighten up our election laws," Masters tweeted. "Starting with universal voter ID for every kind of ballot nothing less is acceptable."

Masters did not respond to questions from Salon about whether he believes Biden legitimately won Arizona, or what he would like Brnovich to do in response to the "audit" results.

The attacks on Brnovich come as Masters seeks to close a massive early polling deficit against the attorney general. A Republican pollconducted last month showed Brnovich leading Masters, by 41% to 6%. Another September poll from OH Predictive Insightsalso showed Masters polling at just 6% and performing the worst of any candidate against incumbent Sen. Mark Kelly, a Democrat.

"It's clear Blake Masters is threatened by AG Brnovich," Joanna Duka, a spokeswoman for Brnovich, said in a statement to Salon.

Thiel has also dropped another $10 million to back J.D. Vance, another longtime business associate and the author of "Hillbilly Elegy," in Ohio's Senate race. As with Masters, Thiel has a long business relationship with Vance, who got his start in venture capital working at Thiel's Mithril Capital Management, which is named after a fictional metal in "The Lord of the Rings." Vance later got an investmentfrom Thiel to help start his own venture fund, Narya Capital, which is named after one of the Elvenrings in J.R.R. Tolkien's fantasy classic.Both men recently investedin the right-wing video platform Rumble.

Masters and Vance are "kind of extensions of Peter Thiel," Chafkin said, describing them as "hardcore ideologues" whoare more disciplined and coherent than Trump, but largely focused on the same issues.

Vance has tried to stay away from election conspiracies but defended rioters at the Jan.6 Capitol attack as mostly "super peaceful."Thiel's allies have generally avoided directly claiming that the 2020was rigged, but have continued to raise irrelevant or baseless questions about the result.

"They're trying to walk a line and come up with some kind of intellectually respectable version of The Big Lie," Chafkin said, adding that the Thiel-backed candidates have tried to "cozy up" to hardcore Trump backers and "be perceived as friendly to them."

Thiel himself has also cultivated relationships in TrumpWorld. He developed close ties to former Trump campaign chief and White House strategist Steve Bannon, whom Chafkin described as Thiel's "ideological" ally who shares his views on the "deep state." Thiel routedhis big 2016 donation to back Trump through the super PAC controlled by Rebekah Mercer, also a major donor to far-right Republicans.Mercer was a longtime patron of Bannon and his projects and has joined Thiel in funding Vance's Ohio campaign. Mercer has spent millionsto support some of the leading proponents of Trump's election lies, as well aselection objectors who fueled the Capitol riot.

More recently, Thiel met with Trump at the ex-president's Bedminster, New Jersey, resort and began funding candidates in support of the former president's revenge tour against pro-impeachment Republicans, according to Politico.

Thiel donated the maximum $5,800 to Harriet Hageman, the Trump-backed primary challenger to Rep. Liz Cheney, R-Wyo. Hageman hascontinuedto claimthat there are "legitimate questions about what happened during the 2020 election" and supported the Arizona "audit."

Thiel has also donated to Joe Kent, the Trump-backed primary challenger to Rep. Jaime Herrera Beutler, R-Wash., who also voted to impeach Trump after the riot. Kent spokeat the recent "Justice for J6" rally in Washington in support of the Capitol rioters charged in the attack. He vowedto lead a "full congressional inquiry" into the 2020 election if elected. Kent was among a group of Trump supporters who filed lawsuitslast month in Washington state accusing multiple counties of "flipping votes" and calling for a "full forensic audit" of the election.

Thiel's funding for candidates pushing election lies is "totally consistent" with his embrace of Trumpism, Chafkin said. Though Thiel ultimately decided not to donate to Trump in 2020 out of frustration about the former president's "perceived competence," the billionaire has sought candidates who will pursueTrump's hardline policies on immigration, relations with China, regulation of tech companies, "political correctness"and globalization.

On all those issues, Thiel "basically agrees" with Trump, Chafkin said. "He wants to be involved in this movement and what you're seeing now is he'smaking that play.He's trying to be the main patron to his part of the Republican Party."

View post:

Peter Thiel bets on the far right: Tech tycoon spending millions to bankroll "Trump wing" of GOP - Salon

Eni S p A : selects the best startups to develop innovative solutions in Communications Data Mining – Marketscreener.com

San Donato Milanese (Milan), 15 October 2021 - Eni, in collaboration with Cariplo Factory, has announced the winners of the call for startups in "Communications Data Mining", an open innovation initiative in data analytics to monitor and optimise communication processes and interactions with stakeholders. Of the 8 startups involved in the Selection Day, 3 companies were selected: BLACKBIRD.AI, DATRIX and ASC27 for having developed the proposals with the greatest innovative potential.These startups will start to hold a course of study with Eni to identify possible areas of cooperation, with the option to engage in a future collaboration. The teams of innovators will have the opportunity to get feedback from Eni and, with the support of Cariplo Factory, test the innovative potential of their products and services in the hugely important and ever-changing communications sector.Focus on the selected startups:

The Selection Day is the result of a design project that began in July with the involvement of Eni's External Communication and Digital & Information Technology units, and defined the innovation areas and their objectives. Seventy-four innovative startups and SMEs were screened for final selection.Three areas of interest were the focus of the search:

The project marks another step forward in Eni's digital transformation by contributing to the development of a mutually beneficial innovative ecosystem with startups - essential to achieving the company's strategic objectives.

Cariplo Factory is a leading player in open innovation. It has been supporting Eni since 2018 in developing the Italian ecosystem and the role of startups in the country's digital transformation.

Disclaimer

Eni S.p.A. published this content on 15 October 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 15 October 2021 16:11:06 UTC.

Publicnow 2021

Technical analysis trends ENI S.P.A.

Income Statement Evolution

Go here to see the original:

Eni S p A : selects the best startups to develop innovative solutions in Communications Data Mining - Marketscreener.com

R&D Solutions for Pharma and Medical Technology – Knovel

For Pharma and Medical Technology

Innovate and prioritize faster and safer with actionable data and AI

Break through data barriers

Pharma and medical technology organizations need facts to make research and business decisions facts extracted from a vast information landscape. With Elsevier as your partner, you can trust the quality of the data that answers your research and business questions. Applying our semantic technology, analytical services and tools, you can:

You are competing in a rapidly evolving environment that encompasses science, global regulatory patterns, market trends, supply chain dynamics and healthcare management. Elsevier can help you develop innovative, targeted drugs and ensure compliance and safety with:

Access the most comprehensive datasets in the areas of chemistry, disease biology, clinical pharmacology, and more.

Integrate datasets into your existing systems or use intuitive browser-based tools. Get the data you need the way you need it.

Analyze your internal data or information from third-party sources. Make your unstructured content machine readable.

Partner with domain and data experts who solve complex data challenges, and deliver scientific analysis and interpretation.

Predict synthesis routes much faster than with other computer-aided retrosynthesis solutions. Reaxys Predictive Retrosynthesis can generate routes for novel compounds in 10 minutes.

How? We trained deep learning technology on over 15 million reactions to auto-derive more than 400,000 reaction rules.

Structure and integrate your data for better discovery and applied analytics

Extracting valuable insights from unstructured data can be like a treasure hunt without a map. SciBite semantic technology helps you prepare both internal and external data for semantic discovery, analysis and machine learning. Integrating and semantically enriching data from multiple sources deepens insights and accelerates breakthroughs.

Manage your data at scale by working with our data experts to:

Elsevier Professional Services can work with you to overcome complex data challenges. This team of highly experienced life science consultants specialize in:

Life science data, ready to integrate and analyze

Tap into an extensive and growing body of life science and related data that is ready to be integrated into your systems and workflow.

This data can be used for machine learning, text mining and other data sciences processes, including predictive models to drive innovation and prioritize business activities.

Acute pancreatitis is the most common GI related cause for hospitalization in the US. Treatment is largely supportive with no current disease-modifying drugs available. Professional Services conducted a trend analysis using Elsevier text and data mining to gain insight into emerging areas of research for pancreatitis.

The tools to accelerate discovery and mitigate risk

Get the scientific content and tools you need to make data-driven decisions throughout the drug and medical device development life cycle.

Improve chemical R&D productivity with expertly curated structure, property and reaction data, experimental procedures and chemistry literature. Reaxys empowers hit identification and lead optimization with normalized substance-target affinity data and pharmacokinetic, efficacy, toxicity, safety and metabolic profiles.

Transform your life science data and accelerate its use in R&D with a suite of semantic solutions. SciBite enables data sharing, discovery, analysis and interrogation through ontology management, text analytics and enrichment, semantic search, deep learning and semantic algorithms to build powerful models.

Make more informed drug development decisions with critical data for comprehensive drug safety and efficacy risk assessment. Preclinical and clinical development researchers find comparative regulatory-based evidence on precedent drugs compliance with PharmaPendium.

Get the scientific, technical and medical research you need from the worlds premiere STEM knowledgebase. Decrease time to market with the peer-reviewed literature on ScienceDirect, including research in disease biology, drug discovery, pharmacology, preclinical and clinical studies.

Drug and medical device manufacturers use Embase to establish uniqueness, provide evidence of safety and comply with pre- and post-market regulatory requirements. Embase is the worlds largest database of biomedical evidence from published, peer-reviewed literature, in-press publications and conference abstracts.

Save time in R&D with smart tools to stay on top of scientific developments, find more relevant results, identify key opinion leaders and gain competitive intelligence. Updated daily, Scopus is a research database that indexes and connects the work of millions of researchers, institutions and companies around the world.

Find, organize and share scientific literature to increase efficiency, save costs and streamline regulatory workflows. QUOSA is an enterprise solution that supports individual or organization-wide literature and knowledge management needs that are critical to functions like R&D, Pharmacovigilance and Medical Affairs.

Get the deep evidence required to understand disease biology faster, improve target and/or biomarker identification and prioritization, and decide what drug targets to pursue and how to measure drug targets. The Biology Knowledge Graph contains 1.4M+ entities connected by 13.5M+ relationships and 65.8M+ referenced and viewable facts.

Pharma and medical technology solution stories

A scientist in metabolism, drug interaction and genetics found comparative extracted data on approved drugs with PharmaPendium that provided valuable insights into drug development programs and decisions.

Whats new in pharma and medical technology

Learn more about advances in pharma and medical technology from our industry partnerships and internal experts.

Elsevier.com visitor survey

We are always looking for ways to improve customer experience on Elsevier.com. We would like to ask you for a moment of your time to fill in a short questionnaire, at the end of your visit.

If you decide to participate, a new browser tab will open so you can complete the survey after you have completed your visit to this website.

Thanks in advance for your time.

Originally posted here:

R&D Solutions for Pharma and Medical Technology - Knovel