Page 1,560«..1020..1,5591,5601,5611,562..1,5701,580..»

Insects Get ‘Caught on Camera’ to Help Farmers in Latest Short Takes – University of Arkansas Newswire

Photo by Russell Cothren

Khoa Luu and Ashley Dowling monitor insect activity at the U of A Farm.

The U of A and U of A System Division of Agriculture researchers have developed a prototype of an insect trap that can help farmers monitor and identify potential pests more efficiently.

Ashley Dowling, a professor of entomology and plant pathology conducting research for the Arkansas Agricultural Experiment Station, partnered with Khoa Luu, an assistant professor of computer science and computer engineering, to create a trap that captures footage of insects, uses artificial intelligence to identify them and sends real-time data back to the farmers.

Once farmers become aware that certain pests are present, they can take the appropriate measures to counteract their potential damage.

The trap itself was created by a local Arkansas business [SolaRid AR], and they came to the university with lots of ideas about what theyd like their trap to be able to do, Dowling said. One of the things on their list was this smart trap approach to things. We jumped at the chance to work with them and create this for them.

Prior to the development of this prototype, farmers typically set traps manually and then sent someone out to evaluate them every few days a process that takes much longer and can sometimes result in crops being ruined in the meantime.

Oftentimes, by the time they get the data, its almost too late, Dowling says.

The new device, however, eliminates the need for manual monitoring, so farmers can make decisions on the fly.

The trap itself is attracting insects using lights of certain wavelengths that are attractive to insects, Dowling explains. It also has the ability to put chemical odors into it that you can target very specific insects with.

As insects beeline their way into the trap, they pass through a sensor with an infrared plane, which then activates the camera. From here, the computer uses artificial intelligence perfected by Luu to identify the insect and transmit the results to the user.

Dowling explains that when certain thresholds of insects are reached in a field, that tells the farmer its time to take action, particularly to avoid substantial economic losses. And, knowing if the pests are isolated to a certain area allows farmers to target only the areas that are potentially affected.

Beyond the fields, the technology has the capability of being applied in other areas of entomology, like biodiversity or museum collections, where samples containing thousands of specimens are regularly collected and need to be analyzed in a timely manner.

Learn more about Dowling and Luus smart insect monitoring system inCaught on Camera: Insects Editionand find additionalShort Takesvideos here.

About the University of Arkansas:As Arkansas' flagship institution, theUofAprovides an internationally competitive education in more than 200 academic programs. Founded in 1871, theUofAcontributes more than$2.2 billion to Arkansas economythrough the teaching of new knowledge and skills, entrepreneurship and job development, discovery through research and creative activity while also providing training for professional disciplines. The Carnegie Foundation classifies theUofAamong the few U.S. colleges and universities with the highest level of research activity.U.S. News & World Reportranks theUofAamong the top public universities in the nation. See how theUofAworks to build a better world atArkansas Research News.

More:

Insects Get 'Caught on Camera' to Help Farmers in Latest Short Takes - University of Arkansas Newswire

Read More..

Ridgefield teen one of 369 students in the world to achieve perfect score on college level test – The Ridgefield Press

RIDGEFIELD A local high school senior's plans to study computer science in college are off to a strong start after he achieved a perfect score on the Computer Science Advanced Placement (AP) test.

Ryan Williams, 17, a Ridgefield High School senior, was one of 369 students in the world to get a perfect score on the test.

Out of 77,434 students worldwide that took the exam, 369 students less than .5 percent earned a perfect score, said Steven Anton, a math, computer science and engineering teacher at Ridgefield High School.

A lot of the Computer Science AP test, Ryan said, is about coding. A multiple choice question might ask students to look at a section of code and identify what it does, while a free response question might ask students to handwrite code to complete a task, like organizing a group of animals in alphabetical order.

You can study for it, Ryan said of the test. You can study for the material theres 10 units in the computer science curriculum, but theres also an aptitude part of the test. Generally, I find the people that enjoy class end up doing better on the exam just because a lot of computer science is application based. The more projects you do in the class and the more interested you are in the projects, the more youll get out of it.

Ryan said he took the Computer Science AP test in early May and found out about his perfect score a few weeks ago. In terms of what constitutes a perfect score, Ryan said he got every point on the exam correct.

Anton said Ryan received a score of 5 on the AP exam, which is scored on a 1-to-5 scale. However, he said a 5 encompasses a range of scores since the test has 80 possible points to earn. As an example, Anton said a 5 might be earned by any score between 62 and 80. Ryans perfect score, Anton said, means Ryan earned all the points possible and that he scored 80/80.

Its so exciting, especially because I want to go into computer science and software, Ryan said. I love the course, I love the teachers as well and it feels kind of surreal to do something like this as well as making my teacher and school and community proud as well.

Having studied computer science with him for two years, Ryan accredited Anton as a mentor that helped nurture his interest in computer science.

As a teacher in general, he makes you feel comfortable in the classroom, Ryan said of Anton. Hes always willing to take questions, he jokes around with the students. That makes it a nice and joyful learning environment rather than a strict one, but at the same time, hes also very good at teaching the course and provides the necessary assessments and encourages students to do their best. Hes been a mentor and kind of grown my interest and love for this.

Ryan was is the student that you know was always excited to learn more about computer science, Anton said of Ryan. You could tell he was very passionate about the subject and interested in learning more, and he kind of just always dove into the projects and was excited to learn more and understand the material hes covering.

I am extremely proud of Ryan for his hard work, Anton said. He worked hard through both of my classes and I think that the perfect score really reflects that its hard work but its also really meticulous effort on the exam where you really have to pay attention and focus. Theres a lot of little things that could go wrong and to score absolutely perfectly is incredible.

After graduating high school, Ryan plans to get his Bachelors degree in college and study computer science and software engineering. While hes not sure where hell attend school next fall, he's a dual citizen of the United States and Canada and has applied to different schools in both countries, including Stanford University in Stanford, Calif.; MIT in Cambridge, Mass.; the University of Toronto in Toronto, Ontario; McGill University in Montreal, Quebec; and the University of Waterloo in Waterloo, Ontario.

Taking Ryans plans to study software engineering into account, Anton said the perfect score itself might not have any direct impact in terms of college, unless schools consider it while looking at admission.

The score of 5 will be accepted by some schools for college credit but that is the same with any score of 5 on the exam, even if it is not a perfect score, Anton said. There could be other impacts that I am not aware of, however. Ryan is my first student to earn a perfect score in the years I've taught the AP course so I am unsure of the full impact.

Read more here:

Ridgefield teen one of 369 students in the world to achieve perfect score on college level test - The Ridgefield Press

Read More..

MIT system sees the inner structure of the body during physical rehab – MIT News

A growing number of people are living with conditions that could benefit from physical rehabilitation but there arent enough physical therapists (PTs) to go around. The growing need for PTs is racing alongside population growth, and aging, as well as higher rates of severe ailments, are contributing to the problem.

An upsurge in sensor-based techniques, such as on-body motion sensors, has provided some autonomy and precision for patients who could benefit from robotic systems to supplement human therapists. Still, the minimalist watches and rings that are currently available largely rely on motion data, which lack more holistic data a physical therapist pieces together, including muscle engagement and tension, in addition to movement.

This muscle-motion language barrier recently prompted the creation of an unsupervised physical rehabilitation system, MuscleRehab, by researchers from MITs Computer Science and Artificial Intelligence Laboratory (CSAIL) and Massachusetts General Hospital. There are three ingredients: motion tracking that captures motion activity, an imaging technique called electrical impedance tomography (EIT) that measures what the muscles are up to, and a virtual reality (VR) headset and tracking suit that lets a patient watch themselves perform alongside a physical therapist.

Patients put on the sleek ninja-esque all-black tracking suit and then perform various exercises such as lunges, knee bends, dead lifts, leg raises, knee extensions, squats, fire hydrants, and bridges that measure activity of quadriceps, sartorius, hamstrings, and abductors.VR captures 3D movement data.

In the virtual environment, patients are given two conditions. In both cases, their avatar performs alongside a physical therapist. In the first situation, just the motion tracking data is overlaid onto their patient avatar. In the second situation, the patient puts on the EIT sensing straps, and then they have all the information of the motion and muscle engagement.

With these two conditions, the team compared the exercise accuracy and handed the results to a professional therapist, who explained which muscle groups were supposed to be engaged during each of the exercises. By visualizing both muscle engagement and motion data during these unsupervised exercises instead of just motion alone, the overall accuracy of exercises improved by 15 percent.

The team then did a cross-comparison of how much time during the exercises the correct muscle group got triggered between the two conditions. In the condition where they show the muscle engagement data in real-time, that's the feedback. By monitoring and recording the most engagement data, the PTs reported a much better understanding of the quality of the patient's exercise, and that it helped to better evaluate their current regime and exercise based on those stats.

We wanted our sensing scenario to not be limited to a clinical setting, to better enable data-driven unsupervised rehabilitation for athletes in injury recovery, patients currently in physical therapy, or those with physical limiting ailments, to ultimately see if we can assist with not only recovery, but perhaps prevention, says Junyi Zhu, MIT PhD student in electrical engineering and computer science, CSAIL affiliate, and lead author on a new paper about MuscleRehab. By actively measuring deep muscle engagement, we can observe if the data is abnormal compared to a patient's baseline, to provide insight into the potential muscle trajectory.

Current sensing technologies focus mostly on tracking behaviors and heart rates, but Zhu was interested in finding a better way than electromyography (EMG) to sense the engagement (blood flow, stretching, contracting) of different layers of the muscles. EMG only captures muscle activity right beneath the skin, unless its done invasively.

Zhu has been digging into the realm of personal health-sensing devices for some time now. Hed been inspired by using EIT, which measures electrical conductivity of muscles, for his project in 2021 that used the noninvasive imaging technique to create a toolkit for designing and fabricating health and motion sensing devices. To his knowledge, EIT, which is usually used for monitoring lung function, detecting chest tumors, and diagnosing pulmonary embolism, hadnt been done before.

With MuscleRehab, the EIT sensing board serves as the brains behind the system. Its accompanied by two straps filled with electrodes that are slipped onto a users upper thigh to capture 3D volumetric data. The motion capturing process uses 39 markers and a number of cameras that sense very high frame rates per second. The EIT sensing data shows actively triggered muscles highlighted on the display, and a given muscle becomes darker with more engagement.

Currently, MuscleRehab focuses on the upper thigh and the major muscle groups inside, but down the line theyd like to expand to the glutes. The team is also exploring potential avenues in using EIT in radiotherapy in collaboration with Piotr Zygmanski, medical physicist at the Brigham and Womens Hospital and Dana-Farber Cancer Institute and Associate Professor of Radiation at Harvard Medical School.

We are exploring utilization of electrical fields and currents for detection of radiation as well as for imaging of the of dielectric properties of patient anatomy during radiotherapy treatment, or as a result of the treatment, says Zygmanski. Radiation induces currents inside tissues and cells and other media for instance, detectors in addition to making direct damage at the molecular level (DNA damage). We have found the EIT instrumentation developed by the MIT team to be particularly suitable for exploring such novel applications of EIT in radiotherapy. We are hoping that with the customization of the electronic parameters of the EIT system we can achieve these goals.

This work advances EIT, a sensing approach conventionally used in clinical settings, with an ingenious and unique combination with virtual reality, says Yang Zhang, assistant professor in electrical and computer engineering at the UCLA Samueli School of Engineering, who was not involved in the paper. The enabled application that facilitates rehabilitation potentially has a wide impact across society to help patients conduct physical rehabilitation safely and effectively at home. Such tools to eliminate the need for clinical resources and personnel have long been needed for the lack of workforce in healthcare.

The papers MIT co-authors are graduate students Yuxuan Lei and Gila Schein, MIT undergraduate student Aashini Shah, and MIT Professor Stefanie Mueller, all CSAIL affiliates. Other authors are Hamid Ghaednia, instructor at the Department of Orthopaedic Surgery of Harvard Medical School and co-director of Center for Physical Artificial Intelligence at Mass General Hospital; Joseph Schwab, chief of the Orthopaedic Spine Center, director of spine oncology, co-director of the Stephan L. Harris Chordoma Center, and associate professor of orthopedic surgery at Harvard Medical School; as well as Casper Harteveld, associate dean and professor at Northeastern University. They will present the paper at The ACM Symposium on User Interface Software and Technology later this month.

See more here:

MIT system sees the inner structure of the body during physical rehab - MIT News

Read More..

Most preprint studies of COVID-19 hold up through peer-review – University of Wisconsin-Madison

Research findings posted online as preprints studies made public before undergoing the review and approval of a panel of peer scientists required by most scholarly journals often hold up quite well to that scrutiny, according to a new report on COVID-19 studies.

While preprint manuscripts have become popular in many scientific fields since physicists made their arXiv (pronounced archive) repository accessible online in 1991, the COVID pandemic pushed new groups of researchers into the habit of posting and consulting fresh experimental results and analyses ahead of peer review.

Preprints have been broadly accepted in the social sciences, computer sciences, mathematics for quite a long time, says B. Ian Hutchins, a professor in the University of WisconsinMadisons Information School and leader of the new study of preprint published today in The Lancet Global Health. Biomedical research has been more cautious, I think, precisely because people use that information for making health-altering decisions.

The appearance and speedy global spread of a new virus as well as the quick response by scientists around the world forced many to reconsider that caution, weighing it against the cost of a typical delay of many months or longer for newly completed studies to clear the hurdles of a careful journal peer review.

A group of journal publishers decided during the pandemic to require preprint availability of COVID-19-related manuscripts submitted for their consideration, according to Hutchins whose own work was, as it focused on COVID-19 studies, also required to be made available as a (deeply meta) preprint.

The UWMadison researchers chose at random 100 COVID-19 studies that had been posted as preprints and then subjected to peer review and successfully published by journals. They examined how peer review affected 1,606 data points in the manuscripts, representing four types of data common to the COVID study genre: the closely-related infection fatality rates and case fatality rates, basic viral reproduction rates (how many people an infected person is expected to infect) and disease incidence (the number of new people infected in a given time period).

That was a strength of using infectious-disease research for this study, Hutchins says. Because when you talk about case fatality rate, theres an agreed-upon definition of what that is, broadly speaking, and so we could make better comparisons of that data across different labs.

Comparing preprint manuscripts to the eventual published versions of the individual studies, about 90 percent of those 1,606 data points were still in the text after peer review. More than 170 were edited out and more than 300 new data points were added across the 100-study sample.

And while the researchers found the confidence intervals associated with estimates thats like the margins of error you hear about in polling, Hutchins says had tightened about 7% after peer review, changes in the actual estimates were minor and statistically insignificant.

Wild swings between preprint and published versions would be hard to explain, Hutchins says. But thats not what we see. Theres not a whole lot of change in the data reported and the estimates based on that data.

Quantifying the differences typically seen after studies cross the peer-review finish line can help consumers of the freshest science consider how much weight they give preprint results as they report on discoveries or issue public health guidance.

Journalists and policymakers should look at the fact that 90% of the data points make it through peer review, should get a sense for how much they usually change, and ask themselves, am I comfortable accepting that degree of change? Hutchins says. The answer to that may depend based on the stakes of the decision. If all youre worried about is your reputation, you might be open to a different amount of risk than if youre making life-or-death decisions.

The National Institutes of Health has promoted preprint manuscripts as a way to accelerate the pace of scientific discovery, according to Hutchins, who developed iCite, a curated search tool for COVID-19 research, while working at NIH.

Hutchins co-authored the new study with statistician Honghan Ye, who completed his doctorate at UWMadison in 2021, and several UWMadison undergraduate students, and hopes to expand his preprint studies to include a broader range of scientific fields and how preprint quality has changed over time.

See original here:

Most preprint studies of COVID-19 hold up through peer-review - University of Wisconsin-Madison

Read More..

International team publishes perspective paper on role of the clinician-data-scientist in healthcare – EurekAlert

Healthcare is constantly changing, as big data analytics and advanced technologies such as artificial intelligence are now being applied in the healthcare field. These high-tech changes hold the potential to transform patient care. In response to these changes, an international team of scientists has published a perspective paper, describing the competencies of clinician-data-scientists and addressing the challenges in training these health care professionals. Their perspective paper was published in the journal Health Data Science on August 8, 2022.

The clinician-data-scientist combines in-depth clinical knowledge with skills in data science. These healthcare professionals are well prepared to identify the challenges in healthcare that accompany the growing digital transformation. They are able to lead meaningful scientific studies, communicate well across disciplines, and provide general critical interpretations. With their interdisciplinary knowledge, these clinician-data-scientists will play crucial roles in guiding the approvals for innovative technologies and creating digital health policy.

In a digital health era, clinician-data-scientists are critically important as they possess a deep understanding of both data science and the humanistic nature of medicine, and are ready to identify clinically important questions, said Luxia Zhang, a professor at the National Institute of Health Data Science, Peking University.

In their perspective paper, the researchers explore the core competencies a clinician-data-scientist needs to have, stressing that these individuals must be able to closely link medicine and data science to work efficiently and to enable data-driven discoveries in healthcare.

The core competencies of a clinician-data-scientist should include a fundamental understanding of health data, training in epidemiology, statistics, bioinformatics, and computer science, combined with an understanding of continuous healthcare improvement frameworks, socio-technical system challenges, and advanced skills in inter-disciplinary communication and collaboration, said Dr. Mai Wang, from the National Institute of Health Data Science, Peking University. The researchers believe that with a strong understanding of healthcare and the ability to identify knowledge gaps in medical practice, clinician-data-scientists will play critical roles in data science research projects.

Besides exploring the core competencies a clinician-data-scientist needs, the researchers also examined the training these health care professionals require. This training is challenging because of the increased complexity of data and the rapid advancement of analytic techniques. As an added challenge, data science is not part of conventional medical education training. While some medical schools are starting to modify their curriculum, overall, integrated formal training programs for clinician-data-scientists are scarce worldwide.

The researchers view teamwork as key to this process. To conquer training challenges, senior clinical faculties and data scientists should form a close partnership and work together to design training frameworks with flexibility and frequent updating for adaptation to various application scenarios, said Wang.

Looking to the future, the researchers stress that clinician data scientists are key team members in patient health care. Clinicians need training in data science skills. At same time, data scientists with deep technical skills are needed, so that key clinical questions are formed and prioritized.

Clinician-data-scientists are critically important as they possess a deep understanding in both science and humanistic nature of medicine and are ready to identify clinically important questions that, if addressed, can make medical advances and assure excellence in patient care, said Zhang.

The next step is to form a training framework and design curricula that focuses on the core competencies required by clinician-data-scientists. And this must be continuously updated to adapt to new developments, said Zhang.

The research team includes Fulin Wang, Lin Ma, Mai Wang from Peking University Health Science Center; Georgina Moulton from The University of Manchester; and Luxia Zhang, from Peking University and Peking University First Hospital.

Health Data Science

Clinician Data ScientistsPreparing for the Future of Medicine in the Digital World

3-Oct-2022

The authors declare that they have no conflicts of interest.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Read the original here:

International team publishes perspective paper on role of the clinician-data-scientist in healthcare - EurekAlert

Read More..

Energy Transitions: We Need To Redefine the Problem and Reframe the Narrative – Data Science Central

Climate Change, as an extension or corollary Energy Transitions, is undoubtedly one of the most critical issues that merit urgent and serious attention from policymakers, scientists, and governments across the globe. However, before looking for solutions, it is equally important to define and frame the problem in the most realistic and unbiased way to ensure the holistic nature of the solution(s). There are many issues with how the problem of energy transitions is being portrayed lately. We are already against the clock. False narratives will only exacerbate the climate crisis by delays and inconsistencies inherent in the proposed solutions.

The starting point is in defining the word itself. As Vaclav Smil, energy doyen of the 21stcentury, puts it (in his book titled Energy Transitions), using transition (singular) is not correct as it is almost equivalent to proposing a linear modernization narrative in terms of energy. Replacing it with transitions depicts that we understand the path dependency of energy transitions and realize that different countries and regions will have to define their ways for Net Zero. We should problematize the linear modernization narrative. This also relates to the idea of the Anthropocene, the proposed period when the human activity started to affect our ecosystem adversely. The problem with using this term is that it assumes humanity is a collective and the subsequent carbon emissions absolve thosewhose contribution weighs much more than others. However, in reality, there exists an asymmetry. For example, energy usage in the British Empire or the U.S. cannot be compared to that in South Asia. The asymmetry persists even to this date.

The United States and China account forabout 44 percentof CO2 emissions, albeit China emits more than 27 percentof global CO2 emissions. However, the U.S. leads the way on a per capita basis important as the U.S. population is 4.35times less than Chinas! On an aggregate basis, the U.S. has emitted more than China over the past 300 years and has the largest historical emissions(20 percent of the global total). The emission discrepancy between the richest and poorest, developed and undeveloped, is shocking. The graph below illustrates this, showing energy usage per person.

A report by Oxfam, Carbon Inequality in 2030, mentions that the emissions by top 1 percent are expected to be 30 percent higher than what is required to attain 1.5C by 2030. Moreover, by 2030, the emissions by the top 1 percent roughly 80 million people will be 25 percent higher versus 1990. Showing no sign of abatement. Another recent report by U.N.stresses that the richest need to cut down their carbon footprint by 97 percent and that the top 1 percent almost 70 million people are responsible for 15 percent of total emissions, which is more than the bottom 50 percent 3.5 billion people!

The Oxfam report highlights a startling observation. The top 1 percent need to cut their emissions by 97 percent between 2015 2030, whereas they are set to cut only 5 percent in reality. The poorest 50 percent will register an increase of only 17 percent when even a 200 percent increase will still render their emissions compatible with the 1.5 C goal!

These facts and figures speak volumes about our current definition of the problem and its narrative. First of all, techno-determinism (reliance on technology) will not help us achieve green growth (see graph below). We need decades (or a century) for these technologies to achieve the scale to have an impact. Also, how will the issue of financing be resolved for the developing world? Furthermore, how can the developing world wean itself off coal when energy and lives depend on its usage?

The current Energy Crisis in Europe has the developed world scrambling to secure coal. In developing countries where about 2.7 billionpoor people still rely on biomass for cooking and where 1 billion people still live without electricity, the expectation that they will shift to renewable energies (with inherent limitations) is wishful thinking. The boilerplate solutions will not work for a significant part of the world, and this is crucial as, according to EIA,energy demand will increase by 50 percent by 2050, with the most demand coming from Non-OECD countries.

Instead of homogenizing the problem, we need to redefine it by focusing on those 70 million people or those 100 companies responsible for 71 percent of emissions. We need a new narrative that doesnt pitch humanity as a collective but as certain regions, countries, groups, and entities and hold them accountable. We need, therefore, tailored solutions. The matter is urgent. As per our current rate of emissions, only 22 yearsare left before we the world, reach 1.75 degrees.

The sooner we redefine the problem and reframe the narrative, the better we move toward practical, viable, holistic solutions.

Read the original:

Energy Transitions: We Need To Redefine the Problem and Reframe the Narrative - Data Science Central

Read More..

University joins national effort to make health care data more inclusive – University of Miami: News@theU

Leaders from the Universitys Institute for Data Science and Computing are working to build a larger, more diverse database for more accurate research on health care disparities.

While kidney disease affects one in seven adults in the United States, it impacts Black patients much more. They are four times more likely to suffer from kidney failure than white U.S. residents, and a 2021 study indicates that computer algorithms that determine eligibility for a kidney transplant often put Black people at a disadvantage, widening the gap for a successful recovery.

This is just one example of how artificial intelligence solutions and algorithms for data collection need to improve, so that other health disparities impacting minority populationsin diabetes, heart disease, and cancer caredo not continue to increase in the United States.

Recognizing this problem, in July the National Institutes of Health (NIH) began the Artificial Intelligence/Machine Learning Consortium to Advance Health Equity and Researcher Diversity program, or AIM-AHEAD, with a goal to reduce these health disparities by creating new algorithms and health care databases that more accurately reflect the diverse U.S. population. They selected the University of Miamis Institute for Data Science and Computing (IDSC) as one of the institutions to spearhead the programs infrastructure core, which is one of four pillars of the program. The others are partnerships, research, and data science training.

Most of our current data is biased, and often the people collecting this data are not representative of all minorities and cultural differences, said Nick Tsinoremas, the Universitys vice provost of data science and computing, professor of biochemistry and molecular biology, as well as the founding director of IDSC and lead investigator on the AIM-AHEAD grant. We want these biases eliminated, but we also want to create an infrastructure that encourages minority serving institutions to do this research because these are the people who understand biases in data and algorithms and know how to create more equitable or unbiased approaches.

As a result, the University received a $1.3 million grant to work with historically Black colleges and universities, tribal colleges, and other minority serving institutions to create a computing structure for these institutions to share patient data that is void of personal information and that will improve the quality and breadth of health care research.

Joining Tsinoremas to lead the AIM-AHEAD project are Azizi Seixas, associate professor of psychiatry at the Miller School of Medicine and director of IDSCs population health informatics program, as well as computer science professor Yelena Yesha, who also serves as the Knight Foundation chair of data science and artificial intelligence and IDSCs chief innovation officer. The three are now working with minority serving institutions across Florida to improve their access to artificial intelligence tools and help them to do more efficient health equity research. Seixas said he and his researchers at the Miller Schools Media and Innovation Lab have reached out to approximately 15 institutions across the state.

Many of these institutions have fragmented electronic health record systems, which can lead to waste, inefficiencies, and poor communication in health care delivery, which is critical. Well focus on trying to bring these fragmented systems together, said Seixas, who is also the associate director of the Center for Translational Sleep and Circadian Sciences. If we are really serious about tackling the health disparities in Florida that make us uniquelike the high prevalence of cardiovascular disease and dementiawe need to build a larger network to really unravel those issues.

On the AIM-AHEAD infrastructure team, Tsinoremas, Seixas, and Yesha are also working with the National Alliance for Disparities in Public Health, Harvard University, and Vanderbilt University. Currently, University leaders are working closely with four institutions in Florida to create pilot programs, and recently received additional funding to support two of these pilot programs.

The team is now part of an additional $500,000 grant to work with Florida Atlantic Universitys (FAU) Schmidt College of Medicine and the Caridad Center, Inc.the largest free clinic in Florida for uninsured and underserved children and families of Palm Beach Countyto improve their electronic health records, so that they can be used more often for research.

In terms of technology infrastructure, security, cloud computing, and supercomputing, we have some of the best faculty and staff working at the University of Miami. So, we are able to do more as far as research productivity and output. But if we look just up the road to many of our minority serving institutions, they dont have those infrastructures, said Seixas. We are trying to spread our resources around, so these institutions that are often under-resourced can also do cutting-edge research and provide top of the line care, too.

As a first step, leaders from IDSC are sharing a tool to collect and de-identify patient electronic health records for research that they developed a few years ago for the Miller School. The tool, called University Research Informatics Data Environment, or URIDE, is a web-based platform that collects, sorts, and helps researchers visualize this type of patient data from multiple clinical health systems. This could help health care professionals explore demographics, diagnoses, procedures, vital signs, medications, labs, allergies, co-morbidities, and other information for certain patient populations to pinpoint trends or optimal treatment practices.

A second award for $362,000 will help IDSC leaders create another pilot program with Florida Memorial University (FMU) and Miami-Dade College to train 40 existing faculty members and students to use artificial intelligence and machine-learning techniques in their clinical practice, research, and curriculum. In this program, IDSC leaders will also build upon another program they created at the University to attract and foster the careers of underrepresented minorities in science and especially in the burgeoning field of data science.

We are very excited to collaborate with FAU, Caridad, FMU, and Miami-Dade College to expand the URIDE platform and to become an important piece of the infrastructure that supports the entire AIM-AHEAD consortium, Tsinoremas said.We also want to establish an open-source community around this platform to engage data scientists, data engineers, and developers for continuous improvement of this open-source effort, which we call Hi-RiDEfor Health Informatics Research Integrated Data Environment.

Continue reading here:

University joins national effort to make health care data more inclusive - University of Miami: News@theU

Read More..

Aboitiz Data Innovation appoints new GM to lead data science and AI adoption for smart cities – ETCIO South East Asia

Alvin Ng, COO & General Manager - Smart Cities, APAC, Aboitiz Data InnovationAboitiz Data Innovation (ADI) officially announced the appointment of Alvin Ng to the role of General Manager for Smart Cities, playing a key role in advancing ADIs mission to drive the adoption of Data Science and Artificial Intelligence (DSAI) tools and solutions for smart city development in the region.

Formerly the APAC Vice President of Digital Solutions at Johnson Controls, a global company providing integrated smart green building systems, services and solutions, Ng oversaw the companys advancement towards digital and sustainable growth in the region. Prior to joining Johnson Controls, he was the General Manager of GE Digital, where he was responsible for leading GE Digital's automated industrial and data-driven software solutions to drive real-time connectivity and data intelligence in Asia.

Commenting on his new position, Ng said, I am thrilled to join ADI, a company that is pushing out innovative solutions for a better future, while advancing communities throughout Southeast Asia. With ADIs goals to strengthen its footing in the development of smart cities, I look forward to bringing my own experience and passion for digital transformation and advanced technologies such as IoT and AI to the table, while working alongside a talented team.

Ng joins ADI with over 25 years of experience in sales management, global business and market development. He has held various global leadership roles in multinational companies such as Wincor Nixdorf AG, Cisco Systems and Rosenbluth International. He is also a founding member of World Economic Forums Digital ASEAN Skills Task Force.

Today, he serves as an Adjunct Associate Professor at Nanyang Business School (NBS), part of Singapores Nanyang Technological University (NTU), teaching Digital Transformation, AI for business, Internet of Things and Sustainability Leadership programs. He is also closely involved with the Nanyang MBA as an Industry-Faculty Lead for the schools Business Consulting and Leader as a Coach programmes and plays the role of Senior Career Fellow for the Global Executive and Full-time MBA students.

Commenting on Alvins new position, ADI Managing Director Dr. David R. Hardoon said, Through our work at ADI over the past year, we have proven that optimising traditional methods with Data Science and AI has brought pioneering innovations and can contribute to the wider community. Alvins experience in data-driven digital solutions in the building sector, combined with his commitment to sustainability is definitely going to be an asset for ADI in promoting the adoption of DSAI for our smart cities practices.

The rest is here:

Aboitiz Data Innovation appoints new GM to lead data science and AI adoption for smart cities - ETCIO South East Asia

Read More..

University Librarian and Vice Provost Smith Announces Retirement – University of California, Davis

The Office of the Provost released this announcement today (Oct. 11):

MacKenzie Smith has announced she will retire as University Librarian and Vice Provost of Digital Scholarship at the end of June 2023, after more than a decade at UC Davis and nearly 40 years in academic research libraries.

Smith joined UC Davis in 2012, following a long career with the libraries of MIT, Harvard, and the University of Chicago.

During her tenure as University Librarian, Smith has overseen all aspects of the UC Davis Library, including its buildings, collections, personnel, and programs. She has led initiatives to modernize the library and advance the universitys strategic priorities by enhancing the librarys support for research, teaching and learning, including:

Smith credits an outstanding team of librarians and staff for the advancement of the library over the past decade, particularly their effective response to supporting the campus during the COVID-19 pandemic. In addition to expanding online access to library materials and services, UC Davis was the first UC campus to reopen its main library.

MacKenzie has led our library during a period of transformative change in how scholars create, access and share research, Provost and Executive Vice Chancellor Mary Croughan said. She has made substantial contributions to the campuss research enterprise at every level, from data science and informatics to the establishment of an undergraduate library research prize.

MacKenzie has also elevated our librarys leadership role, within UC and far beyond, in advancing free and open access to research. We will miss her leadership and collaboration, but wish her all the best in her retirement.

As Vice Provost for Digital Scholarship, Smith has also led campuswide initiatives involving information technology, research computing and knowledge management. She helped strengthen the universitys data governance as the lead architect and inaugural co-chair of UC Davis Institutional Data Council.

Throughout her academic career, Smith has developed and led entrepreneurial programs that apply technology innovation to libraries and their parent institutions and create new models for publishing and data-driven research in the digital age. Her work with open-source software platforms, such as MITs DSpace for archiving digital research, has had a lasting impact on research libraries and many other cultural heritage and knowledge-generating institutions.

Smith feels the timing is right for a new leader to steer the next phase of the librarys evolution and implement its new strategic plan.

I am immensely proud of how our library team has responded to the opportunities and challenges of the past 10 years, from the rapid growth of digital technology that has reshaped libraries, to the many changes we all faced during the pandemic, she said. Weve evolved and grown so much, and I cant wait to see what the library and UC Davis will achieve together in the decade to come.

The Provost, who will launch a search for a new University Librarian this fall, said she is immensely grateful for Smiths service and commitment to UC Davis.

More here:

University Librarian and Vice Provost Smith Announces Retirement - University of California, Davis

Read More..

Elemental Machines and Kanomax USA announce a strategic partnership to offer an integrated digital solution for clean room technology – PR Newswire

Elemental Machines partners with Kanomax USA to deliver a seamlessly integrated clean room monitoring solution. Together, the technologies will empower researchers and manufacturers with environmental and operational data directly via the cloud.

CAMBRIDGE, Mass., Oct. 12, 2022 /PRNewswire/ -- Elemental Machines ("EM") a leading provider of IoT sensors powered by data science and providing actionable insights based on operational data has partnered with, Kanomax USA is a leading manufacturer delivering the best measurements for particle detection and airflow measurement in the business. As a laboratory operations ("LabOps") intelligence pioneer, EM is dedicated to supported fully-connected lab and manufacturing facilities.

The new partnership brings together EM's lab monitoring software with Kanomax' precision measurement technology to incorporate real-time and historical operational data with Kanomax' Clean Room Monitoring System.

The Kanomax Cleanroom Monitoring System offers turnkey solutions for monitoring needs required by various industries such as pharmaceutical, medical device, aerospace, semiconductor, and automotive. With more than 80 years of experience in the business Kanomax USA has always been pushing forward. Kanomax' precision particle counters are capable of real time monitoring of temperature, humidity, airborne particles, differential pressure, energy consumption, and gasses. Via the Kanomax' system, alarms can be initiated per designated event and have dependable remote access.

"We are excited to be teaming up with Elemental Machines and starting a new journey in cleanroom solutions and technology," said Koji Miyasaka, General Manager at Kanomax USA. "We are most excited about working with the team over at EM and collaborating on future projects with their team."

The partnership will allow data collected by Kanomax' technology to be stored and accessed via EM cloud-based dashboard for ease of viewing.

Elemental Machines' IoT enabled sensors and platform connect virtually any piece of equipment to the cloud, bringing together important data from across the lab and manufacturing. EM's real-time environmental monitoring and alerting solution tracks critical data such as CO2/O2 percentage, airborne particles, temperature, and humidity, notifying users of out-of-range events.

"Partnering with Kanomax will add to EM's growing list of hardware integrations that eliminate data silos and digitally connect critical environmental conditions with real-time and historical data helping with reproducibility, compliance, and production integrity," said Dan Petkanas, National Channel Sales Manager at Elemental Machines.

About Kanomax USA

Kanomax delivers the best measurement solutions with its products and services that adapt precision measurement technology for fluids and particles. We contribute to technological innovation and quality improvements for the processes of quality and environment management. Sustaining human well-being in the areas of environment, health, and energy have always been a primary focus of Kanomax. We develop leading technology with the goal of maintaining health and safety in industries including automotive, aerospace, semiconductor, electronics manufacturing, heavy industry, steel, shipbuilding, pharmaceutical, biotechnology, food processing, medical, construction and civil engineering.

Home

About Elemental Machines

Elemental Machines is the trusted data collection, analysis, and reporting technology supplier to researchers, clinicians, and LabOps professionals around the world. The Cambridge-based company equips labs with universal cloud-based dashboards and turnkey sensors that unite data from every asset, every metric, and every location, enabling rapid collection, seamless sharing, and effortless reporting.

http://www.elementalmachines.com/

SOURCE Elemental Machines

Read more here:

Elemental Machines and Kanomax USA announce a strategic partnership to offer an integrated digital solution for clean room technology - PR Newswire

Read More..