Category Archives: Data Science

The Cape Town Statement on fairness, equity and diversity in research – Nature.com

As awareness has grown about fraud and misconduct in science, the World Conferences on Research Integrity have become a leading forum for the discussion and study of ways to promote responsible behaviour in research. Since the first meeting in 2007, which was held in Lisbon, the events have helped to establish an academic field focused on research integrity.

Meetings have typically concentrated on issues such as research misconduct, responsible behaviour around data collection, analysis, authorship and publication, and the importance of reproducibility. But last May, attendees to the 7th World Conference on Research Integrity, held in Cape Town, South Africa, took a significant step. They added the myriad ways in which research programmes and practices disadvantage those living in low- and middle-income countries (LMICs) to the suite of issues that threaten the integrity of science.

Three of the six previous world conferences have led to the publication of guidelines or principles. We are part of a working group (including bioethicists, researchers, institutional leaders and journal editors) that now presents the Cape Town Statement on Fostering Research Integrity Through the Promotion of Fairness, Equity, and Diversity. (See Supplementary information for a list of working-group members.)

This statement entails 20 recommendations, drawn from discussions involving around 300 people from an estimated 50 countries, including 16 African nations and 5 South American ones. The discussions were held over 18 months before, during and after the Cape Town conference the theme for which was fostering research integrity in an unequal world.

The Cape Town Statement is essentially a call to action that we hope will help to turn the global conversation on inequity and unfairness in research into changes in practice by all stakeholders. Here, we lay out the motivation for the statement, and its broad goals.

The reasons for the statement are clear. Much too often, researchers and institutions from high-income countries reap greater benefits from global collaborations than do LMIC collaborators whether in relation to numbers of papers published, authorship, career progression, setting priorities for research or the ownership of samples and data1.

Nature addresses helicopter research and ethics dumping

As an indication of this, a look at the authorship of papers about COVID-19 from the 10 top medical and global-health journals (according to impact ratings), containing content related to Africa or any African country, and published during the first 9 months of 2020, reveals that 66% of the authors were not from Africa2. One in five articles had no author from Africa at all. Whats more, of those papers with African authorship, 59% of first authors and 81% of last authors were not from Africa, and only 14% of papers had both an African first and last author.

Often, what happens is that after securing a grant for a project, a research team from a high-income country looks for local researchers in the low- or middle-income country of interest to collaborate with. Local researchers might be offered some grant money and co-authorship on a paper (usually with their name appearing in the middle of the list). Invariably, the lead research team conducts the analyses, with the local researchers only reviewing manuscripts, often to ensure that they are culturally and politically acceptable3.

Even the push towards openness and transparency in science publishing which many have argued is a way to foster greater integrity in research has created more barriers for investigators in low-resource environments.

Sharing data, for example, requires having enough institutional infrastructure and resources to first curate, manage, store and (in the case of data relating to people) encrypt the data and to deal with requests to access them. Also, the pressure placed on researchers of LMICs by high-income-country funders to share their data as quickly as possible frequently relegates them to the role of data collectors for better-resourced teams. With enough time, all sorts of locally relevant questions that were not part of the original project could be investigated by local researchers. But, well-resourced investigators in high-income countries who were not part of the original project are often better placed to conduct secondary analyses.

Unforeseen difficulties are arising around publishing, too. Currently, the costs to publish an article in gold open-access journals (which typically range from US$500$3,000) are prohibitive for most researchers and institutions in LMICs. The University of Cape Town, for example, which produces around 3,300 articles each year, has an annual budget of $180,000 for article-processing costs. This covers only about 120 articles per year.

Because of this, researchers in these countries frequently publish their papers in subscription-based journals. But scientists working in similar contexts cant access such journals because the libraries in their institutions are unable to finance subscriptions to a wide range of journals. All this makes it even harder for researchers to build on locally relevant science.

Such imbalances in global research collaborations which stem from a complex mix of racial discrimination, systemic bias and major disparities in funding and resources impact the integrity of research in numerous ways.

Scientists in Kenya work on a malaria vaccine at the KEMRI-Wellcome trust laboratories in Kilifi.Credit: Luis Tato/Guardian/eyevine

As recently as 2019, a group of students and their professor at a university in South Africa drew racist conclusions from their study of the cognitive abilities of Coloured South African women (Coloured is a recognized racial classification in South Africa). Their findings have since been debunked, and an investigation concluded that there was no deliberate intention to cause harm. A lack of diversity among the researchers, and possibly among the reviewers and editors, might have contributed to the publication of this research, which has now been retracted by the journal involved.

Power imbalances also skew research priorities, with investigators pursuing goals that frequently overlook the needs of local people.

The Cape Town Statement is not the first guideline on research fairness and equity, particularly in collaborations. Indeed, various documents informed our discussions, including the guiding principles of the Commission for Research Partnerships with Developing Countries (KFPE), which focuses on collaborations involving Swiss institutions; the Global Code of Conduct for Research in Resource-Poor Settings, a resource for those striving to ensure that science is carried out ethically in lower-income settings; and the BRIDGE guidelines, which aim to foster fairness and integrity in global-health epidemiology.

Furthermore, a tool for evaluating practices, called the Research Fairness Initiative, has already been developed by the Council on Health Research for Development, an international non-governmental organization that aims to support health research, particularly in LMICs. By providing questionnaires and guidance, the Research Fairness Initiative enables institutions, individual researchers and funders to evaluate their current practices, and if necessary, improve them.

The Cape Town Statement differs from these other guidelines and tools, however, in that it recognizes that unfair practices can harm the integrity of all research, no matter the discipline or context. Specifically, it focuses on the following four broad actions.

More funders from high-income countries must include diversity stipulations in their calls for grant applicants. In 2020, for example, the second European and Developing Countries Clinical Trials Partnership programme (EDCTP2) in conjunction with the UK Department of Health and Social Care, asked applicants to apply for funding for projects specifically aimed at addressing gender and diversity gaps in clinical research capacity in sub-Saharan Africa. (The EDCTP is a partnership between countries in Europe and sub-Saharan Africa, supported by the European Union.)

Researchers in high-income countries must also work harder to collaborate in more meaningful ways with people from different disciplinary, geographical and cultural backgrounds. One way to do this is to establish long-term relationships with people that extend beyond the life of a single project.

South Africas San people issue ethics code to scientists

Research institutions be they universities, non-governmental organizations, or national or transnational science councils should develop and implement policies, structures and processes that support and promote diversity and inclusivity in research.

The London School of Hygiene and Tropical Medicine has been a major player in global-health research for many decades. Since 2019, a volunteer group of staff and students at the university has been trying to address the fact that the organizations members have often held unacknowledged positions of advantage through its Decolonising Global Health initiative. So far, this has involved various undertakings, including a series of educational lectures.

All stakeholders, from researchers, institutions and funders, to journal editors and publishers, must take steps to ensure that they are not exacerbating power imbalances in research collaborations, but instead helping to remove them.

Funders from high-income countries should discourage parachute or helicopter research in which well-resourced researchers conduct studies in lower-income settings or with groups who have been marginalized historically, but fail to involve local researchers or communities in all stages of the research. Funders can do this by including diversity stipulations in their calls for grant applicants, but also by funding local researchers directly.

Although many need to follow suit, some funders are making progress on this front. Currently, about 60% of the grants awarded by the EDCTP go directly to institutions in sub-Saharan Africa. Similarly, in 2021 the US National Institutes of Health (NIH), launched a $74.5 million five-year project called Data Science for Health Discovery and Innovation in Africa (DS-I Africa). This is being led by African scientists, and is creating a pan-African data network designed to address African research priorities4.

As well as requiring that researchers from LMICs lead collaborations, funders (and research institutions) should insist that projects involving multiple countries begin with a period of discussion involving all potential stakeholders. Before any research is conducted, the roles of each team member, and how they will receive recognition, should be defined and agreed on. Moreover, requiring that projects be fully and transparently budgeted, with the costs of maintaining already existing infrastructure included, would help to ensure that better resourced institutions shoulder a fairer share of project costs.

Publishers and journal editors must question submissions from authors if data have been collected in a low- or middle-income country, but the lead and collaborating authors are from high-income countries.

Some are already taking steps in this direction. The Lancet has started rejecting papers that are submitted by researchers from outside Africa, with data collected from Africa, but with no mention or acknowledgement of a single African collaborator. Similarly, Nature journals now encourage authors to make various disclosures on inclusion and ethics when submitting manuscripts.

The obstacles that make it harder for researchers working in low-resource settings to participate in open science need to be identified and addressed by publishers, and other national and global stakeholders, such as science councils and funders.

Wherever possible, funders should allow data collected by researchers in LMICs to be embargoed for two years, for example, to give investigators time to conduct secondary analyses and to share their data with collaborators at their discretion. Meanwhile, journals and publishers should adjust article processing costs for authors in low-resourced regions.

Over the past 20 years, more than 200 publishers have partnered with Research4Life, a platform established in 2002 dedicated to making some peer-reviewed content available to students and researchers in LMICs. But many of the countries that are major contributors to research, such as South Africa, Brazil, Argentina and India, do not meet the criteria for accessing knowledge through this platform, or for fee waivers or reductions in cost for open-access publishing.

Researchers in LMICs are often disadvantaged because their institutions have underdeveloped research management and financial systems. With scant or no assistance from lawyers, administrative assistants, financial managers and project-management staff, they struggle to meet the due diligence requirements of some funders in high-income countries. (Frequently this involves answering hundreds of questions in multi-page documents about institutional and research governance processes and policies.)

Both funders and collaborating institutions must take steps to enable the development of research-support systems in LMICs. This could mean paying for computing infrastructure, mentorship programmes, open-access publishing, or the training and salaries of project and financial managers, for example.

A volunteer has blood withdrawn for testing as part of an HIV vaccine trial in Bangkok.Credit: Paula Bronstein/Getty

Several controversies in research ethics, which occurred as a result of HIV research projects undergoing ethical scrutiny during the height of the HIV epidemic in the 1980s and early 1990s, prompted the Fogarty International Center at the NIH to launch the International Bioethics Education and Career Development Award in 1999. The aim was to ensure that there were strong research ethics committees in LMIC institutions, with adequately trained members, to review studies and meet international ethics and regulatory standards required by US funders. Thanks to this and subsequent efforts, hundreds of people in Africa and Asia have been trained in research ethics and continue to serve on review boards.

Many more initiatives like this that invest in infrastructure and training over the long term are needed.

The governments of LMICs also need to recognize the value of funding research, both to address locally relevant priorities and to reduce their nations reliance on funders from high-income countries. Matched funding schemes could help, whereby governments commit to give institutions the same amount of funds as those obtained from other sources for nationally identified high-priority research. Launched in 2015, the Science Granting Councils Initiative aims to strengthen the management of research grant funding in 17 countries throughout Africa. Achieving its goal of bringing more support for and control of scientific research into the continent will require governments of African countries to prioritize research funding.

During the 2000s, researchers from the United Kingdom and other high-income countries obtained blood samples from people of the San community in Namibia for genetic research without always adequately explaining what those samples would be used for, or reaching any benefit-sharing agreements with the community.

The San people have since developed their own code of ethics a value-based set of principles that researchers must adhere to before trying to obtain samples or information from them (see go.nature.com/3yz7ash). In principle, this code could be used by researchers working with other Indigenous communities, if codes specific to a particular community dont yet exist.

Ensuring that community members or knowledge-holders, who might not have formal qualifications, are included in research teams with their contributions being adequately valued is another way in which local knowledge can be incorporated equitably5.

Many of the greatest challenges facing humanity climate change among them are disproportionately affecting people living in LMICs. And many of these challenges have arisen largely because of a long history of colonial exploitation and inequitable use of Earths resources. Yet last year, reports of systemic bias and the contributions of researchers from LMICs being sidelined were made, even in the Intergovernmental Panel on Climate Change.

Unfairness, inequity and a lack of diversity must no longer prevent the global research enterprise from maximizing scientific integrity and from realizing the ultimate societal value and benefits of research.

Read more:

The Cape Town Statement on fairness, equity and diversity in research - Nature.com

While OpenAI has been working on text and images, iGenius has been working on GPT for numbers – VentureBeat

Within a week of being launched, chatGPT, the AI-powered chatbot developed by OpenAI, had over 1 million users, growing to 100 million users in the first month. The flood of attention from the press and consumers alike comes in part because of the softwares ability to offer human-like responses in everything from long-form content creation, in-depth conversations, document search, analysis and more.

Uljan Sharka, CEO of iGenius, believes that generative AI has world-changing potential in the business world, because for the first time, data can be truly democratized. GPT stands for generative pretrained transformer, a family of language models trained with supervised and reinforcement learning techniques in chatGPTs case, 45 terabytes of text data powering all that content creation.

But what if generative AI can be used to respond to essential data-related queries in the business world, not only content?

Up till now, data, analytics and even data democratization has been data-centered, designed for data-skilled people, Sharka says. The business users are being left out, facing barriers to the information they need to make data-driven decisions. People are not about data. They want business answers. We have an opportunity today to shift the user interface toward language interfaces, and humanize data to make it people-centric.

But the interface is only a small percentage of what a complex system needs to perform in order to make this kind of information integrated, certified, safe, equal, and accessible for business decisions. Composite AI means bringing together data science, machine learning, and conversational AI in one single system.

I like to think of it as the iPhone of the category, which provides an integrated experience to make it safe and equal, Sharka says. Thats the only way well have generative AI delivering impact in the enterprise.

As the gap between B2C and B2B apps has grown, business users have been left behind. B2C apps put billions of dollars into creating exemplary apps that are very user friendly, operable with a few taps or a conversation. At home, users are writing research papers with the help of chatGPT, while back at work, a wealth of data stays siloed when the complex dashboards that connect data go unused.

In organizations, generative AI can actually connect every data product anywhere in the world and index it in an organizations private brain. And with algorithms, natural language processing and user-created metadata, or what iGenius calls advanced conversational AI, the complexity of data quality can be improved and elevated. Gartner has dubbed this conversational analytics.

Virtualizing complexity unlocks unlimited potential to clean, manipulate and serve data for every use case, whether thats cross-correlating information or just bringing it together as one single source of truth for an individual department.

On the back end, generative AI helps scale the integration between systems, using the power of natural language to actually create what a Sharka calls an AI brain, composed of private sources of information. With no-code interfaces, integration is optimized and data science is democratized even before business users start consuming that information. Its an innovation accelerator, which will cut costs as the time it takes to identify and develop use cases is slashed dramatically.

On the front end, business users are literally having a conversation with data and getting business answers in plain natural language. Making the front-end user experience even more consumerized is the next step. Instead of a reactive and single task-based platform, asking text questions and getting text answers, it can become multi-modal, offering charts and creative graphs to optimize the way people understand the data. It can become a Netflix or Spotify-like experience, as the AI learns from how you consume that information to proactively serve up the knowledge a user needs.

From an architectural perspective, this natural language layer is added to the applications and databases that already exists, becoming a virtual AI brain. Connecting across departments unlocks new opportunities.

This is not about using data more this is about using data at the right time of delivery, Sharka says. If I can use data before or while I make a decision, whether Im in marketing or sales or supply chain, HR, finance, operations this is how were going to make an impact.

For instance, connecting marketing data and sales data means not only monitoring campaigns in real time, but correlating results with transactions, conversions and sales cycles to offer clear performance KPIs and see the direct impact of the campaign in real time. A user can even ask the AI to adapt campaigns in real time. At the same time, the interface surfaces further questions and areas of inquiry that the user might want to pursue next, to deepen their understanding of a situation.

At Enel, Italys leading energy company now focused on sustainability, engineers consume real-time IOT information, mixing finance data with data coming from the production plants, having conversations with that data in real time. Whenever their teams need to perform preventative maintenance or plan activities in the plant, or need to measure how actual results compare to budgets, asking the interface for the synthesized information needed unlocks powerful operational analytics that can be reacted on immediately.

ChatGPT has sparked a massive interest in generative AI, but iGenius and OpenAI (which both launched in 2015) long ago realized they were headed in different directions, Sharka says. OpenAI built the GPT for text, while iGenius has built the GPT for numbers, a product called Crystal. Its private AI brain connects proprietary information into its machine learning model, allowing users to start training it from scratch. It uses more sustainable small and wide language models, instead of large language models to give organizations control over their IP.

It also enables large-scale collaboration, in which companies can leverage expertise and knowledge workers to certify the data used to train models and the information generated to reduce bias at scale, and provide more localized and hyper-personalized experiences. It also means you dont need to be a prompt engineer to safely work with or apply the data these algorithms provide to produce high-quality actionable information.

Ive always believed that this is going to be a human-machine collaboration, Sharka says. If we can leverage the knowledge that we already have in people or in traditional IT systems, where you have lots of semantic layers and certified use cases, then you can reduce bias exponentially, because youre narrowing it down to quality. With generative AI, and a system thats certified on an ongoing basis, we can achieve large-scale automation and be able to reduce bias, make it safe, make it equal, and keep pushing this idea of virtual copilots in the world.

Read the original post:

While OpenAI has been working on text and images, iGenius has been working on GPT for numbers - VentureBeat

The smart life sciences manufacturing market is projected to reach US$ 78,974.61 million by 2033, registering a CAGR of 14.4% from 2023 to 2033 -…

ReportLinker

Technology has been playing a major role in the healthcare sector, wherein the biotechnology industry is the most benefited segment by recent technological advancements in data analytics, compared to other domains such as oncology, neurology, and immunology.

New York, March 24, 2023 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Smart Life Sciences Manufacturing Market Forecast to 2033 - COVID-19 Impact and Global Analysis by Component, Technology [AR/VR Systems, Internet of Things, Artificial Intelligence, Cybersecurity, Big Data, and Others], and Application" - https://www.reportlinker.com/p06433207/?utm_source=GNW Emerging data sciences technologies assist in the growth of the biotechnology industry.

Analysis of living organisms, research for novel drugs, etc. are major roles played by biotechnology laboratories. Modern data analytics tools have allowed biotechnology researchers to create predictive analytics models and understand the most effective ways to achieve desired goals and objectives. Big data, AI, virtual reality, data visualization, and data security are among the common technologies used in biotech laboratories. Novozymes, a leading global biotechnology company headquartered in Bagsvrd, just outside of Copenhagen, Denmark, has adopted Tableau, a data visualization tool, which revolutionized its approach to data analysis and collaboration. AstraZeneca, plc, a BritishSwedish multinational pharmaceutical and biotechnology company, uses data and technology to minimize the time to discovery and delivery of potential new medicines. The company has data science and AI capabilities embedded in its R&D departments, which allows scientists to push the boundaries of science to deliver life-changing medicines. Such adoptions are expected to boost the smart life sciences manufacturing market growth during the forecast period.

China accounts for the largest share in the smart life sciences manufacturing market in Asia Pacific, followed by Japan and India.China has one of the fastest growing smart life sciences manufacturing markets globally.

Chinese firms have been one of the keenest adopters of technology in APAC as they seek to gain a competitive edge through digitalization and innovation.The manufacturers have collaborated with each other for the successful deployment of advanced technologies.

In March 2022, Triastek and Siemens Ltd., China signed a strategic collaboration agreement to provide digital technologies for the global pharmaceutical industry. Triastek is actively implementing a continuous manufacturing approach with advanced digital tools to streamline and improve its manufacturing processes and products. Such factors promote the revenue generation of smart life sciences manufacturing market players operating in China. Moreover, the country has flourishing biotechnology industry, which led to increased investments in biotech firms over the period. China-based biotech firms have seen 100 times increase in total market value to more than US$300 billion from 2016 to 2021. Biopharma companies are exploring China and other emerging markets for commercial expansion and for acquisition and partnership targets to boost innovation. Therefore, all the above-discussed factors will drive the smart life sciences manufacturing market growth in China during the forecast period.

ABB Ltd, Bosch Rexroth AG, Emerson Electric Co, Fortinet Inc, General Electric Co, Honeywell International Inc, IBM Corporation, Rockwell Automation, Siemens AG, and Sophos Group plc are among the key smart life sciences manufacturing market players profiled during this market study. In addition to these players, several other essential market players were studied and analyzed to get a holistic view of the global smart life sciences manufacturing market and its ecosystem.

The overall smart life sciences manufacturing market size has been derived using both primary and secondary sources.To begin the research process, exhaustive secondary research has been conducted using internal and external sources to obtain qualitative and quantitative information related to the market.

The process also serves the purpose of obtaining an overview and forecast of the smart life sciences manufacturing market size with respect to all market segments.Also, multiple primary interviews have been conducted with industry participants and commentators to validate the data and gain more analytical insights.

Participants of this process include VPs, business development managers, market intelligence managers, national sales managers, along with external consultants such as valuation experts, research analysts, and key opinion leaders, specializing in the smart life sciences manufacturing market.Read the full report: https://www.reportlinker.com/p06433207/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Story continues

See the rest here:

The smart life sciences manufacturing market is projected to reach US$ 78,974.61 million by 2033, registering a CAGR of 14.4% from 2023 to 2033 -...

What is ChatGPT? Malone University professor explains AI bot – Canton Repository

ChatGPT is being banned by schools across the country. Here's why.

Schools nationwide are banning OpenAI's ChatGPT. Here's what experts say about the future of artificial intelligence in education.

Just the FAQs, USA TODAY

Artificial intelligence has been making headlines in recent weeks as major tech companies like Google and Microsoft have announced new tools powered by the technology.

Experts say the rapid growth of AI could affect manufacturing, health care and other industries.

The Canton Repository spoke to Shawn Campbell, an assistant professor of computer science and cybersecurity at Malone University, about the rise of AI technology and what it means for the future.

It is the ability for computers to perform functions typically done by humans, including decision making or speech or visual recognition. The study has been around since the 1950s, according to an article from Harvard University.

Campbell said one type of AI commonly used in the medical field is expert systems. This technology uses knowledge databases to offer advice or make decisions regarding medical diagnoses.

A developer called OpenAI launched an AI chatbot in November 2022 known as ChatGPT. Users can interact with the chatbot and receive conversational responses.

Campbell said the rise of this technology has created competition between Microsoft and Google. Microsoft plans to invest billions into ChatGPT, and recently announced AI upgrades to its search engine, Bing. Google, meanwhile, has introduced new AI features in Gmail and Google Docs that create text.

The major tech companies are in an arms race, Campbell said, to see who can develop the best AI technology.

There is some concern that AI technology will replace jobs traditionally held by humans. In some cases, it's already happened. For example, fast-food chain White Castle started installing hamburger-flipping robots in some of its locations in 2020 to reduce human contact with food during the cooking process.

Campbell said it's possible that AI will result in fewer employees doing certain tasks.

"If you have a product line that had 100 people on it, and they get a new type of machine in and kind of redesign the process, then they do it with 80 people or 60 people, ... I do think you're going to find more of the same jobs being done by fewer people, and those people being freed up to do other tasks, essentially," Campbell said.

Some worry that trucking jobs will disappear if developers figure out self-driving technology, but Campbell doesn't expect that to happen anytime soon.

Campbell said he expects AI to become a tool that makes life easier, noting that technology like adaptive cruise control and camera tools that let users remove unwanted objects from the background of photos involve AI and are already used on a daily basis.

"I think that's really the progression it will follow. ... It's being used as a tool, and it's making people's jobs easier, or making a long drive more enjoyable and safer as well," he said.

One of the biggest changes Campbell expects is for data science and analytics to be emphasized more in education. Some employers are already paying for their employees to receive data analytics and literacy training, he said, and Malone University recently added a data analytics major.

Campbell predicted these skills will become important in the job market and that educational institutions may start incorporating data analytics into general curriculum, like they do with writing and public speaking.

Reach Paige at 330-580-8577, pmbennett@gannett.com or on Twitter @paigembenn.

See the article here:

What is ChatGPT? Malone University professor explains AI bot - Canton Repository

GNA University organised Workshop on Big Data Analytics Tools – Cityairnews

Jalandhar, March 25, 2023: The Department of Computer Science and Engineering in the School of Engineering, Design and Automation (SEDA) organised a 2-days workshop on Big Data Analytics Tools for the students of B.Tech. CSE. Rocky Bhatia, Solutions Architect at Adobe, was the resource person of the Workshop, holding an expertise in data engineering, system design, architecture, cloud tech, data science, and other technology areas. The objective of the Workshop was to make the students comfortable with tools and techniques required in handling large amounts of datasets.

Dr. Vikrant Sharma, Professor and Dean-SEDA welcomed all the participants and expert of the Workshop. The contents covered in the Workshop included: Introduction to Big Data, Distributed Computing, HDFS Architecture, Blocks, Replication, Map Reduce, Hadoop, Hive Query Language, YARN, NoSQL, HBase, Oozie, Real time Data Mining using Spark Streaming, Elastic search, Kibana, Kafka. It was an interactive session where students actively participated in hands-on training and enquired curiously about career options in data science domain.

Gurdeep Singh Sihra, the Pro-Chancellor, GNA University congratulated all the participants for successfully completing the course and also to organizers of the event. He said, This event is simply the beginning for providing solutions to make a better future and advised the students to take pride in being a part of this prestigious course and wished all the participants a great success.

Dr.V.K. Rattan, the ViceChancellor, GNA University appreciated all the participants and encouraged them to be part in such events as this is one of the emerging technologies and highly in demand.

Dr. Monika Hanspal, Dean Academics, GNA University interacted with participants, and highlighted the significance of Big Data, and expressed these technologies and hands-on practices will make students industry ready.

Dr. Hemant Sharma, the Pro Vice-Chancellor thanked Dr Anurag Sharma, Professor and Head CSE and organizers for organizing such a great event and ensured all the participants that such events will be planned in the near future.

Link:

GNA University organised Workshop on Big Data Analytics Tools - Cityairnews

‘We do not do the end of life well’ in America: How hospice can help … – Morningstar

By Jessica Hall

Jimmy Carter's use of hospice is shining a light on end-of-life care

Former U.S. President Jimmy Carter's decision to seek hospice care and spend his remaining time in care at home has helped bring awareness to an industry that quietly serves people at the end of their lives.

Hospice, which provides care and support services for patients who are terminally ill with a life expectancy of six months or less, offers care at the end of life -- something that people often are uncomfortable talking about.

"People have been trying for years to raise awareness of hospice. But it's marketing something no one wants to buy -- or at least talk about," said Amy Tucci, president of the Hospice Foundation of America. "President Carter has done more for hospice care than anyone I can remember. He raised awareness more in a day than the industry has in 10 years."

Read:Former U.S. President Jimmy Carter in hospice care

'To live well at the end of life'

The hospice industry has been growing, fueled by an aging U.S. population, as well as an increased awareness of what hospice provides. Hospice admissions grew to 1.7 million, up 42%, from 2011 to 2021, according to MedPAC. Half of all Medicare beneficiaries who die each year do so in hospice, with dementia as the primary diagnosis, Tucci said.

"There has been an awakening in America that we do not do the end of life well," said Elliott Wood, president and CEO of Medalogix, a healthcare data-analytics company that serves the hospice industry. "Physicians are trained to heal. And there's a massive lack of understanding and accountability about who should be having these end-of-life conversations."

Medalogix's Muse technology can predict, with an accuracy greater than 90%, whether a hospice patient will die within 10 days of the last hospice visit. Muse compares individual patient data with that of more than 60,000 active hospice patients, as well as with millions of records for patients who have since died, in order to predict a person's final days.

That prediction can allow caretakers to strategize a plan for the final days of care -- and give them time to talk to the family and the patient about death and what any wishes may be for final arrangements.

"Conversations about dying are very uncomfortable. Jimmy Carter seeking hospice created space for people to have an end-of-life conversation, and an openness for the general public to learn more about hospice," Wood said.

"There's a huge misnomer of hospice. It's an end-of-life benefit -- to live well at the end of life. It's not the patient giving up," Wood said. "It's a chance to make sure there's a will in place, that spiritual needs are being met and end-of-life wishes are met."

The art and science of healthcare

There are four levels of hospice care. Routine care is the most common level of care, usually provided in the home for a stable patient with controllable symptoms like pain or nausea. General inpatient care provides crisis-level care for short-term management of acute symptoms and is usually provided in a hospital or nursing home. Continuous home care is crisis-level care provided on a short-term basis at home. Meanwhile, respite care provides temporary care so a family member can get a break from caregiving.

Medalogix's Muse product serves 50 hospice-care providers or customers that represent one out of five hospice patients in the U.S.

An additional benefit of predicting the final days of a patient's life is that it helps the hospice providers plan their staffing needs better at a time of a labor shortage in the medical field, Wood said.

"The purpose of Muse is to ensure that patients are having the right care at the right time," Wood said.

"It's transformational. It's a mix of the art and science of healthcare. It combines the sacred calling of care -- which is why I got into it -- with a partner on the data-science piece," said Charlotte Mather, vice president of nursing-hospice at care provider AccentCare Inc., a home-health and hospice company that uses the Muse technology.

"Before we had this tool, we didn't have as much confidence as we do now. This gives us a different lens to see the patient's care," Mather said. "We can do a better job with the quality of their life in those final days. Whether it's saying it's time to have the family say their goodbyes, to looking at the cultural beliefs or needs, this gives us the time to help them on their journey," she said.

"It allows you to have a conversation from a place of confidence that this is what we expect to see in the next week or the next few days. It helps us anticipate what they will need from us," Mather said.

Muse also helps the hospice provider meet the Centers for Medicare and Medicaid (CMS) requirements for an additional payment called a service intensity adjustment, which requires that a registered nurse or social worker must visit the patient on two of the last three days of their life.

"CMS shaped the payment to drive behavior to provide the appropriate care," Wood said.

Without having the proper personnel visiting a hospice patient in the final days, families can be ill-prepared for the end. As a result, they might panic and call an ambulance, putting the patient in the emergency room or hospital to die in a facility instead of at home where they wanted, Wood said.

That's also a more costly outcome.

"The role of hospice is to avoid that very bad outcome and provide care and comfort in the final days," Wood said.

Do you have questions about retirement, Social Security, where to live or how to afford it at all? Write to HelpMeRetire@marketwatch.com and we may use your question in a future story.

-Jessica Hall

This content was created by MarketWatch, which is operated by Dow Jones & Co. MarketWatch is published independently from Dow Jones Newswires and The Wall Street Journal.

(END) Dow Jones Newswires

03-25-23 1440ET

Follow this link:

'We do not do the end of life well' in America: How hospice can help ... - Morningstar

The science of sailing: inside the race across the worlds most remote ocean – The Guardian

After a long hiatus, the epic Ocean Race is back but this year, as well as dodging icebergs, cracking masts and suffering the occasional hull sandwich failure, the teams are gathering crucial data from places even research vessels rarely reach

Yvonne Gordon

Sat 25 Mar 2023 09.00 EDT

The Southern Ocean is not somewhere most people choose to spend an hour, let alone a month. Circling the icy continent of Antarctica, it is the planets wildest and most remote ocean. Point Nemo just to the north in the South Pacific is the farthest location from land on Earth, 1,670 miles (2,688km) away from the closest shore. The nearest humans are generally those in the International Space Station when it passes overhead.

But on 21 March, four sailing teams came through here part of a marathon race round the bottom of the Earth, from Cape Town in South Africa to Itaja in Brazil.

By the time these 18-metre (60ft) Imoca monohull sailing yachts neared Point Nemo, the five sailors on each boat had already been at sea for 23 days, with another two weeks to go before they reach port in early April. And this is just leg three, the longest portion of the even longer Ocean Race, a 32,000-nautical-mile dash around the world that started in January and finishes in July.

Competition is fierce and racing is close, even after three weeks at sea. Boat speeds on leg three so far have been up to 40.5 knots the equivalent of gale force winds and the vessels have, subject to ratification, broken the 24-hour distance record multiple times. The crews survive on freeze-dried food (rehydrated with hot water from a kettle theres no kitchen), and operate a four-hour alternating watch system. Nobody gets much sleep. The toilet is a bucket.

The dangers are unpredictable. Winds in the Southern Ocean can reach up to 70 knots and hitting an iceberg at speed would be catastrophic, so the boats have to steer clear of an ice exclusion zone around Antarctica. On 1 March, the Team Malizia crew discovered a crack at the top of the mast, requiring one of them to climb up 28 metres in rough seas to patch it over in the middle of the night. Guyot Environnement Team Europe had to abandon leg three completely after suffering a hull sandwich failure, spending three nervous days sailing 600 nautical miles back to Cape Town for repairs. The last time Team Holcim-PRBs skipper Kevin Escoffier raced here, his boat broke in half and sank and he was rescued from his life raft.

But while the Ocean Race is sometimes known as the toughest, and certainly the longest, professional sporting event in the world an event that began in 1973 as the Whitbread Round the World Race then became the Volvo OceanRace, and which attracts professional sailors of the highest level who join mixed crews every few years on sponsored teams to vie for an overall trophy (there is no cash prize) this year scientists have smelled an opportunity for them to benefit as well.

Because the boats visit the most remote part of the ocean, which even scientific vessels struggle to access, this year the crews will seed scientific instruments all around Antarctica, aiming to measure 15 different types of environmental data from ocean temperature and atmospheric indicators to concentrations of microplastic.

Information from the devices will help with everything from weather forecasting to insights into the climate emergency. The Southern Ocean is one of the planets largest carbon dioxide sinks, for example, but its inaccessibility has meant that there is relatively little CO2 data available.

The Southern Ocean is a very important driver of climate on a global scale [but] there is very little data, says Toste Tanhua, chemical oceanographer at Geomar Helmholtz Centre for Ocean Research in Kiel, Germany. Data from the sailing races in the Southern Ocean is very important for us to understand the uptake of carbon dioxide by the ocean.

Each boat is equipped with weather sensors on board that measure wind speed and direction, barometric pressure and air temperature. Each team will drop two surface drifter buoys provided by organisations such as Mto-France and the US National Oceanic and Atmospheric Administration, which capture data to help the World Meteorological Organization study ocean currents and forecast extreme weather events such as hurricanes.

A second type of buoy, the Argo profiler, deployed by Team Malizia in leg two, operates below the surface at depths of up to 2km, moving slowly with deep currents and transmitting information every 10 days. The data is used for climate analysis as well as for long-range weather forecasts.

Meanwhile, 11th Hour Racing Team and Team Malizia are using OceanPacks to take regular water samples to measure the levels of carbon dioxide, oxygen, salinity and temperature, to be analysed in Germany and fed to Socat, the Surface Ocean CO2 Atlas. Live dashboards give snapshots of the data as it is collected out at sea.

Tanhua says the fresh data reveals new patterns. For example, it shows how carbon dioxide varies over a year higher when the water warms up in summer, lower during a phytoplankton bloom. It also shows how the ocean takes carbon from the surface and transports it into the depths. In the Southern Ocean, you have three major frontal systems where water is either going down vertically (sinking) or coming up, Toste says. That has very different carbon levels. Eddies also transport carbon up and down. Scientists will now be able to observe these fronts and eddies up close, compare with satellite data and fill in the gaps.

The boats are also sampling trace elements such as iron, zinc, copper, cadmium, nickel and manganese, which are essential for the growth of plankton. Not only is plankton the base of the food chain, but phytoplankton are responsible for most of the transfer of CO2 from the atmosphere to the ocean.

This data is extremely important, says Dr Arne Bratki, environmental biogeochemist at the University of Lleida, Spain, who analyses the trace element results. It is important to know how much food is available for animals that will feed on phytoplankton eventually, and how much CO2 the phytoplankton is going to absorb from the atmosphere.

Bratki says that sampling such as this normally requires dedicated scientific voyages, on which places are limited and expensive. The Ocean Race is a way of testing investigations on non-scientific platforms at sea. We are paying attention to the design of the samplers what works and what does not, says Bratki. Its really exciting.

To add to the plankton study, Team Biotherm is working with the Tara Ocean Foundation to study ocean biodiversity, and the sailors have an automated onboard microscope to record images and provide insights into the diversity of phytoplankton species.

They are also studying oxygen. Dr Vronique Garon, senior scientist for Frances National Centre for Scientific Research at the Observatoire Midi-Pyrnes at the University of Toulouse, France, wants to better understand ocean deoxygenation, which is being caused by global heating.

Boats sail through remote parts of the world ocean where observations are really scarce, she says. Getting more oxygen data is invaluable to yield a better estimate of the ocean oxygen inventory and thus of the oceanic oxygen loss.

The more data we have, the more accurately we can understand the oceans capacity to cope with climate change and predict what will happen to the climate in future.

The 2017-18 race made headlines after samples taken near Point Nemo showed that even these remote waters are polluted with microplastics. This year, two teams, Holcim-PRB and Guyot Environnement, are again taking water samples to test for microplastics but can now also analyse them to determine their product source (for example, a bottle or plastic bag).

We still have a very poor understanding of the abundance and distribution of very small microplastics in the ocean, and its quite difficult to collect them, according to Dr Katsiaryna Pabortsava, a biogeochemist at the National Oceanography Centre in Southampton who is helping analyse the samples.

The Ocean Race will be delivering samples from places that otherwise wed have difficulty getting samples from, Pabortsava says. The other thing is the ease of collection of those samples. You dont need trained personnel, as you would have with research vessels. The hope is that this type of sampling could eventually be employed on other non-scientific ships, such as cruises or ferries.

The sailors benefit, too: in such a close and dangerous race (Team Holcim-PRB had a lead of 600 nautical miles at one stage, but light winds brought the pack together again and by 20 March just 5.1 nautical miles separated the four boats), every piece of information is vital.

Its a win-win situation, because six hours [after dropping the buoys], the sailors will download a new weather bulletin, using data from the buoys, says Martin Kramp, ship coordinator at OceanOPS, the monitoring, coordination and implementation support centre of the Global Ocean Observing System.

In such data-sparse areas as the Southern Ocean, [that] can make a significant difference the forecast will be much better.

{{topLeft}}

{{bottomLeft}}

{{topRight}}

{{bottomRight}}

{{.}}

See original here:

The science of sailing: inside the race across the worlds most remote ocean - The Guardian

As tech company layoffs continue, Berkeley advisors share advice with students | CDSS at UC Berkeley – University of California, Berkeley

UC Berkeley advisors are assuring data and computer science students the technology job market isn't as bleak as it appears. (Photo/ Garry Knight, CC by 2.0)

Thousands of UC Berkeley data science and computing students are preparing to graduate in May amidst a slew of technology company layoffs. Some are worried about finding a job.

Berkeley advisors are urging expected graduates to take a beat and embrace a more nuanced job market view. Theyre reminding students of what they have control over in their search and what theyve already accomplished. And theyre telling students to give themselves grace.

This generation will have 12 plus jobs in their lifetime. I often emphasize to students that this is just the first one, said Amanda Dillon, a Berkeley Data Science Undergraduate Studies advisor. Students tend to really overwhelm themselves with expectations [but] its just a stepping stone.

More than 500 technology companies cut roughly 148,000 jobs in 2023 alone, according to the layoff tracking website Layoffs.fyi. Thats on top of about 161,000 last year. But, advisors say, the future isnt as bleak as it initially appears for technical workers if students can be flexible.

The technology industry is cutting jobs in reaction to rising interest rates from the Federal Reserve and adjustments to a post-pandemic economy, according to reports. But it matters who technology companies are laying off, advisors said, and only some of that information is known.

Experts tracking these layoffs have said many of these cuts affected non-technical workers such as recruiters. Bennett Agnew, director of external relations for the Department of Electrical Engineering and Computer Sciences, noted that its unclear how concentrated cuts have been in senior or managerial roles and whether cuts will carry into new graduate or internship hiring.

Regardless of who is being cut, a CompTIA analysis of federal data shows the technology market is still healthy, said Rebecca Andersen, the School of Informations senior director of student and alumni career development. Demand remains high for workers with expertise in areas like artificial intelligence. The technology unemployment rate remains low.

Layoffs create uncertainty and that can be scary, Andersen said. The good news is, if we look at the overall context of the technology industry and especially technology jobs, this is still a healthy market and there are plenty of opportunities.

Advisors are encouraging students to lean into Berkeley advising opportunities from one-on-one sessions to job fairs. Students should reach out to Berkeley alumni to learn about potential career paths and to network. They should also search for jobs on company websites, not just LinkedIn or other job boards.

One major opportunity: look for roles outside of the tech sector. Job opportunities in non-technology industries are increasing at a higher rate than at technology companies, Bain & Company found. To make that idea actionable, advisors ask students to think about what companies they like, where their skills can make a difference and what problems they can solve for society.

Any company that gets large enough is going to have a need to use the data that theyre collecting, said Dillon, urging students to think of industries like healthcare, government and retail. It really does align well with our mission to equip students with these 21st century skills.

Students should also be flexible with their expectations and consider compromise, Dillon and others said. Be ready to work in an office, not remotely. Be willing to move. Understand that moving or taking a job outside of large technology companies may affect your starting salary. Maybe even consider starting your own company.

This can sound intimidating. But advisors emphasized that Berkeley students are uniquely equipped through their highly ranked, interdisciplinary and innovative education to thrive in data science and computing jobs, wherever they land.

Many students begin their education with a specific career path in mind, but they often arrive at very different destinations, intellectually and professionally, than what they previously thought was possible. This is after all, the purpose of attending a university: to expand one's horizons, said Agnew. Our students will have the tools to not just participate as leaders in well-developed industries, but to create new markets and novel paradigms of technology for productivity and social good.

See original here:

As tech company layoffs continue, Berkeley advisors share advice with students | CDSS at UC Berkeley - University of California, Berkeley

Data Science is formally approved as major The Elm – blog.washcoll.edu

By Emma Russell

Copy Editor

A new data science major and minor made its way to Washington College this semester, courtesy of the Department of Mathematics and Computer Science.

The program was officially put together in 2021 but was only approved last year, according to Assistant Professor of Mathematics Dr. Jordan Tirrell.

This is our first semester that its actually a major and theres very little written about it so far. I would love to get more people aware of it, Dr. Tirrell said.

This new major and minor were in the works for several years. Dr. Tirrell said it was something he hoped to get running at the college, ever since he was hired.

This is my first fourth year at WC and when I was hired, I was really interested in doing stuff with data science programs, Dr. Tirrell said. I taught a lot of statistics for data science in the past and Id worked in data science in the past and the college was really interested in getting something like this running.

John W. Allender Associate Professor of Ethical Data Science Dr. Kyle Wilson, who oversees the program, received a grant from the Maryland E-Nnovation Initiative Fund which is a matching fund with the goal of supporting endowed chairs at Marylands higher educational institutions, according to Marylands Department of Commerce website.

The endowed chair thing basically lets me focus my time on starting the student data science program, so my courses will mostly turn into teaching data science courses. All of the administrative nonsense that goes with starting a program becomes my responsibility. Also, I have responsibilities to promote the program and talk to people about it, Dr. Wilson said.

According to Dr. Wilson, there is no one way to define data science.

I would see [data science] primarily as a tool for inquiry. As a discipline, its not very useful because it mostly provides tools for seeking answers to bigger questions that often come from elsewhere. I really encourage data science students to pursue a variety of interests, Dr. Wilson said. I think thats just great as a human being, but it also makes them better data scientists, trying to apply technical tools to problems without domain knowledge tends to lead to us very confidently saying very stupid things.

Despite data science being a difficult subject to define, both Dr. Tirrell and Dr. Wilson agree that it has a myriad of educational and career opportunities useful to WC students.

Junior Parker Hayden took a data science course last semester that he said was fun, and inspired his decision to look into declaring a major or minor.

Hayden said that data science was appealing to him because I think its effective. Its similar to computer science, and I fell in love with computer science, but it has that feel of math, my actual major, and I think it just combines and in a way that seems interesting.

Dr. Tirrell said that data science is a fairly new field, and when he was working in the field in 2008, there was not even a name for it yet, but things have changed in the last few years.

Data science is one of the hottest fields right now, sograduates with data science degrees are in extremely high demand, Dr. Tirrell said.

Dr. Tirrell stresses that this new major and minor is beneficial to those outside of the Department of Mathematics and Computer Science.

The data science majorgoes really well with a lot of other disciplinesWe would really like to see people who might not consider a math or computer science major to consider a data science major. Especially asa double major or as a minor with whatever theyre doing, because data science is really involved in basically every field, even something you might not think of as quantitative, like English, Dr. Tirrell said.

According to Sports Editor for The Elm sophomore, Jack Poleto, who already declared a data science major, Data science is one of the most versatile degrees you can get. Theres data science in everything and every industry.

Despite the versatility of the degree, there are currently only five declared majors and minors, according to SGAs state of the major survey results.

I just hope [the program] grows in general because theres only a few of us right now. Theres next to no one in the major. So as long as it grows, Im happy. [The major is] very well outlined right now, its not like a terribly difficult major to complete, Poleto said.

Dr. Wilson believes that all students can benefit from the new major and minor, even those who do not declare.

We want [students] to have some exposure to one of our [data science] classes either by taking it or by having a friend take it and say actually You know, I didnt think I would like this. I never thought Id be a computer person or spreadsheet person, but I do have interest and I do have questions, and this is relevant to those, Dr. Wilson said.

The program was created from the ground up, giving the Department of Mathematics and Computer Science a chance to experiment and add their own special touches to it.

We created this data science major ourselves, and [data science] is new enough that there arent really like national standards on what should go into data science major. We spent a lot of time looking at other programs and trying to come up with what it should look like here. I think what we came up with is really pretty cool and somewhat unique. We have a particularly large number of courses custom-built for the program, but we also were able to work in ethics emphasis throughout, Dr. Wilson said. I think our program has an interdisciplinary feel to it, that a lot of other programs havent been able to pull off. Im proud of the work we did I think itll be a good experience for the students.

Elm Archive Photo

Photo Caption: Students looking to expand their areas of study or focus on data science specifically will now have fresh opportunities to do so.

Read more from the original source:

Data Science is formally approved as major The Elm - blog.washcoll.edu

Alteryx Announces Partnership with the Department of Defense to … – PR Newswire

SkillBridge Program provides training to prepare active-duty service members with critical data and analytics skills as they enter civilian life

IRVINE, Calif., March 23, 2023 /PRNewswire/ -- Alteryx, Inc.(NYSE: AYX), the Analytics Cloud Platform company, today announced it is now an authorized partner for the Department of Defense (DoD) SkillBridge program. Alteryx will help active-duty service members transition to civilian careers by providing them with real-world industry training and certificationsin data analytics, as part of the Alteryx SparkEDno-cost education program.

"We launched the SparkED program in 2021 to expand data literacy and analytics skills among all learners so that they can use this skill set to solve real-world problems," said Libby Duane Adams, co-founder and chief advocacy officer at Alteryx. "We couldn't be happier to partner with the Department of Defense SkillBridge program and support our transitioning service members and their spouses who sacrifice so much in service to our nation."

By 2025, the top three job roles that will see increasing demand across industries are data analysts and scientists, AI and machine learning specialists, and big data specialists1. With these roles growing in importance, the Alteryx SparkED and SkillBridge partnership will help service members jumpstart their civilian careers with relevant in-demand data and analytics skills. The curriculum provides a vendor-agnostic approach to data science education, as well as hands-on training with Alteryx's industry leading analytics automation solution to enable service members to transform data into business insights. Through the program, participants receive a free Alteryx Designer license, interactive learning paths, and access to the Alteryx Community to guide their training.

"This partnership is a testament to our commitments to the public sector and supporting our service members," said David Colberg, Navy veteran and vice president, Global Government Affairs and Public Policy at Alteryx. "The DoDSkillBridgeProgram and partnership with Alteryx SparkED will build an incredible talent pipeline equipped with in-demand analytics skills while assisting our transitioning warriors and their families with the next chapter of their careers."

The DoD SkillBridge program enables service members to gain valuable civilian work experience through specific industry training, apprenticeships, or internships during the last 180 days of service. Each year, SkillBridge connects 200,000 service members2 with industry partners in real-world job experiences.

Learn more about the Alteryx SparkED and SkillBridge partnership and applytoday.

1(2020). The Future of Jobs Report 2020. World Economic Forum. https://www.weforum.org/reports/the-future-of-jobs-report-20202 (2022, April 22). Military Discharge Data. DOD SkillBridge. https://skillbridge.osd.mil/separation-map.htm

About AlteryxAlteryx (NYSE: AYX) powers analytics for all by providing our leading Analytics Automation Platform. Alteryx delivers easy end-to-end automation of data engineering, analytics, reporting, machine learning, and data science processes, enabling enterprises everywhere to democratize data analytics across their organizations for a broad range of use cases. More than 8,000 customers globally rely on Alteryx to deliver high-impact business outcomes. To learn more, visit http://www.alteryx.com.

Alteryx is a registered trademark of Alteryx, Inc. All other product and brand names may be trademarks or registered trademarks of their respective owners.

SOURCE Alteryx, Inc.

Continued here:

Alteryx Announces Partnership with the Department of Defense to ... - PR Newswire