Page 1,557«..1020..1,5561,5571,5581,559..1,5701,580..»

Aspiring Data Scientists Research the Rise in Gasoline Prices – University of Pennsylvania

The essence of data science is the study of data to extract useful insights for business.

During the 2023 Women in Data Science@Penn Conference, held at the University of Pennsylvanias Perry World House, Linda Zhao, a Wharton School professor of statistics and data science, put it like this: You want to solve real-world problems? Get the data, make sure you have clean data, do the analysis, and make a story out of it. It has to make sense and you have to be able to present it beautifully to a group of people.

This is Professor Zhaos mantra as she introduces high school students to a data-centric mindset during the Data Science Academy, a summer program held on Whartons Philadelphia campus each July.

Gas Price Trends

At the recent WiDS Conference, sponsored by the Wharton School and Penn Engineering, a team of high school students who attended Zhaos summer academy traveled to campus from around the world to show off their data-science dexterity, dazzling the crowd with their problem solving around a timely economic issue: escalating gasoline prices.

The four students Karen W. from New Jersey, Brian L. from California, Jennifer L. from Texas and Christine L. from Hong Kong detailed their data mining process and findings in the presentation Hey, Whats Up? Gas Prices: Analyzing the Influences of U.S. Gas Price Trends.

When we started our investigation this past summer, gas prices in the U.S. were at an all-time high with very high fluctuations, as well, said Christine, a junior at Hong Kong International School. Recent events such as the Russian-Ukraine War and Covid-19 were impacting U.S. retail gas prices dramatically. Many Americans were feeling the effects of these expensive costs and couldnt help but wonder: What factors are affecting these prices? We thought it might be interesting to look at what was causing this unprecedented rise.

During their study, the group made use of several data-science techniques, including multiple linear regression, LASSO, text mining and random forest, to help them drill down on two key questions:

The exploration began with research into gas supply chains to identify what categories of variables, or factors that could be measured, the team might need to predict gas prices. Within four broad categories economics, energy, weather, and Google search trends for the words oil and gas they assembled 38 different variables into one total dataset that included longitudinal data from January 2000 to June 2022.

For a detailed discussion of the teams data mining and research, we encourage you to watch the students presentation on YouTube.

Cold Weather, Canadian Imports and CO2

Ultimately, however, the team used a selection model known as Lasso to arrive at the 10 most significant variables for their data analysis of gas prices.

Our 10 factors gave us some interesting results, said Christine. We found that when U.S. medium income increased, gas prices would increase by $0.09 per gallon. This is because when peoples income increases, they have more money to consume gas and gas-related products like cars, therefore increasing the demand for gas and raising gas prices.

We also found that when oil imports to the U.S. increase, gas prices would also increase, added Brian, a senior from Lynwood High School in California. Thats because when oil imports to the U.S. increase, gas prices have to reflect the increased supply chain and transportation costs, leading to higher prices.

The group also studied the variable of operating oil rotary rigs, which is basically the hardware that drills for oil. They found that an increase in operating oil rotary rigs also led to an increase in gas price. We hypothesized that this is because a higher amount of rigs in operation signifies higher demand for gas prices, thus increasing gas prices, noted Brian.

With these, as well as other findings that they touched on during their presentation, the team gained insight into why gas prices were so high, and how to potentially lower gas prices in the future. Here are their key project takeaways:

Read more:

Aspiring Data Scientists Research the Rise in Gasoline Prices - University of Pennsylvania

Read More..

San Pedro I opens its doors for eighth annual RowdyHacks on … – UTSA

A hackathonthe name is a portmanteau of hacking and marathonis a type of social event where coders, programmers and others interested in computer science fields come together to learn, collaborate and compete for prizes within a set period of time, frequently 24 hours. Historically, RowdyHacks has also served students by incorporating workshops providing networking opportunities with local tech industries.

Yet, RowdyHacks is for more than hardcore coders. On the contrary, the hackathon features multiple tracks and challenges to include hackers of all skill levels and areas of specialization. These include a beginner and advanced track, as well as challenges for the best retro hack, best hardware hack, a specific cybersecurity challenge and more.

Thats the great thing about a hackathon, said RowdyHacks Director and junior computer science major Quynh (Quincy) Nguyen. You dont have to know anything; you dont spend anything because its free and you can come and learn all these different things.

Nguyen has personally experienced the trepidation that first-time hackers face.

I remember going into my first hackathon being so scared, not knowing anyone because I was so new and not sure of my technical skills yet, she said. But I came out of it running to Google looking for the next hackathon to go to because it was just so fun. Thats what kind of encouraged me to run to become RowdyHacks director, because I want to help others experience what I did.

While the students organizing RowdyHacks have made great strides to ensure the event is approachable for first-time hackers, they have also made sure there will be plenty to attract experienced hackers and repeat attendees. Regardless of level, they understand that the competition provides students with a hands-on learning experience to test the skills theyve learned in the classroom.

Theres something new that happens every year within RowdyHacks and thats the change of theme, said Vamshi Ponnala, RowdyHacks co-director and UTSA senior computer science major.

For example, Ponnala says last years theme was Retro while this years is called Into the Unknown and deals with exploration.

RowdyHacks has something new and exciting for everyone, says Ponnala.

Weve got new prizes, new swag, new workshops, new companies to talk to, he said. Theres just a lot of new things, especially the building; weve got a new, big building.

That new, big building is San Pedro I, and the directors of RowdyHacks are excited to commandeer the flagship of the School of Data Science, if only for 24 hours.

On a practical level, SPI can provide the hackers with more space than their previous, well-loved accommodations on the UTSA Main Campus. More philosophically, some view SPI as a symbol of the thriving technology industry located in Downtown San Antonio, part of the Silicone Hills that run north to Austin. Elijah Moya is one who shares this view.

When we had the opportunity to be here at San Pedro I, it was like my dreams were coming true, he said. We can show UTSA students and people interested in technology that theres technology in San Antonio and were going to use this data science building to show it off to everyone.

This is especially true this year, Moya adds, as the 2023 RowdyHacks will be the first to have a significant presence of participants from outside San Antonio. For the last year, Moya and other ACM members have been attending hackathons across Texas to spread the word about RowdyHacks. The results of their outreach efforts are evidenced by more than 500 students who signed up this year.

This isnt just UTSA students. These are people from all across Texas that are going to come to UTSA and experience RowdyHacks and experience the fun, Moya said. Our motto is Howdy, howdy, lets get Rowdy, so we want to be rowdy. We want to have fun and we want to make sure this is a great experience for everyone.

See the rest here:

San Pedro I opens its doors for eighth annual RowdyHacks on ... - UTSA

Read More..

The Cape Town Statement on fairness, equity and diversity in research – Nature.com

As awareness has grown about fraud and misconduct in science, the World Conferences on Research Integrity have become a leading forum for the discussion and study of ways to promote responsible behaviour in research. Since the first meeting in 2007, which was held in Lisbon, the events have helped to establish an academic field focused on research integrity.

Meetings have typically concentrated on issues such as research misconduct, responsible behaviour around data collection, analysis, authorship and publication, and the importance of reproducibility. But last May, attendees to the 7th World Conference on Research Integrity, held in Cape Town, South Africa, took a significant step. They added the myriad ways in which research programmes and practices disadvantage those living in low- and middle-income countries (LMICs) to the suite of issues that threaten the integrity of science.

Three of the six previous world conferences have led to the publication of guidelines or principles. We are part of a working group (including bioethicists, researchers, institutional leaders and journal editors) that now presents the Cape Town Statement on Fostering Research Integrity Through the Promotion of Fairness, Equity, and Diversity. (See Supplementary information for a list of working-group members.)

This statement entails 20 recommendations, drawn from discussions involving around 300 people from an estimated 50 countries, including 16 African nations and 5 South American ones. The discussions were held over 18 months before, during and after the Cape Town conference the theme for which was fostering research integrity in an unequal world.

The Cape Town Statement is essentially a call to action that we hope will help to turn the global conversation on inequity and unfairness in research into changes in practice by all stakeholders. Here, we lay out the motivation for the statement, and its broad goals.

The reasons for the statement are clear. Much too often, researchers and institutions from high-income countries reap greater benefits from global collaborations than do LMIC collaborators whether in relation to numbers of papers published, authorship, career progression, setting priorities for research or the ownership of samples and data1.

Nature addresses helicopter research and ethics dumping

As an indication of this, a look at the authorship of papers about COVID-19 from the 10 top medical and global-health journals (according to impact ratings), containing content related to Africa or any African country, and published during the first 9 months of 2020, reveals that 66% of the authors were not from Africa2. One in five articles had no author from Africa at all. Whats more, of those papers with African authorship, 59% of first authors and 81% of last authors were not from Africa, and only 14% of papers had both an African first and last author.

Often, what happens is that after securing a grant for a project, a research team from a high-income country looks for local researchers in the low- or middle-income country of interest to collaborate with. Local researchers might be offered some grant money and co-authorship on a paper (usually with their name appearing in the middle of the list). Invariably, the lead research team conducts the analyses, with the local researchers only reviewing manuscripts, often to ensure that they are culturally and politically acceptable3.

Even the push towards openness and transparency in science publishing which many have argued is a way to foster greater integrity in research has created more barriers for investigators in low-resource environments.

Sharing data, for example, requires having enough institutional infrastructure and resources to first curate, manage, store and (in the case of data relating to people) encrypt the data and to deal with requests to access them. Also, the pressure placed on researchers of LMICs by high-income-country funders to share their data as quickly as possible frequently relegates them to the role of data collectors for better-resourced teams. With enough time, all sorts of locally relevant questions that were not part of the original project could be investigated by local researchers. But, well-resourced investigators in high-income countries who were not part of the original project are often better placed to conduct secondary analyses.

Unforeseen difficulties are arising around publishing, too. Currently, the costs to publish an article in gold open-access journals (which typically range from US$500$3,000) are prohibitive for most researchers and institutions in LMICs. The University of Cape Town, for example, which produces around 3,300 articles each year, has an annual budget of $180,000 for article-processing costs. This covers only about 120 articles per year.

Because of this, researchers in these countries frequently publish their papers in subscription-based journals. But scientists working in similar contexts cant access such journals because the libraries in their institutions are unable to finance subscriptions to a wide range of journals. All this makes it even harder for researchers to build on locally relevant science.

Such imbalances in global research collaborations which stem from a complex mix of racial discrimination, systemic bias and major disparities in funding and resources impact the integrity of research in numerous ways.

Scientists in Kenya work on a malaria vaccine at the KEMRI-Wellcome trust laboratories in Kilifi.Credit: Luis Tato/Guardian/eyevine

As recently as 2019, a group of students and their professor at a university in South Africa drew racist conclusions from their study of the cognitive abilities of Coloured South African women (Coloured is a recognized racial classification in South Africa). Their findings have since been debunked, and an investigation concluded that there was no deliberate intention to cause harm. A lack of diversity among the researchers, and possibly among the reviewers and editors, might have contributed to the publication of this research, which has now been retracted by the journal involved.

Power imbalances also skew research priorities, with investigators pursuing goals that frequently overlook the needs of local people.

The Cape Town Statement is not the first guideline on research fairness and equity, particularly in collaborations. Indeed, various documents informed our discussions, including the guiding principles of the Commission for Research Partnerships with Developing Countries (KFPE), which focuses on collaborations involving Swiss institutions; the Global Code of Conduct for Research in Resource-Poor Settings, a resource for those striving to ensure that science is carried out ethically in lower-income settings; and the BRIDGE guidelines, which aim to foster fairness and integrity in global-health epidemiology.

Furthermore, a tool for evaluating practices, called the Research Fairness Initiative, has already been developed by the Council on Health Research for Development, an international non-governmental organization that aims to support health research, particularly in LMICs. By providing questionnaires and guidance, the Research Fairness Initiative enables institutions, individual researchers and funders to evaluate their current practices, and if necessary, improve them.

The Cape Town Statement differs from these other guidelines and tools, however, in that it recognizes that unfair practices can harm the integrity of all research, no matter the discipline or context. Specifically, it focuses on the following four broad actions.

More funders from high-income countries must include diversity stipulations in their calls for grant applicants. In 2020, for example, the second European and Developing Countries Clinical Trials Partnership programme (EDCTP2) in conjunction with the UK Department of Health and Social Care, asked applicants to apply for funding for projects specifically aimed at addressing gender and diversity gaps in clinical research capacity in sub-Saharan Africa. (The EDCTP is a partnership between countries in Europe and sub-Saharan Africa, supported by the European Union.)

Researchers in high-income countries must also work harder to collaborate in more meaningful ways with people from different disciplinary, geographical and cultural backgrounds. One way to do this is to establish long-term relationships with people that extend beyond the life of a single project.

South Africas San people issue ethics code to scientists

Research institutions be they universities, non-governmental organizations, or national or transnational science councils should develop and implement policies, structures and processes that support and promote diversity and inclusivity in research.

The London School of Hygiene and Tropical Medicine has been a major player in global-health research for many decades. Since 2019, a volunteer group of staff and students at the university has been trying to address the fact that the organizations members have often held unacknowledged positions of advantage through its Decolonising Global Health initiative. So far, this has involved various undertakings, including a series of educational lectures.

All stakeholders, from researchers, institutions and funders, to journal editors and publishers, must take steps to ensure that they are not exacerbating power imbalances in research collaborations, but instead helping to remove them.

Funders from high-income countries should discourage parachute or helicopter research in which well-resourced researchers conduct studies in lower-income settings or with groups who have been marginalized historically, but fail to involve local researchers or communities in all stages of the research. Funders can do this by including diversity stipulations in their calls for grant applicants, but also by funding local researchers directly.

Although many need to follow suit, some funders are making progress on this front. Currently, about 60% of the grants awarded by the EDCTP go directly to institutions in sub-Saharan Africa. Similarly, in 2021 the US National Institutes of Health (NIH), launched a $74.5 million five-year project called Data Science for Health Discovery and Innovation in Africa (DS-I Africa). This is being led by African scientists, and is creating a pan-African data network designed to address African research priorities4.

As well as requiring that researchers from LMICs lead collaborations, funders (and research institutions) should insist that projects involving multiple countries begin with a period of discussion involving all potential stakeholders. Before any research is conducted, the roles of each team member, and how they will receive recognition, should be defined and agreed on. Moreover, requiring that projects be fully and transparently budgeted, with the costs of maintaining already existing infrastructure included, would help to ensure that better resourced institutions shoulder a fairer share of project costs.

Publishers and journal editors must question submissions from authors if data have been collected in a low- or middle-income country, but the lead and collaborating authors are from high-income countries.

Some are already taking steps in this direction. The Lancet has started rejecting papers that are submitted by researchers from outside Africa, with data collected from Africa, but with no mention or acknowledgement of a single African collaborator. Similarly, Nature journals now encourage authors to make various disclosures on inclusion and ethics when submitting manuscripts.

The obstacles that make it harder for researchers working in low-resource settings to participate in open science need to be identified and addressed by publishers, and other national and global stakeholders, such as science councils and funders.

Wherever possible, funders should allow data collected by researchers in LMICs to be embargoed for two years, for example, to give investigators time to conduct secondary analyses and to share their data with collaborators at their discretion. Meanwhile, journals and publishers should adjust article processing costs for authors in low-resourced regions.

Over the past 20 years, more than 200 publishers have partnered with Research4Life, a platform established in 2002 dedicated to making some peer-reviewed content available to students and researchers in LMICs. But many of the countries that are major contributors to research, such as South Africa, Brazil, Argentina and India, do not meet the criteria for accessing knowledge through this platform, or for fee waivers or reductions in cost for open-access publishing.

Researchers in LMICs are often disadvantaged because their institutions have underdeveloped research management and financial systems. With scant or no assistance from lawyers, administrative assistants, financial managers and project-management staff, they struggle to meet the due diligence requirements of some funders in high-income countries. (Frequently this involves answering hundreds of questions in multi-page documents about institutional and research governance processes and policies.)

Both funders and collaborating institutions must take steps to enable the development of research-support systems in LMICs. This could mean paying for computing infrastructure, mentorship programmes, open-access publishing, or the training and salaries of project and financial managers, for example.

A volunteer has blood withdrawn for testing as part of an HIV vaccine trial in Bangkok.Credit: Paula Bronstein/Getty

Several controversies in research ethics, which occurred as a result of HIV research projects undergoing ethical scrutiny during the height of the HIV epidemic in the 1980s and early 1990s, prompted the Fogarty International Center at the NIH to launch the International Bioethics Education and Career Development Award in 1999. The aim was to ensure that there were strong research ethics committees in LMIC institutions, with adequately trained members, to review studies and meet international ethics and regulatory standards required by US funders. Thanks to this and subsequent efforts, hundreds of people in Africa and Asia have been trained in research ethics and continue to serve on review boards.

Many more initiatives like this that invest in infrastructure and training over the long term are needed.

The governments of LMICs also need to recognize the value of funding research, both to address locally relevant priorities and to reduce their nations reliance on funders from high-income countries. Matched funding schemes could help, whereby governments commit to give institutions the same amount of funds as those obtained from other sources for nationally identified high-priority research. Launched in 2015, the Science Granting Councils Initiative aims to strengthen the management of research grant funding in 17 countries throughout Africa. Achieving its goal of bringing more support for and control of scientific research into the continent will require governments of African countries to prioritize research funding.

During the 2000s, researchers from the United Kingdom and other high-income countries obtained blood samples from people of the San community in Namibia for genetic research without always adequately explaining what those samples would be used for, or reaching any benefit-sharing agreements with the community.

The San people have since developed their own code of ethics a value-based set of principles that researchers must adhere to before trying to obtain samples or information from them (see go.nature.com/3yz7ash). In principle, this code could be used by researchers working with other Indigenous communities, if codes specific to a particular community dont yet exist.

Ensuring that community members or knowledge-holders, who might not have formal qualifications, are included in research teams with their contributions being adequately valued is another way in which local knowledge can be incorporated equitably5.

Many of the greatest challenges facing humanity climate change among them are disproportionately affecting people living in LMICs. And many of these challenges have arisen largely because of a long history of colonial exploitation and inequitable use of Earths resources. Yet last year, reports of systemic bias and the contributions of researchers from LMICs being sidelined were made, even in the Intergovernmental Panel on Climate Change.

Unfairness, inequity and a lack of diversity must no longer prevent the global research enterprise from maximizing scientific integrity and from realizing the ultimate societal value and benefits of research.

Read more:

The Cape Town Statement on fairness, equity and diversity in research - Nature.com

Read More..

The smart life sciences manufacturing market is projected to reach US$ 78,974.61 million by 2033, registering a CAGR of 14.4% from 2023 to 2033 -…

ReportLinker

Technology has been playing a major role in the healthcare sector, wherein the biotechnology industry is the most benefited segment by recent technological advancements in data analytics, compared to other domains such as oncology, neurology, and immunology.

New York, March 24, 2023 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Smart Life Sciences Manufacturing Market Forecast to 2033 - COVID-19 Impact and Global Analysis by Component, Technology [AR/VR Systems, Internet of Things, Artificial Intelligence, Cybersecurity, Big Data, and Others], and Application" - https://www.reportlinker.com/p06433207/?utm_source=GNW Emerging data sciences technologies assist in the growth of the biotechnology industry.

Analysis of living organisms, research for novel drugs, etc. are major roles played by biotechnology laboratories. Modern data analytics tools have allowed biotechnology researchers to create predictive analytics models and understand the most effective ways to achieve desired goals and objectives. Big data, AI, virtual reality, data visualization, and data security are among the common technologies used in biotech laboratories. Novozymes, a leading global biotechnology company headquartered in Bagsvrd, just outside of Copenhagen, Denmark, has adopted Tableau, a data visualization tool, which revolutionized its approach to data analysis and collaboration. AstraZeneca, plc, a BritishSwedish multinational pharmaceutical and biotechnology company, uses data and technology to minimize the time to discovery and delivery of potential new medicines. The company has data science and AI capabilities embedded in its R&D departments, which allows scientists to push the boundaries of science to deliver life-changing medicines. Such adoptions are expected to boost the smart life sciences manufacturing market growth during the forecast period.

China accounts for the largest share in the smart life sciences manufacturing market in Asia Pacific, followed by Japan and India.China has one of the fastest growing smart life sciences manufacturing markets globally.

Chinese firms have been one of the keenest adopters of technology in APAC as they seek to gain a competitive edge through digitalization and innovation.The manufacturers have collaborated with each other for the successful deployment of advanced technologies.

In March 2022, Triastek and Siemens Ltd., China signed a strategic collaboration agreement to provide digital technologies for the global pharmaceutical industry. Triastek is actively implementing a continuous manufacturing approach with advanced digital tools to streamline and improve its manufacturing processes and products. Such factors promote the revenue generation of smart life sciences manufacturing market players operating in China. Moreover, the country has flourishing biotechnology industry, which led to increased investments in biotech firms over the period. China-based biotech firms have seen 100 times increase in total market value to more than US$300 billion from 2016 to 2021. Biopharma companies are exploring China and other emerging markets for commercial expansion and for acquisition and partnership targets to boost innovation. Therefore, all the above-discussed factors will drive the smart life sciences manufacturing market growth in China during the forecast period.

ABB Ltd, Bosch Rexroth AG, Emerson Electric Co, Fortinet Inc, General Electric Co, Honeywell International Inc, IBM Corporation, Rockwell Automation, Siemens AG, and Sophos Group plc are among the key smart life sciences manufacturing market players profiled during this market study. In addition to these players, several other essential market players were studied and analyzed to get a holistic view of the global smart life sciences manufacturing market and its ecosystem.

The overall smart life sciences manufacturing market size has been derived using both primary and secondary sources.To begin the research process, exhaustive secondary research has been conducted using internal and external sources to obtain qualitative and quantitative information related to the market.

The process also serves the purpose of obtaining an overview and forecast of the smart life sciences manufacturing market size with respect to all market segments.Also, multiple primary interviews have been conducted with industry participants and commentators to validate the data and gain more analytical insights.

Participants of this process include VPs, business development managers, market intelligence managers, national sales managers, along with external consultants such as valuation experts, research analysts, and key opinion leaders, specializing in the smart life sciences manufacturing market.Read the full report: https://www.reportlinker.com/p06433207/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Story continues

See the rest here:

The smart life sciences manufacturing market is projected to reach US$ 78,974.61 million by 2033, registering a CAGR of 14.4% from 2023 to 2033 -...

Read More..

While OpenAI has been working on text and images, iGenius has been working on GPT for numbers – VentureBeat

Within a week of being launched, chatGPT, the AI-powered chatbot developed by OpenAI, had over 1 million users, growing to 100 million users in the first month. The flood of attention from the press and consumers alike comes in part because of the softwares ability to offer human-like responses in everything from long-form content creation, in-depth conversations, document search, analysis and more.

Uljan Sharka, CEO of iGenius, believes that generative AI has world-changing potential in the business world, because for the first time, data can be truly democratized. GPT stands for generative pretrained transformer, a family of language models trained with supervised and reinforcement learning techniques in chatGPTs case, 45 terabytes of text data powering all that content creation.

But what if generative AI can be used to respond to essential data-related queries in the business world, not only content?

Up till now, data, analytics and even data democratization has been data-centered, designed for data-skilled people, Sharka says. The business users are being left out, facing barriers to the information they need to make data-driven decisions. People are not about data. They want business answers. We have an opportunity today to shift the user interface toward language interfaces, and humanize data to make it people-centric.

But the interface is only a small percentage of what a complex system needs to perform in order to make this kind of information integrated, certified, safe, equal, and accessible for business decisions. Composite AI means bringing together data science, machine learning, and conversational AI in one single system.

I like to think of it as the iPhone of the category, which provides an integrated experience to make it safe and equal, Sharka says. Thats the only way well have generative AI delivering impact in the enterprise.

As the gap between B2C and B2B apps has grown, business users have been left behind. B2C apps put billions of dollars into creating exemplary apps that are very user friendly, operable with a few taps or a conversation. At home, users are writing research papers with the help of chatGPT, while back at work, a wealth of data stays siloed when the complex dashboards that connect data go unused.

In organizations, generative AI can actually connect every data product anywhere in the world and index it in an organizations private brain. And with algorithms, natural language processing and user-created metadata, or what iGenius calls advanced conversational AI, the complexity of data quality can be improved and elevated. Gartner has dubbed this conversational analytics.

Virtualizing complexity unlocks unlimited potential to clean, manipulate and serve data for every use case, whether thats cross-correlating information or just bringing it together as one single source of truth for an individual department.

On the back end, generative AI helps scale the integration between systems, using the power of natural language to actually create what a Sharka calls an AI brain, composed of private sources of information. With no-code interfaces, integration is optimized and data science is democratized even before business users start consuming that information. Its an innovation accelerator, which will cut costs as the time it takes to identify and develop use cases is slashed dramatically.

On the front end, business users are literally having a conversation with data and getting business answers in plain natural language. Making the front-end user experience even more consumerized is the next step. Instead of a reactive and single task-based platform, asking text questions and getting text answers, it can become multi-modal, offering charts and creative graphs to optimize the way people understand the data. It can become a Netflix or Spotify-like experience, as the AI learns from how you consume that information to proactively serve up the knowledge a user needs.

From an architectural perspective, this natural language layer is added to the applications and databases that already exists, becoming a virtual AI brain. Connecting across departments unlocks new opportunities.

This is not about using data more this is about using data at the right time of delivery, Sharka says. If I can use data before or while I make a decision, whether Im in marketing or sales or supply chain, HR, finance, operations this is how were going to make an impact.

For instance, connecting marketing data and sales data means not only monitoring campaigns in real time, but correlating results with transactions, conversions and sales cycles to offer clear performance KPIs and see the direct impact of the campaign in real time. A user can even ask the AI to adapt campaigns in real time. At the same time, the interface surfaces further questions and areas of inquiry that the user might want to pursue next, to deepen their understanding of a situation.

At Enel, Italys leading energy company now focused on sustainability, engineers consume real-time IOT information, mixing finance data with data coming from the production plants, having conversations with that data in real time. Whenever their teams need to perform preventative maintenance or plan activities in the plant, or need to measure how actual results compare to budgets, asking the interface for the synthesized information needed unlocks powerful operational analytics that can be reacted on immediately.

ChatGPT has sparked a massive interest in generative AI, but iGenius and OpenAI (which both launched in 2015) long ago realized they were headed in different directions, Sharka says. OpenAI built the GPT for text, while iGenius has built the GPT for numbers, a product called Crystal. Its private AI brain connects proprietary information into its machine learning model, allowing users to start training it from scratch. It uses more sustainable small and wide language models, instead of large language models to give organizations control over their IP.

It also enables large-scale collaboration, in which companies can leverage expertise and knowledge workers to certify the data used to train models and the information generated to reduce bias at scale, and provide more localized and hyper-personalized experiences. It also means you dont need to be a prompt engineer to safely work with or apply the data these algorithms provide to produce high-quality actionable information.

Ive always believed that this is going to be a human-machine collaboration, Sharka says. If we can leverage the knowledge that we already have in people or in traditional IT systems, where you have lots of semantic layers and certified use cases, then you can reduce bias exponentially, because youre narrowing it down to quality. With generative AI, and a system thats certified on an ongoing basis, we can achieve large-scale automation and be able to reduce bias, make it safe, make it equal, and keep pushing this idea of virtual copilots in the world.

Read the original post:

While OpenAI has been working on text and images, iGenius has been working on GPT for numbers - VentureBeat

Read More..

What is ChatGPT? Malone University professor explains AI bot – Canton Repository

ChatGPT is being banned by schools across the country. Here's why.

Schools nationwide are banning OpenAI's ChatGPT. Here's what experts say about the future of artificial intelligence in education.

Just the FAQs, USA TODAY

Artificial intelligence has been making headlines in recent weeks as major tech companies like Google and Microsoft have announced new tools powered by the technology.

Experts say the rapid growth of AI could affect manufacturing, health care and other industries.

The Canton Repository spoke to Shawn Campbell, an assistant professor of computer science and cybersecurity at Malone University, about the rise of AI technology and what it means for the future.

It is the ability for computers to perform functions typically done by humans, including decision making or speech or visual recognition. The study has been around since the 1950s, according to an article from Harvard University.

Campbell said one type of AI commonly used in the medical field is expert systems. This technology uses knowledge databases to offer advice or make decisions regarding medical diagnoses.

A developer called OpenAI launched an AI chatbot in November 2022 known as ChatGPT. Users can interact with the chatbot and receive conversational responses.

Campbell said the rise of this technology has created competition between Microsoft and Google. Microsoft plans to invest billions into ChatGPT, and recently announced AI upgrades to its search engine, Bing. Google, meanwhile, has introduced new AI features in Gmail and Google Docs that create text.

The major tech companies are in an arms race, Campbell said, to see who can develop the best AI technology.

There is some concern that AI technology will replace jobs traditionally held by humans. In some cases, it's already happened. For example, fast-food chain White Castle started installing hamburger-flipping robots in some of its locations in 2020 to reduce human contact with food during the cooking process.

Campbell said it's possible that AI will result in fewer employees doing certain tasks.

"If you have a product line that had 100 people on it, and they get a new type of machine in and kind of redesign the process, then they do it with 80 people or 60 people, ... I do think you're going to find more of the same jobs being done by fewer people, and those people being freed up to do other tasks, essentially," Campbell said.

Some worry that trucking jobs will disappear if developers figure out self-driving technology, but Campbell doesn't expect that to happen anytime soon.

Campbell said he expects AI to become a tool that makes life easier, noting that technology like adaptive cruise control and camera tools that let users remove unwanted objects from the background of photos involve AI and are already used on a daily basis.

"I think that's really the progression it will follow. ... It's being used as a tool, and it's making people's jobs easier, or making a long drive more enjoyable and safer as well," he said.

One of the biggest changes Campbell expects is for data science and analytics to be emphasized more in education. Some employers are already paying for their employees to receive data analytics and literacy training, he said, and Malone University recently added a data analytics major.

Campbell predicted these skills will become important in the job market and that educational institutions may start incorporating data analytics into general curriculum, like they do with writing and public speaking.

Reach Paige at 330-580-8577, pmbennett@gannett.com or on Twitter @paigembenn.

See the article here:

What is ChatGPT? Malone University professor explains AI bot - Canton Repository

Read More..

What to expect from BTech programme? What ties different engineering branches together? IIT Bombay professor decodes – The Indian Express

Abhijit Majumder

(A Lesson from IIT is a weekly column by an IIT faculty member on learning, science and technology on campus and beyond. The column appears every Friday)

We see a lot of turmoil in the teenage years. Not only does ourbody go through big changes, so does our behaviour. Adding to this stress is the question of what one has to do after school. While some are quite focused and clear about their career goals, most are not.

Many select engineering as the next step because of pressure and the perception that it provides opportunities in terms of jobs.

While there is no harm in keeping job prospect in mind while selecting the branch, we should not forget that the placement would only follow the successful completion of the four years of study. Hence, it is important to know what to expect in your undergraduate years. While the exact syllabus varies from college to college, some factors remain common,

As I am a chemical engineer, I wish to start with my own branch. When I chose to study it, I had no clue what the course entailed. From the name, I thought that it would have lots of chemistry. I am not alone; many people think the same. Even today, people mail me with chemistry problems. However, chemical engineering has very little to do with chemistry.

I often joke that the way Apple does not have vitamin C in their phones, chemical engineering also does not have chemistry. Instead, we have an understanding of how a chemical process plant functions. We study its constituting operations, such as how one component gets transferred from one material to another when they are brought in close contact in distillation columns.

We study how chemical reactions take place and what kind of reactor might one design to get the best quality product at the least cost. We learn about calculations that go into determining safety factors in a plant design. In short, we do a lot of mathematics and calculations to make sure a chemical process plant such as a petroleum refinery unit runs seamlessly. We also design equipment such as pressure vessels and heat exchangers that are often used in chemical industries.

While different branches have their own characteristics, one common thread is mathematics.

Mathematics is the language of engineers. We express the problems in forms of matrices, probabilities, and differential equations. In whichever branch you may choose, solving equations is probably common in every field of engineering.

My computer science colleague, Supratik Chakraborty, said, Can we find limits of what can be computed? Are there things that cant be computed, regardless of how powerful computers we have? How much time, space, and energy must it take to compute certain things? These are fundamental questions, not all of which have satisfactory answers even as of today. The theoretical part of computer science tries to understand and address these questions using mathematical principles.

So, even for a field as computer science, its not just coding. The emphasis is on solving unanswered questions using maths. Similarly, civil engineering is not just about making bridges but also solving partial differential equations that govern the mechanical stress distribution in those bridges. Hence, having a knack for equations is a must for all fields of engineering.

Another common aspect is the ability to create and imagine. The job of an engineer is to solve real-life problems. Many times, there is more than one solution and we may not have enough computational ability even today to get a definite answer.

In such situations, one may need to take various approaches to tackle the problem computationally or to use common sense to choose one from many. Also, as engineers are creators too, one may need to imagine something that even does not exist, or, at least, does not exist in front of us. Hence, the ability to imagine, visualise, and plan is an integral part of engineering training. Unlike our school where every problem in book had a definite answer, engineering teaches us that life is fuzzy and that creativity is integral to navigating the unknown. It may sound simple, but I have seen students who feel quite uncomfortable to embrace this uncertainty.

Another interesting common aspect of all the branches is the continuous change or evolution that they have had in recent years. For example, metallurgy is not only about metals. It lovingly embraces polymers, ceramics and many other materials. Thats why now we call the branch as Metallurgy and Material Science.Similarly, mechanical engineering is not limited to gears and cantilevers but may include nuclear reactor design, energy, and climate.

Fading Boundaries

At IIT Bombay, we see faculty members in civil engineering working on biology, and someone from metallurgy and material science working on water treatment. As professors, we have realised that real-life problems are not limited by the boundaries of streams. For example, designing a diagnostic kit may need inputs from biologists, chemists, material scientists, chemical engineers, electrical engineers and so on.

To design biomedical prosthetics, one may need inputs from specialists in artificial intelligence and machine learning. Keeping this multi-disciplinary need in mind, here at IITB, we are reducing the stream-specific course loads (which we call as core courses) and giving students the opportunity to pick the courses of their interest. They are free to create their own a la carte menu with the help of their faculty advisor.

As a result, students have the opportunity to get trained in subjects that interest them whether it is management, design, bio-science, psychology, and even social sciences. Hence, even if one has a strong inclination for a particular branch, say mechanical engineering, but gets an admission in chemical engineering, she should not be disheartened. If one is interested, one can train themselves in whatever topic in todays world of open knowledge flow.

The writer is associate professor at the department of chemical engineering at IIT Bombay.

(With inputs from Prof. Supratik Chakraborty, Computer Science and Engineering and Prof. Deepankar Choudhury, CivilEngeneeringat IIT Bombay).

Originally posted here:

What to expect from BTech programme? What ties different engineering branches together? IIT Bombay professor decodes - The Indian Express

Read More..

GNA University organised Workshop on Big Data Analytics Tools – Cityairnews

Jalandhar, March 25, 2023: The Department of Computer Science and Engineering in the School of Engineering, Design and Automation (SEDA) organised a 2-days workshop on Big Data Analytics Tools for the students of B.Tech. CSE. Rocky Bhatia, Solutions Architect at Adobe, was the resource person of the Workshop, holding an expertise in data engineering, system design, architecture, cloud tech, data science, and other technology areas. The objective of the Workshop was to make the students comfortable with tools and techniques required in handling large amounts of datasets.

Dr. Vikrant Sharma, Professor and Dean-SEDA welcomed all the participants and expert of the Workshop. The contents covered in the Workshop included: Introduction to Big Data, Distributed Computing, HDFS Architecture, Blocks, Replication, Map Reduce, Hadoop, Hive Query Language, YARN, NoSQL, HBase, Oozie, Real time Data Mining using Spark Streaming, Elastic search, Kibana, Kafka. It was an interactive session where students actively participated in hands-on training and enquired curiously about career options in data science domain.

Gurdeep Singh Sihra, the Pro-Chancellor, GNA University congratulated all the participants for successfully completing the course and also to organizers of the event. He said, This event is simply the beginning for providing solutions to make a better future and advised the students to take pride in being a part of this prestigious course and wished all the participants a great success.

Dr.V.K. Rattan, the ViceChancellor, GNA University appreciated all the participants and encouraged them to be part in such events as this is one of the emerging technologies and highly in demand.

Dr. Monika Hanspal, Dean Academics, GNA University interacted with participants, and highlighted the significance of Big Data, and expressed these technologies and hands-on practices will make students industry ready.

Dr. Hemant Sharma, the Pro Vice-Chancellor thanked Dr Anurag Sharma, Professor and Head CSE and organizers for organizing such a great event and ensured all the participants that such events will be planned in the near future.

Link:

GNA University organised Workshop on Big Data Analytics Tools - Cityairnews

Read More..

5 Indigenous engineering feats you should know about – The Conversation

For many millennia, Indigenous Australians have engineered the landscape using sophisticated technological and philosophical knowledge systems in a deliberate response to changing social and environmental circumstances.

These knowledge systems integrate profound understanding of Country, bringing together an understanding of the topography and geology of the landscape, its natural cycles and ecological systems, its hydrological systems and its natural resources, including fauna and flora. This has enabled people to manage resources sustainably and reliably.

Engineering is about process, and the process of engineering was very different in Australia before the English colonised the land. However, when our Aboriginal or Torres Strait Islander students take the step into engineering, or other STEM subjects, there is little material provided that relates to their experience or their peoples technical and management knowledge. This is a result of historic denial of the First Nations of Australia as enduring scientific and technical civilisations.

The versatility and minimalist nature of Aboriginal technology designs are inspiring. The flexibility and artistry in tool manufacture, which can differ in neighbouring communities, is a salient lesson for engineers now. Some key aspects of this approach can be seen through five examples of ingenious Indigenous engineering.

The King Sound region of the Kimberleys in Western Australia is renowned for its strong tides, rips and whirlpools. Navigation can be difficult, though there are areas of calm water in the bays. The Bardi community, from One Arm Point, call their raft the kalwa.

The raft is made mostly of light mangrove wood, providing buoyancy. The two fan-shaped sections that make up the boat are wider and thicker at the outer ends to provide stability. These two sections, lapped over each other, are made on a base of mangrove trunks sharpened at the ends; hardwood is used to pin them together. A small basket, made with hardwood pegs on the back section, is used to secure belongings or any fish that are caught.

The design ensures the top of the raft stays above the water when loaded with the paddler, passengers and belongings. The size of the raft determines the load it can carry. Water that washes over the raft will flow out through the gaps between the wooden slats.

Ingeniously, the structure can be pulled apart. One half can be tied to a harpooned dugong, which will swim around and become exhausted, while the hunter floats on the other half.

Rafts were made in different styles all around the coast of Australia, from the different materials available in particular areas and for uses relevant to that landscape.

The Thuwarri Thaa (aka Wilgie Mia) Aboriginal ochre mine is located in central WA in the Weld Range, between Mount Magnet and Meekatharra. It has been in use for probably tens of thousands of years, including by non-Aboriginal miners from the 1940s to 1970s.

The ochre is still important in body and artefact painting for ceremony. It is also used as a skin coolant during summer and for warmth during winter; as a fly repellent; in curing hides and in making glue. Ochre has many uses. Richard Wainwright/AAP

The mine is a deep, sloping shaft cut into the mountain. Wood was carried into the cavern and made into scaffolding to reach seams of ochre out of reach above the cavern floor. Tunnels have been dug along seams in the walls. Heat, flaked pebbles and fire-hardened, sharpened wood were used to undercut the seams of ochre. Fire may have been used to crack the surrounding rock, as well as to provide light deep in the cavern. At times, large sections of ochre could be wedged off.

The ochre was mined from deep underground and then processed onsite. Some was transported by traders northwest to Carnarvon (450 km), south to Kellerberrin (525 km) and east to Wiluna (300 km). To transport, the ochre was dampened and rolled into balls.

Thuwarri Thaa was reserved as a mens only site and stories pass down knowledge of the site and the material. Its location, its mining and its uses are embedded in the creation story of the marlu or red kangaroo. The red ochre is his blood, the yellow ochre is his liver and green is his gall. The entire mining and distribution industry was regulated by these cultural constraints and influences and thus maintained sustainable practices.

When non-Aboriginal people mined there, the roof was blasted off a large cavern at a nearby site, little Wilgie Mia. Ochre from the site is still used in ceremony. People can visit with a permit if guided by Wajarri Yamaji Traditional Owner guides.

The Budj Bim area (also known as Lake Condah), a dormant volcano in south western Victoria, was continuously occupied for thousands of years. The Gunditjmara community farmed eels and harvested galaxia fish in a series of dams and water channels constructed out of the basalt lava flows, an amazing surveying feat.

More than 30,000 years ago, Budj Bim (called Mount Eccles by Europeans) spewed forth the Tyrendarra lava flow, a significant creation event in this country recorded in local oral history. The lava flow to the sea created large wetlands by changing the drainage pattern. This volcanic activity lasted until after the last ice age. Carbon dating shows aquaculture began as early as 6,700 years ago, soon after the lava flow stopped.

The people then continued to alter the water flow through the region with excavated channels. The channels are made in straight or curved paths, with sharp corners helping to reduce the speed of water. Dam walls were built to produce ponds.

These traps for eels and the fish traps in other locations were designed to allow animals to enter the trapping area, be retained in the cooling water and then captured when required for food. The eels remained in pools designed for collection for long periods, where they would breed. This provided a food supply all year round.

The rock was also used to construct dwellings or stone huts, along with 36 storage structures and 12 pits, which are associated with eel trapping. Most of the stone dwellings have a diameter of less than 1.6m. The rest are considered to be storage caches. The area has many scar trees with signs of burning; many of the Manna gums were used for baking and smoking and preserving the trapped eel. Smoked eel products were traded over a wide area.

The structures were exposed during heavy fires in the area and the extent of the all the engineering work is still not known. These traps are an Australian UNESCO World Heritage site, the only one listed exclusively for its Aboriginal cultural values. The Gunditjmara people now work with engineering students designing projects exploring engineering approaches embedded in the landscape.

When Ben Lange, an Aboriginal man from Cairns who plays the Yidaki, came to the University of New South Wales to study electrical engineering, he worked with the physics department to look at how the Aboriginal people created sounds with this instrument. This work led to greater understanding of the use of the mouth and its components in speech production, providing inspiration for new approaches in speech therapy.

The Yidaki (European name the Didgeridoo) is a drone pipe played with circular breathing the lungs are used as a form of air storage to maintain a continual flow through the pipe. The wood is selected from termite-hollowed trees. This bore is widened by hand, especially at the base of the pipe. Bees wax is used to smooth the mouthpiece.

The shape of the mouth across the pipe, the control of air through the mouth with the diaphragm, and the position of the tongue in the mouth, as well as the shape of the players voice box, all affect the sound from the instrument.

The Brewarrina fish traps, called Biame Ngunnhu by the local Ngemba people, were created by Biaime in the Dreamtime there is no oral record of other events that locate the period of construction. They are considered the oldest and longest-lasting dry wall construction on earth.

Dating of the traps would be hard, especially as many of the stones were recently moved to construct a stone weir across the river. Importantly, these fish traps provide an example of collaborative knowledge sharing and governance.

The fish in the river include Australian grayling, river blackfish, short-finned eel, Australian smelt, climbing galaxias, common galaxias, congoli, flathead gudgeon, mountain galaxias, pouch lamprey, smallmouth hardyhead, trout galaxias and southern pigmy perch. However the main fish there now are introduced carp, and the high level of irrigation upstream means the river is often dry.

There is great diversity of Aboriginal peoples across Australia. Aboriginal people have different languages and come from vastly different landscapes, each with their unique ecology. Yet technology is part of our everyday life: the houses we live in; the internet we learn with; the watercraft we use for fun or fishing.

Indigenous communities need students graduating with the skills to help maintain and build infrastructure or create software to support their enterprises and care for Country. In project management, the participatory democracy practised in Indigenous communities is a good example of flat management processes and a way to reinvigorate the Western approach to sustainability and democracy that is failing in our engineering projects as much as in the political space.

Indigenous Engineering for an Enduring Culture, edited by Cat Kutay, Elyssebeth Leigh, Juliana Kaya Prpic and Lyndon Ormond-Parker is published by Cambridge Scholars Publishing.

If so, youll be interested in our free daily newsletter. Its filled with the insights of academic experts, written so that everyone can understand whats going on in the world. With the latest scientific discoveries, thoughtful analysis on political issues and research-based life tips, each email is filled with articles that will inform you and often intrigue you.

Get our newsletters

Editor and General Manager

Find peace of mind, and the facts, with experts. Add evidence-based articles to your news digest. No uninformed commentariat. Just experts. 90,000 of them have written for us. They trust us. Give it a go.

Get our newsletter

If you found the article you just read to be insightful, youll be interested in our free daily newsletter. Its filled with the insights of academic experts, written so that everyone can understand whats going on in the world. Each newsletter has articles that will inform and intrigue you.

Subscribe now

CEO | Editor-in-Chief

It helps you go deeper into key political issues and also introduces you to the diversity of research coming out of the continent. It's not about breaking news. It's not about unfounded opinions. The Europe newsletter is evidence-based expertise from European scholars, presented by myself in France, and two of my colleagues in Spain and the UK.

Get our newsletter

Head of English section, France edition

See original here:

5 Indigenous engineering feats you should know about - The Conversation

Read More..

White men still hold majority of US science and engineering jobs – Nature.com

A report from the US National Science Foundation finds that the majority of science and engineering jobs are held by white men.Credit: Getty

Women, members of minority ethnic groups and those with disabilities continue to be under-represented in positions across the US science, technology, engineering and mathematics (STEM) workforce, according to a report by the US National Science Foundation (NSF).

The report finds that although the proportions of jobs held by these groups rose overall between 2011 and 2021, they remained lower than the groups representation in the nations population. Women, for example, comprise 51% of the US population but represent just 35% of employees in the US STEM workforce. The findings relate to the academic, industry, non-profit and government sectors, including roles such as managers and technicians, and those in related areas such as health care.

White men dominate science and engineering positions in the nation, the report finds; nearly three-quarters of people in these roles identify as male. Almost two-thirds identify as white.

Just one-fifth of the science and engineering workforce identifies as Asian, 8% as Hispanic or Latino, 8% as African American and 0.4% as American Indian or Alaskan Native. Although people with at least one disability represent about one-quarter of the US population, they accounted for only 3% of those in science and engineering positions. The proportion of these workers in the general STEM workforce has remained unchanged in the past decade.

Ableism, or discrimination against people with disabilities, along with continuing inaccessibility of physical and virtual spaces, could bar scientists with disabilities from seeking or getting positions in their field, says Bonnielin Swenor, director of the John Hopkins University Disability Health Research Center in Baltimore, Maryland. She notes that barriers are exponentially greater for scientists with disabilities who are also members of other under-represented groups. These disparities are both a cause and consequence of inaccurate, but common, views that people with disabilities dont belong in STEM, are incapable of being scientists and are overlooked as STEM leaders, she says.

Academics fight moves to defund diversity programmes at US universities

Under-representation in STEM could result from a scarcity of role models, says Johnna Frierson, associate dean for equity, diversity and inclusion for the basic sciences at Duke University School of Medicine in Durham, North Carolina. She sees the problem as a cycle in which lack of representation leads to continued lack of representation. If ones environment doesnt have robust diversity in representation, it can send an implicit message that individuals from under-represented groups do not belong and cant be successful in those spaces, she says.

The NSF report also finds that earnings disparities in science and engineering (S&E) positions persist between different groups. In 2020, the median annual wage for all employees in the sector was US$89,990. But men in these fields earned around $25,000 more than their female counterparts ($99,923 versus $75,562). The median salary for white S&E workers was $89,977. For Asian S&E workers, it was $107,150.

The report also finds that female scientists hold almost two-thirds of positions in the social sciences, including psychology, sociology and anthropology, but fewer than half in biological, agricultural and related disciplines. They continue to be under-represented in the physical and related sciences, including physics and chemistry (33%), computer and mathematical sciences (26%) and engineering (16%). White people also take up a majority of positions in each discipline. Among computer and mathematical scientists, 57% identify as white. The other three disciplines have similar numbers: 63% of biological, agricultural and other life scientists identify as white, as do 69% of engineers and 71% of those in physical sciences.

Chemistry course corrections tackle bias

Some have criticized the report for its lack of data on people from sexual and gender minorities (LGBT+) in the STEM workforce. Ramn Barthelemy, a physicist at the University of Utah in Salt Lake City, who has studied equity and inclusion in physics, says that his work on LGBT+ faculty members, and on LGBT+ physicists in particular, has shown concerning trends, including exclusionary behaviours and negative workspace environments1. Those trends, he adds, are particularly worrisome for those in the community who also identify as women or as a person of colour. Without proper representation in data, he says, we arent being included in metrics on diversity, equity and inclusion, further marginalizing the community.

NSF representatives say that the agency is integrating questions about the LGBT+ community into its surveys, adding that it hopes to publish its analysis of the data in a few years.

Here is the original post:

White men still hold majority of US science and engineering jobs - Nature.com

Read More..