Category Archives: Data Science

Gen-Z shunning alcohol for ‘dry’ parties as they prioritise – Euromonitor International

LONDON, UK Gen Z are causing a shake up in the alcoholic and non-alcoholic universes, ushering in a new era of mindful drinking, sober curiosity and dry venues and events, according to a Euromonitor International expert.

Spiros Malandrakis, Head of Alcoholic Drinks at leading market research firm Euromonitor International, said that Gen Z the generation born between the mid-1990s and early 2010s has displayed a noticeable trend of reduced alcohol consumption compared to previous generations.

All major non-alcoholic segments were among the top performers in Euromonitor Internationals latest alcoholic drinks system update, said Malandrakis.

Non-alcoholic beer sales registered 6% total volume growth in 2022

Malandrakis said: From non-alcoholic beer, already present and embedded in drinking culture for decades yet still witnessing an enviable 6% total volume growth for 2022, to the emerging non-alcoholic wine segment posting 9% total volume growth and from the double digit momentum of the relative newcomer non-alcoholic ready to drink (RTDs) to non-alcoholic spirits spearheading innovation and a pivot to functionality with a 10% total volume growth, this is all proving to be much more than just a passing fad.

Gen Z prioritise physical and mental wellbeing, often engaging in regular exercise and prioritising nutritious diets. They place significant value on authenticity and experiences. They are eager to engage in meaningful interactions, explore their passions and make a positive impact on society.

They often opt for socialising in environments that foster creativity, such as art exhibits, poetry slams or music festivals. They engage in outdoor activities, volunteer work or participate in clubs and organisations that cater to their diverse interests.

Younger people embodying less but better mantra when it comes to alcohol consumption

Malandrakis commented: Mindful drinking and sober curiosity, moderation initiatives, dry venues and events, no/lo focused retailers and a cornucopia of launches and innovation that is blurring the definitional lines between the alcoholic and non-alcoholic universes. There is little doubt that the once niche trend is now established, evolving and yet still holds huge untapped future potential.

For younger cohorts who would historically be at the forefront of high energy occasions and high-volume alcohol consumption, the theme of moderation is also a key part of the ubiquitous less but better premium mantra.

Next gen euphorics, alcohol proxies and virgin extensions have come of age. New formulations are targeting a plethora of consumption occasions; functionality cues are increasingly replacing simplistic dealcoholising approaches and botanical alchemy and molecular experimentation with new ingredients is making the no/lo landscape one of the most innovative and exciting in the alcohol ecosystem. And it is here to stay.

Find out more on the alcoholic drinks sector here and how it is faring in the e-commerce world here.

ENDS

Euromonitor Press Office

Press@euromonitor.com

Euromonitor International is the worlds leading provider of global business intelligence, market analysis and consumer insights. From local to global and tactical to strategic, our research solutions support decisions on how, where and when to grow your business. With offices around the world, analysts in over 100 countries, the latest data science techniques and market research on every key trend and driver, we help you make sense of global markets.

Read more here:

Gen-Z shunning alcohol for 'dry' parties as they prioritise - Euromonitor International

Campus Adds New Areas of Studies for Students to Choose From … – University of California, Merced

New students or those who have not yet chosen their majors will have an array of options before them.

Five new majors and several new emphases, ranging across all three schools, are all coming online in 2024 and are recruiting students now.

New bachelors of science degrees:

New bachelors of arts degrees:

New emphases:

In the mechanical engineering major:

In the political science major:

In the sociology major:

Students who enroll in the public health bachelors of science program can do so as part of the medical education pathway or as the basis for a multitude of other health care related careers.

The standard bachelors of science program has more biology, physiology and nutrition science than the bachelors of arts major, said Professor Nancy Burke, who led the development of the new major. We also have a health professionals/pre-med track that incorporates all the preparation students would need to apply to medical school or for any other health professional degree.

Those who are interested in careers in aerospace engineering can now choose that subject as an emphasis within their mechanical engineering degrees. And those in cognitive science can now enroll in that majors new honors program.

Mechanical engineering Professor and Monya Lane and Robert Bryant Presidential Chair in Excellence in EngineeringAshlie Martini said aerospace engineering has been a subject of interest among students and faculty for quite a while.

Many of our mechanical engineering undergraduate alumni go into aerospace companies already, so it has been something we have wanted to bring to UC Merced, she said. We are starting with an emphasis, but if we see a lot of students signing up for this, that would encourage the creation of a major.

Faculty have created four new classes for the aerospace engineering emphasis: aerospace structures and materials; flight dynamics and control; aeroelasticity; and aerospace propulsion. Technically, the classes are electives, so anyone in the mechanical engineering major can take them.

Students interested in the growing field of data science have two choices: data science and analytics and data science and computing.

Majors are really a prescribed set of courses of study and data science is so big and so new, it's not possible to have a one-size-fits-all program," said Professor Suzanne Sindi, chair of the Department of Applied Math, and co-author of the data science and computing major.

Data science the availability, collection and analysis of data has changed every field of study, Sindi said.

It behooves us to make sure that our students understand not just a domain, or area of study, but that they understand the data that exists within thedomain and what you can use it for, she said. It also helps them understand the world we live in.

The data science and analytics major is also a choice for those interested in understanding this growing field of study.

Data is at the core of modern-day systems-thinking and decision-making. The DSA major will not only provide an accessible bridge, but one that serves as a launchpad for clearing the digital divide, such that graduates will be well-equipped to think critically and tell stories with data, said management of complex systems Professor Alexander Petersen.

For students who lean more toward engineering, there is the new chemical engineering degree, offered by the Department of Materials Science and Engineering. Professor Kara McCloskey, who leads the chemical engineering program, said obtaining the degree opens a plethora of career paths for graduates.

It is a popular engineering major, it's considered a traditional engineering major and most of the other UC campuses offer it and industry, including food and beverage industries in the Central Valley, have been asking us when we are going to offer it, she said. They hire many of our students anyway, but chemical engineering is the right training for many other students they want to hire.

Chemical engineering includes a lot of mass separations, especially at a large scale, which are part of food- and wine-making processes, McCloskey explained.

There has been a demand for chemical engineers and especially those from the Valley, by Valley industry, department chair Professor Valerie Leppert said. It helps them retain workers because people can remain closer to family and home.

Excerpt from:

Campus Adds New Areas of Studies for Students to Choose From ... - University of California, Merced

Big Data and Data Science: The Perfect Synergy – CityLife

Big Data and Data Science: The Perfect Synergy

Big data and data science are two buzzwords that have been dominating the technology landscape in recent years. While they may seem interchangeable to some, they are, in fact, two distinct concepts that complement each other perfectly. In this article, we will explore the synergy between big data and data science and how their collaboration is revolutionizing various industries.

Big data refers to the massive volume of structured and unstructured data generated by businesses, individuals, and machines every day. This data comes from various sources such as social media, sensors, digital images, and videos, among others. The challenge with big data lies in its sheer volume, velocity, and variety, making it difficult for traditional data management tools to process and analyze. This is where data science comes into play.

Data science is an interdisciplinary field that uses scientific methods, algorithms, and systems to extract knowledge and insights from structured and unstructured data. It involves the application of advanced analytics techniques, such as machine learning, artificial intelligence, and statistical modeling, to make sense of the vast amounts of data available. Data scientists are skilled professionals who can transform raw data into valuable information, enabling organizations to make data-driven decisions and gain a competitive edge.

The synergy between big data and data science is evident in the way they work together to unlock the full potential of data. Big data provides the raw material, while data science offers the tools and techniques to process and analyze it. Together, they enable organizations to harness the power of data and turn it into actionable insights that drive business growth and innovation.

One of the most significant benefits of this synergy is the ability to make more informed decisions. With the help of data science, organizations can analyze big data to identify patterns, trends, and correlations that were previously hidden. This information can be used to make better decisions, optimize processes, and improve overall efficiency. For example, retailers can use big data analytics to understand customer preferences and tailor their offerings accordingly, while healthcare providers can leverage data to predict disease outbreaks and develop targeted treatment plans.

Another advantage of the big data and data science synergy is the ability to develop new products and services. By analyzing customer data, companies can identify gaps in the market and develop innovative solutions to address them. For instance, streaming services like Netflix and Spotify use big data analytics to recommend personalized content based on user preferences, while financial institutions use data science techniques to develop sophisticated risk models and fraud detection systems.

Moreover, the combination of big data and data science has led to significant advancements in artificial intelligence and machine learning. These technologies rely on vast amounts of data to train algorithms and improve their accuracy over time. As a result, we have seen the development of self-driving cars, virtual assistants, and advanced robotics, among other innovations.

The synergy between big data and data science has also had a profound impact on various industries, including healthcare, finance, retail, and manufacturing. In healthcare, big data analytics has enabled the development of personalized medicine, where treatments are tailored to individual patients based on their genetic makeup and medical history. In finance, data science techniques have revolutionized risk management and fraud detection, while in retail, big data has transformed customer relationship management and supply chain optimization.

In conclusion, the perfect synergy between big data and data science has unlocked the true potential of data, enabling organizations to make more informed decisions, develop innovative products and services, and drive business growth. As technology continues to advance, the collaboration between these two fields will only become more critical, shaping the future of industries and the world as a whole.

See the rest here:

Big Data and Data Science: The Perfect Synergy - CityLife

A.I. is coming for the jobs you’d least expectand it means … – Fortune

BY Sydney LakeJune 14, 2023, 6:11 PM

Jeff Maggioncalda, CEO of Coursera. Photo courtesy Coursera

ChatGPT may seem like a more savvy and interactive version of Google or another search engine, but this and other artificial intelligence tools could replace and displace workers in nearly every industry. During the next five years, 75% of organizations are expected to adopt A.I. practices, a recent study by the World Economic Forum showsand companies are anticipating a job loss rate related to A.I. adoption of 25%.

What potentially more alarming is that 49% of workers could have half or more of their tasks exposed to large language models like ChatGPT, according to a recent study from the University of Pennsylvania. A potential job loss rate that highcoupled with a major shift in job requirementsmeans that people whose jobs were taken over by A.I. will need to be reskilled.

Thats where online education and upskilling companies like Coursera step in.Coursera offers nearly 6,000 online courses, professional certificates, and degree programs for the most in-demand industries including data science and analytics, digital marketing, project management, and much more.

A.I. and other advanced technologies have already infiltrated jobs in the service industryincluding as waiters and freight moversbecause theyre more repetitive, repeatable, predictable jobs, as Jeff Maggioncalda, Courseras CEO, puts it. But language-focused jobs, including teachers and lawyers, will become more vulnerable to advancements in ChatGPT and other generative A.I. toolsand credentials earned via Coursera can help workers upskill for the jobs most susceptible to A.I. development.

If all these jobs become a lot more vulnerable, then everybodys in the reskilling world, Maggioncalda says. If you dont know how to use A.I. for your job, youre in trouble. All employers want you to be able to use this if youve graduated.

Courses and certificates vs. degree programs

Advanced degree programs can serve as a launchpad to more job opportunities and higher pay. MBA programsas well as masters degree programs in data sciencecan help students double their salary post-graduation and make major career moves.

But everything is at a price, as Maggioncalda notes, and while there will almost certainly be value in degree programs, they come at a higher cost than other options. Through online learning, people can earn certificates and other credentials by completing individual courses to learn the skills they needoften at a lower cost and more flexibility.

I do think that there is a much higher level of competition and alternatives available, Maggioncalda adds. For the learner, it looks like more options than alternatives. For the university, it looks like competition.

More universities could start offering different pathways to degree programs, meaning that there could be more lenient admissions thresholds. One example of this is University of ColoradoBoulders new online masters degree program in computer science on Coursera, which requires prospective students to take three preliminary courses.Ball State University also recently launched computer science and data science masters programs with similar requirements.

Admissions for CU Boulders and Ball States respective programs arent based on an applicants prior academic history, and a bachelors degree isnt required. Rather, prospective students choose a pathway in either algorithms or software architecture and must earn at least a B in the three courses associated with that pathway before they can qualify to enroll in for-credit courses. Plus, the CU Boulder program costs just $15,750 and has a pay-as-you-go model that only charges students for those classes in which theyre currently enrolled. Some computer science masters programs can cost six-figures to complete.

Degrees are going to become more affordable and more flexible, Maggioncalda predicts.

Other advanced tech usage in online education

To give students a more realistic and immersive online learning experience, Coursera plans to release more virtual reality (VR) components to its online classesfrom public speaking to Chinese language courses to physiology.

In an online public speaking course from the University of Washington on Coursera, students can practice giving speeches in front of a virtual audience and receive feedback in real time from their professor. Students can even adjust the venue type in which they want to practice, whether it be an auditorium or conference room and choose a calm or restless audience, Maggioncalda adds.

To practice more realistic interactions using a new language, a beginners Chinese language course offered by Peking University places students in a VR setting in a Chinese market, which can help students practice their listening, speaking, pronunciation, and vocabulary. Whats more, students enrolled in a physiology course from Duke University can even hop inside blood vessels via VR to learn more about measuring blood pressure and other physiological systems.

Why online education can be an accessible option for workers looking to reskill

Online programs can be a more suitable option for students who want lower costs and more flexibility in completing a program. More than one-third of online learners say that their top motivation for considering further education was a stalled career or stalled career search, according to a recent McKinsey & Co. report. But among the top concerns for online learners is a lack of access to one-on-one mentoring and coaching for post-grad opportunities.

To combat these concerns, Coursera this spring launched its coach function, which is powered by ChatGPT. Online students can use this function to ask specific and targeted questions about course material, Maggioncalda says.

Fortune sat down with Maggioncalda last month to see a demo of the new feature, and the bot will only answer questions related to the course, which can make it a more targeted and reliable source than turning to larger search engines like Google. The coach uses the course to answer questions rather than searching the entire internet, he explains. The coach can also provide summaries of lectures and even point learners to recommended video clips to answer their questions.

The main change in the way people learnbecause of A.I.is that its going to be more personalized. Its going to be more interactive, Maggioncalda says. The easiest way to think about it is if every single student had a personal [teaching assistant]. Youll have someone to help you with your career coaching and help you with your studies.

Check out all ofFortunesrankings of degree programs, and learn more about specificcareer paths.

Read more from the original source:

A.I. is coming for the jobs you'd least expectand it means ... - Fortune

Salesforce to Expand Data Cloud Connectivity with New Connectors … – Datanami

CHAPEL HILL, N.C., June 15, 2023 CData Software, a leading provider of real-time data connectivity solutions, has announced that Salesforce Data Cloud customers will have access to select connectors from CData that can be leveraged to expand the Connectors Catalog and bring more data into Data Cloud from their SaaS, database and file-based sources.

In todays market, customers expect proactive, personalized, and connected experiences across digital channels. With CData Connectors, Salesforce customers will be able to streamline access to customer data across a wide collection of data sources and touchpoints, enabling organizations to better serve their customers and gain a competitive advantage in todays market.

CData has been a vendor for Tableau for several years and has enhanced Tableaus ability to integrate with the data sources customers care about, so integrating CData connectors with Salesforce Data Cloud just made sense, said Chandrika Shankarnarayan, VP, Product, Salesforce Data Cloud. CData Connectors will increase our data connectivity options for customers using Data Cloud.

From the Fortune 500 and the Global 2000 to SMEs worldwide, thousands of organizations rely on CData connectivity to overcome data fragmentation and unlock value from diverse, dispersed data assets. Leading vendors across SaaS, data management, integration, analytics & BI, data science, data testing, AI & machine learning, and data governance embed CData connectivity to solve data access, data management, and data integration challenges.

Leading vendors across almost every facet of data management embed our high-performance connectivity to solve their data access and integration challenges, said Amit Sharma, CData co-founder and CEO. We continue to expand that connectivity, bringing our services to one of the most popular platforms for businesses worldwide to improve our customers ability to access and action their disparate data to produce truly exceptional customer experiences.

About CData Software

CData Software is the real-time data connectivity company. Our self-service data products and connectivity solutions provide universal access to live data from hundreds of popular on-premises and cloud applications. Millions of users worldwide rely on CData to enable advanced analytics, boost cloud adoption, streamline operations, and create a more connected business. Consumable by any user, accessible within any application, and built for all enterprises, CData is redefining data-driven business.

Source: CData

Read the original here:

Salesforce to Expand Data Cloud Connectivity with New Connectors ... - Datanami

Leveraging Big Data And AI For Disaster Resilience And Recovery – Texas A&M University Today

Researchers in the Urban Resilience.AI Lab are taking the lead in harnessing community-scale big data to develop artificial intelligence-based models with the potential to impact communities before, during and after a natural disaster or crisis.

Getty Images

In a world where natural hazards can strike at any time with devastating consequences, reducing their impacts in advance may seem impossible.

The unpredictability of these events and their effects makes it challenging to anticipate and respond appropriately, leaving individuals and communities vulnerable to their destructive effects.

In 2017, Hurricane Harvey made landfall along the Texas coast as a category four hurricane and slowed to nearly 5 mph as it moved inland. Palacios experienced a storm surge exceeding 8 feet. The storm dumped 56 inches of rain in the Friendswood area and more than 60 inches near Nederland and Groves. The National Hurricane Center reported $125 million in damage because of Hurricane Harvey.

Now, researchers from the Zachry Department of Civil and Environmental Engineering at Texas A&M University have created models using big data and artificial intelligence (AI) to help communities prepare for future natural disasters, assess the impacts and monitor the recovery in near real time. They used data from Harvey to test these AI-centric solutions.

Led byDr. Ali Mostafavi, Zachry Career Development Associate Professor,the Urban Resilience.AI Labis leveraging AI and big data for predictive risk analysis, predictive infrastructure failure assessment, situational awareness during the event, monitoring of recovery and rapid assessment of the impacts of disasters.

When facing hazards, there are certain capabilities that AI and data science approaches provide or enhance that can improve the resiliency of communities to these disasters, Mostafavi said. Our vision over the past four or five years has been to improve disaster resilience by developing different classes of AI-based models that could provide foresights and insights critical for mitigation, preparedness, response and recovery.

The growth of data from different types of sensors from physical, social and other sensing technologies has given researchers tremendous information to work with in creating these models.

These days, cities and communities are essentially data factories. You can evaluate sensor data related to the condition of the community facing hazards from the traffic network cameras, peoples cell phone activities, power outage data, flood gauge data, satellite images and many other sources of technology that harness the heartbeat of the community, Mostafavi said. As our cities and communities become smarter with information and communication technologies, the amount of data generated grows.

Mostafavi and his team in the Urban Resilience.AI Lab are taking the lead in harnessing community-scale big data to develop AI-based models with the potential to impact communities before, during and after a natural disaster or crisis.

Roads are essential in urban cities, allowing goods, information and people to move from place to place. But during times of disaster, such as floods, road networks can be damaged or blocked, which impacts access to services such as hospitals, shelters and grocery stores. During floods in urban areas, vehicle accidents resulting from driving on flooded roads have been identified as a leading cause of fatalities, highlighting the failures of road networks.

Researchers have developed a deep-learning framework to effectively predict near-future flooding of roads. They tested the framework using three models, and the results showed that it can accurately predict the flooding status of roads with 98% precision and recall values of 96%. Researchers validated the models using the 2017 Hurricane Harvey flooding.

Knowing the flooding status of roads can help affected communities avoid flooded streets and aid emergency management agencies in planning evacuations and delivery of resources.

An aerial view of homes under water after Hurricane Harvey. The National Hurricane Center reported $125 million from the storm.

Getty Images

The National Oceanic and Atmospheric Administration reports from 1980 to 2020, the damage caused by hurricanes in the United States reached $945.9 billion, with 6,593 deaths. Knowing what to do before a hurricane hits is essential to decreasing the adverse effects and disruptions it can bring. This means having enough food, water and other essentials and making necessary home repairs.

Historically, surveys have measured how well households are prepared during hurricane season. Researchers used location-based big data from smartphone devices to proactively monitor hurricane preparation. They looked at three critical dimensions of hurricane preparedness: extent (how widespread is it?), timing (how early do people start preparing?) and spatial distribution (where are the hotspots?).

We have developed metrics and models that we can proactively monitor community hurricane preparedness. Which areas are preparing earlier? Which areas are preparing more based on how many trips they make to grocery stores, pharmacies and gas stations? Mostafavi said. We can identify areas that are underprepared. If an underprepared area hasnt evacuated, its a recipe for disaster.

Researchers focused on smartphone data on visits to businesses, such as grocery stores and pharmacies, to indicate how prepared the local population was for Hurricane Harvey in 2017. The results labeled regions with a decrease in visits as underprepared and those with an increase in visits as highly prepared. They saw that low-income households were more likely to prepare for the hurricane than those with higher incomes, likely because they lacked the means to protect themselves in other ways. The study outcomes allow emergency response managers and public officials to identify underprepared areas and better allocate resources promptly.

If areas are impacted (and unprepared), they wont have power. They wont have water. They wont have food. They wont have medications, Mostafavi said. But by using our models, we can identify the hotspots of various underprepared areas proactively before a hurricane lands.

Researchers proposed and tested an adaptive AI model that can learn how people typically move from place to place and adapt when an emergency, like flooding, a wildfire or a hurricane, happens. This model is helpful because there are not a lot of data available about how people travel during an emergency.

Using reinforcement learning techniques we have developed, we can identify flood impacts on traffic patterns and disrupted access to critical facilities, Mostafavi said.

To test the model, millions of past travel trajectories were used. The researchers tested the model to see how flooding during Hurricane Harvey impacted traffic and congestion in Houston. Results showed the model could make accurate predictions with a mean percentage error of 4.26% and precision and recall at the learning stage of 76%.

Instead of merely choosing the shortest route, the model leverages the vehicles surroundings to anticipate its movement. Researchers say the model can identify which roads are affected by flooding and simulate mobility during different flood scenarios.

Researchers have created a novel deep-learning model that utilizes high-resolution satellite images to accurately categorize varying degrees of destruction following natural disasters.

Getty Images

After a natural disaster hits, the damage assessment of homes and buildings can take months.

To address this challenge, researchers (in collaboration with Simon Fraser University) developed a new deep-learning model called DAHiTrA, which uses high-resolution satellite images to classify different levels of destruction after natural disasters. This can be helpful after large-scale events like Hurricane Harvey.

The model recognizes the geographic features in different locations and captures the changes over time. It then compares images of a building taken before and after a disaster to determine the level of damage. It can also be applied to other types of civil infrastructure, such as roads and bridges.

The satellite images are available within 24 hours, and our models are fast, Mostafavi said. So, the day after an event, you can know how many buildings have been damaged, the extent of the damage and how many buildings have major damage.

One of the fundamental strengths of DAHiTrA is its ability to accurately determine the boundaries of a building, which allows for more precise damage assessment. It also detects multiple types of damage, including collapse, partial damage and water damage.

The model analyzes large volumes of satellite images in a short time, which is essential for rapid response and recovery efforts after a disaster. This allows for faster and more accurate damage assessments, which can help communities and governments allocate resources more effectively.

Mostafavi said the team is actively working with government agencies, emergency management offices and even international organizations to use their model to assess building damage. One recent example is a use-case project with the Texas Department of Emergency Management.

Researchers also examined how people recover after a disaster, specifically during the short-term period after a hurricane. The study focuses on when people return home after evacuating and how long they can stay in their homes without having to move out again. The study uses location-based data to see how quickly people in different areas can do these things.

The study found that more than half of the census tracts in Harris County returned from evacuation within five days after the landfall of Hurricane Harvey and stopped moving out after six weeks. Some areas take longer than others to recover, and there are differences among different groups of people.

Researchers examined how people of different socio-demographic statuses (like income or rental status) responded to flooding. By looking at how quickly they evacuated or relocated, the team could understand different recovery patterns in various subpopulations. While the study shows it took longer to return in flooded areas than in nonflooded areas, there wasnt a significant difference between the two areas regarding evacuation and relocation for low-income populations.

The return time of high-income census tracts were longer than those of low-income census tracts when flooded, indicating the inability of low-income populations to evacuate and relocate. The study also found that areas with shorter return durations may not be more resilient to disaster, as they may indicate challenges faced by low-income and minority populations that require additional assistance and resources.

Our vision is to create a future where data science and AI technologies are leveraged to predict, prepare for and equitably respond to natural hazards, enabling us to mitigate their impact on communities, Mostafavi said. Our work so far shows the promising potential of big data and AI for augmenting different capabilities needed for disaster resilience. We are collaborating with various public and private sector organizations to develop, scale and deploy the AI technologies created in the lab.

Here is the original post:

Leveraging Big Data And AI For Disaster Resilience And Recovery - Texas A&M University Today

The Power of Data Visualization in Data Science – CityLife

Unlocking Insights: The Power of Data Visualization in Data Science

In todays data-driven world, data science has emerged as a powerful tool for organizations to extract valuable insights from vast amounts of information. As the volume of data continues to grow exponentially, it becomes increasingly challenging for data scientists and analysts to make sense of it all. This is where data visualization comes into play, as it enables professionals to unlock insights from complex data sets and communicate them effectively to stakeholders.

Data visualization is the graphical representation of data, which allows users to understand trends, patterns, and correlations that might go unnoticed in traditional text-based data presentation. By presenting data in a visual format, it becomes easier for the human brain to process and interpret the information, ultimately leading to more informed decision-making. The power of data visualization in data science lies in its ability to make data more accessible and understandable to a wider audience.

One of the key benefits of data visualization is that it helps data scientists and analysts identify patterns and trends in the data quickly. By visually representing data, it becomes easier to spot outliers, clusters, and trends that might be hidden in rows and columns of numbers. This can be particularly useful in the early stages of data analysis, as it allows professionals to generate hypotheses and identify areas that require further investigation.

Data visualization also plays a crucial role in the communication of insights derived from data analysis. Often, the results of data science projects need to be presented to stakeholders who may not have a deep understanding of the underlying data or the technical aspects of the analysis. In such cases, data visualization can be an effective way to convey complex information in a simple and easily digestible format. By presenting data visually, it becomes easier for stakeholders to grasp the key insights and make informed decisions based on the findings.

Moreover, data visualization can help facilitate collaboration between data scientists, analysts, and other stakeholders. When working with large and complex data sets, it can be challenging for team members to stay on the same page and understand the various aspects of the data. By using visualizations, team members can easily share their findings and collaborate on the analysis, ensuring that everyone has a clear understanding of the data and the insights derived from it.

In addition to its benefits in data analysis and communication, data visualization can also be a powerful tool for storytelling. By combining data with visual elements, data scientists and analysts can create compelling narratives that help convey the significance of their findings to a broader audience. This can be particularly useful when presenting the results of data science projects to non-technical stakeholders, as it allows them to understand the impact of the insights on their organization or industry.

Despite its many advantages, data visualization is not without its challenges. One of the key challenges in data visualization is selecting the appropriate type of visualization for the data at hand. With a wide array of visualization techniques available, it can be difficult to determine which one will best convey the insights derived from the data. To overcome this challenge, data scientists and analysts must have a deep understanding of the various visualization techniques and their applications, as well as the ability to select the most appropriate one based on the data and the insights they wish to communicate.

In conclusion, data visualization is a powerful tool in the field of data science, as it enables professionals to unlock insights from complex data sets and communicate them effectively to stakeholders. By presenting data in a visual format, data scientists and analysts can identify patterns and trends, facilitate collaboration, and create compelling narratives that help convey the significance of their findings. As the volume of data continues to grow, the importance of data visualization in data science will only continue to increase, making it an essential skill for professionals in the field.

Follow this link:

The Power of Data Visualization in Data Science - CityLife

Bees360 and OneClick Code Revolutionize Property Inspections – GlobeNewswire

HOUSTON, June 15, 2023 (GLOBE NEWSWIRE) -- Bees360, provider of drone enabled property inspection services powered by artificial intelligence (AI), and OneClick Data, a trusted data partner dedicated to streamlining the building code-sourcing process for all parties in the roofing industry, are leveraging each others technologies to revolutionize property inspections and improve the insurer and homeowner experience.

Insurers and homeowners rely on the opinions and expertise of individual inspectors when evaluating property damage. Traditional inspection processes and identifying proper building codes and taxation rates is time consuming, potentially impacting the claim adjudication timeline and customer experience. Bees360 and OneClick Data each solve individual pieces of this problem.

Bees360 provides a new solution to an old problem for insurance carriers helping policyholders become whole after a loss as quickly and painlessly as possible, said Courtney Cooke, vice president of sales at Bees360. Through the use of AI, we provide increased accuracy and consistency in a manner not achievable through the traditional inspection process. The added benefit for our carrier clients is the cost savings this automation allows, helping reduce loss adjustment expenses.

Bees360 uses drones and artificial intelligence (AI) technologies to streamline and modernize the traditionally manual and labor-intensive property inspection process. Certified drone pilot inspectors navigate the property, collecting multiple data points through high resolution imagery. The property data is analyzed by an AI engine that identifies any potential issues with the property, such as wind or hail related damages to the roof and exterior elevations. Leveraging the municipality building code and taxation data from OneClick Datas flagship product, OneClick Code, enables Bees360 to deliver a comprehensive profile of the property and integrate it into the claims process.

For decades, contractors, property inspectors, and insurers have had the challenge of accurately identifying building codes for property inspections, said Jessi West Lundeen, director of marketing and communications at OneClick Data. The combined capabilities of Bees360 and the OneClick Code product removes the guesswork and provides a reliable source of truth for any property.

The OneClick Code product delivers accuracy, transparency, and reliability to contractors, inspectors, and insurers by providing accurate building codes at the click of a button. Accessing building code data is complicated and can lead to misinformed property inspections and repair estimates. The data provided in the OneClick Code product is always up to date and available on demand, mitigating any guesswork.

Insurers, property inspectors, roofing contractors, and certified drone pilots can learn more about how you can enhance your property inspection experience by contacting Bees360 and OneClick Data respectively.

About Bees360Bees360 is a leading innovator of deep learning and computer vision technology who focuses on bringing AI and drone solutions to property underwriting and claim inspections. It is founded by data scientists and insurance experts whose collective background in data science, mathematics, property claims & underwriting provide an unparalleled level of knowledge to build AI-powered workflow and a disruptive business model for the conventional insurance industry. Learn more at http://www.bees360.com.

About OneClick CodeOneClick Data is a trusted data partner revolutionizing the roofing industry by streamlining the building code-sourcing process and enhancing efficiency and transparency for all stakeholders. With their flagship product, OneClick Code, they have pioneered data automation of jurisdictional authority for building codes, permit fees, taxes, and manufacturer specifications nationwide. Empowering contractors, insurance professionals, and claim adjusters, OneClick Code provides instant access to trusted roofing codes with a simple click. With comprehensive tools and integrations, they enable rapid, precise, and scalable claims settlement, reshaping claims management with unmatched efficiency and accuracy. Welcome to the era of automated restoration intelligence. To learn more, visit our website http://www.oneclickdata.com.

Media Contact:Jennifer OverhulseSt. Nick Media Servicesjen@stnickmedia.com859-803-6597

See the original post:

Bees360 and OneClick Code Revolutionize Property Inspections - GlobeNewswire

New Geospatial Strategy to boost UK’s standing as location … – GOV.UK

Fresh commitments to turbocharge the UKs location data and technologies can revolutionise public services planning and boost business, the Science and Technology Secretary unveiled yesterday (Wednesday 14 June), in a keynote speech at London Tech Week.

The UK Geospatial Strategy 2030 will unlock billions of pounds in economic benefits through harnessing technologies, such as AI, satellite imaging and real-time data to boost location-powered innovation and drive the use of location data in property, transport, utilities and beyond.

Geospatial applications and services play a vital role in everyday life, from online maps used by billions across the world to speeding up delivery for online shoppers in turn boosting businesses and the economy.

They can also transform public sector delivery, from planning the countrys transport infrastructure to improving emergency responses such as during the pandemic ensuring taxpayers get the best service for their money.

Viscount Camrose, Minister for AI and Intellectual Property, Department for Science, Innovation and Technology, said:

Location data and technologies power our economy and our Geospatial Strategy 2030 will ensure the UK remains an innovating world leader, by building on our successes and harnessing new opportunities from rapidly evolving technologies.

Our new strategy will grow our economy, embed the application of geospatial data in key decisions that affect our day-to-day lives while driving the private and public sector creativity needed to cement the UKs status as a global science and technology superpower.

The renewed strategy sets out the Geospatial Commissions priorities for the rest of this decade, building on its landmark 2020 paper and ensuring the UK stays at the forefront of rapidly evolving technologies key to driving innovation and growing our economy.

The strategy sets out three renewed missions:

The strategy builds on the work to roll out the National Underground Asset Register (NUAR) (a government-led programme creating a single, standardised data-sharing platform on the location and condition of underground pipes and cables) to the first cohort of users in North East England, Wales and London; our report on how location data can support the rollout of electric vehicle (EV) charge points; and renewing and modernising key multi million pound contracts that give the public sector access to world leading geospatial data and services.

Sir Bernard Silverman, Chair, Geospatial Commission said:

Location-based insights have proven their enduring value again and again - during the coronavirus (COVID-19) pandemic, tracking outbreaks across the country was critical to inform public health decisions. The power of location will continue to underpin solutions to our biggest challenges including climate change, energy security and economic growth.

Our strategy supports the drive towards the adoption of critical technologies and continued investment in UK research and development.

Growing the UK geospatial sector

The strategy sets an ambition for the National Underground Asset Register (NUAR) to be fully operational across England, Wales and Northern Ireland by the end of 2025.

The Geospatial Commission will also conduct a review of the Public Sector Geospatial Agreement (PSGA) to ensure it remains fit for purpose over the remaining seven years of the agreement. The PSGA sets out how Ordnance Survey providesworld-leading location data and expertise to the public sector across England, Scotland and Wales.

Further announcements include:

Notes to editors

Further stakeholder comments can be found below:

Sir Adrian Smith, Institute Director, The Alan Turing Institute said:

The new UK Geospatial Strategy 2030 is important to ensure that technologies developed using AI, machine learning and augmented reality continue to help researchers to better analyse and visualise data in new ways. Embracing these technologies to accelerate geospatial research will require effective partnerships across academia, industry and the public sector, and active public participation. We look forward to continuing our work with the Geospatial Commission to improve land use planning, resilience to climate change and infrastructure investment decisions.

Alison Lowndes, Senior Scientist, Global AI, NVIDIA said:

The updated UK Geospatial Strategy 2030 offers a clear vision for how the UK will realise the economic, social and environmental opportunities offered by location data. Building the UKs skills and capacity in this sector, as well as applying artificial intelligence to unlock the capabilities of satellite imagery, will be key to maintaining global leadership in geospatial applications and services.

Professor Mike Batty, Chair, Centre for Advanced Spatial Analysis said:

This century is increasingly one in which our society and economy will be underpinned by digital data. As part of this, locational data is becoming focal to a multitude of activities that have hitherto ignored geography. In this report, the Geospatial Commission sets out a roadmap to enable this digital transformation to embrace and apply the latest technologies such as AI, remote sensing, and cloud computing to many private and public activities. The case they make is absolutely critical to the way business and government needs to evolve in the next decade.

Rachel Tidmarsh, CEO, Bluesky International Ltd said:

The UK Geospatial Strategy 2030 encapsulates the unlimited potential and opportunities of geospatial data and its applications. I was particularly pleased to read the continuing theme of innovation throughout the strategy as this is what will undoubtedly maintain the UKs position as a world leader in geospatial and enable the public and private sector to unlock its true value. It is key to this success that the steps already taken to increase geospatial skills in the workforce, and enthuse the next generation about geospatial, are continued and amplified.

Tom Knights, European Lead, Strava said:

The way Brits are living, working and moving around their towns and cities has changed. Communities across the UK are experiencing a boom in active mobility as people look to more efficient ways to get from A to B. We are proud to provide geospatial data from the Strava community to empower transport planners to build safe, efficient and enjoyable ways to move around the UK by bike or on foot.

Professor Anahid Basiri, Director, Centre for Data Science and AI said:

We welcome this timely update to the UKs geospatial strategy that draws on the opportunities of the emerging technologies, such as AI and autonomous vehicles, as well as their challenges and concerns associated with them to calibrate the UK plans as a world-leading nation in providing solutions to the grand challenges of our time, such as climate crises, equitable and sustainable societies, active travel, affordable housing and many more.

Adam Burke, Chair, Association for Geographic Information (AGI) said:

The AGI welcomes the publication of the refreshed UK Geospatial Strategy 2030. The strategy offers refreshed thinking for the UKs geospatial capabilities, allowing us to embrace change, innovation and drive greater use of location data opportunities to unlock the full potential of the power of location.

Professor Jon Mills, Director, Newcastle University / Centre for Doctoral Training said:

Specialist geospatial education and training in the UK currently stands on a precipice, so it is good to read that the refreshed UK Geospatial Strategy 2030 considers skills as central to building confidence in the future geospatial ecosystem. To secure an appropriate geospatial talent pipeline for the future necessitates endeavours at all levels of education, from awareness in schools to high-level doctoral training in universities and lifelong professional development in industry. We look forward to working with the Geospatial Commission to develop best practice in embedding geospatial skills into university programmes to help ensure a sustainable future and societal prosperity for the UK.

Sue Daley, Director of Technology and Innovation at techUK said:

We welcome the Governments continued focus on nurturing a thriving geospatial ecosystem. Investing and scaling our capabilities in geospatial technologies will help the UK in its journey to becoming a science and technology superpower. We look forward to seeing how this strategy is put into action to unlock opportunities across our economy

Visit link:

New Geospatial Strategy to boost UK's standing as location ... - GOV.UK

UC Berkeley spreads the gospel of data science with new college, free curriculum – Los Angeles Times

They comb through troves of legal records and video evidence to challenge wrongful convictions. They organize medical data to help personalize health treatments for better care. They scrutinize school test scores to investigate inequities. Finding safe drinking water is easier thanks to an analysis tool they created.

UC Berkeleys faculty and students are marshaling the vast power of data science across myriad fields to address tough problems. And now the university is set to accelerate those efforts with a new college, its first in more than 50 years and is providing free curriculum to help spread the gospel of data science to California community colleges, California State University and institutions across the nation and world.

As data floods society faster than ever before, demand has surged for specialists who can organize and analyze it with coding skills, computing prowess and creative thinking. To meet the insatiable demand, as university officials put it, UC Berkeley will open a College of Computing, Data Science and Society after the University of California Board of Regents approved the plan Thursday.

A new college building is scheduled to open during the 2025-26 academic year and will house the data science major, first offered five years ago, with other degree programs in computer science, statistics, computational biology and computational precision health. Some of the programs will be run jointly with the Berkeley College of Engineering and UC San Francisco. UC Berkeley says no new state funds will be required; the campus has raised private funds for 14 new faculty positions and about $330 million so far in gifts for the new building.

Infusing the power of data science across multiple disciplines, from basic and applied sciences to the arts and humanities, will help us to fully realize its potential to benefit society, help address our worlds most intractable problems, and achieve our most visionary goals, said UC Berkeley Chancellor Carol Christ.

Christ told regents Wednesday that huge faculty and student demand not a top-down decision led to the data science program. In just five years, data science has become the universitys fourth most popular major among more than 100 offered, with the number of students choosing it nearly doubling to 1,232 in fall 2022 from fall 2019. The number of students who took the introductory data science course was even larger 4,291 this academic year and many were majoring in other disciplines, including economics, psychology, sociology, political science and public health.

UC Berkeleys new college comes as the University of Southern California plans to expand its own footprint in the field with its new School of Advanced Computing. USC aims to bring computing instruction to all students as well as dramatically expand the number of degrees it confers in technology-related fields. It is part of a $1-billion plan to advance student understanding of the digital world across industries.

At UC Berkeley, the top-ranked public university also plans a broad reach for its mission. The campus is seeding data science into community colleges and other institutions to make the field more accessible to a diversity of students, offering a path to high-paying careers. UC Berkeley students majoring in computer science, for instance, earn an average annual income of $179,000 four years after graduation, according to federal education data. Graduates in data science earn an average annual income of about $130,000, according to Burning Glass, a nonprofit organization that researches employment trends.

The university has posted its curriculum online, complete with assignments, slides and readings, and shared it with more than 89 other campuses. Classes have launched or are set to begin this fall at six California community colleges, four Cal State campuses and other universities including Howard, Tuskegee, Cornell, Barnard and the United States Naval Academy.

We want to expand access to computing and the possibilities of how people can learn these skills that will get them a better job, said Eric Van Dusen, a data sciences faculty member who is leading efforts to share Berkeleys curriculum with community colleges. The UC is the biggest driver of middle-class advancement we have.

El Camino College in Torrance began offering a data science class based on Berkeleys curriculum in 2021. The professor, Solomon Russell, said the inaugural class included a large number of Black students who were drawn to the subject to investigate issues meaningful to them.

The class, for instance, used census files to examine the racial demographics of Alabama and determined that the probability of seating an all-white jury in a 1965 case involving a Black defendants rape conviction and death sentence was extremely low but the U.S. Supreme Court upheld the jury selection.

It enables you to be able to follow your curiosities, answer questions you have and look at the world in a new way, Russell said.

He invokes Netflix to pitch his class. Data science, he says, is a cool new field behind the streaming services recommendations on what a subscriber would probably want to watch based on previous viewing habits.

Rebecca Gloyer, a second-year El Camino student, said she shied away from computing due to lack of self-confidence but decided to rid herself of that toxic energy to get ahead. She took Russells course, learned she could do the work without being a crazy coder and found parallels with her love of theater and drama.

Its storytelling with numbers, said Gloyer, who is weighing offers to transfer to data science programs at UC Berkeley and UCLA.

She said she hopes to use her data science skills to promote environmental sustainability, a passion of hers. One potential project, she said, would use trash collection data to investigate how waste recycling policies are working.

Russell said that El Camino College would not have been able to start the data science class without Berkeleys curriculum, along with an online course offered by the UC campus that he and two colleagues took.

But finding faculty members willing to train themselves in the new field isnt always easy, Van Dusen said, and ensuring that students have the required math training is another challenge. So far, six of 116 California community colleges are offering the class to about 500 students at El Camino, Santa Barbara City College, City College of San Francisco, San Jose City College, Skyline College and Laney College. Cal State Fresno is using the curriculum and CSU campuses at Humboldt, Pomona and Channel Islands plan to do so this fall.

Im trying to find the people who are going to lean in to teach this new curriculum, Van Dusen said. And its not everybody whos ready to take on learning a hard new thing.

The curriculum includes computing, statistics, ethics and about 25 different areas students can choose, including social justice, biology and environmental sustainability. Ethical questions are front and center.

In one project, students are helping build a public database of California police records that could be used to investigate potential misconduct. But they must also assess the reliability of the information.

Jennifer Chayes, associate provost of the computing and data science division, developed an algorithm that took the bias out of computer screening of resumes after she found that women were less likely to get interviews for tech jobs than men. Students will likewise learn to examine the platforms they build or use to ensure fairness and learn skills to identify and fight misinformation, she said.

In another project, Berkeleys data scientists are working with public defenders nationally to create platforms that can search through reams of data that could help their clients an effort to level the playing field with better resourced prosecutors, Chayes said.

We want to advance equity. We want to advance justice, Chayes said. We want resources to be allocated equitably across society. These are values that the University of California and Berkeley hold near and dear.

Read the original:

UC Berkeley spreads the gospel of data science with new college, free curriculum - Los Angeles Times