Category Archives: Data Science

Scientists bid to unlock the darkest secret of whiskys unique taste – Yahoo News UK

Scientists bid to unlock the darkest secret of whiskys unique taste (Image: Diageo)

FOR hundreds of years, it has been the process which gives whiskies their unique aroma, body and taste although nobody is exactly sure how.

But now scientists from Heriot-Watt University in Edinburgh are working with whisky giant Diageo to quantify exactly how whisky gets its flavour from the cask during maturation.

Heriot-Watt University has assembled a team of experts for the project, which includes researchers from the International Centre for Brewing and Distilling (ICBD), but also scientists who specialise in chemistry, physics, machine learning and data science.

Enjoying our unrivalled business coverage and analysis? Make it official with a Herald subscription for only 1 for three months.This offer ends Friday so click here and don't miss out!

The three-year Knowledge Transfer Partnership (KTP) will investigate various new analytical methods to get to the bottom of what goes on in a whisky barrel.

The scientists will use spectroscopic methods to identify the natural compounds and understand the chemistry of the process that imparts such distinctive and complex flavours to the maturing spirit.

This information will be coupled with sensory input such as visual inspection of newly-manufactured barrels and the evolving flavour and nose of the whisky.

Together it will be used to develop a data science platform that will demystify the process of maturation.

Read More:Scapa Distillery launches Orkney whisky tourism experience

Professor Annie Hill from Heriot-Watts ICBD said: Producing a quality Scotch whisky is an art.

The new KTP with Diageo is particularly exciting because it combines traditional and novel methods to generate big data that may be used to further understand whisky maturation.

The ability to more accurately predict the outcome of maturation based on the characteristics of the cask and new make spirit will enable significant improvements in inventory management and reduction of losses, leading to overall efficiencies in Scotch whisky production.

Story continues

Until now, no-one has been able to pin down scientifically about what exactly happens inside the cask.

Read More: The Macallan single malt Scotch whisky in deal for Jerez sherry casks

The maturation of a whisky takes place in three very essential steps through a complex chemical reactions described as additive, subtractive and interactive maturation.

During additive maturation, the distillate begins to draw flavours and colours from the wood which are then distributed throughout the liquid, and a large number of chemical compounds are formed.

In addition, during additive maturation, the whisky absorbs compounds from the liquid that was previously stored in the barrel, such as sherry.

After the whisky has built up a broad spectrum of aromas, subtractive maturation is about breaking down unwanted flavours.

The third area of maturation is the least fully understood and is called interactive maturation. This is where the peculiarity of the oak wood and the influence of the climate come into play.

During interactive maturation, gases enter and escape the cask and there is an exchange between the contents of the barrel and the environment through the barrel wall.

What enters or leaves is determined by temperature fluctuations and humidity, as well as the peculiarities of the warehouses and their location.

Matthew Crow, research partnerships manager with Diageos Global Technical team, said: Scotch is matured for at least three years and often much longer, a process that enriches and refines its flavour.

Read More: Amber nectar goes green as whisky has a low carbon makeover

However, a barrels potential for imparting flavour, and how the whisky will mature in that barrel, involves many complex factors.

The industry, and Diageo in particular, have a long history of research across whisky production, and Heriot-Watts scientists will help us to take our understanding to a new scientific level.

Professor Martin McCoustra is an expert on the interaction of chemical substances with complex surfaces and will be coordinating the cross-disciplinary team from Heriot-Watt University. He said: Well start with simple optical and ultraviolet imaging of freshly-prepared barrels and then use infrared and optical spectroscopies to give us their chemical fingerprints.

Well also use nuclear magnetic resonance, which is the laboratory equivalent of the MRI scan you would get in hospital, and mass spectrometry to trace how the chemical fingerprint of the spirit changes.

These chemical fingerprints will include information on the natural compounds that contribute to the flavour of the maturing spirit. The skills of the coopers, distillers and blenders will give us a sensory evaluation of the barrel and evolving spirit.

All this data will be used to train a machine learning system that will predict what the flavour quality of the whisky could be. This could significantly enhance whisky production, giving better data upon which to base fundamental decisions, such as how long a whisky should stay in a barrel.

Originally posted here:

Scientists bid to unlock the darkest secret of whiskys unique taste - Yahoo News UK

Ottawa to host world-leading event on statistics and data science – Canada NewsWire

OTTAWA, ON, March 20, 2023 /CNW/ - Statistics Canada is pleased to announce that the 64th World Statistics Congress (WSC) will be held in Ottawa, Canada, from July 16 to 20, 2023, at the Shaw Centre.

The WSC is the leading international statistics and data science event, held every two years by the International Statistical Institute (ISI) since 1887. This marks the second time Canada has welcomed the WSC, having first hosted in 1963.

Statistics Canada is proud to support and participate in this year's event in the National Capital Region, which presents an opportunity to discuss the concrete statistical and data issues of our time, network and collaborate with experts, showcase data and statistical practices in Canada, and learn from practices and insights from other countries.

The four-day congress will provide a scientific program featuring hundreds of experts, including distinguished international statisticians, data scientists, industry leaders and renowned speakers from over 120 countries.

"The WSC is an incredible opportunity to discuss statistics that are crucial to decision making, share insights and learn from many other countries," says Anil Arora, Chief Statistician of Canada. "Statisticians have never been more relevant to helping solve global challenges than they are today."

"The congress encourages collaboration, growth, discovery and advancement in the field of data science," says ISI President Stephen Penneck. "I am excited to have the 64th World Statistics Congress visit Canada and look forward to the impact it will have on the industry. We are delighted to announce that one of the most influential statisticians, the former Director of the United States Census Bureau, Professor Robert M. Groves, will be joining as a keynote speaker."

Statistics Canada looks forward to welcoming experts in this field from around the world and taking part in presentations, panel discussions and more.

Associated links:

Registration information World Statistics Congress 2023 website

SOURCE Statistics Canada

For further information: Media Relations, Statistics Canada, [emailprotected]

Read the original here:

Ottawa to host world-leading event on statistics and data science - Canada NewsWire

How AI is shaping the future of higher ed (opinion) – Inside Higher Ed

Artificial intelligence is emerging as one of the most powerful agents of change in higher education, presenting the sector with unprecedented academic, ethical and legal challenges. Through its algorithmic ability to adapt, self-correct and learn, AI is pushing the boundaries of human intelligence, making the future of higher education inextricably intertwined with AI.

To disentangle the intertwined relationship between AI and higher education, I will briefly discuss the opportunities and the challenges of AI, review some of the emerging applications of AI in higher education, and offer some recommendations for the way forward.

As an umbrella term that includes machine learning, deep learning and natural language processing, AI relies on extensive computing power and massive amounts of data processed by algorithms. As it continues to seep into the fabric of our society, AI is being used to solve problems in cybersecurity, health care, agriculture, climate change, manufacturing, banking and fraud detection, among other areas. By using computing power to process and to learn from data, AI is augmenting our ability to automate routine tasks; to streamline processes; to recognize and classify images, patterns and speech; to make predictions; and even to make decisions.

Search over 40,000 Career Opportunities in Higher EducationWe have helped more than 2,000 institutions hire the best higher education talent.

Not surprisingly, some AI enthusiasts are predicting the emergence of a super AI intelligence capable of rivaling, if not surpassing, human intelligence. As AI capabilities grow, debates about the moral and ethical implications of its use are likely to accelerate. For example, AI-powered facial and emotion recognition technology often exhibits bias against nonwhite people and women and can be weaponized for mass surveillance. Currently, however, as part of an emerging trend of responsible AI practices, researchers are attempting to address issues of bias and discrimination by improving both the algorithmic transparency and the data that feed AI models.

AI is quietly disrupting higher educations administrative, teaching, learning and research activities. Here are a few examples:

Together, these transformative processes have the potential to redefine and reduce the number of positions in areas such as admissions, administrative support, instructional design, teaching and information technology support. With the continued advancement of AI-generated multimodal content, as demonstrated by the recent release of ChatGPT-4, the efficiency and productivity of these areas will continue to improve.

Although its still prone to inaccuracies and sometimes fabrications, ChatGPT, developed by the OpenAI research lab, is now being used to write articles, novels, poems, stories, dialogues, reviews and news reports; to develop web applications; to write programming code; and even to author an academic paper about itself. Educators use ChatGPT to draft course syllabi, lecture content, assignments and grading rubrics. With this ability to produce academic writing, many fear that ChatGPT is poised to disrupt academic scholarship, even though AI is still far from replicating the complex metacognitive activities involved in the scholarly writing process.

Paving the road ahead for the future of AI in higher education requires a strategic and holistic approach that integrates education, planning and research.

First, institutions need to engage in campuswide discussions about the impact of AI on administrative, teaching and research practices. It is crucial to develop administrators and faculty members understanding of the promises and limitations of AI. These discussions must be transparent and address issues like data collection and ownership, intellectual property, data storage, security, and the rights and privacy of various stakeholders. Additionally, institutions must create a framework for the ethical governance of AI, such as the Rome Call for AI Ethics and the Data Ethics Decision Aid. These frameworks will help institutions use AI to advance teaching, learning and research, while preventing the misuse and unintended consequences of AI.

Furthermore, higher education institutions must carefully assess how AI will affect the labor market in the future. This analysis should lead to a rethinking of educational pathways to prepare students for a hybrid labor market in which AI will play a significant role.

Second, stakeholders ought to explore an AI-across-the-curriculum approach by engaging departments and their faculty development centers to identify ways to integrate current AI applications and competencies into the curriculum. AI competencies should be transdisciplinary, reflecting the various areas of expertise involved in AI development: mathematics, machine learning, deep learning, programming, data science, writing, ethics, business management, etc. This transdisciplinary approach would allow universities to lay the groundwork for a holistic and integrated approach to AI education, while fostering collaboration and partnership between faculty from different disciplines.

Third, universities should consider establishing an interdisciplinary longitudinal research agenda to examine the social, ethical and pedagogical challenges associated with AI, making sure to include experts in the humanities and social sciences in these discussions. It is imperative that universities take the lead in identifying and understanding the complexities and challenges that AI will bring to the academic landscape. Moreover, universities should collaborate with industry and the public sector to create integrated, transparent and impartial AI programs while equipping students with lifelong learning skills to make our soon-to-be AI-driven society both better and more just.

See the article here:

How AI is shaping the future of higher ed (opinion) - Inside Higher Ed

How AI and Machine Learning Are Impacting the Litigation Landscape – Cornerstone Research

Mike DeCesaris and Sachin Sancheti detail how expert witnesses are incorporating artificial intelligence and machine learning into their testimony in a variety of civil cases.

Artificial intelligence has long been present in our everyday activities, from a simple Google search to keeping your car centered in its lane on the highway. The public unveiling of ChatGPT in late 2022, however, brought the power of AI closer to home, making it accessible to anyone with a web browser. And in the legal industry, we are seeing the use of AI and machine learning ramp up in litigation, especially when it comes to expert witness preparation and testimony.

The support of expert witnesses has always required leading-edge analytical tools and data science techniques, and AI and machine learning are increasingly important tools in experts arsenals. The concept of technology being able to think and make decisions, accomplishing tasks more quickly and with better results than humans, conjures thoughts of a Jetsons-like world run by robots. However, unlike the old Jetsons cartoons of the 1960s, where flying cars were the de facto mode of transport and robot attendants addressed every need, the futuristic ideas around the impact of AI were not that far off from a rapidly approaching reality. In fact, as older, rules-based AI has evolved into machine learning (ML) where computers are programmed to accurately predict outcomes by learning from patterns found in massive data sets, the legal industry has found that AI can do far more than many imagined.

In the world of litigation, the power of AI and ML have been understood for years by law firms and economic and financial consulting firms. AI is ideally suited to support, qualify, and substantiate expert work in litigation matters, which formerly relied on a heavily manual process to improve the efficiency or quality of the data presented in testimony. Moreover, over the last several years, AI and ML have been used directly in expert testimony by both plaintiff and defense side experts.

Somewhat ironically, humans are at least partially responsible for driving the increased use of AI and ML in expert work as we produce ever-growing volumes of user-generated content. Consumer reviews and social media posts, for example, are becoming increasingly relevant in regulatory and litigation matters, including consumer fraud and product liability cases. The volume of this content can be overwhelming, so one familiar approach involves leveraging keywords to identify a more manageable subset of data for review. This is limiting, however, as it often produces results that are irrelevant to the case while omitting relevant results containing novel language. By contrast, ML-based approaches can consider the entire text, using context and syntax to identify the linguistic elements that most accurately indicate relevance.

To see this approach in action, consider litigation involving alleged marketing misrepresentations or defamatory statements, which require an examination of the at-issue content. The most robust analyses are systematic and objective, making them ideal for outsourcing to the noncontroversial training data and impartial models that are hallmarks of state-of-the-art AI and ML approaches.

AI and ML have also proven to be valuable tools for experts across a broad spectrum of consumer fraud and product liability matters. While some scenarios may be obvious, humans possess the creativity to adapt a solution to other use cases. Here, these novel uses include:

Domain-specific sentiment analysis Publicly available sentiment models perform well on many problems but often fail on tasks that feature domain-specific linguistic structures. Such failure might arise when tasked with measuring the sentiment surrounding an entity in an industry whose discussion features novel or counterintuitive language. Consider a defamation suit filed by a fitness influencer. Terms like confusion, resistance, and to failure generally have negative connotations, but in the fitness space, are often used to describe a successful workout. Likewise, slang terms like guns and shredded mean something entirely different in the fitness context than in conventional use. In these cases, a general-purpose sentiment model may mischaracterize or overlook such language, while training a domain-specific sentiment model will provide a more accurate assessment of the sentiment contained in allegedly defamatory statements. This training process could involve gathering hundreds of thousands of user-generated reviews for industry products, and then directing a context-aware language model to predict the review score from the text. This custom model will quantify the polarity of the discussion surrounding the influencer, which can then be tracked through time and around certain critical events.

Assessing marketing influence on social media To assess allegations that a company steered an online discussion through social media marketing, AI and ML can compare the companys posts to those generated by unaffiliated users (earned media). This can be done using language models and text similarity metrics that quantitatively and objectively assess whether earned media immediately following the companys posts were more like the companys posts than either earned media preceding the posts or selected at random.

Image object detection To assess the incidences of client logos and products appearing across images posted to social media, a custom object detection model can be trained and applied to a random sample of millions of social media images.

Public press topic modeling To quantify the extent and timing of the public awareness of a marketing claim at issue, AI and ML can be applied to articles published in media outlets. This approach helps isolate the at-issue topic from other closely related but distinct topics. Such distinctions can then facilitate an analysis that is more narrowly focused on the claim at hand.

Multimedia characterization Where there are allegations of product misrepresentation or improper marketing, AI and ML can characterize the nature of a companys social media presence. A model trained on text and image content from unaffiliated but topically relevant brands can learn to distinguish content along the lines of broad brand identities (e.g., healthy vs. unhealthy, eco-friendly vs. climate-damaging). Applying such a model to at-issue social media content can quantify whether it conveys each of these brand features.

The nature of allegedly defamatory statements Even in the presence of clearly negative statements, defamation is notoriously difficult to prove. Defendants may claim that statements were expressed not as fact but as opinion, possibility, entertainment or satire. By leveraging datasets and models that identify the degree of certainty present in natural language examples, experts can objectively measure the degree to which reasonable consumers may interpret the information as fact.

Product liability One growing area of research concerns the quantification and isolation of specific entities referenced in a broader text. Product liability cases, for instance, may examine user-generated product reviews to identify the importance and sentiment surrounding at-issue product features. Rather than assess the review as a whole, aspect-based sentiment analysis focuses on at-issue features only, allowing for the extraction of strong indicators from nuanced or mixed reviews.

Class certification A successful class certification challenge will demonstrate that the circumstances of putative class members were sufficiently varied to require individual treatment. Any of the methods discussed above can be taken together to quantify the heterogeneity of the at-issue materials. For example, a case concerning marketing misrepresentations may train a classifier to distinguish at-issue marketing content from content not at issue, model the topics targeted throughout multiple distinct marketing campaigns, and summarize images to demonstrate differing appeal to different consumers.

For centuries, the ability of humans to mold available resources to serve their needs has separated them from less-evolved species. We see it in all walks of life, and the above examples demonstrate it in our small corner of the world. And we will continue to see it as the availability of voluminous social media and other user-generated data continues to expand and become more complex. In its simplest terms, AI and ML are critical in helping us efficiently search through the haystack to find the needle. Those who try to find the needle by hand will inevitably be left behind.

This article was originally published byLaw.com in March 2023.

The views expressed herein do not necessarily represent the views of Cornerstone Research.

Read more here:

How AI and Machine Learning Are Impacting the Litigation Landscape - Cornerstone Research

KX and Microsoft Expand Partnership, Offer a Data Timehouse – Database Trends and Applications

KX is expanding its partnership with Microsoft, now offering kdb Insights Enterprise on Microsoft Azure.

According to KX, this represents an industry-first Data Timehouse, a new class of data and AI management platform designed for temporal data generated by digital transformation.

It provides data scientists and application developers with precision access to temporal data on real-time and massive historical datasets with the Azure native tools they use today.

"In partnership with KX, we're excited to launch one of the industry's first data timehouse on the Azure platform. While in preview, we have already seen impressive results for customers in capital markets, healthcare, manufacturing, and energy. We look forward to working with KX to help businesses achieve transformative growth with kdb Insights Enterprise on Azure, said Corey Sanders, corporate vice president of Microsoft Cloud for Industry.

Engineered by KX, kdb Insights Enterprise on Microsoft Azure enables businesses to transform into real-time intelligent enterprises, with preview customers reporting up to 100x the performance at one tenth of the cost of competing solutions, according to the vendors.

"The launch of kdb Insights Enterprise on Microsoft Azure as a first party service is a watershed moment for KX, said Ashok Reddy, KX CEO. Representing the industry's first Data Timehouse, it enriches data warehouse and lakehouse technologies and gives customers access to the power and performance of kdb with all the benefits of the Azure platform. It enables companies in all sectors to accelerate their AI and ML analytics workloads, putting data driven decision science at the very heart of their business for enhanced operational and commercial outcomes. We continue to see enormous opportunity for our strategic partnership with Microsoft, helping to underpin our growth expectations."

For more information about this news, visit http://www.kx.com.

Link:

KX and Microsoft Expand Partnership, Offer a Data Timehouse - Database Trends and Applications

University of Leeds announces MSc in Data Science (Statistics) on … – FE News

Online learning platform, Coursera, is proud to announce a new Master of Science (MSc) in Data Science (Statistics) from the University of Leeds, one of the largest higher education institutions in the UK and recognised among the leading 100 universities worldwide.

The degree program, developed in collaboration with the School of Mathematics and the Leeds Institute for Data Analytics (LIDA), will equip learners with in-demand data skills and prepare them for careers in healthcare, environmental science, and other data-driven STEM fields.

Data science related professions are already in high demand and likely to increase further over the next few years. In fact, its estimated that the global data analytics industry will grow more than 500% by 2030. Those with technical skills and qualifications, such as the MSc in Data Science from the University of Leeds, will be well positioned for the future labour market.

The Masters offers both foundational data science courses and specialized statistics courses. Students will develop a range of skills, including proficiency in key programming languages, the ability to analyse large datasets, and an understanding of data handling and governance. The course includes practical projects, allowing students to demonstrate their skills to potential employers, and prepare for senior roles in business, government, and nonprofit sectors.

Prof. Jeff Grabill, Deputy Vice-Chancellor: Student Education, said:

This is a fantastic example of our investment in digital technologies and commitment to grow our fully online education portfolio to benefit students around the world. We have an ambitious strategy to transform learning for students, and by offering fully-online courses like this MSc in Data Science, we can give people who would never be able to step foot on campus the opportunity to experience our high-level teaching and research. Our aim is to remove boundaries to learning, and digital education allows us to engage with the world in a different way and adds to the already outstanding learning opportunities already available at the University of Leeds.

Marni Baker Stein, Chief Content Officer at Coursera, said:

This degree program offers a great opportunity for learners to acquire the highly sought-after skills that are indispensable in todays data-centric workforce. By bringing this degree online we have opened it up to all our learners worldwide and enabled more people to gain a world-class education. This program offers a transformative educational experience that is not only accessible but also adaptable to the evolving demands of the industry.

The online MSc allows students to participate remotely in an interactive and engaging learning experience. Students can connect with faculty and peers through live and asynchronous online content, such as discussions, activities, and tutoring. The program also allows students to try open courses related to the degree before applying, with any progress made counting toward the full degree.

The University of Leeds is recognised as one the worlds top 100 universities and this partnership with Coursera will allow learners from around the world to access their world-class expertise and gain valuable in-demand skills that better prepare them for STEM careers.

More here:

University of Leeds announces MSc in Data Science (Statistics) on ... - FE News

The cloud backlash has begun: Why big data is pulling compute … – TechCrunch

Thomas RobinsonContributor

The great cloud migration has revolutionized IT, but after a decade of cloud transformations, the most sophisticated enterprises are now taking the next generational leap: developing true hybrid strategies to support increasingly business-critical data science initiatives and repatriating workloads from the cloud back to on-premises systems. Enterprises that havent begun this process are already behind.

Ten years ago, the cloud was mostly used by small startups that didnt have the resources to build and operate a physical infrastructure and for businesses that wanted to move their collaboration services to a managed infrastructure. Public cloud services (and cheap capital in a low interest-rate economy) meant such customers could serve a growing number of users relatively inexpensively. This environment enabled cloud-native startups such as Uber and Airbnb to scale and thrive.

Over the next decade, companies flocked en masse to the cloud because it lowered costs and expedited innovation. This was truly a paradigm shift and company after company announced cloud-first strategies and moved infrastructures wholesale to cloud service providers.

However, cloud-first strategies may be hitting the limits of their efficacy, and in many cases, ROIs are diminishing, triggering a major cloud backlash. Ubiquitous cloud adoption has given rise to new challenges, namely out-of-control costs, deepening complexity, and restrictive vendor lock-in. We call this cloud sprawl.

The sheer quantity of workloads in the cloud is causing cloud expenses to skyrocket. Enterprises are now running core compute workloads and massive storage volumes in the cloud not to mention ML, AI, and deep learning programs that require dozens or even hundreds of GPUs and terabytes or even petabytes of data.

The costs keep climbing with no end in sight. In fact, some companies are now spending up to twice as much on cloud services as they were before they migrated their workloads from on-prem systems. Nvidia estimates that moving large, specialized AI and ML workloads back on premises can yield a 30% savings.

See the original post here:

The cloud backlash has begun: Why big data is pulling compute ... - TechCrunch

Analytics and Data Science News for the Week of March 17 … – Solutions Review

The editors at Solutions Review have curated this list of the most noteworthy storage and data protection news items for the week of March 17, 2023.

Keeping tabs on all the most relevant analytics and data science news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last week, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy analytics and data science news items.

[hr style=single]

Were pleased to provide an update to the following announcement Analysis Services server properties in Power BI Premium are now in public preview. At this time, weve transitioned this capability to general availability with full support in Power BI Premium and Power BI Embedded. As mentioned in the public preview announcement, the Analysis Services (AS) server properties gives administrators granular control to optimize and alter query behavior in their workspaces.

Read on for more.

[hr style=single]

CelerData, a unified analytics platform for the modern, real-time enterprise, today announced the latest version of its enterprise analytics platform, CelerData Version 3. CelerData is built on top of the open source project StarRocks, the fastest MPP SQL database recently donated to the Linux Foundation.

Read on for more.

[hr style=single]

DataRobot today announced the release of DataRobot AI Platform 9.0, along with deeper partner integrations, AI Accelerators, and redesigned service offerings, all centered on helping organizations derive measurable value from their AI investments.

Read on for more.

[hr style=single]

KNIME announced that it has achievedPremier tier partner status from Snowflake, the Data Cloud company. As a Premier partner, KNIMEcan help accelerate the digital transformation of its joint customers, who can fully leverage the Snowflake Data Cloud.

Read on for more.

In recent years, natural language processing (NLP) has made remarkable progress, and with the development of large language models such as ChatGPT, the application possibilities seem endless. One such possibility is PlotGPT, a tool that can turn any text prompt..

Read on for more.

[hr style=single]

Tableau has launched new features that help customers make data-driven decisions more quickly and efficiently. The new release will further enable companies to put their data at the center of every business interaction, use natural language to add business context..

Read on for more.

[hr style=single]

Tecton, the leading machine learning (ML) feature platform company, today announced version 0.6 of its flagship feature platform. The release introduces new capabilities that accelerate the process of building production-ready features and expands support for streaming data.

Read on for more.

[hr style=single]

TIBCO, a global leader in data connectivity, management, and analytics, empowers the worlds most innovative enterprises to solve their complex data challenges and make mission-critical business decisions with greater confidence and impact. TIBCO announced a series of enhancements to its analytics suite, delivering immersive, smart, and real-time analytics that empower

Read on for more.

[hr style=single]

TigerGraph has released a new benchmark report for its graph analytics software. The company used the Data Benchmark Council (LDBC) Social Network Benchmark (SNB) Scale Factor 30k dataset. The results show that TigerGraph can run deep-link OLAP style..

Read on for more.

[hr style=single]

With the next Solutions Spotlight event, the team at Solutions Review has partnered with leading data science and analytics automation vendor Alteryx to provide viewers with a unique webinar called Unlock Cloud Use Cases with the Alteryx Analytics Cloud Platform + AWS.

Read on for more.

This summit is produced by the worlds largest and most influential think tank and research firm in information technology (IT), namely Gartner, Inc. Because of its size and long history, Gartner has the unique resources required to develop thought leadership and practical advice for every discipline in IT.

Read on for more.

For consideration in future analytics and data science news roundups, send your announcements to the editor: tking@solutionsreview.com.

Tim is Solutions Review's Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.

Continued here:

Analytics and Data Science News for the Week of March 17 ... - Solutions Review

LOCUS ROBOTICS INTRODUCES LocusONE, THE WAREHOUSE ORCHESTRATION PLATFORM POWERING CENTRALLY MANAGED MULTI-BOT AMR AUTOMATION – Yahoo Finance

Data science-driven platform seamlessly orchestrates large fleets of multiple robot form factors in very large warehouses to deliver predictable, efficient, and scalable productivity results.

WILMINGTON, Mass., March 20, 2023 /PRNewswire/ -- Locus Robotics, the leader in autonomous mobile robots (AMRs) for fulfillment warehouses, announces LocusONE, the industry's first data science-driven warehouse automation platform to enable seamless operation and management of large quantities of multiple AMR form factors as a single, coordinated fleet in all sizes of warehouses.LocusONE uses proprietary data science to support the full breadth of material movement needs found in today's fulfillment and distribution warehouses.

(PRNewsfoto/Locus Robotics)

"LocusONE makes it easy to deploy and manage large numbers of AMRs and multiple form factors within very large warehouses to work together as a single, orchestrated fleet," said Rick Faulk, CEO of Locus Robotics. "Based on Locus's proprietary data science engine, LocusONE enables operators to gain the flexibility and critical business intelligence needed to efficiently handle material payloads from 3 ounces to 3,000 pounds."

The LocusONE platform supports a thousand or more robots, operating in sites as large as one million square feet or more, executing multiple use cases simultaneously in a single, intelligent, and orchestrated solution. With LocusONE, LocusBots engage in a diverse array of tasks including each picking and putaway, case picking and putaway, replenishment, pallet building, routine routes, point-to-point transport, counting, and more within a single warehouse. The data science behind the scenes optimizes the mix of tasks throughout the day to achieve optimal warehouse productivity.

LocusONE integrates with any WMS system to provide flexible and dynamic fleet management and enables easy deployment of a mix of Locus Origin, Vector, and Max AMRs tailored to meet each warehouse's specific needs. Regular enhancements are released as over-the-air updates to Locus's installed base of more than 250 customer sites.

Story continues

"Locus lets DHL Supply Chain deploy the right bot for our customers' varied needs, all centrally coordinated and managed on one platform. With Locus's powerful data science strategy, delivered through the LocusView dashboards, LocusONE plays an important role in providing the key business intelligence insight we need to optimize operational efficiency and improve service quality for our customers," said Sally Miller, CIO North America & DSC Digital Transformation Officer of DHL Supply Chain North America. "Partnering with Locus has helped us deliver on our commitment to continuous innovation and digital transformation across the entire supply chain".

Locus's powerful data science foundation makes it possible to smoothly navigate hundreds or even a thousand or more bots in a single footprint as well as deliver predictive and actionable management guidance in real time to dramatically improve throughput across a multitude of use cases. LocusONE includes Locus's award-winning LocusView package, which delivers data-driven, actionable insights across more than two dozen insightful reports and real-time dashboards, including labor guidance, predictive insights for work completion, operational comparisons against targets or time periods, order pool tracking and guidance, mission analysis and optimization, key performance visualization, and more.

LocusONE further extends Locus's position as the industry's AMR leader for automation and digitalization of warehouses, distribution, and fulfillment centers to efficiently meet increasing order volumes, labor shortages, and rising consumer expectations.

"As warehouses become increasingly complex, the ability to rely on a dependable data analytics strategy is essential to their smooth operation. Having the ability to deploy a flexible, seamless and proven AMR management platform to meet a broad range of use cases has become a must-have in the warehouse fulfillment industry," said Ash Sharma, Senior Research Director at Interact Analysis. "This innovative platform demonstrates how critical multi-form factor interoperability is in today's fulfillment warehouse and reflects Locus Robotics' commitment to innovation, ease of use, and broad industry vision over the past few years."

"LocusONE's ability to integrate rapidly and efficiently with other automation technology such as sortation or packaging systems ensures that a nimble, scalable robotics solution can be easily deployed into both brownfield and greenfield environments," noted Faulk.

"The LocusONE platform enables Kenco to create a seamless fulfillment experience that delivers enhanced productivity, while improving employee morale and visibility. We rely on the LocusView dashboards to monitor progress and inform our labor management throughout the day," said Kristi Montgomery, Vice President, Innovation, Research & Development Kenco Logistics. "LocusONE's real-time insights allow us to delight our customers, and that's what matters most to Kenco."

LocusONE is available through the company's all-inclusive, Robots-as-a-Service (RaaS) business model. Locus Origin, Vector, and Max can be easily added to existing and new workflows, enabling operations to dynamically scale and adapt to changing market demands. In a study done by Peerless Research Group, nearly half of respondents said they would prefer to buy their robotics solution as an entire integrated system that includes hardware, software, support, and maintenance.

Locus's Recycle, Refurbish, and Repurpose initiativeis the first in the AMR industry to actively drive sustainability across all aspects of the business from manufacturing and deployment to support and maintenance at sites around the world. Locus is continually working to identify and implement best-practice strategies and tactics designed to reduce overall waste across all areas of our organization.

Locus will be showcasing LocusONE at Promat, Booth #S2303, the material handling industry's premier event, running March 20-23 in Chicago, IL, where the Locus theme will be Vision, Intelligence, Results emphasizing Locus's commitment to delivering actionable intelligence in an innovative solution to drive results for customers. Locus will also host guests at the Locus Theater, featuring a lineup of informative industry speakers and presentations, discussions with Locus's customers and partners, as well as showcasing live picking and putaway demos.

In addition, unwind with Locus for The Happiest Hour, every day from 4:00 pm 5:00 pm and be sure to join Locus's ProMat 2023 scavenger hunt Passport to Bots. Learn how LocusBots work with other industry-leading warehouse solutions and be entered to win one of five prizes, including an iPad and Beats wireless headphones.

For more information and details, visithttps://locusrobotics.com/about-us/events/locus-promat-2023/

About Locus RoboticsLocus Robotics is the world leader in revolutionary, enterprise-level, warehouse automation solution, incorporating powerful and intelligent autonomous mobile robots (AMRs) that operate collaboratively with human workers in more than two dozen languages to dramatically improve product movement and productivity 23x. Named to the Inc. 500 two years in a row, and winning more than 17 industry and technology awards, the Locus solution dramatically increases order fulfillment productivity, lowers operational costs, and improves workplace quality, safety, and ergonomics for workers.

With over 100+ of the world's top brands and deployed at 250+ sites around the world, Locus Robotics enables retailers, 3PLs and specialty warehouses to efficiently meet and exceed the increasingly complex and demanding requirements of today's fulfillment environments. For more information, visit http://www.locusrobotics.com.

Cision

View original content to download multimedia:https://www.prnewswire.com/news-releases/locus-robotics-introduces-locusone-the-warehouse-orchestration-platform-powering-centrally-managed-multi-bot-amr-automation-301776014.html

SOURCE Locus Robotics

Follow this link:

LOCUS ROBOTICS INTRODUCES LocusONE, THE WAREHOUSE ORCHESTRATION PLATFORM POWERING CENTRALLY MANAGED MULTI-BOT AMR AUTOMATION - Yahoo Finance

David Hunter announced as new director of the Penn State AI Hub – Pennsylvania State University

UNIVERSITY PARK, Pa. The Office of the Senior Vice President for Research has announced that David Hunter has been appointed director for the cross-functional Penn State AI Hub. Hosted by the Institute for Computational Data Science (ICDS), the hub is designed with the vision of creating a collaborative and creative ecosystem across our university enterprise, to embrace and elevate Penn State's AI research.

We are thrilled that David is transitioning into the role as leader for this key initiative within our research enterprise, said Lora Weiss, senior vice president for research. Davids diversified experience brings the breadth and depth that will drive the vision of our AI Hub.

In his role as director, Hunter will foster an environment of networking various AI expertise within the University, with the goal of driving research around the computational and data sciences. The AI Hub is designed to bring together researchers and facilitate interdisciplinary collaborations, deepening connections among research thrusts. By broadening awareness of AI research at Penn State, the initiative seeks to increase the competitiveness of Penn State across AI research, whether using machine learning, deep learning or other AI-related tools or instruments. The hub will also identify areas of future AI research, and strengthen grant proposals for major external opportunities.

"The AI Hub provides a nexus for all Penn State scholars incorporating AI in some aspect of their research," said Jenni Evans, director of ICDS. "As director for the AI Hub, Dave will collaborate with University leaders in AI to identify synergies and develop strategies to further empower this research community."

Deb Ehrenthal, Director of the Social Sciences Research Institute, or SSRI, reflected that SSRIs role in co-sponsoring the Hub signifies the broad interdisciplinary nature of the AI Hubs mission.

Hunter is an accomplished leader, serving as head of Penn State's Department of Statistics from 2012 to 2018, and has a reputation in the international statistics community for his many leadership contributions. During his time as a faculty fellow with Penn State's Teaching and Learning with Technology (TLT) program, Hunter worked to mobilize the Penn State data science community "writ large." This community brought together students and faculty across all disciplines. The diversity and inclusiveness of this group has provided a forum for the creation of new, and sometimes unexpected, collaborations in the data sciences.

Hunter has published widely on statistical models for networks and is a co-creator of the "statnet" suite of packages for network analysis in R. He co-coined the term "MM algorithms" (majorization-minimization / minorization-maximization) and has written extensively on this and other "EM-like" (expectation-maximization) algorithms. He has also extended the theory and computational practice of unsupervised clustering using nonparametric finite mixture models.

Hunter earned his doctorate in statistics from the University of Michigan in 1999, following a math degree from Princeton University in 1992, and a prior career teaching high school mathematics. He has been at Penn State since 1999, where he is professor of statistics. He is a fellow of the Institute of Mathematical Statistics, the American Statistical Association and the International Statistical Institute.

Go here to see the original:

David Hunter announced as new director of the Penn State AI Hub - Pennsylvania State University