Page 1,610«..1020..1,6091,6101,6111,612..1,6201,630..»

Internet of Things Security: Safeguarding Connected Devices and Networks in IoT Era – Data Science Central

Internet of things devices and connectivity concepts on a network.

The IoT security product industry promises to become a safe road for digital commercialization. IoT security products safeguard networks and interconnected devices. It caters to various business needs such as data encryption, authentication, and subsequently, regulatory compliance. Currently, the various concerns driving the industry growth include data loss prevention (DLP), identity and access management (IAM), and identity governance and administration (IGA).

IoT security products are predicted to become essential for developing smart homes, buildings, and cities. Smart city projects are becoming a cornerstone of public infrastructure development for many economies around the world. These grand projects and policy initiatives are expected to drive market growth during the forecast period.

IoT security products are predicted to become essential for developing smart homes, smart buildings, and smart cities because they are becoming a cornerstone of public infrastructure development for many economies worldwide. Another IoT product driver on the horizon is Biometric security, owing to its rising use in smart devices. Also, the Increasing number of virtual data and the emergence of AI makes it essential to record large amounts of data and analyze it further. IoT security products can play a key role through data encryption, and physical storage through data storage centers.

An increasing number of virtual data and the emergence of Artificial Intelligence (AI) are making it essential to record large amounts of data and analyze it further. IoT security product market can play a key role through data encryption, and physical storage through data storage centers to drive the market.

The global IoT security product market has registered significant growth in recent years alongside various IoT technologies. The IoT security product industry promises to become a safe road for digital commercialization. IoT security products safeguard networks and interconnected devices. Additionally, the market caters to various business needs such as data encryption, authentication, and subsequently, regulatory compliance. Currently, the various concerns driving the market growth include data loss prevention(DLP), identity and access management(IAM),and identity governance and administration(IGA). The uncertainties arising from obscure legal frameworks globally and the lack of a clear path of innovation for IoT devices are barriers to growth.

Download PDF Brochure to Get Information About IoT Security Products, Tools, and Policies @https://bit.ly/3LvPehm

Continued here:

Internet of Things Security: Safeguarding Connected Devices and Networks in IoT Era - Data Science Central

Read More..

How to choose the right NLP solution – VentureBeat

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

For decades, enterprises have jury-rigged software designed for structured data when trying to solve unstructured, text-based data problems. Although these solutions performed poorly, there was nothing else. Recently, though, machine learning (ML) has improved significantly at understanding natural language.

Unsurprisingly, Silicon Valley is in a mad dash to build market-leading offerings for this new opportunity. Khosla Ventures thinks natural language processing (NLP) is the most important technology trend of the next five years. If the 2000s were about becoming a big data-enabled enterprise, and the 2010s were about becoming a data science-enabled enterprise then the 2020s are about becoming a natural language-enabled enterprise.

To fast-track its transformation to such an enterprise, an organization must establish a viable strategy that aligns with its business objectives and generates business impact. While it may sound like a complex decision that requires an expensive management consulting firm, its not. It starts with how you answer two questions: First, who employs the data scientists and machine learning engineers (MLEs)? Second, who builds and operates the underlying ML stack that houses the relevant models and tools?

A build your own strategy allows companies to construct custom ML models on their data. It also minimizes security risks because companies dont have to share data with external vendors to label or process. If you can pull it off and afford it, build your own leads to substantial competitive advantages because you now have a world-class artificial intelligence (AI) team, amplifying productivity in every aspect of the business.

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

However, this strategy is by far the most expensive. Building and operating an ML stack is complicated and requires specialized expertise. KPMG estimates that to build mature AI capabilities, a company needs to employ at least 500 to 600 full-time AI employees including a majority who build and operate the ML stack and pay them a cumulative $100 million to $120 million per year. On top of that, there is no guarantee for success, since productionizing AI is challenging for even the best teams.

The low-code ML platform and pre-trained models strategy reduces the cost of building mature AI capabilities because the vendor handles the majority of the development and operation of the ML stack. Instead of spending more than $100 million per year, organizations can likely reduce that to $25 million to $50 million annually. This strategy also still allows companies to build custom ML and NLP models.

Though, like the previous strategy, there is no guarantee of success because it does not eliminate one of the most complex parts of the full AI process. That is the handoff of models from the AI team to the business team to actually implement them into production and derive business value.

An application programming interface (API) strategy minimizes the hand-off problem, increasing the probability of success in productionizing AI. ML models can be seamlessly integrated into applications because the vendor abstracts the complexity of creating and training these models, and guides the users into the best way of using them. It also reduces the cost of achieving the benefits of NLP since the vendor employs the data scientists and MLEs, and builds and operates the ML stack.

Models that are accessible via APIs are built on public datasets and must still be trained and tuned to work on domain and company-specific data. However, if the vendor has implemented the tool properly, this work can be done directly by domain experts without technical skills.

Unfortunately, most vendors have not solved this problem, so there is limited feasibility of re-training their large language models to work on customer data without hiring a full staff of MLEs and data scientists to train and maintain over time; it either works or it doesnt.

For most enterprises, the best approach to leveraging NLP and becoming a natural language-enabled enterprise would be a strategy that includes APIs. That is provided that the vendor has enabled the capability for the customer to easily tune and optimize its general-purpose model so it can work on customer data. This would save enterprises tens of millions of dollars every year and accelerate time-to-value.

To the extent that the use case calls for a model that cant be accessed via API and easily tuned, then the next best strategy for most enterprises is the low-code ML platform and pre-trained models strategy. While the build-your-own strategy is the least practical strategy for most enterprises, there are, of course, a few companies for which this is the best path to action.

After all, according to Gartner: Enterprises sit on unexploited unstructured data, with opportunities to extract differentiating insights. Data and analytics technical professionals must uncover such insights by applying natural language technology solutions: intelligent document processing, conversational AI and insight engines.

Ryan Welsh is the founder and CEO of Kyndi.

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even considercontributing an articleof your own!

Read More From DataDecisionMakers

More here:

How to choose the right NLP solution - VentureBeat

Read More..

OHIO students place first, second and third in international Wikipedia editing competition – Ohio University

The Wikipedia project is a way to have students reflect back on everything theyve done in their coursework and then go to the deepest levels of Blooms Taxonomy of Learning to edit the articles, Lonnie Welch, Charles R. Jr. and Marilyn Stuckey professor in the School of Electrical Engineering and Computer Science, said.

In fall 2021 and spring 2022, computer science students in Data Mining and Data Science worked toward the same goal to complete their course to improve a Wikipedia page with more robust descriptions, key visuals and reliable sources. This common goal was part of a final project, allowing students the opportunity to revise and edit their page drafts before their final submission.

A Wikipedia-based writing activity offers a more authentic learning experience than a traditional term paper and provides students with the opportunity to practice disseminating domain-specific knowledge to a broad audience while navigating the complexity and ambiguity of working through a real-life problem, Welch wrote in the projects description.

Hunter Burden, BSCS 21, participated in this project in both the Data Mining and Data Science course. In his final semester of his undergraduate degree in computer science, Burden worked collaboratively with his teammates to improve an article on Radar Charts, a type of graph that measures multivariate data, like performance metrics, where at least three variables are represented like spokes or points on a wheel. The culmination of his teams hard work resulted in a first-place win in the International Society for Computational Biologys Wikipedia Competition. The competition supports the ISCB's mission by promoting the improvement of topics relevant to computational biology on Wikipedia, which is widely accessed around the world as a free-to-use educational resource. Since Wikipedia is often the first port of call for someone learning about a new topic, ensuring that computational biology is well-represented on Wikipedia helps maximize the visibility and impact of the field on society, Alastair Kilpatrick, competition organizer and bioinformatician at the University of Edinburgh, said.

While winning the competition was a benefit of this project, perhaps more importantly, the project helped assess learning outcomes, indicating that students not only learned key topics in computer science, but were also able to communicate about those topics in an informative and reliable way.

Before we won, I felt like we did great work. Dr. Welch asked us honest questions throughout the semester about our process and the contents of the page. When we presented everything in the end, we felt really good about the result, Burden said.

This assessment strategy was not accidental. Through collaborative work with instructional designer, Audra Anjum, Welch decided to plan his class around learning outcomes and objectives. Then he devised assessment strategies informed by his intended outcomes. He used the example of tying one's shoes to communicate his vision for his course structure.

If someone is learning how to tie their shoes, you can either ask them a series of questions about tying shoes or you can simply ask them to tie their shoes to show you that they have learned that skill, Welch explained.

By authoring Wikipedia pages, students were demonstrating that they learned valuable topics in computer science by writing about those topics for the layperson, creating visuals, incorporating data and identifying reliable sources that bolster their edits to the page.

We specifically developed the Wikipedia project to not only create a unique and interesting experience [for] students, but also to demonstrate that they met the learning outcomes, Anjum said.

Additionally, this project allowed students to practice critical written communication, a skill that is often underemphasized in both engineering and computer science coursework.

Theres an art to writing things that are concise. You dont want to overload people with jargon. This is an important skill in computer science when writing comments or updates on your work, so another party knows whats going on and what their task is, Burden said.

You need to be able to communicate your ideas to get people excited about what you are doing. Much of the written communication we do is to sell things, ideas and ourselves. In computer science, you need to be effective at communicating requirements, designs and test plans, Welch said.

Using their written communication skills, students collaboratively enhanced articles to improve the general landscape of Wikipedia a resource that is often stigmatized due to the user-editable nature of each page. In this circumstance, however, students demonstrated that thoughtful, informed editing can improve the landscape of Wikipedia.

Wikipedia has a reputation for being unreliable since so many individuals can contribute to the content. Wikipedia is a widely accessible open education resource for students and faculty, however. Experts have a responsibility to their professions to make sure that their fields are represented accurately and broadly on Wikipedia to discourage the spread of misinformation, Anjum said.

Editing a Wikipedia page well can be a significant undertaking, but the final product results in a more informative and accessible resource for people around the globe.

It's important for university students (and academics in general) to edit Wikipedia pages, as they possess important domain knowledge that can vastly improve Wikipedia's coverage of their subject of interest. They also should have some vested interest in ensuring that Wikipedia articles describing their subject of interest are accurate and up to date, Kilpatrick said.

Explore the winning teams Wikipedia articles in Radar Charts (first place), Biological Networks (second place) and Cosegregration (third place).

To learn more about how to incorporate Wikipedia editing into curriculum design, read this article by Anjum, Kilpatrick and Welch.

More:

OHIO students place first, second and third in international Wikipedia editing competition - Ohio University

Read More..

Top 5 stories of the week – VentureBeat

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

We know youre busy and may not have time to check out VentureBeat every day though we think you should so each week we offer this look at the most popular stories of the week.

If youre looking to follow the money, contributor Louis Columbus examines what IT certifications offer the biggest payday for tech pros. Staff writer Sri Krishna reports on Hugging Faces Inference Endpoints and how the AI-as-a-service offering is designed to take on the biggest enterprise workloads. Krishna also covers how Colossal Biosciences, the company that breathed new life into the wooly mammoth, is spinning off a new company to offer its life platform that bridges the gap between data and discovery to other companies.

Senior editor Sharon Goldman digs into deep fakes and how artificial intelligence (AI) and machine learning (ML) technology are spreading misinformation. Goldman also chats with LinkedIns VP of engineering and head of data and AI about how she lands top data science talent.

Heres more from our top five tech stories of the week:

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Of the many certification types and focuses, security certifications are the most lucrative, with an average annual salary of $153,154. For cloud certifications, the average salary is $150,961. AWS certifications despite having the top paying certification as mentioned above are just a tad lower on average overall with the annual salary sitting around $149,371.

Tech professionals with one or more certifications continue to see strong demand in the job market. For instance, LinkedIns job listings show a combined 117,000 open cybersecurity jobs today seeking candidates who hold at least one of the four security certifications on the top 15 list below. There are also 88,200 open jobs listed on the platform for candidates who hold at least one cloud architect certification

New York-based Hugging Face, which aims to democratize AI and ML via open-source and open science, has launched the Inference Endpoints. The AI-as-a-service offering is designed to be a solution to take on large workloads of enterprises including in regulated industries that are heavy users of transformer models, like financial services (e.g., air gapped environments), healthcare services (e.g., HIPAA compliance) and consumer tech (e.g., GDPR compliance).

The company claims that Inference Endpoints will enable more than 100,000 Hugging Face Hub users to go from experimentation to production in just a couple of minutes.

Colossal Biosciences, the self-described de-extinction company behind the woolly mammoth and thylacine, just announced that it is spinning off Form Bio, as an independent software company offering a computational life sciences platform that bridges the gap between data and discovery.

Driven by deep learning artificial intelligence (AI) algorithms, Form Bio empowers life scientists with a software platform for managing large datasets, executing verified workflows, visualizing results and collaborating with their peers. The platform brings these capabilities together in one cohesive unit with a user experience designed to simplify computational work and bolster life science breakthroughs across companies, labs and universities.

Deepfakes, or high-fidelity, synthetic, fictional depictions of people and events leveraging artificial intelligence (AI) and machine learning (ML), have become a common tool of misinformation over the past five years. But according to Eric Horvitz, Microsofts chief science officer, new deepfake threats are lurking on the horizon.

A new research paper from Horvitz says that interactive and compositional deepfakes are two growing classes of threats. In a Twitter thread, MosaicML research scientist Davis Blaloch described interactive deepfakes as the illusion of talking to a real person. Imagine a scammer calling your grandmom, who looks and sounds exactly like you.

Compositional deepfakes, he continued, go further with a bad actor creating many deepfakes to compile a synthetic history.

In a new interview with VentureBeat, Ya Xu, VP of engineering and head of data and artificial intelligence (AI) at LinkedIn, is more than happy to share her thoughts on everything from her passion for bringing science and engineering together to the top traits she looks for when interviewing data science talent.

Xu said there are three important things that she looks for in candidates. First, is the individual mission-driven and impact-driven? Next, Xu wants to hire people who are not surprisingly collaborative. On top of that, she said it is a priority to hire people who are willing to learn, adapt and stay curious.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Read more here:

Top 5 stories of the week - VentureBeat

Read More..

SF Supervisors Challenge Mayor Breed and the Navy to Bolster Protections at Bayview-Hunters Point – KQED

In June, the San Francisco civil grand jury found the city, the Navy and the regulators overseeing the site had not adequately accounted for how rising groundwater could mix with toxics and expose residents to contamination. With the pace and scale of climate change, Bay Area climate scientists are increasingly worried the worst-case scenarios will become a reality, which could mean inundation of toxic sites from both above and below.

The report seemed to confirm what Bayview-Hunters Point residents have long been saying: that the city is not acting fast enough on the issue.San Franciscos supervisors said they agree with most of the jurys findings and have expressed frustration with the city's lack of action on the issue. Walton would like to secure resources for an independent commission and a fast-tracked, third-party study of how groundwater rise could impact the Superfund site and the community. He would also like the city and all federal agencies involved to increase oversight of the cleanup to protect the health of residents.Groundwater and sea level rise has not been afforded the level of review and research necessary to protect residents of the shipyard, and understanding the additional science is important to keeping people safe, Walton said during Thursdays hearing.

Tom Paulino, the mayors liaison to the Board of Supervisors, reiterated the mayors objections to the report when pressed at Thursdays hearing. Breed has said she mostly disagrees with its findings and argues that the city is working with regulators, the Navy and other experts on a response to the climate threat that is robust and appropriate. A five-year Navy review of the Superfund site beginning in March 2023 could include updated climate science.

Paulino said additional elements of oversight arent needed and would be duplicative of the existing structures in place. He noted the mayors team is willing to work collaboratively with the Board on the issue.

The jury recommended the city pay for an independent study, using multiple sea-level-rise scenarios, to determine how groundwater rise could affect toxic contamination in the soil at the Superfund site. Its report also recommended convening a permanent oversight committee to examine and question decisions about the cleanup, and communicate requests from residents and the city to the Navy and regulators.

Sara Miles, a member of the jury, said shes happy the Board of Supervisors is taking the report seriously.

There's no way to erase or make good all the harm that has been done, she said, noting that local doctors have found contamination in residents bodies. Were getting somewhere. President Walton wants to take some responsibility. I think thats good.Bayview community members have also pressed city leaders to take action. Arieann Harrison, an organizer with the Marie Harrison Community Foundation, said that the board is taking a step in the right direction to protect residents, but that more work is needed. It's time to take it to big wigs, she said. We need our Nancy Pelosis to come and speak to the issue too. We need them to stop skipping past our community like we are invisible.The Hunters Point Biomonitoring Foundation tested the urine of Harrison and other residents in the past three years and found high levels of contaminants such as uranium, although those tests were not independently confirmed by the health officials. If I tested positive for that stuff, I'm pretty sure that a lot of other residents will test positive as well, she said.

Continue reading here:

SF Supervisors Challenge Mayor Breed and the Navy to Bolster Protections at Bayview-Hunters Point - KQED

Read More..

Research Associate in Geospatial Data Science for Urban Applications job with NATIONAL UNIVERSITY OF SINGAPORE | 310927 – Times Higher Education

Job Description

Job Requirements

You should demonstrate that you possess:

Covid-19 Message

At NUS, the health and safety of our staff and students are one of our utmost priorities, and COVID-vaccination supports our commitment to ensure the safety of our community and to make NUS as safe and welcoming as possible. Many of our roles require a significant amount of physical interactions with students/staff/public members. Even for job roles that may be performed remotely, there will be instances where on-campus presence is required.

Taking into consideration the health and well-being of our staff and students and to better protect everyone in the campus, applicants are strongly encouraged to have themselves fully COVID-19 vaccinated to secure successful employment with NUS.

More Information

Location: Kent Ridge CampusOrganization: College of Design and EngineeringDepartment : ArchitectureEmployee Referral Eligible: NoJob requisition ID : 17426

The rest is here:

Research Associate in Geospatial Data Science for Urban Applications job with NATIONAL UNIVERSITY OF SINGAPORE | 310927 - Times Higher Education

Read More..

Top Companies in the Financial Analytics Space – CXOToday.com

Modern organisations are massively transforming their finance function by leveraging data and applying advanced statistical and ML techniques. Hence, finance analytics is slowly becoming an integral part of businesses across industries as the enterprises strive to improve decision-making within the finance function of their organizations.

Although the number of vendors offering data analytics services has significantly increased over the last few years to meet the demands at the enterprise level, there is no major consulting firm offering explicit finance analytics services.

The whole big data and analytics software and services market is fragmented and we can broadly divide it into the following categories: KPO firms (Genpact, TD Synnex, and Telus International), Consulting firms (Deloitte, Mckinsey & Companys, PwC, EY, and KPMG), Software companies (Celonis, UiPath, and Tellius), and Boutique firms (Mu Sigma, Fractal, and Tiger Analytics).

Top consultancies that can help you change the fortunes of your organization by providing finance analytics services.

When it comes to choosing the right data analytics solutions and services provider, it is a very crucial decision for any organization. Since, only a handful of players specialize in financial analytics domain, the decision becomes all the more difficult. To help you choose the best consultancy, we have put together the list of top contenders in this space, all of which belong to the boutique firms category:

1) Mu Sigma: Powered by an entirely new art of problem-solving, Mu Sigma can help to transform your business by utilizing the potential of financial analytics. The company offers a range of scalable revenue models and financial algorithms to help its clients analyse multiple financial variables in a complete and holistic manner. Further, the company help clients throughout the cycle of data engineering, data science, and decision science which makes it a credible name in the financial analytics sector.

2) Lingaro: Lingaro has found a prominent place in Gartner Peer insights which speaks volumes about the credibility of its Data Analytics services. The firm is a preferred innovation partner of Fortune 500 companies and thanks to its excellence in financial analytics, it continues to remain one of the dominant forces in this category. Along with financial analysis, Lingaro offers breakthrough solutions in the domain of supply chain analytics, consumer analytics, digital marketing, and data strategy consulting among others.

3) Fractal: Fractal offers AI-driven Data Analytics solutions to companies across product categories and service domains. Specifically in the financial analytical space, the company help clients in actuating agility and achieve revenue growth by improving forecasting and applying intelligent automation. The company excels in cognitive research and offers credible solutions in the domains of ratio analysis, operating profitability, and inventory turnover.

4) Latentview Analytics: Latentview Analytics can be easily considered a one-stop solution for all your needs related to Financial Analytics and allied solutions. The company offers a wide range of services that include offering unified data partnering solutions, ecosystem development, business outcomes analysis, and analysing working capital models. Further, the platform offers expert solutions for a wide range of categories including business analytics, consulting servicing, data engineering, and digital solutions among others.

5) Altair: To help companies excel and see a clear picture on the financial front, Altair offers a constellation of analytics services based on AI and ML technologies. The clients of the company can choose from a range of services that can help them to better manage the overall financials, fiscal forecasting, credit risk, and ROI. Further, customized solutions such as predictive sales, client profitability, and cash flow analytics help clients enhance shareholder value while delivering superior experiences to customers through cost optimisation and reduction in inventory turnover models.

6) Aays Analytics: Aays is a boutique AI & data consulting firm which specialises in democratising data science and ML in the corporate finance space. The company partners with global conglomerates having large, complex data footprints and help them contextualise their data science journey with a deep functional focus. Since its inception in 2018, it has solved some of the most complex business problems faced by fortune 1000 and fast-growing companies in the world utilising big data, cloud computing, and AI/ML. Backed by a strong management team with IIT/IIM background and over a decade of industry experience with Fortune 100 companies, Aays Analytics has been featured in the 2022 list of top data science providers in India by Analytics India Magazine (AIM) and has been rated as the top 5 fastest growing companies in this space.

7) Polestar: The experienced team at Polestar make excellent use of data integration, visual modelling, and attractive dashboards to help FP&A teams make insightful decisions on many critical parameters such as asset management, liabilities handling, capital gains, EBITDA, and earnings per share (EPS) among others. The company offers a completely guided approach to optimizing cash flow, maximizing revenue, and detecting fraud to enhance the overall profitability and revenue of its clients

8) Tiger Analytics: Known for its global client base, Tiger Analytics has its wings spread across countries and continents. The AI-based solutions offered by the company can help financial leaders to make more realistic and data-based decisions for organizations. Tiger Analytics can offer financial consultancy services for risk management, discrepancies identifications and elimination, and fraud detection and mitigation. Prospective clients can also avail of services such as cost optimisation, ratio analysis, and evaluating investment potential for achieving better results on critical parameters of financial performance.

Original post:

Top Companies in the Financial Analytics Space - CXOToday.com

Read More..

P&G turns to AI to create digital manufacturing of the future – CIO

Over the past 184 years, The Procter & Gamble Co. (P&G) has grown to become one of the worlds largest consumer goods manufacturers, with worldwide revenue of more than $76 billion in 2021 and more than 100,000 employees. Its brands are household names, including Charmin, Crest, Dawn, Febreze, Gillette, Olay, Pampers, and Tide.

In summer 2022, P&G sealed a multiyear partnership with Microsoft to transform P&Gs digital manufacturing platform. The partners say they will create the future of digital manufacturing by leveraging the industrial internet of things (IIoT), digital twin, data, and AI to bring products to consumers faster and increase customer satisfaction, all while improving productivity and reducing costs.

The main purpose of our digital transformation is to help create superior solutions for daily problems of millions of consumers around the world, while generating growth and value for all stakeholders, says Vittorio Cretella, CIO of P&G. We do that by leveraging data, AI, and automation with agility and scale across all dimensions of our business, accelerating innovation and increasing productivity in everything we do.

The digital transformation of P&Gs manufacturing platform will enable the company to check product quality in real-time directly on the production line, maximize the resiliency of equipment while avoiding waste, and optimize the use of energy and water in manufacturing plants. Cretella says P&G will make manufacturing smarter by enabling scalable predictive quality, predictive maintenance, controlled release, touchless operations, and manufacturing sustainability optimization. These things have not been done at this scale in the manufacturing space to date, he says.

The company has already undertaken pilot projects in Egypt, India, Japan, and the US that use Azure IoT Hub and IoT Edge to help manufacturing technicians analyze insights to create improvements in the production of baby care and paper products.

For instance, the production of diapers involves assembling many layers of material at high speed with great precision to ensure optimal absorbency, leak protection, and comfort. The new IIoT platform uses machine telemetry and high-speed analytics to continuously monitor production lines to provide early detection and prevention of potential issues in the material flow. This, in turn, improves cycle time, reduces network losses, and ensures quality, all while improving operator productivity.

P&G is also piloting the use of IIoT, advanced algorithms, machine learning (ML), and predictive analytics to improve manufacturing efficiencies in the production of paper towels. P&G can now better predict finished paper towel sheet lengths.

Smart manufacturing at scale is a challenge. It requires taking data from equipment sensors, applying advanced analytics to derive descriptive and predictive insights, and automating corrective actions. The end-to-end process requires several steps, including data integration and algorithm development, training, and deployment. It also involves large amounts of data and near real-time processing.

The secret to scale is to lessen complexity by providing common components at the edge and in the Microsoft cloud that engineers can work with to deploy diverse use cases into a specific manufacturing environment without having to create everything from scratch, Cretella says.

Using Microsoft Azure as the foundation, Cretella says P&G will now be able to digitize and integrate data from more than 100 manufacturing sites around the world and enhance AI, ML, and edge computing services for real-time visibility. In turn, this will enable P&G employees to analyze production data and leverage AI to support decisions that drive improvement and exponential impact.

Accessing this level of data, at scale, is rare within the consumer goods industry, Cretella says.

P&G took the first steps in its AI journey more than five years ago. It has moved past what Cretella calls the experimentation phase with scaled solutions and increasingly sophisticated AI applications. Data and AI have since become central to the companys digital strategy.

We leverage AI across all dimensions of our business to predict outcomes and increasingly to prescribe actions through automation, Cretella says. We have applications in our product innovation space, where thanks to modelling and simulation we can shorten the lead time to develop a new formula from months to weeks; in the way we engage and communicate with our consumers, using AI to deliver to each of them brand messages delivered at their right time, right channel, and with the right content.

P&G also uses predictive analytics to help ensure the companys products are available at retail partner where, when, and how consumers shop for them, Cretella says, adding that P&G engineers also use Azure AI to ensure quality control and equipment resilience on the production line.

While P&Gs recipe for scale relies on technology, including investment in a scalable data and AI environment centered on cross-functional data lakes, Cretella says P&Gs secret sauce is the skills of hundreds of talented data scientists and engineers who understand the companys business inside and out. To that end, P&Gs future is about embracing automation of AI, which will allow its data engineers, data scientists, and ML engineers to spend less time on manual, labor-intensive tasks so they can focus on the areas where they add value.

Automation of AI also allows us to deliver with consistent quality and to manage bias and risk, he says, adding that automating AI will also make these capabilities accessible to an increasingly larger number of employees, thus making the benefits of AI pervasive across the company.

Another element to achieving agility at scale is P&Gs composite approach to building teams in the IT organization. P&G balances the organization between central teams and teams embedded in its categories and markets. The central teams create enterprise platforms and technology foundations, while the embedded teams use those platforms and foundations to build digital solutions that address their units specific business opportunities. Cretella also notes that the company prioritizes insourcing talent, especially in areas such as data science, cloud management, cybersecurity, software engineering, and DevOps.

To accelerate P&Gs transformation, Microsoft and P&G have created a Digital Enablement Office (DEO) staffed by experts from both organizations. The DEO will serve as an incubator to create high-priority business scenarios in the areas of product manufacturing and packaging processes that P&G can implement across the company. Cretella considers it as more of a project management office than a center of excellence.

It coordinates all the efforts of the different innovation teams that work on business use cases and ensures an efficient scaled deployment of proven solutions that develop, he says.

Cretella has some advice for CIOs trying to drive digital transformation in their organizations: First, be driven and find your energy in the passion for the business and how to apply technology to create value. Second, be equipped with tons of learning agility and genuine curiosity to learn. Last, invest in people your teams, your peers, your bosses because technology alone does not change things; people do.

Read the original here:

P&G turns to AI to create digital manufacturing of the future - CIO

Read More..

Health Data Research UK’s Better Care program announces recommendations for developing learning health systems in the UK – EurekAlert

Researchers are calling on policy makers to support the use of data-driven learning health systems to deliver a step change in the NHS's ability to improve patient care.

The call comes as a new report, authored by the Better Care programme from Health Data Research UK (HDR UK) and the Health Foundation, warns that tackling the huge pressures currently faced by health and care services as a result of the COVID-19 pandemic and a period of significant underfunding will require action on multiple fronts, and with learning health systems a key part of the mix.

Learning health systems are approaches that enable iterative, data-driven improvements to health and care systems which draw on the increasing availability of both clinical and patient-generated data. They aim to give health and care services the tools to identify and solve problems from within, turning them into engines of innovation and improvement.

The report suggests the learning health system approach offers a more powerful and sustainable route to improving NHS efficiency than relying on short-term, high cost, external consultancy support or nation-wide improvement programmes which are not tailored to meet local and regional challenges and needs.

Given the challenges faced by health and care services, the authors warn learning health systems are essential to the NHSs sustainability and should be seen as a nice to have.

Alice Turnbull, National Programme Director, HDR UK, said:

The UK health and care system is facing some of its most significant resource challenges in its history.

The long-term impact of the COVID pandemic combined with the current economic crisis makes it even more urgent to embed approaches which deliver sustainable and effective improvements in care and position our NHS to meet future health needs.

In many ways, healthcare has always been learning from practice. But recent advances in data and technology, coupled with the move towards increased collaboration and integration between services, presents new opportunities to improve care in a more systematic way.

HDR UK is working to harness this opportunity by enabling a more integrated health data ecosystem that prioritises benefit to patients and the public. This system is already enabling data-driven insights and technology to rapidly inform decision making across the health and care system but there is still a long way to go to ensure everyone, everywhere benefits from this approach.

We hope that the recommendations presented in this report support the implementation of learning health systems across the UK and provide a route to help the national health service to negotiate the current challenges faced.

Despite their potential, the report reveals most providers and systems have not yet been able to capitalise on learning health systems, creating a large gap between their promise and practice".

Based on interviews and a survey of over 100 stakeholders, the report provides evidence that a lack of data analytics expertise and the challenges of linking data across different sources are key barriers to implementing learning health systems.

In response, the authors highlight actions needed to increase data and analytical expertise through better training opportunities and career pathways, and providing long-term funding to support the development of mature digital capabilities.

It also presents 16 examples of how learning health systems have improved patient care, from Bluetooth-enabled nebulisers which have helped people with cystic fibrosis track their medication intake and shape their treatment plans, to data on the impact of taking magnesium sulphate as a pregnancy supplement which is believed to have prevented 48 cases of cerebral palsy between 2018 2021.

Tim Horton, Assistant Director (Insight & Analysis) at the Health Foundation said:

The health and care system is under unprecedented pressure.A step change in providers capability to learn and improve is needed in order to find a sustainable route through COVID-19 recovery and to meet future health needs.

Learning health systems can play a vital role here capitalising on the increasing availability and potential of data and technology, to understand how to improve patient care, and translate this into practice.

Policymakers, and those leading health and care services, can help create the conditions for learning health systems to flourish.

Progress will be needed on a range of related issues, including data, digital maturity, improvement capability and culture, and in developing a clear vision for learning health systems and how they can improve health and care.

Our report highlights eight areas where targeted action could support the development of learning health systems and help many more patients benefit from improved care.

ENDS

About the Better Care programme

HDR UKs Better Care Programme aims to equip people with large-scale data and advanced analytics so that they can identify and make informed decisions about the best care for them.

Over the last two years, as part of this wider programme, the Health Foundation and HDR UK have been working in partnership to deliver the Better Care Catalyst Programme. This partnership has funded three projects to develop data-driven tools that aim to improve healthcare decision making. It has also supported three workstreams to set out the training, knowledge mobilisation, and policy actions required to support data-driven learning and improvement in healthcare.

About Health Data Research UK

Health Data Research UK (HDR UK) is the UKs national institute for health data science. HDR UKs mission is to bring together the UKs health and care data to enable discoveries that improve peoples lives by uniting, improving, and using data as one national institute.

About The Health Foundation

The Health Foundation is an independent charity committed to bringing about better health and health care for people in the UK. This report is part of wider work to enable faster improvement in health and care from how emerging new technologies and data can support adoption and spread of improvement and service model changes, to sustainably meet peoples current and future needs.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

View original post here:

Health Data Research UK's Better Care program announces recommendations for developing learning health systems in the UK - EurekAlert

Read More..

PLOS Board Appointments – The Official PLOS Blog – PLOS

After a careful search, I am excited to share with our community four new appointments to the PLOS Board of Directors. This is a critical time for us as we expand our journal offerings, our global reach and challenge the landscape of Open Access publishing regarding sustainable business models. Each new member brings a depth and breadth of knowledge in their fields, which will enable us to continue to drive our mission forward while serving our scientific communities. The Board plays a key role as strategic thought partner to PLOS leadership, as well as oversight of organizational performance (business, strategic and financial), compliance and risk management.

Dr. Arlene Espinal, who joined the Board on September 1, currently serves as the Head of Microsoft Cloud Data Science and Analytics for Microsoft Corp. She is a leader in global strategy, Quantum-AI and next-generation digital technologies. She is also passionate about talent development and leads teams with diversity, inclusion, equitability, belonging and acceptance in mind essential to community and business. Recognized for her seminal role in driving awareness and change to social disparities that impact our communities, the Hispanic IT Executive Council named Dr. Espinal as a 2020 Top 100 Global Technology Leader. She was again recognized this year for her executive contributions. The National Diversity and Leadership Council recognized Dr. Espinal as one of the 2021 Top 50 Most Powerful Women in Technology.

Dr. Israel Borokini, who joined the Board on September 1, is a postdoctoral research fellow in the Department of Integrative Biology, University of California, Berkeley. His research focuses on combining ecological, geospatial, genomic, cytological, and phylogenetic data to identify patterns of community assemblages and biodiversity, and the eco-evolutionary mechanisms that generate and maintain them. Dr. Borokini completed his Ph.D. in the Ecology, Evolution, and Conservation Biology graduate program at the University of Nevada, Reno. He completed his undergraduate and Masters degrees in his home country of Nigeria before spending a decade as Principal Scientific Officer at the National Center for Genetic Resources and Biotechnology in Ibadan, Nigeria. Dr. Borokini not only expands the scientific expertise on the Board but also brings a passion for PLOSs mission. He has personally experienced the challenges of access to research in a low resource environment and will bring valuable perspectives to the Boards discussions as PLOS grows globally and prioritizes equity.

Richard Wilders deep experience in global public health law has a recurring theme: ensuring access. Prior to private practice, he was the General Counsel and Director of Business Development at the Coalition for Epidemic Preparedness Innovations (CEPI). At CEPI, he directed the legal and business development affairs during its initial start-up phase and through the first two years of the response to the COVID-19 pandemic. Prior to CEPI, he was the Associate General Counsel in the Global Health Program at the Bill & Melinda Gates Foundation. He provided legal expertise to ensure access to drugs, vaccines and diagnostics with a particular focus on access by affected populations in low and middle-income countries. His work also addressed how to ensure access to the artifacts of scientific research, including published materials, data, software code and biological materials. His Open Access policy work at Gates won the SPARC Innovator Award in 2015. Richard has also served as a committee member of the Roundtable on Aligning Incentives for Open Science convened by the National Academies of Science, Engineering and Medicine. He joined the Board in June, 2022.

Fernan Federici joined the Board in October, 2021. As we expand globally, Dr. Federicis perspective from a different research culture will prove invaluable. He is currently an Associate Professor and molecular geneticist at the Pontificia Universidad Catolica in Santiago, Chile. He has been a champion of Open Science in a number of areas including protocols and reagents, where he contributes toReclone (the Reagent Collaboration Network). Fernans research group also works on the promotion and development of Free/Libre Open Source technologies for research and education in molecular biology and bioengineering. The group is part of ReClone, the Gathering for Open Science Hardware community (GOSH) and the CYTED-reGOSH network for open technologies in Latin America.

I would be remiss if I did not take the opportunity to express my heartfelt thanks to Robin Lovell Badge, Mike Carroll and Meredith Niles for their outstanding years of service to the PLOS Board. Their wisdom and counsel have been enormously beneficial to me, and our organization, as we collectively charted a new path for PLOS, one focused on sustainability, inclusivity and expanding our roots globally. While its hard to say goodbye, we are excited to bring on board so many exceptional individuals with fresh perspectives. Please join me in welcoming our new Board members!

Excerpt from:

PLOS Board Appointments - The Official PLOS Blog - PLOS

Read More..