Category Archives: Data Science

Pace receives NSF grant to expand data science instruction nationwide – Westfair Online

Instructors from Pace Universitys Dyson College of Arts and Sciences recently received a $499,354 grant from the National Science Foundation that will allow them to expand the teaching of data science skills into introductory biology and environmental science courses.

Specifically, the grants purpose is to allow the group to continue expanding the Biological and Environmental Data Education Network, which will create a system of instructors addressing the need for data science skills and offer training, resources and professional development opportunities for biology and environmental science instructors to enable them to teach the skill set in their regular coursework.

Aiello-Lammens and Crispos roles in the program are to help other instructors learn to incorporate data education into curricula, an approach that the two instructors already practice in all of their courses taught at Pace.

Aiello-Lammens has been involved with the network before, during its original founding through a National Science Foundation incubator grant, of which he was a recipient along with colleagues from Kenyon College and Denison University. Crispo was among one of the first attendees of the networks activities.

We think its vital that our students understand how to make sense of these data and use them to make decisions for what they should be doing from whether to eat certain foods, consider certain medicines, or accept a particular job, Aiello-Lammens said. If they have these data science skills in general, then they can apply them both in their work and in their lives.

Today were able to collect more data more rapidly, collect it on computers, and analyze it on supercomputers, Crispo said. Its becoming more challenging to handle data and analyze itand its becoming increasingly important to give students the skills to be able to do so.

According to the National Science Foundation, data science skills are increasingly necessary in the fields of science, technology, engineering, mathematics and beyond, but instructors often do not have the training necessary to impart this skill set to their students. Other barriers it cited include curricula that are already perceived as overcrowded, confusion over what the key skills are and a lack of confidence in teaching them.

Data management, data analysis, data visualization, programming, modeling and reproducibility are the main subject areas and strategies the network will focus on imparting to students.

The Biological and Environmental Data Education Network will hold annual meetings, the first of which will be held at Paces Manhattan campus in 2022 and will focus on diversity and inclusion in data science education.

Its other education expansion efforts will include developing training and education workshops, publishing a curriculum guide and adding members to create a more active and diverse community of educators. Aiello-Lammens and Crispo aim to expand it nationally and internationally.

According to Pace, the program aligns with its new university-wide strategic plan, Pace Forward.

This grant epitomizes what we believe in at Pace and helps to put Pace at the forefront of educational innovation, said Tresmaine Grimes, dean of Dyson College of Arts and Sciences and School of Education. The work of Professors Aiello-Lammens and Crispo is inspiring in its aim to be cutting-edge, far-reaching, and cross-disciplinary, and will serve instructors and students not only at Pace, but across the country.

See the rest here:

Pace receives NSF grant to expand data science instruction nationwide - Westfair Online

Snowflake Shapes the Future of Data Science with Python Support – Business Wire

No-Headquarters/BOZEMAN, Mont.--(BUSINESS WIRE)--Snowflake (NYSE: SNOW), the Data Cloud company, today announced at its Snowday event that data scientists, data engineers, and application developers can now use Python - the fastest growing programming language1- natively within Snowflake as part of Snowpark, Snowflakes developer framework. With Snowpark for Python, developers will be able to easily collaborate on data in their preferred language. At the same time, they can leverage the security, governance, and elastic performance of Snowflakes platform to build scalable, optimized pipelines, applications, and machine learning workflows. Snowpark for Python is currently in private preview.

Developers want flexibility when working with data, simpler environments that require less administrative work and maintenance, and immediate access to the data they need. Snowpark brings the programming languages of choice for data to Snowflake. With Snowpark, developers can unlock the scale and performance of Snowflakes engine and leverage native governance and security controls built-in to Snowflakes easy-to-use platform. In addition to Java and Scala, Snowpark now supports Python, allowing users to have different languages and different users all working together against the same data with one processing engine, without needing to copy or move the data.

As a result of the recently announced partnership with Anaconda, Snowflake users can now seamlessly access one of the most popular ecosystems of Python open source libraries, without the need for manual installs and package dependency management. The integration can fuel a productivity boost for Python developers. Snowflakes recently launched Snowpark Accelerated Program also supports customers with access to numerous pre-built partner capabilities and integrations, from directly within their Snowflake account.

With Snowpark for Python, data teams can:

Novartis, the multi-national healthcare company that provides solutions to address the evolving needs of patients worldwide, needed a way to empower their global team of analysts and data scientists with a powerful data platform that would reduce data preparation time and provide self-service capabilities for building models and running analytics.

Novartis mission is to reimagine medicine to improve and extend people's lives, and to do so successfully today we need to leverage digital technologies that continue to put data and data science at the center of our transformation, said Loic Giraud, Global Head of Digital Platform & Product Delivery at Novartis. As a progressive, data-driven life-science organization, the flexibility and scale of Snowflakes Data Cloud allows us to accelerate our pace of knowledge through data interpretation and insight generation, bringing more focus and speed to our business. Bringing together all available data ultimately unlocks more value for our employees, patients, and health care providers, and data science innovations help us realise this goal."

Snowflake has long provided the building blocks for pipeline development and machine learning workflows, and the introduction of Snowpark has dramatically expanded the scope of whats possible in the Data Cloud, said Christian Kleinerman, SVP of Product at Snowflake. As with Snowpark for Java and Scala, Snowpark for Python is natively integrated into Snowflakes engine so users can enjoy the same security, governance, and manageability benefits theyve come to expect when working with Snowflake. As we continue to focus on mobilizing the worlds data, Python broadens even further the choices for programming data in Snowflake, while streamlining data architectures.

Learn More:

Forward-Looking Statements

This press release contains express and implied forwarding-looking statements, including statements regarding the availability of Snowpark for Python. These forward-looking statements are subject to a number of risks, uncertainties and assumptions, including those described under the heading Risk Factors and elsewhere in the Quarterly Report on Form 10-Q for the fiscal quarter ended July 31, 2021 that Snowflake has filed with the Securities and Exchange Commission. In light of these risks, uncertainties, and assumptions, actual results could differ materially and adversely from those anticipated or implied in the forward-looking statements. As a result, you should not rely on any forwarding-looking statements as predictions of future events.

About Snowflake

Snowflake enables every organization to mobilize their data with Snowflakes Data Cloud. Customers use the Data Cloud to unite siloed data, discover and securely share data, and execute diverse analytic workloads. Wherever data or users live, Snowflake delivers a single data experience that spans multiple clouds and geographies. Thousands of customers across many industries, including 212 of the 2021 Fortune 500 as of July 31, 2021, use Snowflake Data Cloud to power their businesses. Learn more at snowflake.com.

1According to SlashData, Developer Economics: State of the Developer Nation 20th Edition.

Read more:

Snowflake Shapes the Future of Data Science with Python Support - Business Wire

UK Railroads Invest in Data Science, AI, and Machine Learning – Cities of the Future

The oldest rail network in the world, with over 32,000 km. of track, is investing in artificial intelligence and machine learning to streamline maintenance and deal with weather-related challenges.

Two months ago, during the AI and Big Data Expo, I had the opportunity to meet Nikolaos (Nick) Kotsis, Chief Data Scientist at Network Rail. At the conference, he was one of the speakers, talking about their digital transformation and how Network Rail was implementing new ways to inspect and manage the networks assets, shifting work from traditional planning and maintenance schedules to a proactive predict and prevent approach.

At the time, Kostis mentioned the data collection happening in real-time, simultaneously from the trains using the network, the people inspecting the tracks, drones, helicopters, and over 30,000 IoT sensors deployed all over the country.

All this data collection allows Network Rail to know what is happening and take immediate action when something goes wrong or needs fixing. But the real magic, which helps predict and prevent incidents, and provides predictive maintenance, happens when AI and Machine Learning are applied to that massive amount of data.

To learn more about how Network Rails Data Science department works and how it impacts the organization, we reached out to Nick Kotsis again. He answered our questions by email.

PV: As discussed in our previous conversation, Network Rail is undergoing a substantial digital transformation in the field and the data center. Can you tell us a bit about your data science department and its role within the organization?

Nick Kotsis: Our initial plan for the data science function was leaning towards being primarily guidance and advisory; however, once we began engaging with customers, we realised that taking on responsibilities for delivery would not only return cost benefits to the taxpayer but would also help our partner network operate more effectively.

Inspiring confidence and gaining the trust of our customers and suppliers was the focus of my role in the first six months. Since then, we have evolved to become an actual delivery function for data analytics, advanced machine learning, and AI technology packaged into fully integrated digital products that customers can use with minimal training. The end result is a trusted service supported by a skilled, confident team focused on the customer and responding to the most complex problems in Network Rail.

When customers from across the organisation ask for help, it means we do something right.

PV: You said before that NR is a complex machine, managing the network and some of the largest train stations and maintaining the freight trains. How is data analytics (and AI) affecting the different services?

Nick Kotsis: Our organisation is responsible for maintaining a complex infrastructure that is vulnerable to environmental and weather conditions and the constant pressure of hundreds of daily train journeys.

Network Rail maintains 20,000 miles of track. Our job is to maximise the use of data to make the maintenance of the infrastructure a safer and more efficient environment for both our passengers and workforce.

To give you an example, performing remote inspections on track assets with the assistance of AI technology instead of visiting the track is a genuine safety benefit for our maintenance teams. Of course, we are not planning to pause physical inspections, but if we could confidently limit them to absolutely necessary ones, that would be a true benefit.

Data analytics and the more sophisticated machine learning techniques consistently demonstrate high quantitative and qualitative benefits. Examples of these benefits can be seen in: the prediction of incidents; the automation of tasks that would be repetitive and mundane for a human; the complex risk assessment on thousands of assets in a split second; and complexity management using optimisation algorithms which also bring speed to complex decision making.

In our case, we have successfully developed automated risk assessments using computer vision techniques that identify assets for immediate attention on the track and surrounding areas. We also established a preventive maintenance process driven by predictive algorithms which calculate the likelihood of an asset failing days or weeks before it actually happens, allowing us to resolve an issue before it becomes a problem. Both of these systems offer significant improvements to safety along with reduced delays and disruption for passengers.

The advanced big data engineering and algorithmic (AI) logic we use behind the scenes should, and will, continue to expand across every part of our infrastructure. I am confident that eventually, we will reach the desired levels of deployments needed to scale up our operations to prevent the incidents.

PV: The UK has the oldest rail service worldwide. Nevertheless, across Europe and many other places are also old rail services operating, each with complex and specific challenges. Based on your experience, what would you recommend to those organizations starting or undergoing digital transformation?

Nick Kotsis: Finding the correct answer to this is not easy as every organisation will be starting at a different point and has a different maturity trajectory. Also, financial investment for some organisations can be made easier than in others, and thus the digital journey will be very different.

As with most complex initiatives, digital, data, and AI success depends on leadership and commitment to completing the journey. In our case, we are fortunate to have solid technical leadership from our CIO and CTO. They saw our destination at the early stages of development and supported us in making the data science vision a reality.

My advice to colleagues in other organisations? be clear about the vision and data strategy, engage with customers who need your help, create the right team and set them for delivery, and focus on projects with clear benefits.

Sign up to ournewsletterto receive the latest Cities of the Future news. You can also follow us onTwitterandFacebook.

Excerpt from:

UK Railroads Invest in Data Science, AI, and Machine Learning - Cities of the Future

Top Upcoming Data Science Webinars to Attend in 2021 – Analytics Insight

Data Science training empowers professionals with data management technologies such as Hadoop, Flume, Machine learning, etc. If a candidate has the knowledge and proficiency of these significant data skills, it would be an added advantage for them to have an improved and competitive career. Here are the top data science webinars for you to attend in 2021.

Nov 17 2021, 11:30pm

Join Adam Mansour, Head of Sales Engineering, and Daniel West, Head of Mid-Market Sales at ActZero for this fireside chat as they discuss: Techniques to detect compromises before a threat is detonated Recommendations to achieve greater threat coverage and visibility into attacks How to improve security effectiveness by merging data science with cybersecurity How thought security leaders and change agents are preparing for the future by leveraging AI/ML.

Join here.

Nov 18 2021, 9:30pm

As pharmaceutical companies collect more unstructured Voice of Customer data, manually reading and analyzing each entry becomes more costly and inefficient. The advances of natural language processing and other machine learning techniques make it possible to create an insights generation program where subject matter experts work with technology to analyze more data and accelerate time to insights. Setting yourself up for success and maximizing the value of your VoC data requires changes in strategies and the way you handle data. In this webinar, youll hear from Seth Tyree, VP-Pharma Insights at Stratifyd, and Tolga Akiner, Data Scientist, as they share their experience and best practices from building an insights generation program. Learn about the importance of critical data prep activities like data labeling Gain an understanding of key AI models and their roles in your insights generation from unstructured VoC data See how AI and machine learning can work with your SMEs to accelerate insights.

Join here.

Nov 23 2021, 6:30am

This webinar is presented by Fangjin Yang, Co-Founder and Chief Executive Officer of Imply. This webinar will talk about Data analytics, big data, data science, and real-time analytics.

Join here.

Nov 24, 2021 2.30 pm

This Data Science webinar on How to Become a Data Scientist includes all the skills required for becoming a modern-day Data Scientist.

Agenda

Join here.

Nov 17 2021, 9:30pm

At any given time, organizations are attempting to transform their business (think business process, digital, management, organizational, and cultural transformations) with the common end goals of operational change, business model innovation, and domain expansion. Now is the time to use AI-enabled solutions to drive business transformation, but how is that done in practice? Join Jerry Hartanto, AI Strategist at Dataiku, for an overview of how AI mitigates business transformation risks, accelerates the time to value, and drives tangible outcomes.

Join here.

Follow this link:

Top Upcoming Data Science Webinars to Attend in 2021 - Analytics Insight

The Iconic CEO Erica Berchtold wants to use the science of data to convert you to online shopping – The Australian Financial Review

I really do believe most of the journey that you have in a physical store can be done in a digital world, she says.

The Iconic leverages data to lure consumers, and keep them. Already, the site will suggest sizes based on other brands you wear, and offers virtual sneaker try-ons. It curates products based on your preferences, sending you a personalised edit of clothing, shoes and accessories you might like the way a sales assistant might. But Berchtold sees a future when the technology of the metaverse, for example, could be used as a virtual fitting room.

Although The Iconic employs just over 1000 people, half of these are stationed at its fulfillment centre in Yennora, in western Sydney. Of the rest, 120 are in technology roles. Berchtold is recruiting 70 more staff in the tech team, across data, engineering, product and information technology.

COVID hasnt helped, with the borders being shut, she says. There is also a lack of investment in technology education here in Australia. We havent been grooming our own workforce as much as we could have.

An unplanned meeting with federal Labor leader Anthony Albanese. Louie Douvis

Berchtold has spent her career in retail, and revels in the shop floor. The closest thing to that at The Iconic is the raft of customer emails that arrive each week; Berchtold reads them all.

When I was having a bad day at other jobs, Id go to the shop floor to remind myself of what were actually doing, she says. This is the same thing.

Some feedback will resonate with Berchtold so much she is compelled to send the writer a gift voucher. Like, we had one customer, says Berchtold. She wrote, I really love this brand. But I hate the models. Theyre just such skinny things. I was like, Oh, shes cool. I hear that. And so, were going to make sure that were a bit more diverse in the models selected for that particular brand. And I sent her a $50 voucher.

The company has grown significantly in 10 years. In 2011, The Iconic launched with 1000 products from 125 brands. Today, it boasts 165,000 products from 1500 brands, with 500 new arrivals to the site daily.

The Yennora site is Australias largest fashion fulfillment centre, with a capacity for 3.75 million units, and can fulfil orders in as little as eight minutes. At its Alexandria Hub, a purpose-built production studio in inner Sydney, more than 60 staff pump out curated editorial for the sites 19.5 million monthly users. Its app is Australias most downloaded fashion app, with 5 million downloads. In the 2020-21 financial year, it sent out 6.2 million parcels, a 1.2 million increase on fiscal 2019.

Numbers aside, Berchtold is most proud that the company has cemented itself as a permanent part of the retail landscape.

Expanding from clothing to categories such as childrens wear, homewares and beauty has been a significant shift for the business. And Im proud that we are playing a leadership role in things like sustainability, diversity and inclusion.

Last year, the company launched Giving Made Easy, enabling customers to donate pre-loved clothing by downloading a pre-paid postage label and dropping off their donation at any Australia Post location. The company says this has saved more than 25,000 kilograms of textiles from landfill.

It also enacted a Reconciliation Action Plan, endorsed by Reconciliation Australia, and is committed to body diversity, urging vendors to offer extended sizing. This year, it launched a modest dressing capsule.

Not, Berchtold says, that she has a lot of choice. The average team member is 27, she says. I do not have to push for this stuff. It is business as usual for young people. I dont need to explain why sustainability matters to anybody in our team.

As for the future of retail and technology Berchtold is bullish.

I look at COVID, and the way some retailers were saying, We got a chance to experience what its like to be pure-play online. Like, you think? Weve had a 10-year head start on that stuff. The data and technology that is baked into our DNA. I think some retailers would be astounded by it.

I dont love trying stuff on. So could we use virtual clothing on the site to show people what it looks like on an avatar of themselves? That would be enormously helpful.

The idea behind all The Iconics technology is to make every transaction more seamless.

The team is constantly testing and learning. Recently, it announced a partnership with AirRobe, an online resale platform. Customers can automatically add their just-bought clothing to AirRobe, so that if they want to sell it later, the information and imagery is ready for them to use.

Berchtold is waiting on early results of the AirRobe experiment, but is confident the re-economy is the future. Reuse, repair, recycle, she says. This is the way ahead. A repair program is in the works.

One of the big lessons of the past decade has been learning how to use data.

Youve got the data right there. But theres a real art to knowing what to do with it, the data to actually pay attention to, what you can just not ignore, Berchtold says. Thats actually a real skill, and it is very important and will only continue to grow in importance. And I dont think a lot of retailers would do that easily.

Its a muscle you have to work, over and over. And weve had 10 years of trial and error in that space. So, weve got a pretty strong muscle.

Here is the original post:

The Iconic CEO Erica Berchtold wants to use the science of data to convert you to online shopping - The Australian Financial Review

Immunosuppressants linked to severe reactions in people with common genetic profile – Stanford Medical Center Report

Meanwhile, Mellins extracted human leucocyte antigen (HLA) profiles from the genetic data of 20 lung disease patients and found that more than half shared the same genetic signature. HLAs are the proteins on cell surfaces that distinguish self from nonself tissues.

I picked up the phone and called Jill Hollenbach, an immunogeneticist at UCSF, and she said, This is remarkable, Mellins said. Hollenbach thought that the results indicated a severe drug reaction rather than a genetically linked feature of Stills disease.

The researchers realized that patients with the lung problems met criteria for drug reaction with eosinophilia and systemic symptoms, or DRESS, a type of severe, delayed medication reaction. Although features of DRESS, easily missed with Stills disease, began soon after the drug was started, on average it took 14 months on the drugs for patients severe lung disease to become apparent.

The researchers compared 66 Stills disease patients who had DRESS with 65 patients who did well on the drugs. Among other problems, three-quarters of the patients with DRESS, with or without lung disease, had liver enzyme levels indicating serious liver dysfunction, and 64% developed a cytokine storm.

The reactions were not always recognized by the patients physicians. Patients who were taken off and kept off the drugs did well. Tragically, of 33 patients who kept taking the medications after their reactions began, nine died.

This [drug reaction] is a very, very complicated signal, and its hard for clinicians to realize that stopping the drug is what you do, especially if there is organ involvement, such as lung or liver dysfunction, Saper said.

The genetic signature that confers higher risk is found in 20% of the population at large and in 80% of patients in the study who had DRESS. Blood testing for HLA markers is available in clinical labs. Since the gene test did not predict all who reacted, this study indicates that physicians should watch carefully for DRESS reactions to inhibitors of IL-1 and IL-6.

Two of the immune-blocking medications, tocilizumab and anakinra, have recently been used in patients experiencing cytokine storms due to severe COVID-19.

This worries the research team because the risky HLA markers are quite common. A recent scientific report on 24 very ill COVID-19 patients treated with tocilizumab noted that six patients died. The Stanford researchers suggest caution in using this drug for COVID-19.

Theres all this circumstantial evidence in COVID patients, but it requires more investigation, Saper said.

Meanwhile, the researchers hope the findings will quickly prompt HLA testing of Stills patients.

One imperative we have is, The right drug, for the right person, at the right time, Saper said. In Stills disease, for most people these drugs are exactly right. But weve been able to identify a simple genetic test that could tell if this is not the right drug for you.

The papers other Stanford authors are research associate Gonzalo Montero-Martin, PhD; Serena Tan, MD, assistant professor of pathology; postdoctoral scholars Vamsee Mallajosyula, PhD, Debopam Ghosh, PhD, and Jianpeng Xu, PhD; Lu Tian, PhD, professor of biomedical data science; and Marcelo Fernandez-Vina, PhD, professor of pathology.

Tian and Mellins are members of the Stanford Maternal and Child Health Research Institute, Mellins is a member of the Interdisciplinary Program in Immunology and the Wu Tsai Neurosciences Institute at Stanford, and Tian is a member of the Stanford Cardiovascular Institute.

Among the contributors areother scientists from UCSF and the National Institute of Arthritis and Musculoskeletal and Skin Diseases, as well as scientists from UC-San Diego, Emory University School of Medicine and Childrens Healthcare of Atlanta, the University of Pittsburgh, the University of Washington, the National Human Genome Research Institute, the University of Pennsylvania, Pennsylvania State University College of Medicine, Yale University Medical School and the University of Hong Kong, also contributed to this research.

The research was funded by the Lucile Packard Foundation for Childrens Health, the Stanford Maternal and Child Health Research Institute, the National Institute of Arthritis and Musculoskeletal and Skin Diseases (grant Z01-AR041198), the National Human Genome Research Institute (grant Z01-HG200370), the Gordon and Marilyn Macklin Foundation, the RK Mellon Institute for Pediatric Research and the Marcus Foundation Inc.

Some of the authors received personal fees, grants or both from Novartis, which makes canakinumab.

Continued here:

Immunosuppressants linked to severe reactions in people with common genetic profile - Stanford Medical Center Report

Why Genpact ‘Dare in Reality’ Is A Hackathon Not To Be Missed – Analytics India Magazine

In Formula E the first all-electric, single-seater world championship even the most subtle performance improvements can lead to victory. These could be changes to not only the car, but also how the team uses data and analytics to improve energy management, race strategy, or competitive advantage to shape a races outcome.

That is why Envision Racing a founding Formula E team uses advanced data analytics and artificial intelligence (AI) to stay ahead of the pack. Business transformation firm Genpact is partnering with the team to help bring this extra edge to Envision Virgins performance.

And now, Genpact is giving data scientists outside the partnership the opportunity to help the Formula E team win on the racetrack. Together with Envision Racing and MachineHack, Genpact launched Dare in Reality, a data science and machine learning hackathon that promotes the use of advanced data analytics and AI to predict lap times in qualifying races.

Formula E racing has 12 teams and 24 drivers all driving battery-powered cars. In every race, drivers and their engineers need to make rapid decisions on how to use their energy, when to overtake or defend, how to respond to competitors, when to take attack mode (and an extra boost of energy available to all drivers), and much more.

Thanks to Genpacts data science and machine learning capabilities, Envision Racing can combine the talent of its drivers and engineers with fast access to insights from sophisticated algorithms to give the team an extra edge to win.

The Genpact team provides insights that help drivers prepare for each race by analysing their performance in previous contests and practice runs in the simulator. They can also make predictions on how many laps a race will be and help the team understand what other drivers may be planning by tapping into race-day radio communications.

Similarly, technology enables the team to easily report its carbon emissions to The Carbon Trust to remain the greenest team on the greenest grid. And finally, analytics also helps identify fan personas, enabling the team to build its base of socially conscious supporters.

And now the Dare in Reality hackathon gives you a unique opportunity to help the team make even better use of data and analytics.

What are you waiting for? Unleash your data analytics skills today, participate in this prestigious hackathon, and collaborate with the most innovative minds in the industry. But hurry, it closes in 8 days!

This hackathon is a two-week event where data science and machine learning professionals get a chance to win exciting merchandise and cash prizes totalling INR 12 lakhs (about $16,000).

For more details click here.

Amit Raja Naik is a senior writer at Analytics India Magazine, where he dives deep into the latest technology innovations. He is also a professional bass player.

Read the original post:

Why Genpact 'Dare in Reality' Is A Hackathon Not To Be Missed - Analytics India Magazine

TigerGraph expands its graph data library with 20 new algorithms – VentureBeat

Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more

TigerGraph, a company that provides a graph database and analytics software, has expanded its data science library with 20 new algorithms, bringing its total to more than 50 algorithms.

Graph databases like TigerGraph have become increasingly popular. They are particularly effective at letting data scientists analyze relationships among millions or billions of entities, and they outperform other types of databases for many deep learning applications.

Of course, this promising market sees plenty of competitors startups like Neo4j, MongoDB, and DataStax, as well as giants like Oracle and Amazon. What sets TigerGraphs product apart is that it is open source, in-database, scalable, and uniquely centered around graph data science.

This company is the first to offer a distributed native graph database as well and has gained traction in enterprises. Graph technologies are predicted to be used in 80% of data and analytics innovations by 2025, according to research firm Gartner, up from just 10% this year.

TigerGraph says that graphs will become accessible to more organizations. Not only data scientists, but also business users, can dive deeper into their real-time data with benefits like enterprise-grade scalability, management, and security.

The new group of algorithms can span across industries, like advertising, financial services, and health care sciences. Within these verticals, TigerGraph has adapted numerous data analytics, data science, and machine learning use cases, including fraud detection, cybersecurity threat detection, energy management, risk assessment and monitoring, and time series analysis.

However, this new release brings more and newer algorithms that are further improved by new groupings by algorithm category. Some will be able to run graph embedding processes either faster or more accurately, for example, which could free up time for data scientists and machine learning engineers. Ultimately, this freed-up time could be spent engaging with other algorithms, like the enhanced groups of community algorithms that can be used for offering personalized recommendations or detecting social groups.

Next up for the Graph Data Science Library will be in-database neural networks to complement its currently long list of algorithms. TigerGraph says this future update will simplify your pipeline while saving time and costs.

See original here:

TigerGraph expands its graph data library with 20 new algorithms - VentureBeat

Smarter Production and Data-Driven Insight | – Advanced Television

November 18, 2021

The pandemic has forced the pace on what is possible with creative content production in the cloud. This is just in time to help content providers and broadcasters step up to the spiralling demand for ever more and ever more creative content.

In a free-to-attend online event (see below) AWS hosts Sky revealing the ground-breaking work it has done pushing the boundaries of what is possible with live event production.

Also hear Deutsche Telekom and AWS partner ThinkAnalytics, on the data that can be surfaced and how to deploy it to create those vital better content discovery experiences for subscribers.

AWS describes how it is helping media and entertainment companies create content, optimize media supply chains, and compete for audience attention across streaming, broadcast, and direct-to-consumer platforms. Industry leaders such as Netflix, Formula 1, Discover, and Disney use the cloud to pioneer new ways to entertain, launching new streaming services, expanding their content catalogues, and setting new paradigms with audiences for viewing experiences.

You canjoin this live fascinating thought-leadership event free on the afternoon of November 23rd.

AWS brings 15 years of experience supporting transformation for leading industry customers such as Comcast, Discovery Communications, Disney, F1, FOX, HBO Max, Hulu, Method Studios, MGM, The NFL, Netflix, Nikkei, Peacock, Sky, TF1, Untold Studios, ViacomCBS, and Weta Digital.

AWS recently launched AWS for M&E, aligning the most purpose-built media and entertainment capabilities of any cloud against five solution areas (Spanning Content Production, Media Supply Chain & Archive, Broadcast, Direct-to-Consumer & Streaming, and Data Science & Analytics) to help customers transform the industry. We saw the industry faced with unprecedented audience demand for new content and a global pandemic accelerating the transition to remote production. For example:

Continue reading here:

Smarter Production and Data-Driven Insight | - Advanced Television

Improving Your Odds of ML Success with MLOps – insideBIGDATA

In this special guest feature, Harish Doddi, CEO, Datatron, discusses what CEOs need to understand about using MLOps. He also shares insights on how to use MLOps to gain competitive advantage and provide tips on how to implement it. Over the past decade, Harish has focused on AI and data science. Before Datatron, he worked on the surge pricing model for Lyft, the backend for Snapchat Stories, the photo storage platform for Twitter, and designing and developing human workflow components for Oracle. Harish completed his masters degree in computer science at Stanford, where he focused on systems and databases.

Machine learning is a lofty, unreachable goal in some executives minds; they see it as a nice-to-have thats just too complex to execute. It doesnt have to be that way, however, thanks to Machine Learning Operations (MLOps). MLOps uses models to systematize the ML lifecycle by defining processes to make ML development more productive and reliable. But that alone is no guarantee of success. Business leaders need to understand several important aspects of MLOps to make it work for their enterprises.

Difficulties with MLOps

Enterprises need to dedicate people and resources to MLOps. Sometimes, enterprise leaders just assume it will all magically work out, but thats not a given. Specifically, you need to invest in people with the right skill sets. MLOps is a skill that needs to be developed because to really do analog, you need to understand the machine learning portion of the AI models, but you also need to understand the operations portion. It can be difficult to find people who understand all of this.

To implement MLOps successfully, youll need to do some planning in advance. Carefully consider the various contingencies and possible outcomes before you initiate deployment to ensure your organization is prepared in advance.

MLOps success requires culture change

Generally, when machine learning is involved, there are many different people involved. One of the most important adjustments to an organizations culture when introducing MLOps is being able to demonstrate the separation of duties. Traditionally, AI was seen as a project for the AI group or the data science group but thats no longer true.

Youll need to make sure you can carefully separate duties because in some cases, the priorities for the data scientists arent the same priorities for the business leaders. And those might not be the same priorities from an operations standpoint.

These days, there are many stakeholders involved: operations, engineering, line of business even the regulatory compliance people. Thats why, rather than having a one-size-fits-all approach, each of them will have different priorities. How you bridge this priorities gap is a key question. Because everyone needs to work together, this is the culture adjustment to aim for first.

MLOps: Understanding whats important

Its crucial to understand that when someone develops the first version of the model, its not the final version; its the first draft. When stakeholders push the draft to production, and they see how it behaves, they learn from it and then take what they learn back to the development environment. Its a highly iterative process.

In a production environment, many things are changing, including data and behaviors of users. So, the things they observe in the development, they may not observe in production. They may unlock some new insight. So, its important to remember that its always an iterative process.

A second important point is that you need to adopt MLOps processes. This journey is key for AI success, because things are going to be more difficult the longer it takes to adopt these best practices. Heres one example: If data scientists have the right set of tools in their development environment, they can move quickly. They can iterate fast on their models, but they dont see the same thing in the production environment once people get involved. So, this behavior unnecessarily creates friction between the data science teams and other parts of optimization engineering, operations, infrastructure and other parts of the organization. That is why the faster you can adopt best practices and have standardization, the better off youll be in terms of easing friction that could occur down the line.

The third important point is that auditing is happening across huge volumes of data and across different business units, and it can happen in terms of models, too. You need to be able to show evidence and accountability for any questions the auditing team might ask. For instance, if the model loses money during a particular time period, you explain why that happened, and what if any actions were taken.

Reap the MLOps benefits

Why do many AI and ML deployments fail? Its more often an issue of culture and process rather than a technology issue. To successfully adopt ML, you need the right systems, resources and skills, and this is where MLOps can provide a significant advantage. The above recommendations will help you make the needed shifts in culture and remind you of the iterative process involved. These changes will help you deploy MLOps and reap all the ensuing business benefits.

Sign up for the free insideBIGDATAnewsletter.

Join us on Twitter:@InsideBigData1 https://twitter.com/InsideBigData1

Continued here:

Improving Your Odds of ML Success with MLOps - insideBIGDATA