Category Archives: Data Science

Coatue Data Science Head Alex Izydorczyk Is Leaving the Company – Business Insider

Former rising star and data wunderkind Alex Izydorczyk is parting ways with Coatue Management, the $48 billion hedge fund where he's led data science since 2016, according to sources familiar with the matter.

Izydorczyk's departure follows an extended leave of absence this year, the sources said, as well as a turbulent year in 2020 that saw the dissolution of the quant fund the young partner had overseen.

It isn't clear who if anyone will replace Izydorczyk, though his top deputy and head of quantitative research Justin Bleich remains at the firm.

A representative for Coatue declined to comment.

Izydorczyk first worked at Coatue as an intern in 2014 while he was still an undergrad at the University of Pennsylvania. He impressed billionaire founder Philippe Laffont and joined the firm full time after graduation.

A talented statistician and coder, Izydorczyk helped launch the firm's data-science efforts, transforming troves of data sets, often raw and unstructured, into value for the broader fund, both the public trading side and the private investing side. He became head of the promising new division at age 23.

But cultural issues festered under his leadership and the division saw numerous departures, Insider previously reported. Former employees accused Izydorczyk of fostering a miserable workplace with a brash, mercurial management style that featured micromanagement as well frequent outbursts and insults directed at subordinates who upset or disappointed him.

A quant fund led by Izydorczyk with more than $350 million in client funds hit bumps in 2019 and 2020 and was mothballed in summer of 2020 amid lackluster performance, Insider previously reported.

Izydorczyk remained at the firm, but earlier this year he took an extended leave of absence, people familiar with the matter told Insider.

View original post here:

Coatue Data Science Head Alex Izydorczyk Is Leaving the Company - Business Insider

Data Science Is a Key Weapon in the Fight Against Fraud – Built In

Businesses, health systems, utilities and even governments run on data unseen torrents of it constantly sloshing back and forth across globe-spanning networks, at a speed and volume too large for any human brain to fathom. Fraudsters often lurk within that hidden world, exploiting weaknesses in systems or using crafty techniques to mask their activities.

Organizations can use big data technology to guard against various types of fraud, from data breaches to false claims that an item never arrived, by training algorithms to recognize what is and is not normal behavior within a system. The technology required for such operations is complex and requires big investments from e-commerce experience builders like Signifyd, which uses data science technology to protect its customers from abuse. Meanwhile, cybersecurity outfits like ActZero use similar technology to help businesses recognize the potential presence of hackers within their own systems.

To learn more about what data science-driven cybersecurity looks like in practice, we caught up with data science leaders at Signifyd and ActZero.

Diana Rodriguez

Senior Director of Data Science

Company background: Signifyd helps retailers produce e-commerce experiences for customers. The company provides a financial guarantee against approved orders that turn out to be fraudulent, which places data security for customers and vendors front and center.

Describe the data sets your technology runs on and how that data is collected.

Our data sets consist of hundreds of billions of dollars worth of transaction data from thousands of online merchants selling in every retail vertical in more than 100 countries around the world. If you want to envision that data set of transactions, think of a top 10 online merchant. Now think bigger and bigger still. That commerce network is at the core of what we do, and I constantly work with and on the technology that drives it. Our data is enriched with data sets from third-party providers, which amplifies our ability to understand the identity and intent of every order placed on our global commerce network. Our commerce network data is collected through API custom integrations with some retailers and through standard Signifyd applications available through all the major e-commerce platforms such as Shopify Plus, Magento, Salesforce, BigCommerce and others.

Signifyds models harvest their most valuable insights from behavior patterns that indicate whether an e-commerce order is legitimate or fraudulent, or whether a customer complaint involves an honest failure on a retailers part or a dishonest attempt by a fraudster to take advantage of the retailer. Those anomalies can come in the form of disparities between shipping addresses and billing addresses, transaction history, device ID and location, among thousands of other signals.

What are the most valuable insights or patterns you look for in the data?

Our challenge is to apply the latest developments in machine learning to turn fuzzy concepts like trust into solvable, quantitative problems. Do I trust this order and the consumer behind it? Secondly, fraud is an adversarial problem. Unlike, say, a self-driving car, where the innovators, developers and society at large all share an interest in making sure the artificial intelligence involved works well and evolves into better versions rapidly, fraudsters are out to foil our artificial intelligence. As we become better and our solutions become more effective, fraudsters shift their targets and tactics. We need to anticipate these moves and stay ahead of the competition. It certainly makes life interesting.

By analyzing transactions on multiple data points, we are able to see the full picture and better distinguish the good from the bad, as well as enable merchants to properly enforce policies that meet their business goals. Watching the changes in behavior through the data has inspired Signifyd to develop additional solutions for merchants, including those tackling unauthorized reselling, fraudulent returns and false claims that an ordered item never arrived, to name a few. All the while, as we analyze patterns, adjust the model and evolve our view of fraud, we need to keep in mind that the primary goal is not stopping fraudsters but enabling good buyers to buy. That makes life better for end consumers and also for our merchant-customers, who see higher revenue and build customer lifetime value for their enterprises.

It can be dangerous to rely too heavily on one data point.

What are some of the potential drawbacks of using data science to solve problems like fraud, and how can technologists avoid or mitigate them?

We can never forget that the machine learning models we build to make data understandable and actionable are the products of human minds. Human minds are wonderful things, but they come with biases, preconceived notions of outcomes and they are not flawless. Because humans conceive of and develop models, they also can inject human biases into the model, which by extension introduce biases into the outcomes the models produce.

Data scientists apply various model interpretability methods to balance high model performance and accountability and explainability. Data scientists are building very complex models. Employing interpretability methods is one way to better explain how our models function and diagnose issues of bias or fairness that might otherwise go undetected.

One advantage to drawing on a vast number of signals to achieve our decisioning is that the anomalous patterns that we look for are based on thousands of signals,and therefore, we do not ascribe too much weight to one or two signals. It can be dangerous to rely too heavily on one data point. For example, consider the case of account takeover fraud a misfortune in which through obtaining passwords, for instance, a fraudster takes control of a legitimate consumers account. If decisions were based solely on the behavior of that email address, the account would appear suspicious and the legitimate consumers ability to transact would be compromised. Thats why our models base their decisions on a multitude of signals.

How do you think data science technology will evolve over the next year?

Like all science, data science will continue to advance through experimentation, trial and error, vigorous debate and rigorous study. Most of all, it will continue to advance through collaboration. Data science is moving fast and its power is being extended to more and more facets of our everyday lives. New ideas and techniques are being shared every day. Understanding how we can build off of the collaboration in the space to continue to refine and explore new ways of building models will be an ongoing pursuit. Flexibility in experimentation will be key, with clear goals guiding the continuing exploration.

Alexis Yelton

Director of Data Science

Company background: ActZero provides an AI-powered security platform for small and medium-sized businesses, detecting anomalies that may betray the presence of bad actors within a system. The company raised $40 million in funding earlier this year, which it is using to publicly launch its product.

Describe the data sets your technology runs on and how that data is collected.

We build ML and mathematical models on computer, network and cloud software logs in order to detect and respond to cyber threats. This data is semistructured and very big we see terabytes of data per day.

What are the most valuable insights or patterns you look for in the data?

We look for two kinds of patterns in the data. The first is patterns known to be indicative of cyberattacks, including but not limited to known suspicious commands, suspicious processes executing and command line character entropy. The second is patterns that indicate anomalous behavior. For these, we learn what is normal behavior for a user, machine or customer and highlight unusual behavior using anomaly detection algorithms.

We build simple features that can serve as output in their own right, then more complex ones.

What are some of the potential drawbacks of using data science to solve problems, and how can technologists avoid or mitigate them?

The biggest drawback is obvious but essential to understand: Machine learning models and even simpler mathematical models are costly to build, run and maintain. You need a robust code framework and infrastructure to process data and to run and monitor algorithms. These types of models also require substantial maintenance.

We have spent a great deal of time mitigating these issues at ActZero, and we do so in a number of ways. Firstly, we take an iterative but also additive approach to rolling out models. We build simple features that can serve as output in their own right, then more complex ones. All of these are input into a feature store from which future pipelines can draw. Then we build heuristics based on logic and statistics that use these features to produce meaningful predictions. Finally, we build an ML model if there is a business case for one. We reuse our framework mainly via our feature store and data science pipelines as much as possible to accelerate and simplify productionization.

How do you think data science technology will evolve over the next year?

There has been a proliferation of companies that sell machine learning frameworks for feature stores, ML pipelines and auto-ML. Over the next year, I see these dropping in cost and experiencing significant market adoption as the need grows for data science solutions.

Here is the original post:

Data Science Is a Key Weapon in the Fight Against Fraud - Built In

Grads from MIT, IITs, IIMs and other Renowned Varsities Working on Building LIVEY, a World-Class Data Science and AI-powered Healthcare Solution -…

Cureck Technologies to unveil the most innovative Artificial Intelligence powered Health care solution, LIVEY for skin treatment

But still there is a gross mismatch between the huge burden of skin diseases and skills necessary to manage it, leading to a high number of untreated or poorly treated patients. An AI powered health care solution can help in timely diagnosis. Physicians now have a powerful new ally in the fight against skin diseases.

Gaurav, Co Founder of LIVEY says Skill development and Digital ways of working, led by advances in technology will be the two mantras that can help address the large infrastructural gaps in primary healthcare and deliver healthcare to the last mile. With this clear vision in mind, our team with graduates from MIT, IITs, IIMs and other globally renowned universities are working on building a world-class data science & artificial intelligence powered solution under the name, LIVEY. This AI-enabled tool is a dermatology related mobile application that will play the role of an informative assistant, enabling doctors to get a first level classification of skin diseases while sitting thousands of kilometers away from the patients. This eventually can save a lot of time, effort and costs through easy access to unbiased, consistent, good quality diagnosis and treatment."

"Human skin is the largest body organ and as per the Global Burden of Disease research, skin diseases continue to be the fourth leading cause of all human diseases, affecting around 1.9 billion people at a time, almost one-third of worlds population. In India, burden due to skin diseases increased by more than 54% over the past 3 decades.

LIVEY has emerged from years of research, tons of historical data & deep learning proprietary algorithms resulting in meaningful patterns across hundreds of skin disease classes and helping in identification of various skin conditions. In the coming months, the team plans to build on this health care solution so that more patients & individuals can use it to answer questions about skin issues. says Rohan, Co founder of LIVEY.

LIVEY is only intended for differential diagnosis as many conditions require clinical review, in-person examination, or additional testing like a biopsy. For patients who may be using the search bar as their first resource, this tool can provide some useful information about dermatologic issues for a variety of skin conditions. LIVEY will give users access to authoritative information so they can make a more informed decision about their next step and keep their skin healthy & happy says Arushi, Co founder of LIVEY.

Innovations from Cureck Technologies will extend well beyond skin treatment as the team wishes to build futuristic, intelligent solutions in health care.

Read more:

Grads from MIT, IITs, IIMs and other Renowned Varsities Working on Building LIVEY, a World-Class Data Science and AI-powered Healthcare Solution -...

Tim Kao: Infusing Data Science to Revolutionize the Functionalities of Navy and Marine Industries – Analytics Insight

The Center for Naval Analyses (CNA), a Federally Funded Research and Development Center (FFRDC), is sponsored by the Department of the Navy and headquartered in Arlington, Virginia. CNAs mission is to help Navy and Marine Corps leaders ensure Americas defense in the 21st century. CNA is unique because of its real-world, empirical, and data-driven approach to operations analyses and use of data scientists as expert observers of military operations. CNA is the nations oldest operations research organization with lineage tracing back to World War II (1942) when CNA scientists went to sea aboard Navy ships to observe and analyze anti-submarine warfare operations in order to help defeat German U-boats. Its research staff has diverse educational disciplines with nearly 70% with Ph.D. degrees and 92% having graduate-level degrees.

Tim Kao is the Vice President of Data Science, CAN. He has served in the United States Marine Corps for 20 years, including being a part of two combat tours to Iraq and Afghanistan. During those two decades, Tim served in the artillery, special operations, space, jungle warfare, and operations analysis commands. He received a B.S from the Air Force Academy, an MBA from the University of Colorado, and a Masters in Operations Research from the Naval Postgraduate School. Tim started working at CNA in 2015 and is currently the Vice President of Data Science. As the VP, he has the privilege of leading over 25 data scientists that are collaborative partners with Navy and Marine Corps sponsors and committed to supporting Americas naval services.

Data scientists rightfully focus on relationships between features and variables. However, leadership, even for data scientists, is fundamentally is about relationships with people. As a young jumpmaster instructing parachuting at the Air Force Academy, Tim quickly learned that the safety and success of personnel learning to jump out of airplanes depended on the trust and relationship they had with him as their instructor. If that was lacking, the probability of having students that refused to jump increased significantly. In the Marines, Tim worked for some great leaders and some extremely bad ones. The difference between those two types of leaders came down to the relationships those leaders were able to establish with those people they led. Leading data scientists is no different. Tim values a data scientists technical skills, but their ability to exhibit the extremely hard to master soft skills is just as important. That is how CNA try to hire (and are hiring!) and it contributes to the teams success.

Tim presumes that the ever-evolving discipline of data science requires a constant refresh on emerging algorithms, technologies, and processes. He says that when he joined CNA, he was coming from a USMC assignment as the commander of the only jungle warfare training center in the Department of Defense as well as a recent deployment to Afghanistan. While Tim was able to use some of his data science knowledge to support those military units, many of my data science skills quickly atrophied, as he did not use them regularly. According to him, overcoming this challenge required a growth mindset to catch up quickly on emerging machine-learning techniques and code. This continues to be a challenge today as new things in data science are released every day. A growth mindset is critical for a data scientist for continued success, highlights Tim.

Tim opines that data science leaders have to be curious and caring. He adds that being curious is important because leaders wont know everything but need to know enough to search out an answer through research and collaboration with others. Being caring is also vital because Tim remarks that he really believes in the truism that no one cares how much you know until they know how much you care. Leading data scientists really isnt much different than leading people in combat. You have to know yourself (what are your data science capabilities), know your enemy (what is the problem being solved), and take care of your troops (data scientists) so they want to do a good job for each other and you, asserts Tim.

Tim explains that CNA uses concepts like developing minimum viable products (MVPs) to iterate with its Navy and Marine Corps sponsors. They are an integral part of the process in the creation of a data science solution CNA is developing. CNA not only works with them on the user interface, but collaborates with them to understand the data, how it is collected, what is missing, and assumptions the company needs to make. A successful data science effort requires both superb data scientists and a tight partnership with the end-users the analytics.

Tim exclaims that leaders have to ensure their data scientists have opportunities to leverage disruptive technologies in their work. Like other professions and even more so, data scientists want to do innovative and meaningful work. Its incumbent on data science leaders to provide that opportunity. He adds that even if data science team members think they are too busy doing work to learn new things, by definition, there is a major inherent risk from overlooking disruptive technologies. That risk will manifest itself by the team not being very busy much longer as their work will go to people that have learned those new things.

A rising tide lifts all boats, and such is the case for many data science organizations says, Tim. The leaders that separate themselves in this environment are the ones that can couple the hard skills (i.e. statistics, programming, machine learning) with the soft skills (i.e. communication, empathy, collaboration). According to Tim, at CNA, the team works on both sets of skills every day so it can help provide a competitive advantage for Americas defense.

Dont forget about your people asserts Tim. He continues that CNA spends more of its waking hours working than spending those hours with families. Tim insists emerging data scientists do everything they can to help their people look forward to coming to work every day. Tim asks them to be the kind of leader that provides the psychological safety for people to share their ideas and problems with you. According to Tim, data scientists get better solutions and outcomes if they build an inclusive and learnable environment in their workplace, where their team will work as long and as hard as it takes to accomplish the mission. He concludes by saying, Mission first, people always.

More:

Tim Kao: Infusing Data Science to Revolutionize the Functionalities of Navy and Marine Industries - Analytics Insight

COVID-19: Compliance to household mixing restrictions in England decreased with each lockdown – EurekAlert

Household mixing significantly decreased in the first lockdown in England and remained relatively low in the second lockdown, but increased during the third lockdown, reports a study published inScientific Reports. The authors observed that the increase in household mixing by mid-February 2021 during the third lockdown coincided with the wider COVID-19 vaccine rollout across England.

ProfessorEdManleyand colleagues used GDPR-compliant mobile phone data from over one million anonymous users who providedconsent for their data to be used for research purposes.Theauthors comparedhousehold mixing across the pandemic to baseline levels, calculated from average household visits eight weeks before the pandemic began in England. The authors observed the largest decrease of 54.4% in household mixing during the first lockdown (starting in March 2020) which gradually increased across 2020 as restrictions were lifted. The authors also observed household mixing reduced by 15.28% in the second lockdown (starting in November 2020) and the initial month of the third lockdown by 26.22% (January 2021). Household mixing varied across regions, with some urbanised areas including London, Manchester and Cambridge associated with increased household mixing.

The significant increase in household mixing by mid-February 2021 rose above baseline levels by between 1.4% and 23.3% during the third lockdown, despite national restrictions remaining in place. The increase in household mixing coincided with the announcement that the most vulnerable had been vaccinated, and the wider rollout of the vaccination programme across England. The authors propose this significant increase in household mixing during the third lockdown may reflect the widespread perception of safety from vaccinations. The authors also suggest that lockdown fatigue contributed to higher levels of household mixing in later lockdowns.

The authors conclude their study of mobile phone data may provide a useful privacy-preserving tool in helping assess the effectiveness of COVID-19 public health policies at local and national scales. The authors did not predict any associations between potential future restrictions or booster vaccinations and household mixing.

###

Article details

Household visitation during the COVID-19 pandemic

DOI:10.1038/s41598-021-02092-7

Corresponding Author:

Ed ManleyUniversity of Leeds, Leeds, UKThe Alan Turing Institute for Data Science and Artificial Intelligence, London, UKEmail:e.j.manley@leeds.ac.uk

Please link to the article in online versions of your report (the URL will go live after the embargo ends): https://www.nature.com/articles/s41598-021-02092-7

Scientific Reports

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Read the original here:

COVID-19: Compliance to household mixing restrictions in England decreased with each lockdown - EurekAlert

Data science approaches to confronting the COVID-19 pandemic: a narrative review – DocWire News

This article was originally published here

Philos Trans A Math Phys Eng Sci. 2022 Jan 10;380(2214):20210127. doi: 10.1098/rsta.2021.0127. Epub 2021 Nov 22.

ABSTRACT

During the COVID-19 pandemic, more than ever, data science has become a powerful weapon in combating an infectious disease epidemic and arguably any future infectious disease epidemic. Computer scientists, data scientists, physicists and mathematicians have joined public health professionals and virologists to confront the largest pandemic in the century by capitalizing on the large-scale big data generated and harnessed for combating the COVID-19 pandemic. In this paper, we review the newly born data science approaches to confronting COVID-19, including the estimation of epidemiological parameters, digital contact tracing, diagnosis, policy-making, resource allocation, risk assessment, mental health surveillance, social media analytics, drug repurposing and drug development. We compare the new approaches with conventional epidemiological studies, discuss lessons we learned from the COVID-19 pandemic, and highlight opportunities and challenges of data science approaches to confronting future infectious disease epidemics. This article is part of the theme issue Data science approaches to infectious disease surveillance.

PMID:34802267 | DOI:10.1098/rsta.2021.0127

Excerpt from:

Data science approaches to confronting the COVID-19 pandemic: a narrative review - DocWire News

Bayer and Microsoft partner to build new cloud-based digital tools – IT Brief New Zealand

Bayer has partnered with Microsoft to build new cloud-based digital tools and data science solutions for agriculture and adjacent industries.

The strategic partnership aims to bring new infrastructure and foundational capabilities to accelerate innovation, boost efficiency and support sustainability across value chains.

Bayer says agriculture and agribusinesses today benefit from a wide range of digital tools and data-powered insights through platforms such as Bayer's Climate FieldView, which is used today on more than 180 million farming acres across more than 20 countries. But the company says there is still work to do when it comes to optimising the entire food, feed, fuel and fibre value chain and its use of precious natural resources required to power the planet while ensuring broader efforts to combatclimate change.

Under the new agreement, Bayer will work with Microsoft to co-develop new solutions that address critical industry scenarios such as farming operations, sustainable sourcing, manufacturing and supply chain improvement, and ESG monitoring and measurement.

The companies plan to achieve this by developing the go-forward infrastructure for digital farming solutions and data science capabilities. These new solutions and capabilities will be available to businesses from startups to global enterprises in agriculture and adjacent industries for use in their offerings. Bayer will also migrate its digital farming core capabilities to the new infrastructure for its customer-facing solutions.

The partnership builds upon a long-standing relationship between Bayer and Microsoft and a shared commitment to data privacy, cybersecurity and customer trust.

"This partnership comes at a unique point in time when increased innovation is sorely needed across the food and fibre value chain," says Bayer member of the board of management and president of Crop Science, Liam Condon.

"As we cope with an ongoing global pandemic, fragile supply chains and the continuing climate catastrophe, status quo will not suffice. We need collaboration, shared vision and action," he says.

"For those reasons, Bayer and Microsoft are taking action to make a positive impact, both through our collaboration and by offering off-the-shelf infrastructure and digital capabilities for other companies to address the enormous challenges facing our society."

Jeremy Williams, head of The Climate Corporation and Bayer Digital Farming Solutions, says, "As agriculture and technology entrepreneurs and organisations work to advance the security and sustainability of the value chain, supporting farmers at its foundation, collaboration is required."

"Bayer is pioneering digital innovation within agriculture. Microsoft is setting the standard in trusted, global cloud solutions. Together, we caninnovate and implement as a team to deliver the food, feed, fibre and fuel needed to power our planet," he says.

Microsoft corporate vice president, Azure Global Industry, Ravi Krishnaswamy, says like every industry, farming and the food sector are undergoing rapid digital transformation.

"We're excited to partner with Bayer to accelerate this transformation byhelping them increase the adoption of precision and sustainable agriculture techniques using data-driven insights and the power of Microsoft Azure," he says.

This partnership is a significant, strategic step forward in accomplishing Bayer's target of 100-percent digitally enabled sales in the Crop Science division by 2030 and accelerating its ability to deliver outcomes-based, digitally enabled solutions to customers.

Read more from the original source:

Bayer and Microsoft partner to build new cloud-based digital tools - IT Brief New Zealand

Elizabeth Holmes testifies in Theranos trial & Athenahealth acquisition 2.0 – STAT

Youre reading the web edition of STAT Health Tech, our guide to how tech is transforming the life sciences.Sign up to get this newsletterdelivered in your inbox every Tuesday and Thursday.

Elizabeth Holmes aims to undercut prosecutions claim of deception

Elizabeth Holmestestified in theTheranosfraud trial Monday that the companys blood-testing technology performed well in early studies with drug makers such asMerck,Astra Zeneca, andBristol Myers Squibb. The line of questioning by the defense fielded with slow, steady replies from Holmes was meant to rebut testimony by prosecution witnesses suggesting the company sent falsified reports on validation testing by drug companies to investors. Holmes also countered testimony from aPfizerdirector, Shane Weber, who told jurors that Theranos replies to his technical questions were at times deflective or evasive.

advertisement

By calling Holmes to testify, her lawyers are betting that she will be able cast doubt on prosecutors narrative that she deliberately misled investors and the companys potential clients. The big question is how she will hold up under cross examination.

Biotechs big data problem is all too human

advertisement

As biological datasets have exploded in size, biotechs need data scientists to process their plentiful petabytes. But many are struggling to hire enough researchers with the computational know-how needed for the job. Executives believe the gap might [become] fourfold unless they start acting now, Parag Patel, a partner atMcKinsey and Companyfocusing on the life sciences industry, toldSTAT. Part of the problem is supply and demand: There are only so many top-tier data scientists, and pharma is competing for them against companies likeGoogle,Apple, andFacebookthat cannot only pay more, but are often more data-proficient, too. It can get frustrating just cleaning up dirty data, said Neal Cheng, a data scientist who left biotech to work at eBay.Read more about the biotech brain crunch in our colleagueAngus Chens new story.

Verilys clinical trials business leans into data

At theSTATSummitlast week, Katie spoke withVerilys leaders about their plans for their clinical trials infrastructure, just one arm of the life science companys sprawling and sometimes unfocused business.Amy Abernethy, who joined theAlphabetsubsidiary over the summer after championing real-world data and evidence at theFDAas principal deputy commissioner, described her next steps as president of clinical trials platforms. Practically speaking, we are going to focus on building longitudinal datasets that can be put to use for understanding the contours of illness, said Abernethy, adding that such data could be used to test multiple different treatments simultaneously. Abernethy said the company hopes to create a vast repository combining real-world data, medical records, and traditional clinical trial data. Mario has moredetails.

The dream of data-sharing in 2030

By the end of this decade, publichealthresponse and preparedness will be driven by access to real-time data. In this day and age, that seems like a fairly basic desire. But it is among many distant goals that made it onto anew interoperability wish listcompiled by theOffice of the National CoordinatorforHealthIT. To help guide its policy roadmap, ONC asked providers and other stakeholders to list outcomes they hope to achieve through enhanced data interoperability by 2030. The agency received more than 700 responses. Among the other priorities on the list: Patients and doctors will be able to compare the costs of drugs, tests, and procedures online before the bill comes in.

Epics sepsis alerts skyrocketed during the pandemic

A general rule for AI models is that they should be trained on data from the kind of patients they are likely to see in real clinical settings. But what happens when the AI encounters a fundamental change, like the onset of Covid-19? Anew studyof a sepsis algorithmdeveloped byEpic Systemsfound that the number of alerts it generated in two dozen hospitals jumped 43% in the first three weeks after Covid-19 hit. The study by researchers at theUniversity of Michigandid not measure the accuracy of the alerts, but pointed out the increased volume may contribute to alert fatigue and prove detrimental at a time of constrained resources. The finding also points to the need for careful monitoring and governance of AI algorithms after they are implemented.

Meanwhile, clinical predictive tools continue to be adopted around the country, asnew survey datafrom theCollege of Healthcare Information Management Executivesshows. In the last year, the percentage of surveyed acute care organizations with predictive tools integrated into clinician workflows increased from 44% to 55% using a mix of homegrown tools along with those from EHRs and third parties. Among ambulatory care providers, adoption was at 52%, and at 40% in long-term post-acute care facilities.

Athenahealths acquisition & moves for AI drug discovery

Movers & shakers

What were reading

See the original post:

Elizabeth Holmes testifies in Theranos trial & Athenahealth acquisition 2.0 - STAT

Who are the current generation of Chief Data Officers? – ComputerWeekly.com

This is a guest blogpost by Peter Jackson, Chief Data and Analytics Officer at Exasol.

The growing mainstream appreciation for quality data and its resulting insights has made the Chief Data Officer (CDO) role more acclaimed than ever. There is now a CDO in almost two thirds (65%) of companies a huge jump since 2012 (12%).

The CDO understands the processes and drivers behind success based on data making them distinct from the CIO. They provide a central point of responsibility and accountability for business clarity, efficiency, and performance and play an increasingly important role in shaping and influencing the strategic direction of a company.

As this role evolves, Exasol spoke to 250 active CDOs from across the UK, the US and Germany to find out more about these individuals and their experiences.One promising finding is that the role is opening C-Suite doors to women who have been overlooked for senior executive roles for many decades. Of the CDOs we spoke to, 26% identified as female and 73% male slightly bucking the IT industrys gender talent gap. For example, when looking at the wider gender gap found across the technology workforce, 19% are female, whilst for data science roles in general, as few as 15% are women.

Commenting on this trend, talent specialists for C-Suite appointments, Savannah Group, told us at Exasol, Organisations love to hire female talent. There are some incredible female CDOs and they are able to command a premium and cherry pick and shape the role more. Its an amazing time to be a female CDO!

Another surprise we uncovered in our research is that todays generation of CDOs are strikingly young. According to M&A Executive Search, the average age of a typical C-suite member is 56, yet the majority of the CDOs we spoke to (55%) were between the age of 25-34 when they were first appointed to the role. Interestingly, 18% of the German respondents we spoke to were between 18-24 when they took on the role more than double the number from the UK and the USA that were in that age bracket.

This more diverse pool of talent is encouraging as its not always easy for companies to find candidates. UK government research found that almost half of businesses (46%) have struggled to hire for roles that require data skills.

The opportunity for change

Diversity isnt just about demographics, however. Its also about having a diverse skillset, which is increasingly important in tech, particularly in data science.

As you might expect, 73% of the CDOs we spoke to had come from a technical background, and 69% say that most people in their teams are from a technical background too. But businesses suffering from tunnel vision and only hiring CDOs from purely technical backgrounds are potentially missing out.

Diversity is crucial when liberating data, as well as improving data accessibility and literacy across all levels of the organisation. If we concentrate too muchon STEM in education, then well have people who understand the numbers and stats, but they wont necessarily have the skills to articulate the meaning of that data, turn it into actionable insights or communicate next steps around those insights to the business.

Derek Danois, CDO at GE Healthcare, for example, has seen real benefits from employing a diverse team. I have a combination of people on my team. A physician, someone with a healthcare IT background and another with marketing background and MBA, he said. This makes for great conversation between them and a cross pollination of ideas. A marketing background is a good thing as we have to communicate across a number of business functions, and they are a strong liaison to the commercial part of the business to gather the data analysis they need. And they are someone that is credible at the table, for example, when discussing customer engagement.

The CDO role is no longer simply about governing data its about liberating it. Transforming data from a confusing, gated asset into an open, useful tool, accessible and actionable to everyone is realised on the top line (revenue) and bottom line (cost reduction and efficiency).

Its exactly for this reason that it will be exciting to see the impact that this generation of CDOs, and the next, will have on business as we know it as organisations open their minds to new candidates.

Go here to see the original:

Who are the current generation of Chief Data Officers? - ComputerWeekly.com

Analysis in Government Awards 2021 Shortlist – GOV.UK

Wed like to say a big thank you to everyone who has submitted a nomination for the Analysis In Government Awards. We have received over 130 nominations; an indication of just how important government analysis has been this year.

There are six separate awards which will be presented to individuals and teams from across the Analysis Function and you can find the shortlisted entries under each category.

This award recognises an individual that has displayed excellence in championing, promoting or growing the Analysis Function. This could be by highlighting analysis across government, developing our people, inspiring analysts, or delivering better outcomes for the citizen.

The shortlisted entries are:

This award recognises collaboration between teams, departments, other professions and/or external organisations/researchers to deliver a piece of analysis or analytical project.

The shortlisted entries are:

This award recognises innovative methods of analysis which led to new insight, answered complex questions, or improved efficiency.

The shortlisted entries are:

This award celebrates those who have successfully used clear communications to present analysis, considering needs of the audience. This could be an example of public-facing or internal communication within teams, departments or across professions.

The shortlisted entries are:

This award recognises analysis which has been impactful through use, influenced decision-making and/or has contributed to public debate.

The shortlisted entries are:

This award recognises an outstanding contribution in making the Analysis Function a more inclusive function, reflective of the citizens we serve.

The shortlisted entries are:

These shortlisted nominations will now be passed to an expert judging panel to decide on twohighly commendednominations, one runner-up and one winner in each of the six categories.These winners will be announced at an exclusive awards ceremony in December.

Get your tickets for the Analysis in Government Awards ceremony 2021 today. The event is open to all to attend, so come along to celebrate with your colleagues.

If you are part of the Analysis Function and you would like to join the judging panel, email the team at Analysis.Function@ons.gov.uk.

Hear from previous winners

Collaboration Award 2020 COVID-19 infection survey

Collaboration Award 2020 SPLINK

And some colleagues on what makes them proud to be a government analyst

Proud to be a government analyst

See the original post here:

Analysis in Government Awards 2021 Shortlist - GOV.UK