Page 2,399«..1020..2,3982,3992,4002,401..2,4102,420..»

One CEOs Foundations for Success: Core Values, Flexibility and Conviction – Worth

Some entrepreneurs can point to a single moment that defined their trajectory, but that is not the case for me. In fact, my career path has been defined by understanding when a new path was necessary and seizing the opportunity to take it.

I started my career as a carpenter, then went to university to become a quantum physicist. Not a typical occupational decision, to be sure. But I found inspiration again during my schooling (or perhaps it found me). After a personal incident, which Ill explain later, a light bulb moment occurred: What if we reimagine the medical records retrieval process? Lets simplify access for patients and make it easier for them to share their information with whoever needs it. That idea led me to where I am today.

Over time, three lessons have stuck with me and have guided me through some of the most challenging career decisions Ive had to make: Define and stay true to your mission, embrace the concept of conviction and be willing to bet big. While the journey to where I am today hasnt been the easiest, Ive found that dedicating myself to these three principles has served me well.

So many times, you know what you want to do, but do you know why you want to do it? Every entrepreneur should spend time answering this question honestly. There are many motivating factors for starting a business, but there must be a reason for doing it that resonates deeply within you.

That is your mission, and it must be grounded in your core values and beliefs. Your mission will serve as your companys North Star, guiding your next moves. But just identifying your mission isnt enoughyou have to execute it consistently, even if that means passing on an opportunity that isnt in line with the mission statement.

Because youll also be surrounding yourself with those who believe in the mission, staying true to it is key to maintaining the culture youre trying to nurture. Your staff must also feel empowered to share ideas that will enforce the mission and speak up when they think specific plans dont live up to it.

If you dont believe in what youre doing, no one else will either. So if youre all-in on your mission, you must exude that conviction. For example, a VC firm or potential customer is not only investing in your idea, but also in your ability to execute on that idea. That same sense of confidence goes for keeping a staff who can move the company forward. Your people wont embrace the mission if they dont see that passion from leadership.

To be fair, this suggestion might seem to contradict previous advice about being willing to pivot. While it is undoubtedly a fine line, I believe the two are equally crucial to suggestions. Someone wise once told me you should have strong opinions, weakly held. Yes, you should believe 100 percent in your idea, but you should challenge yourself to see if evidence suggests that you could be on the wrong path and move decisively.

While conviction is critical, there are times when evidence indicates that another idea simply makes more sense. I learned this lesson more than once.

Nine years ago, my father-in-law was diagnosed with terminal cancer and fought a long battle across multiple hospital systems, providers and caregivers. Amid unimaginable stress, my mother-in-law was also responsible for retrieving many of his medical records from different sources to help direct the next steps of his care. It was during this time that inspiration hit.

I decided to start a company to simplify the medical records retrieval process for patients and their families. When I was pursuing quantum physics, I could never have imagined what might catapult me from my lab. When I faced the challenge to help my in-laws, the force was great and served as new inspiration to step out of my comfort zone.

Then, another difficult decision occurred a few years later after cofounding the company. Once we got the business up and running, we came to a cold realization: The market wasnt set for this kind of company when patients are the customers. It was a hard pill to swallow since we were dedicated to staying true to our original mission of helping patients gain greater control of their medical records.

At this point, it was clear that we had to be open to the possibility that our original idea, our first big bet, would not lead us to success. We shifted to a more viable business model that still helped patients gain greater control of their data while targeting specific companies that use that data to make patient lives better. Instead of rolling a snowball uphill, we realized that we could relent to the market forces while staying true to our mission. Fortunately, we bet on another model, which has put us on a very successful track and which will take us through future stages in our plan.

While my resume is unconventional, it has created a textured tapestry of experiences. More than anything, my career has reaffirmed my belief that a commitment to core values, flexibility and conviction are the foundation for long-term success.

James Bateman is cofounder and CEO of Medchart. He and his team are on a mission to simplify access to patient-authorized information for businesses beyond care.

An indispensable guide to finance, investing and entrepreneurship.

Excerpt from:

One CEOs Foundations for Success: Core Values, Flexibility and Conviction - Worth

Read More..

Condensed Matter Seminar – Professor Joe Gomes | Physics and Astronomy | The University of Iowa – Iowa Now

Professor Joe Gomes;University of Iowa, Department of Chemical and Biochemical EngineeringBio:Joe Gomes is an assistant professor in the Chemical and Biochemical Engineering Department at the University of Iowa. He received a B.S. in Chemical Engineering from the University of Illinois-Chicago and a Ph.D. in Chemical Engineering from the University of California-Berkeley. He was a postdoctoral researcher in the Chemistry and Bioengineering Departments at Stanford University. Dr. Gomes conducts research in the areas of theoretical chemistry, machine learning, energy materials, and catalysis.

Abstract:Many important challenges in science and technology can be cast as optimization problems. The development of improved heuristic algorithms for determining approximate solutions to these optimization problems has high potential for impact across many disciplines. We propose an efficient heuristic algorithm for solving hard optimization problems which we refer to as Classical Quantum Optimization (CQO). Our approach consists of: (1) converting the optimization problem of interest into a (classical or quantum) spin glass Hamiltonian where the ground state configuration of this system encodes the optimal solution to the problem at hand, and (2) the variational optimization of a neural network representation of the ground state of the many-body system given by the problem-specific Hamiltonian. We demonstrate the utility of CQO on optimization problems found in graph theory. We compare CQO against other widely used heuristic solver algorithms and exact results when possible. The results show that CQO achieves state-of-the-art approximation ratio solutions for the MaxCut problem. We highlight potential applications of CQO towards solving the quantum chemical electronic structure problem.

Read more from the original source:

Condensed Matter Seminar - Professor Joe Gomes | Physics and Astronomy | The University of Iowa - Iowa Now

Read More..

This chemist is reimagining the discovery of materials using AI and automation – MIT Technology Review

In time, as companies develop more powerful quantum computers, the VQE could enable chemists to run strikingly accurate simulations. These models might be so precise that scientists wont need to synthesize and test the materials at all. If we ever reach this point, Aspuru-Guzik says, my work in materials science will be done.

When Donald Trump was elected president of the United States in 2016, Aspuru-Guziks career was flourishing, but suddenly the prospect of remaining in the country no longer appealed to him. One week after the election, he began emailing colleagues in Australia and Canada, looking for a new job.

The University of Toronto offered him a prestigious government-funded position meant to lure top-tier researchers to the country and a cross-appointment at the Vector Institute for Artificial Intelligence, a nonprofit corporation cofounded by machine-learning pioneer Geoffrey Hinton that is quickly making Toronto a global hub for AI. The biggest inducement, however, was a promise to build a radical new materials lab called the Matter Lab, a project Aspuru-Guzik had dreamed of for years.

In the Matter Lab, we only attack a problem after asking three questions, says Aspuru-Guzik. Does it matter for the world? If not, then fuck it. Has somebody else already done it? If the answer is yes, theres no point. And is it remotely possible? Here, the word remotely is key. Aspuru-Guzik wants to tackle challenges that are within the range of feasibility, but barely so. If a material is too easy, he says, let other people find it.

Located in a postwar brick building in downtown Toronto, the lab is unlike any other at the university. The ceiling is adorned with maroon and burgundy acoustic panels, an homage to the beloved Mexican architect Luis Barragn. Tucked away in an inconspicuous corner is a typical lab bencha table with flasks, scales, and beakers beneath a fume hoodwhere graduate students can practice chemistry in much the same way their grandparents generation did. One gets the sense that this workstation isnt often used.

In the center is a $1.5 million robota nitrogen-filled glass-and-metal enclosure housing a mechanical arm that moves back and forth along a track. The arm can select powders and liquids from an array of canisters near the sides of the enclosure and deposit the contents, with exacting accuracy, in one of a number of reactors. The robot is like a tireless lab assistant who mixes chemicals 24/7, says Aspuru-Guzik. It can make 40 compounds in a mere 12 hours.

Continue reading here:

This chemist is reimagining the discovery of materials using AI and automation - MIT Technology Review

Read More..

XtalPi and Signet Expand AI Drug Discovery Collaboration to Novel Cancer Target – PRNewswire

A significant challenge to developing new therapeutics is the ability to expand the search beyond known structures and accurately screen through a copious supply of novel molecules to identify top candidates with a desirable drug property profile indicative of development potential. Pharmacodynamics is a key factor in lead optimization and drug design. However, traditional cell-based in vitro studies have considerable limitations in modelingdrug effects in the human bodyand often produce unreliable efficacy data that can lead to clinical failure.

XtalPi has developed an AI drug discovery workflow that integrates its algorithm-driven platform with expert domain knowledge and targeted small-batch experiments. This three-pronged approach can generate novel scaffolds beyond the conventional boundaries of known chemical space and predict molecular behaviors as well as important physicochemical and pharmaceutical properties with enhanced accuracy. The generative and prediction models continue to improve their outcome through iterations in a closed-loop feedback process, with insights from XtalPi's team of medicinal chemists and high-quality data from its high-throughput wet lab, until promising candidates are validated in experiments. This workflow has been shown to substantially cut down the research time, costs, and experiments needed between target identification and IND-enabling experiments.

Signet Therapeutics was founded by scientists from Dana-Farber/Harvard Cancer Center, with extensive experience and unique expertise in oncology research. Using real-world cancer genomics data, the Signet team developed novel organoid disease models specific to cancer subtypes that simulate the unique 3D environment of organ tissues, yielding data with much higher clinical relevance. The two companies' first collaboration successfully combined XtalPi's AI-powered one-stop drug discovery capabilities with Signet's insight and functional biology platform and identified novel molecules with superior in vitro performance that are now quickly advancing toward clinical trials.

Building upon existing success, the two companies will continue to apply the tried-and-true collaboration model of "AI drug discovery + novel disease models". XtalPi's AI platform will generate an extra-large chemical space containing millions of molecules with high binding affinity to the cancer target discovered by Signet. After assessing these molecules by their predicted key drug properties such as selectivity, drug-likeness, novelty, and synthesizability, a smallbatchof top-ranking molecules is synthesized in XtalPi's lab and passed on to Signet's platformfor biological and functional evaluations. XtalPi will then use the data from organoid/based and biochemistry tests to further finetune its AI models and recommend increasingly potent drug candidates. Through such Design-Make-Test-Analyze cycles, XtalPi's AI platform and team of medicinal chemists work together to zero-in on molecules ofstrong bioactivity and a balanced drug property profile with minimal synthetization experiments.

Dr. Shuhao Wen, XtalPi's co-founder and chairman, says, "We are excited to expand our collaboration with Signet, which allows us to develop XtalPi's AI platform into new application areas and accelerate the growth and progression of Signet's first-in-class pipeline to provide much-needed treatment options for cancer patients worldwide. XtalPi aspires to be a strong partner for innovative biotech companies like Signet and empower the quick translation of new biological discoveries into promising new clinical candidates."

"XtalPi's AI drug discovery platform and Signet's novel disease models platform are highly complementary," says Dr. Haisheng Zhang, founder and CEO of Signet. "The value of XtalPi's AI is not only reflected in its incredible efficiency, but more importantly, in the discovery of de-novo molecules with strong clinical potential, helping us reach milestones in record speed. We look forward to working closely with XtalPi as an important partner in developing our first-in-class pipeline and bringing forth more targeted drugs to underserved disease markets."

About XtalPi Inc.

We are a quantum physics-based, AI-powered drug R&D company with the mission to revolutionize drug discovery and development by improving the speed, scale, novelty, and success rate. With operations in both China and the U.S., we strive to deploy the best capabilities and resources available to us in each market to meet the needs of our customers and collaborators.

We operate an integrated technology platform that combines the mutually informing and reinforcing cloud supercomputing-powered in silico tools and our wet lab, and enables discovery and development of innovative therapeutics at a pace and scale beyond traditional alternatives. We are among the pioneering AI-powered drug R&D companies in the world that have established a platform with an iterative feedback loop between quantum physics-based dry lab and wet lab capabilities.

About Signet Therapeutics

Signet Therapeutics is developing new medicines to improve the lives of patients diagnosed with cancer, especially those insensitive to chemotherapy and radiotherapy.

By strategically collaborating with XtalPi, we bring together the expertise of an AI-powered drug discovery platform and our unique novel disease models to discover and optimize promising new candidates for novel targeted cancer drugs. By taking these advantages, we hope to revolutionize traditional drug discovery for small molecules.

Contact: Ruyu Wang (617) 717-9867 [emailprotected]

SOURCE XtalPi Inc.

xtalpi.com

Original post:

XtalPi and Signet Expand AI Drug Discovery Collaboration to Novel Cancer Target - PRNewswire

Read More..

I knew that was going to happen The truth about premonitions – The Guardian

Around seven years ago, Garrett was in a local Pizza Hut with his friends, having a day so ordinary that it is cumbersome to describe. He was 16 or thereabouts and had been told by teachers to go around nearby businesses and ask for gift vouchers that the school could use as prizes in a raffle. There were five other teenagers with Garrett, and theyd just finished speaking to the restaurant manager when suddenly, out of nowhere, Garretts body was flooded with shock. He felt cold and clammy and had an overwhelming sense that something had happened. He desperately tried to stop himself crying in front of his peers.

It was like Id just been told something terrible, the now 23-year-old from the southwest of England says (his name has been changed on his request). I couldnt tell you exactly what it was, but I just knew something had happened. Garrett returned home and tried to distract himself from a feeling he describes as grief. The phone rang. His mum answered it. A few hours earlier around the time Garrett was in the restaurant his grandfather had died from a sudden heart attack while on a cruise.

Although theres no way of knowing how many people worldwide feel that they sensed a loved ones death before being told, its a phenomenon thats been explored in everything from Star Wars to Downtown Abbey to Kung Fu Panda 2. Perhaps one of your own relatives has a story similar to Garretts perhaps you dismissed it, perhaps you treat it as family lore. Is there any evidence to suggest this phenomenon is real that humans can sense one anothers passing from a distance, that Garretts emotional afternoon was anything more than a coincidence? In a word, no. Meanwhile, it is well documented that the human mind is a bundle of bias: false memories, grief hallucinations and confirmation bias can easily explain these experiences. Besides which, for every person who feels a shiver when their loved one dies, there are hundreds more who were quietly eating pizza or happily riding a rollercoaster or bored doing maths homework completely unaware of their loss.

But are these dismissals too quick? Too easy? Some scientists claim that the complex world of quantum physics could be used to explain the paranormal (other scientists say theyre unbelievably wrong.) What can stories like Garretts tell us about what we do and dont know? What we are and arent willing to believe? About the disconnect between what some claim to experience and others claim is impossible?

Brian Josephson is your prototypical professor. With tufts of white hair atop his head, a knitted top and a glasses chain keeping his specs safe, he says via Zoom that, The academic community is a kind of club. Youre supposed to believe certain things and you run into problems if you disagree. In 1973, he was awarded the Nobel Prize in physics for his work on superconductivity. Later, during his time as a professor at the University of Cambridge, he began using quantum mechanics to explore consciousness and the paranormal.

Quantum entanglement nicknamed spooky action at a distance by Albert Einstein describes the (proven) phenomenon of two spatially separated particles influencing each other, even over large distances. While the phenomenon is subatomic, academics such as Josephson have theorised that quantum entanglement could explain phenomena like telepathy and psychokinesis.

There are many accounts of crisis telepathy, says Dean Radin, a parapsychologist and author of Entangled Minds: Extrasensory Experiences in a Quantum Reality. Does entanglement explain these effects? No, in the sense that entanglement as observed today in the physics lab, between pairs of photons, is extremely fragile and typically lasts only minuscule fractions of a second. But also, yes, in that we are at the earliest stages of understanding entanglement.

Radin says studies in quantum biology show that entanglement-type effects are present in living systems (academics from Oxford have successfully entangled bacteria) and he believes the human brain could in turn have quantum properties. If that is subsequently demonstrated I think its just a matter of time then that would go a long way towards providing a physical mechanism for telepathy, he says.

Put down your pen, scrunch up your letter to the editor. You only need an explanation for telepathy if you believe in telepathy in the first place, and experiments purporting its existence have been widely debunked. Josephson and Radin are regularly criticised by peers. In 2001, when Royal Mail released a set of stamps to celebrate the 100th anniversary of the Nobel Prize, there was outrage when Josephson wrote in an accompanying booklet that quantum physics may lead to an explanation for telepathy. In this very newspaper, academics branded the claim utter rubbish and complete nonsense.

When reviewing Entangled Minds for The Skeptics Dictionary, philosophy professor and professional sceptic Robert Carroll wrote that Radins book was aimed at non-scientists who are likely to be impressed by references to quantum physics.

Garrett has no idea what happened to him on the day his grandad died, but he is certain that it happened. He believes in some kind of interconnectedness between people. I think if its happened to you, then theres an underlying accepting of it, he says.

This is a sentiment shared by the self-described naturally sceptical Cassius Griesbach, a 24-year-old from Wisconsin who lost his grandfather in 2012. Griesbach says that he shot awake on the night his grandad passed and began to sob uncontrollably. It felt like something just rocked me, physically, he says. When his dad called moments later to say his grandad had died, a teenaged Griesbach replied: I know.

Griesbach doesnt blame anyone for being sceptical of his story. The further you get away from it, the more I would like to write it off as a coincidence, he says, But every time I sit down and think about it, it feels like its something else. Griesbach is not super religious and doesnt believe in ghosts. If it is something to do with actual science, I would think that would be science that we are nowhere near yet, you know?

Many would disagree, arguing that the answer lies in the social sciences. In 2014, Michael Shermer married Jennifer, who had moved from Kln to California and brought with her a 1978 radio belonging to her late grandfather. Shermer tried in vain to fix it before tossing it in a drawer, where it lay silent until the couple said their wedding vows at home months later. Just as Jennifer was keenly feeling the absence of her grandfather, the radio began to play a romantic song. It continued all night before it stopped working for good the next day.

Its just one of those anomalous experiences, says Shermer, a science historian, professional sceptic and author of The Believing Brain: from Spiritual Faiths to Political Convictions. How We Construct Beliefs and Reinforce Them as Truths. Randomness and chance play a big role in life and in the world, and our brains are designed to see patterns not randomness. Shermer argues that experiences like Garretts and Griesbachs are statistically more likely than we think.

You have billions of people worldwide having dozens of dreams [each] at night, he says. The odds are pretty good that on any given night, somebodys going to have a dream about somebody dying who actually dies. Thats inevitable. At the same time, he argues, we ignore all the times we suddenly sob or shudder and it turns out that no ones died or the times when someone does die and we dont feel anything at all.

There are other prosaic explanations. While Garretts grandfathers death was sudden and unexpected, Griesbachs grandfather was hospitalised the week before he died, when he shot awake in the middle of the night, Griesbachs first thought was, It happened he knew his grandfather had passed. But is that surprising when hed spent a week by his bedside?

John Bedard, a 36-year-old in Los Angeles, woke suddenly on the night his parents died. He was 10 and sleeping at a friends house when he awoke, just knowing something was wrong. He called his brother, sobbing. When his brother picked him up, he told Bedard their parents had died in a motorcycle accident.

And yet, there were clues that something was wrong much earlier. The sleepover wasnt planned Bedard had gone to friends to play when it started getting later and later and nobody came to pick him up. It was a Sunday night an unusual night to have a sleepover. Bedard was uneasy when he went to bed.

Despite these answers, explanations continue to be toyed with. Rupert Sheldrake is a biologist and parapsychologist who conceived of morphic resonance, the idea that interconnections exist between organisms. He believes the human mind has fields that stretch beyond the brain, much like electromagnetic fields. This, he says, explains why we can seemingly tell when someone behind us is staring at us, or why we sometimes think of someone right before they call. (Sheldrakes work has been called heresy in the journal Nature.)

Im not talking about the supernatural; I think these things are totally natural. I think theyre normal, not paranormal, he says. When it comes to experiences like Garretts, he says empirical studies are impossible. You cant ask somebody to die at a randomly selected time to see if their nearest and dearest respond So unfortunately, the evidence for cases to do with death has to be circumstantial.

Shermer is not a Sheldrake fan. The idea that a biologist like Rupert Sheldrake is going to uncover some new force of nature that somehow Einstein and everybody else has missed is just so unlikely to have happened, that almost any explanation like the ones Ive been giving you are way more likely. Josephsons rebuke of such criticisms: People say that [science is] always subject to revision and yet theyre secretly convinced that certain things cant happen.

What can and cant happen doesnt change what many feel has happened Garrett, Griesbach and Bedard all believe something strange and unexplainable occurred when they lost their loved ones. At the very least, these stories undeniably offer comfort.

As far as looking into it, I dont even know what there is to look into, Griesbach says after all, the phenomenon doesnt even have a name. I think the best thing that we could do for people is validate how they feel and let them grieve. Because whenever people have that happen, theyre also grieving. That is one of the most important times to just be a kind human to somebody.

This article was amended on 24 October 2021. In an earlier version Professor Josephson, speaking about the academic community, was quoted as saying: Youre supposed to believe certain things and you run into problems you disagree with. His actual quote was: Youre supposed to believe certain things and you run into problems if you disagree.

More here:

I knew that was going to happen The truth about premonitions - The Guardian

Read More..

Learn more about the mysteries of the universe on Dark Matter Day – University News: The University of Western Australia

Think the universe is just made up of stars, planets, asteroids, comets and space dust? Think again.

Scientists now believe dark matter, so far only detected through its gravity-based effects in space, makes up about a quarter of the total energy of the universe, and about 80 per cent of all mass.

However, it is composed of particles that do not absorb, reflect or emit light so they cant be detected by observing electromagnetic radiation. So dark matter is material that cannot be seen directly.

Dr Ben McAllister is a physicist at the UWA Node of the ARC Centre of Excellence for Dark Matter Particle Physics (CDMPP) and The University of Western Australias Quantum Technology and Dark Matter Research Lab.

He spends a large part of his working life trying to unlock the mysteries of dark matter and is at the forefront of a free public event for Dark Matter Day this Saturday, 30 October.

Its the biggest mystery in the universe many scientists around the world are trying to unlock the big cosmic mystery of what dark matter actually is using lots of different methods, Dr McAllister said.

The Forrest Prospect Fellow is one of the West Australian researchers at the forefront of the search, as part of the Australia-wide ORGAN Experiment to detect a particle called the axion.

To do this we use whats called a microwave cavity haloscope, which is basically an empty copper can placed in a very strong, very cold magnetic field, Dr McAllister said.

If axions are dark matter and exist all around us, one might enter the resonant cavity, react with the magnetic field and transform into a particle of light a photon.

Image: Members of the Quantum Technology and Dark Matter Research lab use UWA's haloscope in a bid to unlock the mysteries of dark matter.

If youre interested in hearing more about Dr McAllisters work and the search for dark matter, tune in to Dark Matter Day a live, virtual and all-ages event this Saturday 30 October from 2pm to 4.30pm.

There will be lots to see and explore including an introduction to dark matter, talks about dark matter, live Q&As and kids activities and competitions, as well as a chance for a virtual tour of the UWA labs, Dr McAllister said.

To register and receive a link to the ARC Centre of Excellence in Dark Matter Particle Physics Gather town (the virtual meeting place), click here.

See the article here:

Learn more about the mysteries of the universe on Dark Matter Day - University News: The University of Western Australia

Read More..

Data Preprocessing in Data Mining – GeeksforGeeks

Preprocessing in Data Mining:Data preprocessing is a data mining technique which is used to transform the raw data in a useful and efficient format.

Attention reader! Dont stop learning now. Get hold of all the important CS Theory concepts for SDE interviews with the CS Theory Course at a student-friendly price and become industry ready.

Steps Involved in Data Preprocessing:

1. Data Cleaning:The data can have many irrelevant and missing parts. To handle this part, data cleaning is done. It involves handling of missing data, noisy data etc.

2. Data Transformation:This step is taken in order to transform the data in appropriate forms suitable for mining process. This involves following ways:

3. Data Reduction:Since data mining is a technique that is used to handle huge amount of data. While working with huge volume of data, analysis became harder in such cases. In order to get rid of this, we uses data reduction technique. It aims to increase the storage efficiency and reduce data storage and analysis costs.

The various steps to data reduction are:

Read the original here:

Data Preprocessing in Data Mining - GeeksforGeeks

Read More..

Miners Are The Optimal Buyers: The Data Behind Bitcoin-Led Decarbonization In Texas – Bitcoin Magazine

Recently, Ars Technica published an article from staff writer and environmental science PhD Tim de Chant, aiming to rebut Texas Senator Ted Cruzs comments from the Texas Blockchain Summit earlier this month.

De Chant took issue with the following statement from Cruz:

Because of the ability of bitcoin mining to turn on or off instantaneously, if you have a moment where you have a power shortage or a power crisis, whether its a freeze or some other natural disaster where power generation capacity goes down, that creates the capacity to instantaneously shift that energy to put it back on the grid.

De Chant offered a number of responses, but generally seems to misunderstand the substance of Senator Cruzs point. Additionally, he made a significant mathematical error (later retracted) that called into question his literacy on Bitcoin mining.

But first, its worth quoting Cruz in full, as the intent of his claims are lost without the full context. We have included a transcript excerpt of Cruzs comments on mining from his conversation at the summit with Jimmy Song below, in which he referenced a recent winter storm that left many in Texas without access to power for days:

There were lots of things that went wrong [during the winter storm] that I think are worthy of study, but I do think that Bitcoin has the potential to address a lot of aspects of that. Number one, from the perspective of Bitcoin, Texas has abundant energy. You look at wind, were the number one wind producer in the country, by far. Number two, I think there are massive opportunities when it comes [indistinct audio]. If you look at natural gas right now, in West Texas the amount of natural gas that is being flared 50% of the natural gas in this country that is flared, is being flared in the Permian right now in West Texas. I think that is an enormous opportunity for Bitcoin, because thats right now energy that is just being wasted. Its being wasted because there is no transmission equipment to get that natural gas where it could be used the way natural gas would ordinarily be employed; its just being burned.

And so some of the really exciting endeavors that people are looking at is can we capture that gas instead of burning it?. Use it to put in a generator right there on site. Use that power to mine Bitcoin. Part of the beauty of that is, the instant youre doing it, youre helping the environment enormously because rather than flaring that natural gas youre putting it to productive use. But secondly, because of the ability to Bitcoin mining to turn on or off instantaneously, if you have a moment where you have a power shortage or a power crisis whether its a freeze or some other natural disaster where power generation capacity goes down, that creates the capacity to instantaneously shift that energy to put it back on the grid. If youre connected to the grid, they become excess reserves that can strengthen the grids resilience by providing a significant capacity of additional power to be available for critical services if and when it is needed. So I think that has enormous potential and its something that in five years I expect to see a dramatically different terrain, with Bitcoin mining playing a significant role as strengthening and hardening the resilience of the grid.

Its a weird point. A lot of the discussion around Bitcoin views Bitcoin as a consumer of energy. A lot of the criticism directed at it is the consumption of energy. The perspective Im suggesting is very much the reverse, which is as a way to strengthen our energy infrastructure. And it also has one of the exciting things about crypto also, is the ability to unlock stranded renewables. So there are a lot of places on earth where the sun shines a lot and the wind blows a lot but there arent any power lines. And so its not economically feasible to use that energy. And the beauty of Bitcoin mining is that if you can connect to the internet, you can use that energy and derive value from those renewables in a way that would be impossible otherwise. And I think were going to see in the next five years massive innovations in that regard as well.

De Chant made a number of points in reaction to Cruzs statements. We will tackle them in turn.

De Chant starts with the admission that it stands to reason that bitcoin mining could create enough demand that investors would be enticed to build new power plants. Those plants could theoretically be tasked with providing power to the grid in cases of emergency.

But this isnt really the point that Cruz and the Bitcoin community are making. Instead, we are pointing out that power providers will have improved economics from the existence of bitcoin mining as an additional source of offtake. These improved economics could induce extra construction. But we havent come across the suggestion that mining would finance the construction of bitcoin-only plants that would be directed to the grid in emergency situations.

The other claim is that bitcoin miners represent a unique type of interruptible load, whose ability to dial back energy consumption can help safeguard the grid from instability.

De Chant continued by pointing out that the February blackout in Texas was caused by significant winter storms in conjunction with a poorly weatherized grid although Cruz completely acknowledged this in his remarks. This doesnt score a point against Cruz hes fully aware of why the grid failed: significant winter storms in conjunction with a poorly weatherized grid, alongside other contributing factors such as natural gas delivery. What happened was that, alongside power plant failures, the natural gas infrastructure was unable to deliver natural gas to power plants. Additionally, the Electric Reliability Council of Texas (ERCOT) underforecasted its high case peak load scenario by around 10 gigawatts (GWs), which was a huge miss.

Cruz was not claiming that Bitcoin would prevent a black swan weather-driven grid meltdown. Ultimately, only better planning can do this.

De Chant continued by pointing out that Bitcoin miners wouldnt spend extra cash to winterize their operations. But this is a confusing point: De Chant appears to be conflating miners and energy producers. In practice, the two are distinct. General grid failures have nothing to do with Bitcoin, and no one is suggesting that Bitcoin will cause power plants to fully avoid two-sigma tail events.

The main line of argument from De Chant is simply his claim that the economics of mining dont support curtailment, even when prices are high. In his words: Bitcoin miners would be unlikely to offer their generating capacity to the grid unless they were sufficiently compensated. In the first version of his article, he originally claimed that miners would need to be paid $31,700 per megawatt hour (MWh) during the February 2021 winter storm to turn off their machines, an estimate which he revised to $600 per MWh later on. But both estimates are erroneous.

Even for the highest-end equipment (Antminer S19s), the turn-off point in February 2021 for miners would have been $480 per MWh. Older equipment has a lower turn-off threshold as it is more sensitive to electricity prices. When electricity prices reach a certain threshold, miners are no longer breaking even and turn off their machines whether or not they are enrolled in a formal grid program to compensate them for downtime.

Miners are acutely aware of their economics and can adjust to grid conditions in real time. De Chant was off by a factor of 66 in his initial estimate. In his revised estimate, he maintained erroneously that miners would turn off their rigs at $600 per MWh, which is still an overestimate. Put simply, Bitcoin miners are highly price sensitive and engage in economic dispatch meaning that they react to prices and simply do not run their equipment if electricity prices get too high. This is independent of whether they are participating in a demand response program, which formally employs power consumers to curtail their usage during periods of electricity scarcity.

In the below chart, you can see that miners would have turned off their machines well before the $9,000 per MWh price cap was reached for electricity in ERCOT.

The precise threshold at which miners curtailed their usage depends on the types of machines employed higher-end machines have a higher opportunity cost, and are hence kept online through more expensive periods of power pricing.

Electricity is generally cheap in ERCOT, which might imply relatively few instances in which miners would curtail their usage. But of course, the average doesnt tell the story. The nature of the spot-driven grid is that much of the time, energy is cheap or even free (depending on where it's being consumed), and a small fraction of the time its very scarce and expensive (this is a feature the high prices are a signal to incentivize new generation to be built).

Its during those right-tail events that Bitcoin miners can significantly benefit the grid by interrupting their load. Running the rest of the time means that energy is generally more abundant, because the presence of miners is an economic pressure that improves grid economics, making it more worthwhile to build new energy projects (who can now for the first time have the option to sell their full generation capacity to the grid or to Bitcoin).

For example, Lancium is a Houston-based technology company that is creating software and intellectual property solutions that enable more renewable energy on the grid. In 2020, it was the first company ever to qualify a load as a controllable load resource (CLR) (more on these later).

As of today, the company owns and/or operates all load-only CLRs in ERCOT with approximately 100 MWs of Bitcoin mining load under control for CLR. These mining facilities are being optimized on both a daily and hourly basis to mine when it is economic to do so and to turn down when it is not.

Its worth diving into the distribution of power prices on a grid like ERCOT to fully understand how miners engage with the grid. Much of the time, energy is abundant and cheap. In West Texas, prices are routinely negative, as the supply of wind and solar periodically vastly outstrips demand, and theres a limited ability to export the supply to load centers elsewhere in Texas.

What the miners do is provide a load resource which eagerly gobbles up negatively priced or cheap power (everything on the left side of the chart), while interrupting itself during those right-tail events (you can see the winter storm to the right).

On the one hand, this improves the economics of energy producers who for the first time have a new buyer to sell their electricity to, beyond just the inflexible grid. This promotes the construction of more renewable energy infrastructure and improves the prospects for existing installations. On the other hand, a highly interruptible load that can tolerate downtime means theres more power available for households and hospitals during periods of scarcity, when supply trips offline through weather or other interruptions.

From the miners perspective, accepting interruptions to their service is actually an economically rational decision, for two reasons:

The below table shows the average yearly electricity price for consumers willing to tolerate various amounts of downtime. You can see that if you strategically avoided high-priced periods (as miners are motivated to do), you dramatically saved on power overall.

In 2021, with the right-tail event due to the winter storm causing prices to spike, if you reduced your uptime expectation from 100% to 95%, you were able to drive your overall power cost for the year from $178 per MWh to a mere $25 per MWh. So, the grid does not need to rely on the beneficence of miners to expect them to turn off their machines during times of grid stress: as profit-maximizing entities, they have a clear economic motive to do so.

For a more holistic look at what prices in ERCOT have done during the last five years, we have included a chart showing the cumulative distribution by year below. Given that it hosted the winter storm, 2021 has the fattest right tail, with 5% of hours being priced over $100 per MWh.

You can see that wholesale spot prices are low much of the time, but are characterized by extreme spikiness as you get to the last 15% of the distribution. Neither tail is desirable: negative or low prices indicate an excess of supply causing a mismatch, and imply poor economics for energy producers; extremely high prices are indicative of blackouts and households not getting the energy they need. The presence of flexible load on the grid chops off both tails of the distribution. It is not a panacea and it cannot stop poorly-winterized equipment from failing during once-a-century storms, but the net effect is positive regardless.

Additionally, the existence of flexible load is so useful to grid operators that they have designed specific programs to pay these load centers for a type of grid insurance. Broadly, these programs are known as demand response (DR). This term covers a range of load responses that generally reduce load at the instruction of the grid operator. Virtually all independent system operators maintain demand response programs, but most of them have programs that require 10 to 30 minutes of response time on the load.

In fact, on a percentage of peak demand basis, ERCOT lags its peers like MISO (the Midcontinent Independent System Operator) when it comes to enrolling utilities in demand response.

As ERCOT is a single balancing authority interconnection that is not synchronously connected with any other interconnection, it is essentially an islanded electrical grid. This means that ERCOT cannot lean on its neighbors for help when faced with an expected energy shortfall and instead must balance on its own.

Texas leads all states in having the highest levels of installed wind generation capacity in the country and is expected to double its renewable capacity over the next three to five years. Being an islanded grid with a significant portion of energy supply coming from renewables requires ERCOT to procure and utilize more responsive DR products, with requirements to respond in seconds or even at the sub-second frequency in addition to the more traditional 10-to-30-minute response times.

What De Chant simply failed to mention but Ted Cruz hinted at is the remarkable ability of miners to act as these controllable load resources.

In ERCOT parlance, this is a type of power consumer that can dial down their consumption and back up again in response to grid operator commands on a second-by-second basis. Most data centers cant do this in fact, the selling point for many data centers is precisely their high uptime and non-interruptibility.

The Bitcoin network is a much more forgiving client: it doesnt really care if you interrupt the action of mining, because each successive hash is statistically independent of the last (this is known as memorylessness). Aside from making slightly less revenue, nothing adverse happens if a Bitcoin mining data center only runs at 60% or even 0% capacity for a few minutes or hours. Compare that to a hospital, a smelter, a factory, or commercial real estate. These sources of load need constant uptime, and cannot tolerate interruptions.

Due to the statistical properties of mining and the physical tolerance for mining hardware to deal with interruption, Bitcoin data centers can therefore dial up and down their consumption on a highly-granular basis and on short notice.

For a grid operator, such a load type is a dream, because it gives them the ability to balance supply and demand from the demand side, rather than having to tweak supply (typically by spinning up and down natural gas turbines). There have historically been some semi-interruptible loads that grid operators relied on for similar programs, like arc furnaces, wood pulp production, cement mills, or aluminum electrolysis, but none could provide the flexibility or response times Bitcoin miners can.

The industries mentioned are industrial loads which cannot easily power up and down, and certainly not on extremely short notice, as is necessary for a modern CLR. For context, CLRs have to be able to curtail their targeted load reduction by 70% within 16 seconds. Before Bitcoin mining, no load type qualified in ERCOT.

You can think of a CLR as a power generator in reverse. Instead of adding expensive power to the grid during a period of scarcity, the CLR receives a real-time price signal from the grid operator and if it's above its economic turn-off point, it will automatically dispatch down (curtail consumption) to make way for other, more critical loads. Therefore, instead of only having flexible (and CO2-emitting) thermal energy from a coal or natural gas generation available to the grid operator during peak demand periods, the CLR capacity not reserved as grid insurance is offered into ERCOTs security-constrained economic dispatch (SCED) and will automatically dispatch down when the real-time price is higher than the turn-off point for the bitcoin mining load.

An added benefit to ERCOT in having bitcoin mining loads as a load resource'' is that during local shortages or system emergencies, ERCOT can directly turn down the load. This is a very big deal. For the data center, its a great deal, because they can sell ancillary services (basically, a bundle of products that give the grid operator the right to curtail the data centers production should they need to), collect a premium for doing that, and mine the rest of the time. So, they collect a premium on an ongoing basis (even if not called upon to curtail their usage), effectively lowering their all-in power cost, while also providing a valuable service to the grid.

In contrast, a generation resource which sells ancillary services has a real opportunity cost: it has to run below its maximum in order to retain some slack in case it is called upon to increase its power.

So, when Cruz mentioned the possibility of Bitcoin mining playing a significant role as strengthening and hardening the resilience of the grid, he is likely referring to the strong benefits that interruptible load offers to a grid operator. The existence of qualifying CLRs means that policymakers can target structurally-higher renewable penetration and feel comfortable in the grid operator's ability to procure more insurance against adverse events. As grids become increasingly renewable and move from fossil-fuel-powered steady baseload to more volatile wind and solar power, these kinds of controllable loads will become increasingly critical.

Additionally, the ability of Bitcoin miners to colocate with renewable assets and act as an independent buyer when the grid has no demand provides a base level of monetization which was not available previously. This incentive means that intermittent energy sources like wind and solar (which are often curtailed, as they are frequently distant from load centers) have improved economics.

Indeed, an analysis from Dr. Joshua Rhodes and Dr. Thomas Deetjen with IdeaSmiths LLC demonstrated that flexible data centers would actually promote the stability of an increasingly-renewable grid and allow for more renewable penetration than the grid could otherwise support.

The analysis from Rodes and Deetjen found that operating data centers in a flexible manner can result in a net reduction of carbon emissions and can increase the resilience of the grid by reducing demand during high stress times (low reserves) on the grid.

Under the scenario where 5 GWs of flexible data center growth was added to the base case with a range of uptimes between 85% to 87%, the flexible data center consumes about 35.5 million MWh, but supports the deployment of an additional 39.5 million MWhs of wind and solar energy.

In simple terms, the incremental MWh output from solar and wind is greater than the incremental MWh consumption from the flexible data centers hence, carbon negative.

For a visualization of how the intermittency of wind and solar affects electricity pricing, we have assembled real data from earlier in October in West Texas in the chart below:

Source: Lancium, ERCOT

Between October 8 and October 10, wind and solar generation averaged over 20 GWs with the power lines that connect West Texas to the major load centers in the east being at maximum capacity. With so much wind and solar online, West Texas power prices averaged $3 for these three days with several hours settling negative. On October 11, wind generation was 20 GWs at around midnight, 2 GWs by noon and back up to 20 GWs by end of day. As wind dropped, power flowing across the West Texas power lines also dropped, which caused West Texas prices to be on par with the rest of ERCOT.

This drop in wind means natural gas must yet again pick up the slack. A grid with even higher wind and solar penetration would face these problems in abundance. Having a significant quantity of flexible loads on the grid to dial down consumption during rapid drawdowns in renewable generation would help attenuate spikes in power prices, without requiring as much support from less efficient combustion turbine peakers.

In our view, contemporary industrial Bitcoin mining has four key properties that make the industry an extremely suitable buyer of energy for increasingly renewable grids. These are interruptibility, unconstrained location agnosticism, scale independence, and attenuation:

Among industrial load centers, these qualities are unique. Prior to Bitcoin, there simply wasnt a load resource that satisfied these four qualities. Some industrial consumers of load have some of these features, but none with such fidelity. Aluminum smelting, for instance, has a degree of location agnosticism, as has been well-documented with Alcoa smelters being colocated with abundant energy resources, effectively exporting energy via the refined metal.

Certain types of factories like aluminum arc furnaces and paper mills have a degree of interruptibility, but only for short periods of time, and only with significant latency. Certain power companies offer households and industrial consumers the ability to participate in demand-response programs, but only in a reduced capacity and never as a Controllable Load Resource. Non-Bitcoin data centers also exhibit location agnosticism to a certain degree, but require high throughput internet and cannot tolerate interruptibility.

By satisfying these qualities, Bitcoin represents a truly novel industrial load center, and offers superlative benefits to grid operators and policymakers aiming to decarbonize the grid. This permits Bitcoin miners to effectively sell high-quality insurance into the grid by participating in the ancillary service market in ERCOT, something that was mostly limited to the supply-side previously (through thermal generation resources).

Even without these more advanced CLR products, its clear that miners have a strong economic motive to curtail their consumption during periods of high prices, allowing households to get more power during periods of scarcity.

A joint realization is currently underway. First, Bitcoin miners are learning that there are significant economic benefits to accepting reduced uptime and going from dumb load to smart load. Beyond the merit of economic dispatch, participating in formal demand response programs is an additional source of revenue and hardens the grid, too. Additionally, independent system operators are beginning to discover the possible importance of Bitcoin miners as a flexible load resource, one most unlike those they have historically considered for demand-response programs.

We hope and expect that many more grid operators will realize the significant benefits offered by flexible data centers and begin to design with Bitcoin mining in mind. Their ambitions for increasingly renewable grids may well depend on it.

This is a guest post by Nic Carter and Shaun Connell. Opinions expressed are entirely their own and do not necessarily reflect those of BTC Inc or Bitcoin Magazine.

The rest is here:

Miners Are The Optimal Buyers: The Data Behind Bitcoin-Led Decarbonization In Texas - Bitcoin Magazine

Read More..

The Benefits of Data Analytics in Clinical Reporting – Diagnostic and Interventional Cardiology

Accurate reporting is essential to producing the reliable analytics needed to meet lab accreditation standards. This is a process Susan Wayne, RRT, RVT, RCDS, is quite familiar with.

Wayne is clinical coordinator for noninvasive cardiovascular services at Carolina Pines Regional Medical Center in Hartsville, S.C. As of this writing, she is working through the re-accreditation process for both the echocardiogram and vascular labs, all while providing personal bedside care to cardio patients each day. Both our labs are accredited through IAC the Intersocietal Accreditation Commission and we go through this process every three years, she said. Im in my window for both of these right now. I need to provide data on volume by sonographer and for each procedure. That all used to be a manual process before.

Waynes team previously relied on manual log books. There was always the possibility that a technologist would forget to log something, she recalled. Each day, wed have to do charge reconciliation by manually tabulating a report from the previous day to make sure the correct charge went to the right patient. Wed also need to check that each procedure was logged. We might miss billing as many as four or five procedures a month because they were never recorded in our log book.

Wayne had heard about other facilities that were using Fujifilms VidiStar structured reporting to produce data analytics. As Carolina Pines was looking to transition to a PACS, they invited Fujifilm to come onsite and demo its capabilities. It was an easy choice. Once we were onboarded, the vendor started teaching me about the data mining the system was able to do something Id been doing manually before.

Carolina Pines transition to Fujifilm Data Analytics was highly seamless. Because its a cloud-based system, the analytics option was simply added to their existing account so it was accessible when they logged in. There are useful pre-built analytics that come with the system, but VidiStar also allows custom queries to be saved as presets. Once we had access, the Fujifilm representative helped me tailor the reports I need so that they run automatically each month, noted Wayne.

For the re-accreditation process, data must be provided that shows reporting is logged within 24 hours for inpatients and within 48 hours for outpatients. In addition to the data that must be supplied about volume by sonographer and by procedure type, the accrediting body requires submission of case studies for each physician. This involves selecting individual exams that are as technically perfect as possible. This used to be more difficult, Wayne said, because that information needs to be collected and logged over a period of several years. It was a laborious process to keep track of. Now, she can check a box on a case that looks like a good candidate and have that propagate into a report she can later review in order to select the exams that best meet IACs criteria.

Wayne is still learning how to build new reports, but Fujifilm has remained a resource during the ramp-up process. Theyre always available to guide me through, she said. They have fantastic customer support. I can either call a toll-free number or send an online message to their help desk. Any time I get stuck building a new report, theyll respond within a couple of hours and help me get it done. Theyre very responsive.

Another advantage of Carolina Pines analytics capability is how easy it is to respond to an ad hoc request for data. I occasionally receive an inquiry about physician statistics from the credentialing department, said Wayne. Its so easy to pull these with analytics.

She can also pull clean, professional-looking reports for the reading physicians offices to ensure they are able to bill their professional fees.

The amount of manual labor required to produce reports used to be onerous. For Carolina Pines, that has definitely changed. I dont know how labs function without data mining, said Wayne. The streamlining provided by data analytics has made her more efficient, better able to support facility administrators and physicians needs for information, and even spend more time with patients.

Wayne started out as a respiratory therapist, but was asked to get into the cardiovascular field 24 years ago. Her response at the time: If the hospital needs me to do it, then I will. That can-do attitude has persisted; combined with the power of Fujifilm Data Analytics, shes more empowered than ever to meet the needs of Carolina Pines.

Editor's note:This blog is the fourth in a four-part series about the benefits of cardiovascular information systems. The first blog ishere, the second blog ishere, and the third blog is here.

To learn more visithttps://hca.fujifilm.com/it

.

Go here to see the original:

The Benefits of Data Analytics in Clinical Reporting - Diagnostic and Interventional Cardiology

Read More..

Ex-WNBA player Alana Beard joins effort to bring expansion team to Oakland – ESPN

6:51 PM ET

Mechelle VoepelESPN.com

Former WNBA player Alana Beard has joined the effort to bring a WNBA expansion team to Oakland, it was announced Thursday. The 14-year WNBA veteran will partner with the African American Sports and Entertainment Group (AASEG) in pursuit of a new franchise.

Beard, the No. 2 overall draft pick out of Duke in 2004, played six seasons for the Washington Mystics and then another eight with the Los Angeles Sparks, with whom she won the 2016 league championship.

"As a professional athlete who made the transition into the business world, I understand now more than at any point the importance of having a great team and strong partnerships," said the 39-year-old Beard, who moved to the Bay Area to work in venture capital after her retirement following the 2019 WNBA season. "I've always envisioned being an owner of a WNBA team. It made sense to come together to partner on this."

The WNBA, which just completed its 25th season, has 12 franchises. It has not had an expansion team since the Atlanta Dream in 2008. The Dream, who were sold in February, have another former WNBA player, Renee Montgomery, as part of their ownership group.

"That was something I truly admired," said Beard, a two-time winner of the WNBA's Defensive Player of the Year award and a four-time All-Star. "Kudos to Renee and other women who are pursuing this exact dream."

One of the league's original eight franchises in 1997 was in Sacramento, but that team folded after the 2009 season. The Bay Area has been considered a possibility for expansion since. The short-lived women's American Basketball League had a team in San Jose in 1996-98, and the reigning national champion Stanford Cardinal are perennially one of the best women's college programs in the country.

Beard appeared on a video call Thursday along with Oakland Vice Mayor Rebecca Kaplan and others who support bringing a franchise to Oakland, including Gina Johnson Lillard, the mother of Portland Trail Blazers guard Damian Lillard and the director for the western region of Mothers of Professional Basketball.

"The leadership of Black women here is incredible," said Alicia Garza, co-founder of the International Black Lives Matter Movement. "WNBA, we definitely need and want you in Oakland. This project aligns on every level in terms of my values and the things I prioritize."

The AASEG said the Oakland City Council, Alameda County Board of Supervisors and the Joint Powers Authority Commission have approved a path for a WNBA team to play at Oakland Arena, which was home of the Golden State Warriors from 1971 to 2019. The WNBA said it had no statement on Thursday's announcement from the Oakland group. Commissioner Cathy Engelbert told ESPN earlier this month that the league is hopeful of expansion within the next five years.

"We're doing all that data mining now," Engelbert said. "I suspect by next summer or this time next year, in our 26th season, we'll be talking about the number of teams and a list of where."

Follow this link:

Ex-WNBA player Alana Beard joins effort to bring expansion team to Oakland - ESPN

Read More..