Category Archives: Data Mining

Andrew Hopkins of Exscientia: the man using AI to cure disease – The Guardian

It was early one morning in 1996 when Andrew Hopkins, then a PhD biophysics student at Oxford University, had a brainwave as he walked home from a late-night lab meeting.

He was trying to find molecules to fight HIV and to better understand drug resistance.

I remember this idea struck me that there must be a better way to do drug discovery other than the complex and expensive way everyone was following, he says. Why couldnt we design an automated approach to drug design that would use all the information in parallel so that even a humble PhD student could create a medicine? That idea really stuck with me. I remember almost the exact moment to this day. And that was the genesis of the idea that eventually became Exscientia.

It was to prove a lucrative brainwave. Hopkins set up the company in 2012 as a spinout from the University of Dundee, where he was by then working as a professor. It uses artificial intelligence (AI) systems, which are being trained to mimic human creativity, to develop new medicines. This involves the use of automated computer algorithms to sift through large datasets to design novel compounds that can treat diseases, and to help select the right patients for each treatment.

Age 50

Family Married with a 10-year-old daughter. He met his wife, Iva Hopkins Navratilova, at Pfizer. Her business, Kinetic Discovery, merged with his to create the experimental biology labs at Exscientia.

Education Dwr-y-Felin comprehensive and Neath College in south Wales; degree in chemistry at Manchester; doctorate in molecular biophysics at Oxford.

Pay 415,000

Last holiday Czech Republic to visit his wifes family at Easter.

Best advice he has been given My dad worked in a factory. He said to me: Get a good education and get a job you enjoy doing. Its worth an extra six grand a year. And I definitely got a job I enjoy doing.

Biggest career mistake Its too early to tell. He quotes Miles Davis: Its not the note you play thats the wrong note its the note you play afterwards that makes it right or wrong.

Words he overuses Fundamentally; the heart of the matter.

How he relaxes Reading and dog walking. I am a bibliophile. I immerse myself in books to relax.

This approach drastically cuts the time of drug development. Hopkins says that for Exscientias pipeline it has typically taken 12 to 15 months from starting a project to identifying a drug candidate, compared with four and a half years in the traditional pharmaceutical industry.

The average cost of developing a medicine is $2bn, according to Deloittes latest pharma report, and many drugs fail the failure rate is 90% for medicines that are in early clinical studies (where they are tested on humans).

Typically, pharma companies make 2,500 compounds to test them against a specific disease, while AI enables Oxford-based Exscientia to whittle down that number to about 250, Hopkins says. Its a much more methodical approach.

Last autumn, the Welsh scientist became one of Britains richest entrepreneurs, with a paper fortune of 400m after the company achieved a $2.9bn stock market debut on Nasdaq in New York, making it one of Britains biggest biotech firms. Hopkinss stake of nearly 16% is now worth 170m, as the share price has lost 60% of its value in a bloodbath for Wall Street stocks.

Exscientia was part of a transatlantic trend that is defying government attempts to build a biotech powerhouse in the UK. Abcam, a pioneering Cambridge antibody company, recently announced it was moving its stock market listing from the UK to the US. We are a British company; we choose to be in Oxford because we can attract global talent, Hopkins says. But to be seen as a global company, we listed on what is the global technology index, which is Nasdaq. What we have now is an incredibly international shareholder base from across the world.

The business came up with the first AI-designed drug to enter clinical trials a treatment for obsessive-compulsive disorder in partnership with Japans Sumitomo, although Sumitomo later decided not to proceed with it. The Japanese firm is currently studying another drug developed by Exscientia, for the treatment of Alzheimers disease psychosis, in early human trials.

Hopkins, now 50, fell in love with science thanks to an inspirational chemistry teacher. He has worked as a scientist since the age of 16, when he did a stint in industrial chemistry at the Port Talbot steelworks in south Wales, which he says taught him about the benefits of automation in boosting productivity.

He spent nearly a decade at the US drug giant Pfizer, where he was on a data warehouse project that led to some of the first machine-learning applications in the pharmaceutical industry, with the findings published in Nature in 2006.

During the subsequent five years at Dundee University, he further researched applying data mining and machine learning to drug discovery. He says being a professor is actually one of the best jobs in the world and gave him the freedom to research AI methods at length. He maintains his links with the university, where he is honorary chair of medicinal informatics at the School of Life Sciences.

Exscientia (which means from knowledge in Latin) soon moved to the Schrdinger Building at the Oxford science park, and now employs 450 people worldwide, from Vienna to Boston, Miami and Osaka, equally split between AI engineering, chemistry and biology.

It is building a new robotics laboratory at Milton Park near Oxford, focused on the automation of chemistry and biology to accelerate drug development and its declared goal is drugs designed by AI, made by robot. Other pharma companies have also introduced some automation into their processes, but generally lab technology is similar to how it was when he was a student in the 1990s, Hopkins says.

The firm is involved in 30 projects, some in partnership with big pharmaceutical companies including Frances Sanofi and the US firm Bristol Myers Squibb (BMS). It is also working with Oxford University on developing medicines that target neuroinflammation for the treatment of Alzheimers disease. Among the firms solo projects, a cancer drug for solid tumours is about to go into early clinical trials.

Exscientia is also working on a broader coronavirus pill to rival Paxlovid, the Covid-19 treatment made by Hopkinss former employer Pfizer. This work is funded by a $1.5m grant from the Bill and Melinda Gates Foundation, which took a stake in Exscientia. The companys other investors include BMS, Celgene (now a BMS subsidiary) and Germanys Evotec, as well as Japans Softbank, the US fund manager BlackRock and the life science investor Novo Holdings.

Hopkins says the team has identified a set of molecules that could work as a broader treatment for Covid-19, new mutations and other coronaviruses, and that there will be more news later this year. The firm is aiming for a low-cost pill that could be distributed globally and given quickly to people who fall ill to prevent serious illness and hospitalisation. Covid-19 infections are rising again in 110 countries and the World Health Organizations director general, Tedros Adhanom Ghebreyesus, has warned that the pandemic is far from over.

Firms across the pharmaceutical industry have started using AI in recent years. AstraZeneca is investing heavily in it for its entire research and development infrastructure, and GSK has built an AI team of 120 engineers, with plans to reach 160 next year, making it the largest such in-house team in the industry.

AI systems require a lot of computing power and enormous datasets. Their use should boost the number of new drugs being approved every year typically 40 to 50 in the US to many more. Hopkins confidently predicts: This is the way all drugs will be designed in the future. In the next decade, this technology will become ubiquitous.

View post:

Andrew Hopkins of Exscientia: the man using AI to cure disease - The Guardian

Digital Twinning in Buildings Goes Well Beyond the Digital Commercial Observer – Commercial Observer

At first glance, NASA and commercial real estate have little in common. Yet in strategizing for a more efficient future, outer space and built space arent all that different. To advance their respective galactic and environmental endeavors, both utilize the same technology: digital twins.

Digital twins are an increasingly common technology in commercial real estate, thanks in part to the boom in proptech and the growing emphasis on sustainability. Yet the concept of digital twins first gained traction in the 1960s, when NASA introduced an early version. Following the Apollo 13 mission yes, the one from that movie starring Tom Hanks NASA employed a digital twin to assess and evaluate engine intricacies.

Since then, digital twins have come down to Earth literally. Within the last five to six years, theyve been gradually integrated into the dialogue surrounding commercial real estate, and the technology is being used to represent real-time models of real life buildings.

A digital twin is creating a digital profile of the building in a way that you can use technology to anticipate and monitor and plan for your buildings life cycle, both in energy and in flow and in people, said Michael Phillips, principal and president at real estate investment and management firm Jamestown.

James Wynn director of intelligent places at architecture firm Gensler referred to the same concept as operational twins. When real estate companies talk about digital twins, theyre typically citing real-time models that represent already-built buildings. These digital models help companies understand how the built environment is currently actively performing and gauge how tenants are realistically engaging with the space in question.

As interactive models of buildings, operational twins can therefore help companies adopt the best practices for maximizing a buildings efficiency a boon to sustainability efforts. Twins can map, track, and quantify flow patterns, access points and energy consumption, which help buildings react accordingly and quickly to changing environmental needs.

During the summer, for example, landlords will likely confront new patterns of tenant engagement. People tend to go on vacation, so rather than cool and light an entire building, landlords can use ongoing feedback from a digital twin to help landlords decide which floors to open, light, and cool, according to Wynn.

Such tools remain in the early stages of integration, but their climate-related impact bodes well for the long-term.

I think digital twinning will become a base standard for buildings as they move forward, said Phillips. Digital twins will grow alongside the built environment at a pace that depends on data access and will eventually become ubiquitous, Phillips said.

Thats because digital twins are particularly useful in retrofits. Though the design and construction steps surely shape a buildings later efficiency, developers cant always start from scratch. Rather, the real estate industry must get creative to tinker with already in-use developments.

Ninety percent of the buildings that will exist in 2030 already exist, said Michael Jansen, CEO and founder of urban digital twin platform Cityzenith. The real problem in the country is not new buildings so it has to be retrofitting, which is the elephant in the room.

For companies unsure where to begin in retrofitting buildings, digital twins present a workable roadmap to rebuild and repurpose their assets.

They are systems of systems tools that allow multiple users to integrate multiple softwares and data systems, Jansen said. He pointed to some combination of 3-D models, public data and sensors that work together to simulate various outcomes. Based on analyses of data, digital twins can then draw actionable conclusions that direct landlords in managing and optimizing operations.

When it comes to making greener buildings, these conclusions are all-encompassing. According to Jansen, digital twins can help optimize the energy management of a building; evaluate the renewable options for that particular property; and calculate the carbon offsets required for a building to achieve net-zero status or other climate targets, as well as monitor emissions in real-time.

An example of this widespread promise includes Cityzeniths Las Vegas digital twin a virtual copy of part of that citys downtown which aims to help building owners achieve net-zero carbon emissions. Still in its early stages, the twin has begun to integrate data from Las Vegas downtown region in conjunction with Cityzeniths Clean Cities Clean Future initiative. According to the Cityzenith website, the project aims to transform mobility, air quality, noise pollution [and] water management, as well as emissions.

Although operational twins have a clear role to play, they are limited to addressing already built space. As such, new buildings require models of their own and a digital upgrade in sync with a portfolios emerging ESG goals.

As an architecture firm, Gensler is primarily interested in what Wynn deemed design twins, as opposed to those operational twins. Wynn noted that these twins arent typically defined as digital twins, as they dont monitor a building in real time. Instead, they are akin to energy models or used alongside operational read: digital twins.

For the purposes of sustainability, however, design twins have a similar usage and impact as strict digital ones.

The way Im using that definition is saying that any digital representation of a space, whether in design or built, is really a digital twin, said Wynn, broadening the definition of twin technology.

Semantics aside, design twins have similar goals, techniques and outputs to digital twins. They also approach sustainability from a unique vantage point; they can evaluate and react to a buildings potential environmental performance before its ever been built. This perspective allows developers, architects and engineers to understand building operations prior to spending money, beginning construction, and releasing carbon via inefficient materials.

In this vein, design twins have financial benefits, in addition to environmental ones. Many projects are slow-moving and take years to come to fruition. Design twins can therefore act as trial runs, helping developers understand how a building may function and adjust accordingly for assessed setbacks. The simulations can only predict so much, though.

[A digital twin is] real-time feedback from different sensors, said Violet Whitney, senior product manager at Delve by Sidewalk Labs, an artificial intelligence-powered, Google-owned neighborhood planning product. But what it doesnt capture is maybe more of the long-term consequences that couldve prevented something like that in the first place.

Delve is essentially a design twin that front-loads a project with pro-environmental considerations. Such technology works in tandem with operational twins; employing various models from all stages of development is likely to yield a well-rounded and thorough understanding of a buildings energy systems and emissions. When used in conjunction with real-time twins, products like Delve can ensure that architects, developers and landlords understand how a given building operates and may continue to operate throughout its lifespan.

We can help [developers] by generating many designs that vary, that show them a bunch of scenarios, said Whitney.

Delve relies on greenhouse gas modeling, as well as other quality of life variables, to help developers decide what to build onsite. Models assess the interplay between the natural environment and the tenants experience; understanding how light, shadow and solar will interact within a space is crucial, both from the standpoint of ultimate tenant happiness and environmental integrity.

The gradual consideration of these factors reflects a change in developer and owner goals alike. Whether employing a digital twin before or after a buildings inception, real estate companies across the industry are increasingly focused on outcomes, Whitney said.

It is this goal-driven mentality that will become one of ESGs greatest assets. By assessing what exactly a buildings outputs are and from where theyre coming, digital twins whether design or operational hold developers accountable to change.

To get these results, however, a company must first create a digital twin. Gensler begins with a Building Information Model, widely known as BIM and a standard technology used by architects. Once this model is ready, data is then layered into a buildings geometry for the sake of simulating and analyzing a potential designs performance.

For an energy use study, the BIM can assess how daylight affects design, said Wynn. It taps into where the sun is located relative to the building. Design twins synthesize various aspects of a projects surroundings, neighborhood, materials and potential uses to allow for educated and intentional construction.

[Digital twin is] a term that is being defined as we speak, said Wynn. So I think somebody could make the argument that a really, really defined BIM model is actually kind of a rudimentary digital twin.

Yet to create models that actually help the environment, Gensler, as well as the likes of Jamestown, Cityzenith and Delve, must collect ample and accurate data. Finding this information isnt so simple a challenge that has become the norm in attempts to implement pro-ESG policies.

Were at the stage where weve got all this data, but its actually harder than it sounds to put it into one coherent picture, said Wynn. Some manufacturers have really good data; some dont.

The lack of consistent data presents a challenge in deciding between multiple designs. Yet Whitney stated that data collection proves more difficult for operational twins than it does for design models. Because design twins rely on upfront simulation, data mining is more straightforward and less subjective.

Still, the overarching emphasis on data will skew the integration of digital twins toward larger corporate portfolios. Bigger companies with more significant holdings tend to have richer and greater quantities of statistics, so they are therefore at an advantage in creating and utilizing accurate and comprehensive digital twins.

Data-driven asset classes represent the high-end of the market, said Jansen. These include retail, offices and Class A residential buildings, as well as campuses. Yet he predicts that this limited, upper integration of digital twins is only a short-term pattern.

Eventually digital twin products, including ours, will have the T.J. Maxx version for the homeowner, Jansen said.

Although digital twins do require energy in order to process heaps of data, they ultimately save far more energy than they consume. The marginal amount of energy that would be consumed by the hardware in the course of making these calculations is nothing in comparison to the savings that are generated by the assets themselves as a result, Jansen said.

Given these reductions, tenants have been especially receptive. Sustainability is increasingly important for the savings and the public perception, never mind the ethics, so employers must attract and retain a workforce that increasingly cares about its carbon footprint. A sustainable building is therefore a win-win-win for tenants, developers and the environment.

This central motivation encompasses the heart of digital twin usage.

What really matters is, Whitney said, ultimately, do we end up building a place that ends up being better for the people that live there and the people that would experience the place?

See more here:

Digital Twinning in Buildings Goes Well Beyond the Digital Commercial Observer - Commercial Observer

HotSpot Therapeutics Appoints Paul Thibodeau, Ph.D., as Chief Business Officer – PR Newswire

BOSTON, July 25, 2022 /PRNewswire/ -- HotSpot Therapeutics, Inc., a biotechnology company pioneering the discovery and development of small molecule allosteric therapies for the treatment of cancer and autoimmune diseases, today announced the appointment of Paul Thibodeau, Ph.D., as Chief Business Officer. Dr. Thibodeau brings nearly 20 years of business development experience in the biopharmaceutical industry.

"We are very pleased to welcome Paul to HotSpot as Chief Business Officer," said Jonathan Montagu, Co-Founder and Chief Executive Officer of HotSpot Therapeutics. "Paul's significant industry experience and proven track record in biopharmaceutical business development will be critical as we advance our oncology and autoimmune programs and expand our pipeline through partnerships enabled by our Smart Allosteryplatform."

Dr. Thibodeau joins HotSpot from Novartis, where he held numerous business development positions, most recently leading M&A transactions across all of Novartis' therapeutic areas, where he was responsible for deal sourcing, evaluation, and execution of acquisitions and divestments. Prior to joining Novartis in 2016, he served as Senior Director of Global Business Development at Teva Cephalon, heading up all out-licensing and discovery deals for its specialty pipeline. Previously, he held numerous business development positions with increasing responsibilities at Sanofi Genzyme across its oncology, transplant and multiple sclerosis franchises. Dr. Thibodeau was a Postdoctoral Research Fellow at the Institut national scientifique et de la recherche mdicale (INSERM) in Paris, France. Dr. Thibodeau received his Ph.D. in Cell Biology and Radiobiology from the Universit de Sherbrooke in Quebec, Canada, and his B.S. in Biology and Biochemistry from the Universit de Moncton in New Brunswick, Canada. He also completed a MBA at the MIT Sloan School of Management.

"I am thrilled to join HotSpot's driven and innovative team at this important juncture for the Company," said Dr. Thibodeau. "I look forward to collaborating with the team as they seek to bring forward novel therapies for patients with significant unmet need."

About HotSpot Therapeutics, Inc.HotSpot Therapeutics, Inc. is targeting naturally occurring pockets on certain proteins that it refers to as "natural hotspots" that are decisive in controlling cellular protein function. Largely unexploited by industry, these pockets have significant potential for drug discovery and provide for the systematic design of highly potent and selective small molecules that exhibit novel pharmacology. The company's proprietary Smart Allostery platform utilizes computational approaches and AI-driven data mining of large and highly diverse data sets to identify natural hotspots, integrated with a tailored pharmacology toolkit and bespoke chemistry which the company believes will enable rapid delivery of superior hotspot-targeted small molecules. HotSpot has established a pipeline of differentiated allosteric small molecules for the treatment of cancer and autoimmune diseases. To learn more, visitwww.hotspotthera.com.

SOURCE HotSpot Therapeutics

More:

HotSpot Therapeutics Appoints Paul Thibodeau, Ph.D., as Chief Business Officer - PR Newswire

Machine learning hiring levels in the mining industry rose in June 2022 – Mining Technology

The proportion of mining industry operations and technologies companies hiring for machine learning-related positions rose significantly in June 2022 compared with the equivalent month last year, with 21.5% of the companies included in our analysis recruiting for at least one such position.

This latest figure was higher than the 10.7% of companies that were hiring for machine learning-related jobs a year ago and the same as the figure of 21.5% in May 2022.

When it came to the rate of all job openings that were linked to machine learning, related job postings dropped in June 2022 from May 2022, with 0.6% of newly posted job advertisements being linked to the topic.This latest figure was a decrease compared to the 0.7% of newly advertised jobs that were linked to machine learning in the equivalent month a year ago.

Machine learning is one of the topics that GlobalData, from which our data for this article is taken, has identified as being a key disruptive force facing companies in the coming years. Companies that excel and invest in these areas now are thought to be better prepared for the future business landscape and better equipped to survive unforeseen challenges.

Our analysis of the data shows that mining industry operations and technologies companies are currently hiring for machine learning jobs at a rate lower than the average for all companies within GlobalData's job analytics database. The average among all companies stood at 1.2% in June 2022.

GlobalData's job analytics database tracks the daily hiring patterns of thousands of companies across the world, drawing in jobs as they're posted and tagging them with additional layers of data on everything from the seniority of each position to whether a job is linked to wider industry trends.

You can keep track of the latest data from this database as it emerges by visiting our live dashboard here.

Leading the Digital Transformation of Mining

Durability. Wear resistance. Maximum Conveyor Uptime

High-Performance Monitoring Equipment for Mining Applications

See original here:

Machine learning hiring levels in the mining industry rose in June 2022 - Mining Technology

How an abortion case turns the screw on Big Tech EJINSIGHT – ejinsight.com – EJ Insight

The United States Supreme Courts recent decision to overturn Roe vs. Wade has firmly engulfed American society in heated debates. Beyond the strife on the volte-face of abortion right, this epochal case might now turn data-driven companies into the means to a political end and force businesses to affirm or retract their positions on privacy.

Location data can now be weaponized by abortion-ban advocates to convict women seeking pregnancy termination in abortion-banning states. Many mobile applications nowadays track a users geolocation data in the name of optimizing the users experience and the apps functionality. Individuals who conduct or undergo abortions might be subject to criminal witch-hunting, further exacerbating the entrenched moral stigma associated with abortion.

Location data is a small fraction of the entire digital weapon arsenal. Other obvious targets include menstrual tracking, ovulation, and pregnancy data. These examples may barely scratch the surface of all possible data harvesting activities. Download any applications from an app store, and it is possible to find an alarming level of data, tangential or otherwise, harvested for commodification. Surveillance capitalism, a system centered around companies capturing and monetizing personal data, may now be a foregone conclusion.

To avoid being caught in legal crosshairs, Google announced that they would automatically delete users location history to places such as abortion clinics. Sure, Google earned cautious applause in a move that is palatable to abortion advocates. But it also begs the question: why did it take so long for companies like Google to recognize the effect of over-zealous data mining practice? Or was the move a fig leaf intended to cover the cracks in its system? From a pragmatic perspective, it is also unclear how, and how dependably, Google can identify sensitive locations to guarantee automatic data deletion.

Businesses have long understood that hoovering up a treasure trove of users data creates a fast track to power and competitive edge. The decision of Roe shows, for the first time, that even aggressive data collectors might see the value in a less-is-more approach. Frequent run-ins with legal enforcement have long sounded alarms to market players, signaling that their data monopolies are far from secured. Didis recent RMB $8 billion fine levied by the Cyberspace Administration for violation of Chinas data laws, is likely to imbue companies possessing large volumes of data with a sense of fear. As the public begins to come to grips with the peril of surrendering too much data, can they now trust companies to have the bottle to take a stand to stamp out further abuse?

While most companies do not decidedly take sides in controversial legal battles, their present modus operandi means that they may be unable to avoid getting involved. How would firms react when receiving legitimate requests from law enforcement? Under what obligations, legal or ethical, are companies like Google responsible or liable to relinquish information? Companies drawn into this conflict might assume different and often tendentious positions on a case-by-case basis. A small pivot in managerial direction could change the fate of millions of people. It must be asked how, and in what ways, can companies truly insulate themselves from legal crossfire?

Roe provides a live example of how decisions about bodily autonomy can cut across elusive digital boundaries with grave consequences. One can argue that our government watchdog should strictly enforce the principle of data minimization the practice of limiting the collection of personal information. This might also prove to be illusory at best. Determining exactly what data a company needs to carry out a task properly has always been challenging. Whipsawed by cutthroat competitions and bulging consumer demands, companies typically adopt a toxic collect-first, think-later culture at the expense of individual privacy. There is also the irreconcilable duality between data privacy and data collection. The more shackles we place on data mining activity, the fewer insights we can gather, hindering innovation and creativity.

Privacy, in its permutations, crisscrosses multiple legal territories. It involves giving individuals the right to control decisions about themselves. As can be gleaned from Roe, a judgment about abortion can introduce an unavoidable element of unpredictability into privacy principles. The abortion debate quickens the need for companies to decide what to do with sensitive data within that regulated boundary, whether for good or for evil. But the debate also shows how elastic our wider digital ecosystem is to the ebbs and flows of real-life frictions. As we broaden our viewfinders, we must be prepared to accept that societal realities could sometimes bend our understanding of the borderless digital experiences. And only through coordinated efforts and carefully-designed policies could we expect to shake off the yoke of a grimly uncertain future.

-- Contact us at [emailprotected]

The writer is a lawyer based in Hong Kong.

Original post:

How an abortion case turns the screw on Big Tech EJINSIGHT - ejinsight.com - EJ Insight

Is Viscount Mining (CVE:VML) In A Good Position To Deliver On Growth Plans? – Yahoo Finance

Even when a business is losing money, it's possible for shareholders to make money if they buy a good business at the right price. For example, biotech and mining exploration companies often lose money for years before finding success with a new treatment or mineral discovery. Having said that, unprofitable companies are risky because they could potentially burn through all their cash and become distressed.

So, the natural question for Viscount Mining (CVE:VML) shareholders is whether they should be concerned by its rate of cash burn. In this report, we will consider the company's annual negative free cash flow, henceforth referring to it as the 'cash burn'. The first step is to compare its cash burn with its cash reserves, to give us its 'cash runway'.

View our latest analysis for Viscount Mining

You can calculate a company's cash runway by dividing the amount of cash it has by the rate at which it is spending that cash. When Viscount Mining last reported its balance sheet in February 2022, it had zero debt and cash worth CA$1.6m. In the last year, its cash burn was CA$2.6m. So it had a cash runway of approximately 8 months from February 2022. That's quite a short cash runway, indicating the company must either reduce its annual cash burn or replenish its cash. Depicted below, you can see how its cash holdings have changed over time.

debt-equity-history-analysis

Viscount Mining didn't record any revenue over the last year, indicating that it's an early stage company still developing its business. So while we can't look to sales to understand growth, we can look at how the cash burn is changing to understand how expenditure is trending over time. Over the last year its cash burn actually increased by 9.1%, which suggests that management are increasing investment in future growth, but not too quickly. However, the company's true cash runway will therefore be shorter than suggested above, if spending continues to increase. Viscount Mining makes us a little nervous due to its lack of substantial operating revenue. We prefer most of the stocks on this list of stocks that analysts expect to grow.

While its cash burn is only increasing slightly, Viscount Mining shareholders should still consider the potential need for further cash, down the track. Issuing new shares, or taking on debt, are the most common ways for a listed company to raise more money for its business. Many companies end up issuing new shares to fund future growth. By looking at a company's cash burn relative to its market capitalisation, we gain insight on how much shareholders would be diluted if the company needed to raise enough cash to cover another year's cash burn.

Viscount Mining has a market capitalisation of CA$26m and burnt through CA$2.6m last year, which is 10.0% of the company's market value. That's a low proportion, so we figure the company would be able to raise more cash to fund growth, with a little dilution, or even to simply borrow some money.

Even though its cash runway makes us a little nervous, we are compelled to mention that we thought Viscount Mining's cash burn relative to its market cap was relatively promising. Summing up, we think the Viscount Mining's cash burn is a risk, based on the factors we mentioned in this article. Taking a deeper dive, we've spotted 4 warning signs for Viscount Mining you should be aware of, and 3 of them make us uncomfortable.

Of course, you might find a fantastic investment by looking elsewhere. So take a peek at this free list of companies insiders are buying, and this list of stocks growth stocks (according to analyst forecasts)

Have feedback on this article? Concerned about the content? Get in touch with us directly. Alternatively, email editorial-team (at) simplywallst.com.

This article by Simply Wall St is general in nature. We provide commentary based on historical data and analyst forecasts only using an unbiased methodology and our articles are not intended to be financial advice. It does not constitute a recommendation to buy or sell any stock, and does not take account of your objectives, or your financial situation. We aim to bring you long-term focused analysis driven by fundamental data. Note that our analysis may not factor in the latest price-sensitive company announcements or qualitative material. Simply Wall St has no position in any stocks mentioned.

Join A Paid User Research SessionYoull receive a US$30 Amazon Gift card for 1 hour of your time while helping us build better investing tools for the individual investors like yourself. Sign up here

More here:

Is Viscount Mining (CVE:VML) In A Good Position To Deliver On Growth Plans? - Yahoo Finance

Denmark is banning Chromebooks in schools here’s why I’m on board – Laptop Mag

Denmark is banning Chromebooks in schools after Helsingr municipality officials were ordered to undergo a data-risk assessment regarding how Google processes personal information.

Datatilsynet, Denmark's data protection agency, released a verdict last week that stated that Google's cloud-based Workspace don't meet the European Union's GDPR data-privacy regulations. Why? The investigation discovered that Google transfers data from Europe-based servers to US servers, which clashes with GDPR's security and privacy standards.

The ruling is currently specific to Helsingr; the municipality has until Aug. 3 to rid themselves, and their students, of Chromebooks and Google Workspace. However, due to the Datatilsynet report stating that there was a personal data breach in 2020, the protection agency noted that the ruling would most likely apply to other municipalities in the future.

The main cause for this situation is the now defunct EU-US Privacy Shield that controlled how data was shared between the US and the EU. With no regulation currently in place, this leaves EU users vulnerable to the less-stringent data regulations of the United States, which are less concerned with anonymizing personal data.

Interestingly enough, data transfers between the US and Europe have been considered illegal since a previous ruling, "the Schrems II case" in 2020, which nullified the existing US-EU Privacy Shield agreement due to it not meeting the standards of the GDPR. These announcements and rulings follow a trend of European nations, such as Italy, France, and Austria, determining that websites using Google Analytics violated European data-privacy protection rules.

A Google spokesperson recently told TechCrunch the following:

"Schools own their own data. We only process their data in accordance with our contracts with them. In Workspace for Education, students data is never used for advertising or other commercial purposes. Independent organizations have audited our services, and we keep our practices under constant review to maintain the highest possible standards of safety and compliance."

This is gobbledygook for "no harm, no foul" and doesn't address the issue.

Recently, Ireland's DPC (Data Protection Commission) looked into how Meta, Facebook's parent company, has been transferring data between Europe and the US, which is possibly impacting WhatsApp and Instagram users.

While nations within the EU fight for citizens' data protection, it leaves US residents to wonder how much of their personal information is being mined.

As a tech reviewer, Chromebooks have always been less attractive to me compared to PCs and Macs. When you add that many EU nations are putting their legal foot on Google's neck due to data security issues, what's the point of buying one?

Chrome OS devices are ideal for checking emails, watching YouTube, and diving into Google's productivity suite of apps (e.g., Docs, Sheets and Slides), however, you can't do any serious gaming, video editing nor photo editing. As such, Google's prevalent security and privacy issues further weakens the case for purchasing a Chromebook.

Sure, Chromebooks are more affordable, but so was my second-hand car, which turned out to be utterly useless and dangerous to drive. Lastly, one of the biggest faults with Chrome OS is the Google Play store, which is filled with malicious, malware-infested apps used for data mining, stealing personal information, and money.

Google must take a page from Apple and tighten up its security vulnerabilities before consumers jump ship and seek alternative operating systems. If you want to know what you can to protect yourself, I suggest looking into setting up a VPN, or maybe even a double VPN, which one of our contributors recently wrote about. Stay vigilant, my friends!

h/t TechCrunch

Today's best Apple 13.3" MacBook Pro M2 and deals

Original post:

Denmark is banning Chromebooks in schools here's why I'm on board - Laptop Mag

Crypto miners moved over $300 million of bitcoin in one day, and some are dropping out altogether – CNBC

New data from blockchain analytics firm CryptoQuant shows that miners are rapidly exiting their bitcoin positions.

14,000 bitcoin, worth more than $300 million at its current price, was transferred out of wallets belonging to miners in a single 24-hour period at the end of last week and in the last few weeks, miners have offloaded the largest amount of bitcoin since Jan. 2021. The phenomenon is called "miner capitulation," and it typically indicates that miners are preparing to sell their previously mined coins in order to cover ongoing mining expenses.

Bitcoin is currently trading around $21,600, up about 3% in the last 24 hours. Still, the wider crypto market has been in a slump for months, with bitcoin down nearly 70% from its all-time high of around $69,000 in Nov. 2021.

Meanwhile, inflation is on a tear, and the cost of energy is hitting record highs as the war between Russia and Ukraine rages on.

Lower bitcoin prices and higher energy costs are compressing profit margins for miners, which is part of why some are selling bitcoin at current prices to try to contain exposure to continued volatility in the sector and mitigate against further risk to their bottom line.

"Given rising electricity costs, and bitcoin's steep price decline, the cost of mining a bitcoin may be higher than its price for some miners," Citi analyst Joseph Ayoub wrote in a note on July 5.

"With high-profile reports of resignations from mining companies, as well as miners that have used their equipment as collateral to borrow money, the bitcoin mining industry could be under growing pressure," the note continued.

Core Scientific, which is one of the largest publicly traded crypto mining companies in the U.S., sold nearly all its bitcoin in June. CEO Mike Levitt tells CNBC that just like any other business, bitcoin miners need to pay their bills.

"We mine and earn or produce bitcoin, but our costs, expenses, and liabilities are in dollars," said Levitt.

It's still profitable to mine bitcoin, Levitt says, with around 50% margins across the industry. That's down from 80% margins at its peak.

Last month, Core sold 7,202 bitcoin at an average price of $23,000. Levitt tells CNBC they invested the proceeds of approximately $167 million primarily into growth-oriented activities, including new ASIC servers and additional data center capacity for their self-mining and colocation businesses.

But they also deployed some of that capital to repay debt and to help settle five years of employee stock grants.

Long-term, Levitt is optimistic because there's tremendous positive operating leverage in the business. Over certain levels, every dollar increase in the price of bitcoin is 100% operating income to bitcoin miners.

"We would all be cheering loudly if bitcoin were to get back to $35,000, $40,000. There is no doubt about that," he said.

But productivity per unit of electricity also matters, and when prices are low, large-scale miners like Core Scientific tend to face less competition from hobbyists and small operations.

"As prices fall, the global hashrate or the competition for the production of bitcoin decreases, as less efficient miners come off the network," explained Levitt.

The hashrate is a term used to describe the computing power of all miners in the bitcoin network, and it is down 15% in the last month. That is ultimately a good thing for the large-scale miners who can afford to weather the downturns.

As less efficient miners come off the network and global hashrate declines, machines that continue to mine bitcoin get more productive.

"And thus, the cost of energy, if you will, per bitcoin produced, goes down," said Levitt.

Read the original post:

Crypto miners moved over $300 million of bitcoin in one day, and some are dropping out altogether - CNBC

Digital health investment is just cooling off after a scorching year – MedCity News

Last year was a record year for venture capital investment in digital health startups the sector raised $29.1 billion across 729 deals, with an average deal size of $39.9 million. This market boom has ended, but the digital health investment space has not come crashing down by any means, according to a recent report from Rock Health. Its just cooling off after a scorching year.

Digital health startups raised $10.3 billion across 329 deals during the first half of this year, with an average deal size of $31.2 million. This puts the sector on track to rake in $21 billion in 2022, about $8 billion less than the total amount raised last year.

Though the pace of investments has decreased in the first half of this year compared to 2021, venture capital funding for healthcare companies is still pacing ahead of where it was in 2020, David Blumberg, founder and managing partner of venture capital firm Blumberg Capital, pointed out in an interview. He said investment firms are definitely still consistently pouring money in the digital health space, and these startups are still developing technology that is attractive to health systems.

As health systems continue to generate a tremendous amount of data, funding is flowing towards startups that are deploying AI and machine learning solutions to harness this data for prevention, diagnosis and treatment, he said.

Health systems are also continuing to adopt technology from startups focusing on telehealth and remote patient monitoring. Blumberg noted that this trend was strengthened by the pandemic, which accelerated our societal migration toward a more virtualized mode of living and working.

When choosing which healthcare companies to invest in, Blumberg Capital only considers startups that can prove they leverage data to improve clinical outcomes and lower costs. Blumberg said the firm looks at six Ts when evaluating companies: theme, team, terrain, technology, traction and terms. All companies receiving Blumberg Capital funding must be able to explain these thoroughly.

Ferrum and Theator are examples of two companies that made it into Blumberg Capitals portfolio in the past two years. The former uses algorithms to automatically and inexpensively scan hospitals radiology diagnoses as a second opinion for quality assurance, error reduction and training purposes. The latter sells an AI-powered surgical tool that extracts and annotates key moments from real-world procedures to help train surgeons for future operations.

Blumberg Capital is an early-stage investment firm, so Blumberg said it will always be able to choose from a steady flow of healthcare startups promising to harness data in new ways to benefit patients and healthcare professionals. The investment space looks a bit different for later-stage startups. Later-stage digital health startups are using this market moment to reconsider valuations, reduce expenses and design their go-to-market strategies, according to the report.

Some of these companies may need to adjust their expectations following large Series A or B deals, which could mean selling shares at a lower price than they were sold for in a previous financing round. But this isnt true for all digital health startups that experienced rapid growth in their early years Rock Healths report noted that some of the companies within its own portfolio have exceeded pre-pandemic financial projections by significant margins.

One thing that is certain is that 2022 wont see as many startups enter public markets as last year. So far this year, no startups have gone public, compared to 23 exits in 2021.

Photo: aurielaki, Getty Images

See original here:

Digital health investment is just cooling off after a scorching year - MedCity News

Neuromorphic Computing Market SWOT Analysis, Latest Innovations, Emerging Trends, Industry Size, Growth Outlook and Forecast 2029 – Digital Journal

The ReliableNeuromorphic Computing Marketreport provides actionable market insights from which companies can build long-lasting and profitable strategies.Needless to say, the report helps companies to make different strategies by analyzing general market conditions such as product price, profit, capacity, production, supply, demand and market growth rate.A SWOT analysis was performed while formulating this market document along with many other standard research, analysis, and data collection stages.Additionally, key players, key collaborations, mergers, acquisitions, innovation trends and business policies are also re-evaluated in the Neuromorphic Computing Market report.It makes it easier to understand brand awareness and knowledge about your brand and products among potential customers.

Market Analysis and Overview

The Neuromorphic Computing market is expected to witness market growth at a rate of 52.50% during the forecast period 2022-2029 and is expected to reach a value of $34.61 billion by 2029.Data Bridge Market Research Report on Neuromorphic Computing Market provides analysis and insights into the various factors that are expected to prevail during the forecast period, providing their impact on market growth.Growing demand for technology globally is accelerating the growth of the neuromorphic computing market.

Neuromorphic computing, also known as neuromorphic engineering, refers to the use of large, integrated systems made up of various analog circuits.These systems allow replication of the neurobiological behaviors present in the human nervous system.The Neuromorphic Computing Platform includes two critical systems based on custom hardware architectures.These systems are designed to program neural microcircuits by applying brain-like thought processes to cognitive computing.

The growing number of applications in automation worldwide is one of the major factors driving the growth of the neuromorphic computing market.

Get Sample Copy of Report @https://www.databridgemarketresearch.com/request-a-sample/?dbmr=global-neuromorphic-computing-market

Segmentation:

What benefits can DBM research studies bring?

Latest Trends Affecting Industry and Development Scenarios

open new markets

Seizing strong market opportunities

Key decisions to plan and increase market share

Identify key business segments, market propositions and gap analysis

Marketing investment allocation support

Leading companies reviewed in the Neuromorphic Computing Market report are:

Key players operating in Neuromorphic Computing Market report are: Intel Corporation, IBM, BrainChip., Qualcomm Technologies, Inc., Hewlett Packard Enterprise Development LP, SAMSUNG, HRL Laboratories, LLC, General Vision Inc., Applied Brain Research, Vicarious ., Numenta, Aspinity analogML, BrainCo, Inc., Bitbrain Technologies, Halo Neuroscience, Linux Kernel Organization, Inc., Nextmind SRL, Cognixion, Inc., NeuroPace, Inc., MindMaze and Innatera Nanosystems BV et al.

In the full-scale Neuromorphic Computing Market report, industry trends are detailed at a macro level, which helps to explain the market landscape and possible future issues.This market research report makes a systematic analysis of the market and various relevant factors ranging from market drivers, market restraints, strategies, market segmentation, opportunities, challenges and market revenues to competitive analysis.This report analyzes and estimates the general drivers of the market in the form of consumer buying patterns and consequently consumer demand, government policies and demand related to the growth and development of the market.

Find more information athttps://www.databridgemarketresearch.com/reports/global-neuromorphic-computing-market

Highlights of the Neuromorphic Computing Market Report:-

The latest market dynamics, development trends and growth opportunities are presented along with industry barriers, development threats, and risk factors.

The forecast data of Neuromorphic Computing market will help in feasibility analysis, market size estimation, and development.

This report serves as a comprehensive guide to micro-monitoring all important Neuromorphic Computing Market.

A brief view of the market will facilitate understanding.

A competitive perspective on the Walnut Oil market will help players make the right choice

Country-level analysis

The Neuromorphic Computing market is analyzed and market size insights and trends are provided by the above mentioned countries, distribution channels, end users, connectivity, and JANDI.

The countries covered in the Neuromorphic Computing market report are the rest of South America in the framework of North America, United States, Canada and Mexico, Peru, Brazil, Argentina and South America, Germany, Italy, United Kingdom, France, Spain.Netherlands, Belgium and Switzerland., Turkey, Russia, Hungary, Lithuania, Austria, Ireland, Norway, Poland, Rest of Europe, Japan, China, India, Korea, Australia, Singapore, Malaysia, Thailand, Indonesia, Philippines, Vietnam, Rest of Asia Pacific (APAC) Asia Pacific (APAC), South Africa, Saudi Arabia, United Arab Emirates, Kuwait, Israel, Egypt, Middle East and Rest of Africa (MEA) which is part of Middle East and Africa (MEA).

A few points from the table of contents

Part 01: Summary

Part 02: Report Scope

Part 03: Neuromorphic Computing Market Outlook

Part 04: Neuromorphic Computing Market Size

Part 05: Neuromorphic Computing Market Segmentation by Product

Part 06: Five Forces Analysis

Part 07: Customer Environment

Part 08: Geographical Landscape

Part 09: Decision-making framework

Part 10: Drivers and Challenges

Part 11: Market Trends

Part 12: Supplier Status

Part 13: Supplier Analysis

Request TOC@https://www.databridgemarketresearch.com/toc/?dbmr=global-neuromorphic-computing-market

Data Bridge Market Research Information

Data Bridge Market Researchhas identified itself as a non-traditional and innovative market research and consulting firm with an unmatched level of resilience and an integrated approach.We are committed to uncovering the best market opportunities and nurturing effective information to help your business thrive in the marketplace.

Data Bridge is committed to providing the right solutions to complex business problems, and initiates an easy decision-making process.We think about heterogeneous markets according to the needs of our clients and find the best solutions and detailed information on market trends.Data Bridge enters markets in Asia, North America, South America and Africa.

Data Bridge knows how to create satisfied customers who depend on our services and are confident and satisfied with our efforts.We are pleased with our glorious 99.9% customer satisfaction rating.

Contact us:

Data Bridge Market Research

USA: +1 888 387 2818

UK: +44 208 089 1725

Hong Kong: +852 8192 7475

Email [emailprotected]

Read the original:

Neuromorphic Computing Market SWOT Analysis, Latest Innovations, Emerging Trends, Industry Size, Growth Outlook and Forecast 2029 - Digital Journal