Category Archives: Data Mining

Allorion Debuts with $40 Million to Enhance and Discover Precision Targets – BioSpace

Allorion Therapeutics is thankful for the $40 million in Series A financing itannouncedWednesday morning, a day ahead of Thanksgiving.

With headquarters in both Natick, Massachusetts and Guangzhou, China, Allorions drug discovery engine combines advances in protein structure, big data, machine learning and gene editing to discover and develop highly selective small molecules.

The precision medicine company is developing mutant selective and isoform-specific drugs in non-conventional ways for well-known targets in the oncology and autoimmune spaces, with the aim to improve upon efficacy and prevent resistance. Along with existing targets, Allorions proprietary technology systemically screens for synthetic lethality targets and allosteric inhibitors.

The company believes that carefully cultivating the synergy of these two approaches will allow it to build a robust pipeline with the intention of transforming the paradigm for cancer and autoimmune disease.

Founder and CEO Peter Ding expressed optimism about where Allorion currently stands and where he sees it heading.

Over the past year, Allorion has built up R&D capabilities in Boston and Guangzhou and formed a strong management and R&D team. Multiple projects achieved their milestones, Ding said in a statement. We are grateful to all the investors for their trust and support. Allorion will leverage cutting-edge technologies and strive to make precision medicine more precise and accessible to more patients.

The funding was led by Chinese VC Qiming Venture Partners and helped by participation from IDG Capital, Octagon Capital, Firstred Capital and Elikon Venture. Original investors TF Capital and Med-Fine Capital continue to like what they see and returned for this round.

There are huge unmet medical needs for autoimmune disease and cancer therapy globally. Based on data mining and an in-depth understanding of disease biology, Allorion focuses on the early discovery and development of precision medicines. We have confidence in the team's strong R&D capabilities. We hope to support Allorion to grow into a globally-recognized company and improve patients' life quality, said Qiming principal Chen Kan.

Allorion will apply the new funds to advance its preclinical projects, Investigational New Drug (IND)-enabling studies and to support the IND applications for two drug candidates. The company will also ramp up its investment in its novel screening technologies, and further build out both its clinical and business development teams.

The completion of this financing round shows investors' recognition of the progress and the support for the company's long-term strategy on highly innovative platforms for best- or first-in-class drug discovery, Ding added.

The oncology space can be thankful this year for a number of other innovative new players.Elucida Oncologylaunchedin January to develop drug conjugates with its C-Dots, which precisely target and penetrate tumors, and ArriVent Biopharma debuted in June with$150 millionin Series A financing and an epidermal growth factor receptor tyrosine kinase inhibitor (EGFR TKI) candidate for lung cancer.

Read the original:

Allorion Debuts with $40 Million to Enhance and Discover Precision Targets - BioSpace

Needed: Discoveries to feed green economy – www.mining-journal.com

A paradox of the mineral exploration sector is that it is both distinct and inseparable from the broader mining industry. Although the odds of a greenfields exploration project ever becoming a mine have been estimated at around one in one thousand, every mine begins life as an exploration project.

With this in mind, explorers should always be mindful of the challenges facing the industry. Inthecoming decade, there will be no two greater challenges than the need to consistently demonstrate strong sustainability credentials, including appropriate environmental, social and governance (ESG) principles and practices, and to meet the growing material needs of the world economy and emerging green economy.

ESG: An opportunity, not a hindrance

More projects than ever are failing to advance notbecause of technical issues, but because of environmental and social issues. It is incumbent on the exploration sector to address sustainable work practices, including ESG, early in project life cycles rather than set it aside for when projects reach a development stage. Integrating sustainability into decision making right from the start of an exploration project provides the foundation for constructive long-term engagement with all stakeholders, thereby serving to significantly de-risk a project. Therefore, sustainability should be viewed not as a hindrance to exploration but as an opportunity.

Establishing a respectful, open, engaged and supportive relationship with the local community, where shared values can be established, is critical to establishing a sustained social licence to operate (orexplore). While part of the engagement may includehighlighting the types of activities involved inexploration, and thereby highlighting the differencesbetween exploration andmining, there are many other aspects to this engagement - responsible behaviour, being open to considering alternatives (e.g.,non-invasive versus invasive activities, moving around properties, etc.), looking forcollaborative opportunities, understanding the Needed: Discoveries to feed green economy rights and perspectives of various stakeholders and maintaining open, regular and effective communication, just to note a few.

Establishing a positive and constructive track record at the early stages builds trust and respect andhelps lay the foundation should discovery and exploration success occur. Establishing success in this area, coupled with discovery success, should lead to better outcomes with investors (and by extension, exploration funding).

Uptake of non-invasive and lower impact, more environmentally sustainable on-site technologies will serve to make projects more environmentally friendly and assuage fears from communities who may only know about mining from negative depictions in film or the wider media.

These non- to less invasive technologies include:

Drones, in low impact acquisition of geophysical or other remote sensing imagery;

Passive-geophysics, techniques that allow for a greater understanding of subsurface geology and structure without the use of disruptive seismic or electrical sources;0

Non-invasive geochemical surveys, most surface geochemical surveys are relatively non-invasive, but the use of technology such as ionic geochemisty allows for rapid anomaly detection with the smallest of impact possible;

Deep 3D geophysics inversion modelling, utilising new geophysical techniques to assess deep signatures coupled with using geological models to assist in constraining them during inversion modelling;

Remote sensing and hyperspectral platforms, greater use of these tools early on to guide exploration ensures fewer areas require invasive exploration; and

Downhole monitoring, using downhole tools and probes to maximise data capture and utilise every borehole to its full extent.

Greater adoption of non-invasive technologies offers the added benefits of reducing costs and improving technical efficiency-KPIs that will only grow in importance as explorers go to increasingly greater depths to discover the minerals needed to fuel the post-COVID economic recovery and assist in the decarbonisation of the global economy.

Better technical efficiency reduces risk

The exploration sector is also a rare case of an industry that becomes harder the more successful we become. As the number of viable deposits near surface dwindles, especially in well-established mining jurisdictions, explorers need to go undercover in search of new discoveries, increasing uncertainty and risk. This will increase our reliance on new technology and geological concepts such as system science. This is where technical efficiency and effectiveness becomes critical.

Adoption of non-invasive technologies like those mentioned above will help. So too will new invasive technologies such as coil drilling, which reduces the cost and increases the speed of drilling, allowing us to drill at greater depths where the next Tier 1 deposits are more likely to be found. Coil drilling technology also provides an added benefit by reducing the drilling footprint and thereby minimising environmental impacts. Couple this with new lab-at-rig technology and other developments like portable XRFs and PhotonAssay and we have the potential to dramatically speed up mineral exploration.

Data science, and specifically machine learning, is another important growth area for the exploration sector and mining industry in general. Today, our industry collects vast quantities of data, too much forany geologist or team of geologists to efficiently analyse in a reasonable timeframe. A machine learning model can be trained to examine reams of data - whether that be proprietary company data and/or historical pre-competitive data in jurisdictions where such a thing is available - in a much shorter period of time. It is important to note however, that machine learning is not a silver bullet; it performs the grunt work for the geologist, much like a paralegal does for an attorney. But it is also much more than that, having, for example, the ability to potentially detect connections between disparate data sources that would be virtually invisible to the human eye.

Discovery success is not just about having a large geological team with access to these technologies. Better efficiencies will also require the adoption of certain organisational principles. Having a technically strong team is fundamental to exploration success, asRobert Friedland has demonstrated with Turquoise Hill Resources and Ivanhoe Mines and others have shown with various exploration ventures. Coupling technical capabilities with a strong team culture has also been a successful combination, as the leaders of Western Mining Corporation (later acquired by BHP) knew in the late 1960s to 1990s when they made numerous discoveries including Kambalda and St Ives in Western Australia and the giant Olympic Dam copper-uranium deposit in South Australia.

Finally, the need to improve technical efficiency demands new ways of thinking. One potential new approach is scenario planning, a strategy of planning for multiple future scenarios pertaining to one's business. Building on this concept, researchers at theUniversity of Western Australia's Centre for Exploration Targeting are developing a multiple hypothetical reserves' approach whereby explorers would develop multiple scenarios about currently undiscovered mineral accumulations that could be extractable in the future.

Discovery rates must improve

Countless observers have noted that the continued growth of the world economy and particularly low-carbon technologies like batteries, wind turbines and solar panels will require much larger quantities of certain commodities - such as copper, cobalt, lithium and graphite - than can be produced from existing (or known) resources and reserves. It is for the exploration sector to meet this growing need from the new green reality, and to do so, our success and discovery rates must improve. Together with adequate funding this will require the sector to embrace and successfully integrate new technologies and new mindsets with sound geological knowledge and thinking.

Read more:

Needed: Discoveries to feed green economy - http://www.mining-journal.com

Lore has it that there’s a lull leading up to Santa Claus rallies here’s what the statistics show – MarketWatch

The stock market does not suffer from seasonal weakness between now and Christmas.

You may have never heard about a supposed pre-Santa lull on Wall Street. Given the medias relentless focus on a so-called Santa Claus Rally, you probably have been focused instead on the possibility of stock market strength in the weeks leading up to Christmas.

But you should never underestimate analysts appetite for slicing and dicing the data in new ways. And one such slice of the historical data is suggesting that, because were in the first year of the presidential cycle, stocks will go sideways between now and Christmas, which is when year-end strength kicks in.

Dont believe it.

Theres no seasonally based reason to expect the stock markets performance leading up to Christmas to be any different than it is over any other five-week stretch of the calendar. That doesnt mean the stock market wont exhibit weakness in coming weeks. But if it does, it wont have anything to do with it being late November and the first weeks of December in the first year of the presidential cycle.

The accompanying chart, below, focuses on the Dow Jones Industrial Average DJIA, -0.03% back to its creation in 1896, measuring its average return between Nov. 17 and Dec. 24. Its average gain over this five-week period is 0.54%, versus an average of 0.72% for all five-week periods across the entire calendar. If I were to have stopped the analysis at this point, you would have some basis for thinking there is a pre-Santa lull though the difference of 18 basis points is of doubtful statistical significance.

But notice the results when segregated by year of the presidential cycle. The first year of that cycle this year, in other words has produced an average DJIA gain of 0.74% in this pre-Santa period. While the 2-basis-point margin above the overall average is not statistically significant, the more important takeaway is that there is no reason to expect this year will be a below-average one for the stock market.

Even if the stock markets performance over the five weeks prior to Christmas differed in a statistically significant way from the long-term average, however, the pattern would have to jump over another hurdle before it would make sense to bet on it. That hurdle is the need for a theoretical justification for why that pattern should exist in the first place.

I am aware of no such explanation in the case of a pre-Santa lull in the first year of the presidential cycle. And Im not holding my breath that any will ever be found.

Thats because this pattern, as well as most other seasonal patterns, are the result of shameless data-mining exercises. As any statistician can attest, upon torturing the data long and hard enough you can get it to say almost anything you want.

My favorite example of data mining comes from David Leinweber, head of the Center for Innovative Financial Technology at the Lawrence Berkeley National Laboratory. He found that you could explain 99% of the variation in the S&P 500s SPX, +0.23% return with a simple model containing just four inputs: Butter production in Bangladesh, American cheese production, and American and Bangladeshi sheep populations.

This isnt to say that no seasonal patterns exist. A few do, and they rest on strong theoretical foundations.

But most do not, so you should be skeptical whenever you read or hear of another analyst discovering some uncanny pattern in the stock market. Your first instinct should be to think back to Leinwebers example and how obviously irrelevant to the stock market are butter, cheese and sheep in Bangladesh and the U.S.

Mark Hulbert is a regular contributor to MarketWatch. His Hulbert Ratings tracks investment newsletters that pay a flat fee to be audited. He can be reached at mark@hulbertratings.com.

Excerpt from:

Lore has it that there's a lull leading up to Santa Claus rallies here's what the statistics show - MarketWatch

Who Says AI Is Not For Women? Here Are 6 Women Leading AI Field In India – SheThePeople

I dont see tech or AI as hostile to women. There are many successful women in AI both at the academic as well as industry levels, says Ramya Joseph, the founder of AI-based entrepreneurial start-up Pefin, the worlds first AI financial advisor. And even on my team at Pefin, women hold senior technology positions. There tends to be a misconception that tech tends to attract a geeky or techy kind of personality, which is not the case at all,

Joseph has a bachelors degree in computer science and masters in Artificial Intelligence, Machine Learning and Financial Engineering. As a wife, mother and daughter, Joseph could closely relate to the crisis of financial advice to plan for the future. She came up with the idea of founding Pefin when her father lost his job due to a lack of financial advice when he jeopardised his retirement plans. Navigating and solving his problems, Joseph realised that many were telling the same problem. Hence she came up with the idea of an AI-driven financial adviser.

No doubt Artificial Intelligence is one of the growing industries in the field of professionalism. As new inventions and developments knock at our doors, the relation between humans and computers is being reassessed. With the expansion of AI, new skills and exceptional human labour is in high demand. But the problem is that despite the evolution in society, the gender pay gap is not shrinking. As per the wef forum, only 22 per cent of AI professionals are women. The report suggests that there is a gender gap of around 72 per cent.

Despite this, many women are breaking the glass ceilings and reforming the field of Artificial Intelligence. Through their skills and leadership, these women are carving the path for other women to participate as AI professionals. So in this article, I am going to list out some women AI professionals in India who changing the gender dynamics through their excellence.

Amarjeet Kaur is a research scientist at TechMahindra. She has a PhD in Computer Science and Technology. Kaur specialises in research techniques and technologies like graph-based text analysis, latent semantic analysis and concept maps among others. She also has expertise in experimentation and field research, data collection and analysis and project management. She is known for her organisational skills and willingness to take charge.

Kaur has also worked with the Department of Science and Technology at Women Scientist Scheme. As a part of the scheme, she helped in developing a technique to automatically evaluate long descriptive answers. With more than ten years of research and teaching experience, Kaur has excellent academic skills. Her academic skills and innovative techniques have gained her a gold medal and a toppers position at Mumbai University. Her innovative skills and course material has also received a place in Mumbai Universitys artificial intelligence and machine learning courses.

Sanghamitra Bandyopadhyay works at the Machine Intelligence Unit of the Indian Statistical Institute. She also completed her PhD from the institute and became its director serving for the years 2015 to 2020. Bandyopadhyaya is also a member of the Science, Technology and Innovation Advisory Council of the Prime Minister of India (PM-STIAC). She specialises in fields like machine learning, bioinformatics, data mining and soft and evolutionary computation.

She has been felicitated with several awards for her work like Bhatnagar Prize, Infosys award, TWAS Prize, DBT National Women Bioscientist Award (Young) and more. She has written around 300 research papers and has edited three books.

Ashwini Ashokan is the founder of MadStreetDen, an artificial intelligence company that uses image recognising platforms to power retail, education, health, media and more. Starting up in 2014, the venture is headquartered in California with offices access Chennai, Bangalore, Tokyo, London and more. She co-founded the platform along with her husband. Speaking to SheThePeople, Ashokan said, Its only natural that the AI we build mimics what weve fed it, until the agency of its own, which could be good or bad. As an industry, we need to think about what were teaching our AI, She also added, Every line of code we write, every feature we put in products we need to ask ourselves, what effect does this have on the way the world will be interacting with it.

Apurva Madiraju is a vice president at Swiss Re Global Business Solutions India in Bangalore. She is leading the data analytics and data science team of the audit function. As the leader, she is responsible for building machine learning and text analytics solution to deal with audit compliance risk.

Madiraju flaunts 11 years of experience across diverse fields like artificial intelligence, data science, machine learning and data engineering. She has developed multiple AI and ML-driven solutions like ticket volume forecasting models, turn-around-time prediction solutions and more. She has worked across companies globally to lead the conceptualisation, development and deployment of many AI and ML-based solutions for enterprises.

With more than 20 years of experience as a Data Scientist, Bindu Narayan serves as the Senior Manager with Advanced Analytics and AI at EY GDS Data and Analytics Practice. At EY, Narayan is AI competency leader for EYs Global Delivery Services. She along with her team offers virtual assistant solutions to clients across the industry. Moreover, with her skills, Narayan has developed many innovative AI solutions and leads in the field of machine learning, customer and marketing analytics and predictive modelling. She completed her PhD from IIT Madras on the topic of modelling Customer Satisfaction and Loyalty.

Here is the original post:

Who Says AI Is Not For Women? Here Are 6 Women Leading AI Field In India - SheThePeople

No-Code Analytics The Best Introduction to Data Science – Analytics Insight

No-code analytics has been one of the best additions in the data science space.Introduction

Although reading books and watching lectures is a great way to learn analytics it is best to start doing. However, it can be quite tricky to start doing when it comes to languages such as Python and R if someone does not have a coding background. Not only do you need to know what you are doing in terms of analytical procedures, but you also need to understand the nuances of programming languages which adds onto the list of things to learn to just get started. Therefore, the best middle ground between knowledge acquisition (books, videos, etc.) and conducting advanced analytics (Python, R, etc.) is by using open-source analytics software. These types of software are great for both knowledge acquisition and actually doing analysis as documentation is built into the software and you can start doing relatively complex tasks with only mouse clicks. Even if you know how to code, the same analysis is usually conducted faster using these types of software.

The term data analytics has become synonymous with programming languages such as Python and R. Although these powerful languages are necessary for conducting advanced analytics with the latest and greatest algorithms, they are not necessary to start analyzing complex datasets! Data analytics software can either be open-source (Orange) or have a free version associated with it (RapidMiner). These tools are great for beginners as the time it takes to learn the nuances of coding languages can instead be spent on the data analytics process and statistical theory which is important for Python and R users as well. Think about it, if you woke up one day and knew everything about Python and R, would you still be able to conduct thorough and accurate analysis? Even if your code works and an output is given, the output may be wrong due to lack of knowledge within the data analytics domain. We live in a beautiful world where very smart people create completely free software so the public can use them without a price barrier. A great website that illustrates the trend of open-source software is alternativeto.net. In this website, you can type in any paid commercial software, and it will recommend open-source alternatives that serve as a substitute for the commercial software. The purpose of this article is to provide the ideal introduction to data analytics for anyone who is interested in this fascinating subject. The software we will be covering can do analytical tasks such as regression, classification, clustering, dimensionality reduction, association rules mining, deep learning/neural networks, ensemble methods, text mining, genetic algorithms, network analysis, image analytics, time series, bioinformatics, and spectroscopy. Some of the software listed can also be connected to a SQL database. In this article, we will go over the various no-code software that is either completely open-source or has a powerful free/academic version associated with.

RapidMiner was founded in 2007 and is still used today. RapidMiner is used by over 40,000 organizations and has been doing well according to the Gartner Magic Quadrant. The types of analyses that can be done are quite broad ranging from simple regression to genetic algorithms and deep learning. It is a point-and-click interface where widgets are placed and connected to one another in order to perform analytics. These are essentially pre-written blocks of code that conduct certain operations. Hyperparameters can be tuned on the side after clicking the widget. One thing that makes RapidMiner unique is its automated machine learning functionality. With just a couple of clicks, various algorithms will run and output the performance metrics where you can compare the results and choose the best model. RapidMiner believes in no black boxes, so it is possible to see how the algorithm works after implementing the automated machine learning. Other capabilities can also be done such as text mining and big data (e.g., Radoop) through the various extensions that it provides. In my opinion, the strongest part of RapidMiner is how rapidly (pun intended) one can learn the theory and underlying mechanisms of how the model works. The documentation is built into the software so you can right-click on each functionality/algorithm and gain a description of each. Each description covers a synopsis, a brief description of the overall algorithm, a description of each hyperparameter, as well as a tutorial on how to use it. The tutorial is extremely useful as you can use it as a template for your own dataset. In tutorials, widgets are formed with the use of sample data so you are given a usable example of how you can use it. Just, plug in your own data and make certain changes and you are good to go! RapidMiner also incorporates a wisdom of crowds functionality where statistics are given on hyperparameter tuning and widget creation. For instance, are you trying to determine the number of trees of your random forest? Well, RapidMiner will state something like 50% chose a value between 100 and 149 along with a bar graph that shows what percentage or RapidMiner users chose what. This streamlines the learning process to see what the professionals are choosing. Overall, I highly recommend RapidMiner for learning analytics and should be one of the first tools someone uses when starting to learn.

Orange is probably the most visually pleasing software on this list and has some of the best data visualizations. It also has the most features for a completely free open-source software. This means that you can take the knowledge learned into the corporate world as it is free and open-source for everyone! Interestingly, the software runs on Python so a lot of the visualization should be familiar. The creators of this software are biostatisticians and so more scientific packages are included in the software such as biostatistics and spectroscopy. Orange also uses widgets similar to RapidMiner and can be downloaded under the Anaconda environment or as stand-alone software.

JASP (Jeffreyss Amazing Statistics Program) is mostly used for traditional statistics in the social sciences but has machine learning functionalities as well. It is more of a substitute for SPSS and the user interface looks very similar to it. The interesting thing about JASP is that the R language works under the hood so the data visualizations should look similar to it. This is a great way to learn traditional statistics as you can load a workflow based on certain statistical techniques where an already conducted analysis will be downloaded along with explanations for why certain analyses are done. The software documentation is also built-in to the software so you can easily learn about the statistical techniques and how to use them in the right way along with already-loaded example datasets. Academic papers and books are also cited under each statistical technique for further resources; R packages are also listed for each technique as well. In JASP, it is possible to conduct t-tests, ANOVA, regression, factor analysis, Bayesian statistics, meta-analysis, network analysis, structural equation modeling, and other classical statistical techniques as well as machine learning.

Voyant Tools specializes in corpus analytics which relates to text data. To get started with minimal effort, you can pre-load corpus data from Shakespeare plays and have a dataset ready for analysis. There is a great number of functionalities within the software and is unique compared to the other software in that it comes in the format of a dashboard where you can change each tile with another form of analysis. Most of the analytical techniques encompass unique ways in visualizing textual data. Statistical techniques such as topic clustering are also possible.

This one is a little different from the others as it pertains to obtaining data opposed to analyzing data. Webscraping is a popular way to obtain data from webpages since there is more control in how the data is collected compared to the use of secondary data. There are plenty of free web scraping services, but my favorite is DataMiner. With the free version, you can scrape up to 500 pages a month (although some websites such as Glassdoor are restricted unless you pay a minimal monthly fee). However, it is very intuitive and comes with live customer support for help in your web scraping projects. This software works by clicking on certain parts of the screen where the html code will be sensed. Then, the software will detect similar areas on the website and gather each instance as a row and put them all in one column. This can be repeated for other areas where you will end up with a nice, personalized dataset.

https://dataminer.io/

We live in a fascinating world where talented people are creating software to help newcomers in certain fields which exponentially increases the collective knowledge of society. There are other great analytical tools that I didnt mention such as KNIME, Weka, QGIS, and Jamovi since Im not as familiar with those, so go out there and explore more! Five, ten, one hundred years from now, this list will be outdated, and new types of code-free software will enter the field, each with a core competency. For instance, I can see the future having specific software for each data type (image, audio, etc.) or each type of data mining technique. We also have access to free real-world datasets from websites such as Kaggle where you can easily start exploring datasets that intrigue you. Datasets can range from Pokmon statistics to healthcare, so the possibilities for analysis are endless!

So, if you are interested in analytics, download a dataset that fascinates you and immediately start conducting advanced analytics by just mouse clicks using the software above, and if it fascinates you, also add on a keyboard to your toolkit to use more advanced methods using Python and R. The latest and greatest methods of analysis (which can be found on GitHub) can only be done using one of these tools so that would be the next step. Then, you can try to replicate scientific papers from paperswithcode.com. I hope this article serves as a good introduction to this field, welcome to the world of analytics!

Dennis Baloglu graduated from UNLV with Bachelors degrees in Finance and Marketing along with Masters degrees in Hotel Administration and Management Information Systems. He taught under the UNLV William F. Harrah College of Hospitality for 2 years and is a current Enterprise Marketing Analyst at MGM Resorts International. He is passionate about the intersection between data analytics and the hospitality industry since he believes data-driven decision-making and algorithms are the keys to industry success.

Company Designation: MGM Resorts International

Location: Las Vegas, NV

Link to Website: https://sites.google.com/view/dennis-baloglu/home

Social Media: https://www.linkedin.com/in/dennisbaloglu/

Share This ArticleDo the sharing thingy

The rest is here:

No-Code Analytics The Best Introduction to Data Science - Analytics Insight

Manchin and Cortez Masto kill chances of reforming outdated hardrock mining law – Grist

This story was originally published by High Country News and is reproduced here as part of the Climate Desk collaboration.

Amid the recent skirmishes over revising the reconciliation bill, known as the Build Back Better Plan, lawmakers once again skipped a chance to reform the General Mining Law of 1872.

Under this outdated law, hardrock miners can extract profitable minerals such as gold and silver from public lands without having to pay any federal royalties. Though it has been challenged several times over the past few decades, mainly by Democrats, the law has not been significantly updated in the nearly 150 years since its passage.

In August, a House committee, chaired by Ral Grijalva, D-Ariz., tried to modernize the legislation by adding language to the reconciliation bill to establish federal royalties of between 4% to 8% on these mines. This would have been the most consequential update that the mining law has received in the nearly 15 decades since President Ulysses S. Grant signed it into existence.

However, hardrock royalty reform never even reached a vote thanks to Democratic Sens. Catherine Cortez Masto, D-Nev., and Joe Manchin, D-W.V., who made his personal fortune in coal mining. Manchin initially signaled support for the royalty provisions in October when he spoke in front of the Senate Committee on Energy and Natural Resources, stating that he could never imagine that we dont receive royalties on so many things we produce in this country. But he later reversed course and reportedly promised Cortez Masto that hed block any mining royalties, effectively killing reform before it even reached the full Senate. On Nov. 4, royalty reform was officially out of both the House and Senate bills.

These senators actions all but guarantee that the U.S. public will continue to miss out on billions of dollars in revenue that could have supported the Build Back Better Plans priorities, including paid family leave and important climate investments. The bill also would have held companies accountable for cleaning up the abandoned mines that pockmark the West. Instead, mining companies will continue to exploit public land for their own financial gain.

The General Mining Law of 1872 law was passed in the wake of the mid-19th century California gold rush as part of a push to encourage white settlement of the West. Previously, prospectors sometimes staked claims to land without the permission of the federal government, let alone that of the Indigenous people who were being dispossessed of the land in question.

In order to regulate the blossoming industry, Congress passed a few early mining laws beginning in 1866. The General Mining Law of 1872 took their place. It established the location system, which permitted individual miners and corporations to stake claims to mineral discoveries on the public domain, on land that had never been in private ownership.

A long list of royalty-free minerals besides gold and silver fall under this location-system regulation, including lithium and copper, which are becoming more valuable due to their use in green energy technologies like solar panels and electric vehicles. The industry has extracted some $300 billion worth of these minerals from public lands since 1872, according to Earthworks. And though mining companies have evolved tremendously since the days of digging with pickaxes and now use some of the largest machinery on earth, the return they make to the American public remains as paltry as ever.

This is why a broad base of critics from conservation organizations to lawmakers think it is high time to reform the 1872 law. Currently, the government earns hardrock mining fees for things like registration and annual maintenance, which generated about $71 million in revenue in fiscal year 2019, but its a small amount compared to the money that would be derived from royalties.

The industry has extracted some $300 billion worth of these minerals from public lands since 1872, according to Earthworks.

For example, in September, the House Natural Resources Committee proposed a new royalty that would have raised $2 billion over 10 years. And thats likely a conservative estimate: The federal government has no data on the amount or value of the hardrock minerals extracted from public lands, which account for more than 80% of the mineral mines on federal lands, according to the Government Accountability Office.

In contrast, mines operating under the more heavily regulated leasing system, for resources like coal and oil shale, account for just 17% of mining on federal lands, but generate much more revenue through royalties. In fiscal year 2018 alone, they brought in $550 million. Coal is by far the primary revenue generator under leasing-system mining.

The proposed reforms also would have added a reclamation fee for abandoned mines and increased the yearly maintenance fee for claims from $165 to $200 per claim, adding another combined $1 billion in revenue over the next decade.

This money could, among other things, provide funding to address a myriad of environmental and health threats across the Western U.S. caused by past mining. Before the 1970s, for example, companies abandoned mines once work was complete leaving behind tens of thousands of often-toxic scars on the land that could cost over $50 billion to address.

Attempts to reform the General Mining Law have been going on for years, but a well-funded network of lobbyists and special interest groups has continued to thwart any success. Mining interests regularly spend north of $16 million annually on lobbying; this year, theyve already spent over $13 million.

The National Mining Association spent the most in 2021, coming in at $1.5 million, according to data from OpenSecrets, a nonprofit campaign finance and lobbying watchdog organization. Several companies that would be directly impacted by mining law reform have lobbied against it, including Newmont Corp., a gold-mining company that has invested over $800,000 to fight efforts to change the law.

This helps explain why one ongoing effort to reform the law the Hardrock Mining and Reclamation Act has stalled in recent years. Democrats have introduced the legislation in Congress at least six times since 2007. The bills most recent iteration, in 2019, failed amid a major industry-led lobbying blitz. Among those fighting it were mining giant BHP Group and the National Mining Association, which targeted the bill in a $1.2 million lobbying campaign.

And mining industry lobbyists have power beyond their financial influence: They are also intricately linked to the government. According to OpenSecrets, nearly 65% of the industrys lobbyists previously worked in the government, many in positions related to mining.

The lobbying campaigns help illuminate why Manchin, who said in October that it was time to bring the outdated law into the 21st century was willing to suddenly reverse course. According to OpenSecrets, he received more campaign donations from the mining industry than anyone else in Congress, raising nearly $50,000 from the industry in the current fundraising cycle. Cortez Mastos campaign also benefited: Both the National Mining Association trade group and Barrick Gold Corp., one of Nevadas largest mining companies, have recently donated to her campaign.

Nevadas economy depends on gold mining; nearly $8.2 billion worth of the metal was extracted in the state in 2020. Cortez Mastos predecessor, former Nevada Democrat Harry Reid, was against any challenges to the 1872 Mining Law, calling them ill-conceived reform efforts that would have hurt rural Nevada in a 2009 op-ed. It seems that Cortez Masto is picking up right where Reid left off, protecting the industry in an attempt to keep rural voters.

Neither Manchin nor Cortez Masto responded to requests for comment.

This story was produced in collaboration with the Project on Government Oversight, a nonpartisan independent watchdog that investigates and exposes waste, corruption and abuse of power.

Read more here:

Manchin and Cortez Masto kill chances of reforming outdated hardrock mining law - Grist

Seeing the Future: How to Use Predictive Analytics in Your Business – Silicon UK

Your business has been collecting masses of data for several years, but is your enterprise using that information to drive your company forward? Data for its own sake is useless. However, when data forms the basis of a well-designed analytical process, tangible and actionable information can be revealed.

According to IBM, among survey respondents who had implemented predictive analytics, 66% say it provides very high or high business value. Predictive analytics initiatives show a median ROI of 145%, in comparison to non-predictive business intelligence initiatives median ROI of 89%.

Having a predictive analytics strategy for your company is now a commercial imperative. You have already completed the work to rationalise and connect datasets, the next critical step is to make that information work for your business.

Speaking to Silicon UK, Andy Crisp, senior vice president, Global Data Owner at Dun & Bradstreet explained how reliable predictive analytics can be used to base accurate predictions upon.

The reliability of any predictive analytics varies based on a number of factors: Firstly, how many data points you have to build your model given that the more collaborating data points available, the more effective the analytics, said Crisp. The second is how accurate the data collection mechanism for the data points is and, as a consequence, the quality of the data itself. And lastly, how much history you have for these data points and how far into the future youre trying to predict. If all these factors are considered, and the data itself is of a high enough quality, predictive analytics can be extremely reliable and hugely useful for businesses.

How reliable can predictive analytics be especially when a business is using these predictions for major strategic decisions upon? Predictive analytics needs data, but is the predictive analytics engines results only as good as the questions you ask it?

Ash Finnegan, digital transformation officer at Conga also offered this advice to any business leaders looking to improve how they use predictive analytics: It is crucial that companies first establish their digital maturity. This is where they currently stand in their digital transformation journey and how their data is currently being processed and stored. To do this, companies must first evaluate their operational model, assess its suitability, and identify any pain points along the entire data cycle. The key is to arrive at a clear understanding of how and where change needs to occur in a phased manner, to progress and improve the organisations overall operability and unify the data cycle.

Finnegan continued: True business intelligence enables organisations to take themselves to the next level. By establishing their digital maturity and recognising which areas of the operational model need to be improved, predictive analytics will be empowered. Leaders will have far greater visibility of their revenue operations and will have established true business and data intelligence they will be able to identify other areas that may need to be tweaked or fine-tuned, leaving them far more agile and adaptable for any given outcomes.

Implementing predictive data analytics is clearly how businesses can improve their bottom line, but as the Harvard Business Review Pulse Survey concludes, there are obstacles to overcome:

There are many interrelated obstacles to improving the use of analyzed data. The top three barriers cited by survey respondents were lack of employees with necessary skills or training (61%), lack of high-quality data (43%), and a company culture that tends to limit access to information (24%). While the first two challenges are significant, the third one might be the most pressing.

In fact, many of the barriers cited by respondents indicate lack of access, including lack of necessary technology (22%), an organizational structure that impedes access to or use of analysed data (20%), lack of leadership prioritization (19%), and hard-to-access analysed data (18%).

This analysis is telling as it reveals that infrastructure, leadership and even business culture can have an impact on how successful any predictive analytics program could be.

And there is an urgency to embrace these technologies. In their Technology Vision report, Accenture concluded that 92% of executives report that their organization is innovating with urgency and call to action this year. And that over half (63%) of executives state the pace of digital transformation is accelerating.

Leadership demands that enterprises prioritize technology innovation in response to a radically changing world, says Accenture. Small pilots and incremental scaling are an obsolete luxury, and the friction between research, development, and large-scale deployment must diminish or disappear.

However, the Pulse Survey concluded that less than a quarter (24%) of respondents rate their organisations effectiveness using analyzed data as less than optimum. Clearly, more work needs to be done by some enterprises to become data-driven businesses that use the insights predictive analytics can bring to them.

Dun & Bradstreets Andy Crisp also outlined clearly defined ways predictive analytics can be used: Businesses can leverage predictive analytics in many ways, but three of the most powerful areas in which to use it would be to predict risk, identify opportunities and improve efficiency.Crucially, leveraging predictive analytics on client payment behaviour can reduce the risk of bad debt thereby improving a business cash flow.

Perhaps just as importantly is using predictive analytics to analyse customer behaviour and identify opportunities to improve a products usability which, as a consequence, can increase revenue. Finally, analysing client consumption can improve business efficiency ensuring businesses have what the client wants when they want it while still reducing any waste.

Also, Ana Pinczuk is the Chief Development Officer at Anaplan, also explained: The past year has proven that predictive analytics is critical in helping companies anticipate and react to disruption. Weve seen more customers leverage predictive capabilities for everything from forecasting to territory planning as they realise how imperative that external view really is to their operations.

One area where weve seen this really take off is within the sales organisation. The pandemic threatened traditional revenue streams at a time when businesses were highly focused on cash flow and liquidity. Now with a fluctuating economy and job market, revenue leaders are dealing with high-attrition rates of their sales reps. Predictive analytics allows sales leaders to augment internal data with predictive attributes on things like profile fit and buyer intent so they can target and prioritise accounts that are more likely to want to buy from them. This makes it easier to build fair territories and set more realistic quotas so they can optimise their sales resources and ideally retain top sales talent.

Predictive analytics has a wide application across many business processes and customer touchpoints. What is clear for all enterprises is they must have a well-defined predictive analysis strategy that must be high on their agendas as we move into the post-pandemic era.

The strategic importance of data cant be overstated. When data is properly harnessed it can deliver tangible benefits right across a business. However, when data is used as the basis for prediction, new opportunities can often reveal themselves.

David Sweenor, a senior director of product marketing at Alteryx says that placing every piece of analytical data into its proper environment is critical: Automation and Machine Learning (ML) have one primary limitation: context. Without context, making insightful, timely, and accurate predictions is a challenge. While automated analysis functions are extremely effective, they are hollow without knowing how, and where, to apply these learnings most efficiently.

The current skills gap continues to be an issue for businesses looking to make the most of their data, with data prep still one of the biggest issues. On average, data workers leverage more than six data sources, 40 million rows of data and seven different outputs along their analytic journey. Much of the emphasis ML has been on the technology, not the people, and thats where failed projects are rooted.

Sweenor concluded: Another pitfall is the ethics and bias consideration. Artificial intelligence doesnt make moral judgements. It is not inherently biased, but historical data and the creators of the model could be. When using machine learning and advanced analytic methods for predictive analytics, we need to be careful that inputs dont bias the outcomes. Today its data, and not instinct, that facilitates most business decisions.

And what does the future of predictive analytics look like? For Nelson Petracek, global CTO, TIBCOthere are several strands to the development of this technology.Based on my conversations with customers and partners, predictive analytics will continue to evolve in a number of different ways: The technology will become more immersive and embedded, where predictive analytics capabilities will be blended seamlessly into the systems and applications with which we interact.

The technology will be made available to broader audiences (not just data scientists) through the use of continuously improving tools and model-driven approaches to development. Tools will broaden in capability to include not only model development, but also additional functions such as data collection, data quality, and model operationalisation into an end-to-end data fabric or data mesh (depending on your point of view).

Open data and AI/ML model ecosystems will grow, supported by technologies such as APIs, federated learning, and blockchain. And predictive analytics will drive new, emerging use cases around the next generation of digital applications, including metaverse applications (convergence of digital and physical worlds, powered by technologies such as IoT, digital twins, AI/ML, and XR) and the next generation of composable applications.

Can businesses really see the future? Its certainly now possible to make accurate guesses over a wide range of critical business questions. The data that is needed to make these predictions is embedded within every enterprise. The key is to ask the right questions.

Ana Pinczuk is the Chief Development Officer at Anaplan concluded: Business has never been more unpredictable, so companies need access to more data sources and signals than ever before to model future outcomes and react quickly to disruption.We are going to see more integrations and partnerships to access data sources and make predictive analytics and intelligence capabilities from Machine Learning and AI to advanced data science techniques more accessible to the average business user. We need to democratise access to predictive analytics, and technology partnerships are a key part of that equation.

Tim El-Sheik, CEO and co-founder of Nebuli.

Tim El-Sheikh is a biomedical scientist, an entrepreneur since 2001, and CEO and the co-founder of Nebuli, the worlds first Augmented Intelligence Studio. Since the age of 10, he has been a self-taught coder, and has a real passion for designing enhanced human experiences through intelligent algorithms. After a masters degree in Computer Science and Information Technology, Tim combined his experience in design, neuroscience, and engineering to start as an entrepreneur in online multitier system architectures in the media and advertising sectors, scientific publishing, and social enterprises.

What is the difference between business intelligence and predictive analytics?

The conventional definition of business intelligence is the application of various data mining techniques, visualisation tools, and analytics to overview a companys current position as well as its performance within the marketplace. Whilst business intelligence is about asking: Where are we now as a business? predictive analytics involves a more detailed analysis of past and current data trends to predict what will happen in the future in an educated way. However, our view at Nebuli is that modern business intelligence must also involve predictive analytics and one should not be used without the other.

How reliable can predictive analytics be, especially when a business is using these predictions for major strategic decision upon?

Any educated forecast depends entirely on past experiences and accumulated data to tell us about the future. What would predictive analytics tell us about coping with future pandemics after what we have experienced? In other words, if we state our assumptions that the future prediction depends on the previous data, then we can have a better understanding of what the analytics are able to predict.

The problem comes when people do not understand the underlying assumptions in the forecast and end up producing inaccurate predictions. That is why it is essential to combine as many data sources as possible, including those generated from business intelligence, to maximise the accuracy of any assumptions.

What are businesses most interested in using predictive analytics for? For example, pricing or product design?

Predicting the future is the holy grail of making money in all sectors! Predictive analytics are heavily used in the finance sector to understand and assess risk. The future of interest rates, for instance, is of key importance to commercial lenders, as well as borrowers and for any company that trades globally, predicting exchange rates is essential in holding funds in strong currencies.

Another example is investment markets, where predicting the price of an asset is critically important for investors. Other sectors such as sports, where bookkeepers predict possible outcomes of football matches or retail, where predicting sales of a product under a price change is critical due to market price elasticity, also make important use of those analytics.

Predictive analytics needs data, but is the predictive analytics engines results only as good as the questions you ask it?

To some extent, it is, however, getting good results from the predictive analytics engine is not that simple. It is, for example, essential to remain critical of the assumptions made from the given data and always get verification from an experienced data expert about the way these data are being used. Businesses should also have a robust quality management (QM) process around the use of data and include the outcomes in the risk register for the company.

Crucially, decisions should not be made without those elements reinforcing the validity of the data or solely based on the instincts of leaders in the business. In addition to that, including elements of behavioural analysis of the target customers and employees productivity is something that we also encourage at Nebuli to ensure we are happy with the outcome. Overall, it is more about building a comprehensive blueprint of your forecast.

What are the pitfalls to watch out for when using predictive analytics across an enterprise?

The biggest pitfall we see is enterprises believing that their data is structured and holds the answers to all of their questions about the future. Most of the time, this is not the case, and I would go as far as saying that there is no such thing as structured data! Why? Because your companys data can be seen structured if it complies with your data inputting policies even though those policies may have been around for several years.

Your predictions are then based on your data output and possibly hold newly acquired data from other channels that were not part of the original data input processes. Hence, your combined output might not match the key questions you need to answer in an ever-changing data-driven world. That is why we actively advocate enterprises to adopt a comprehensive data strategy as early as possible, which is the foundation for successful business intelligence and predictive analytics.

What do you think the future of predictive analytics looks like?

Avoid the hype! Machine learning and AI are the two key buzzwords that are being pushed around as the holy grail of modern business intelligence and predictive analytics. While both AI and machine learning algorithms can add significant value, the critical point goes back to your data. Without a clear data strategy, no matter how much AI or machine learning algorithms you apply, your analysis will not be any better. In fact, AI can amplify prediction errors and biases much further if the data structure is not scrutinised, optimised or analysed in detail as part of your data strategy.

Photo byGantas VaiiulnasfromPexels

See more here:

Seeing the Future: How to Use Predictive Analytics in Your Business - Silicon UK

Bullish: Analysts Just Made A Significant Upgrade To Their Evolution Mining Limited (ASX:EVN) Forecasts – Simply Wall St

Evolution Mining Limited (ASX:EVN) shareholders will have a reason to smile today, with the analysts making substantial upgrades to this year's statutory forecasts. Consensus estimates suggest investors could expect greatly increased statutory revenues and earnings per share, with the analysts modelling a real improvement in business performance.

Following the upgrade, the latest consensus from Evolution Mining's 16 analysts is for revenues of AU$2.1b in 2022, which would reflect a decent 13% improvement in sales compared to the last 12 months. Per-share earnings are expected to bounce 23% to AU$0.23. Before this latest update, the analysts had been forecasting revenues of AU$1.9b and earnings per share (EPS) of AU$0.20 in 2022. So we can see there's been a pretty clear increase in analyst sentiment in recent times, with both revenues and earnings per share receiving a decent lift in the latest estimates.

View our latest analysis for Evolution Mining

It will come as no surprise to learn that the analysts have increased their price target for Evolution Mining 5.9% to AU$4.37 on the back of these upgrades. That's not the only conclusion we can draw from this data however, as some investors also like to consider the spread in estimates when evaluating analyst price targets. Currently, the most bullish analyst values Evolution Mining at AU$5.60 per share, while the most bearish prices it at AU$3.50. Analysts definitely have varying views on the business, but the spread of estimates is not wide enough in our view to suggest that extreme outcomes could await Evolution Mining shareholders.

One way to get more context on these forecasts is to look at how they compare to both past performance, and how other companies in the same industry are performing. It's clear from the latest estimates that Evolution Mining's rate of growth is expected to accelerate meaningfully, with the forecast 13% annualised revenue growth to the end of 2022 noticeably faster than its historical growth of 7.5% p.a. over the past five years. Compare this with other companies in the same industry, which are forecast to see a revenue decline of 0.3% annually. So it's clear with the acceleration in growth, Evolution Mining is expected to grow meaningfully faster than the wider industry.

The most important thing to take away from this upgrade is that analysts upgraded their earnings per share estimates for this year, expecting improving business conditions. Fortunately, they also upgraded their revenue estimates, and our data indicates sales are expected to perform better than the wider market. Given that the consensus looks almost universally bullish, with a substantial increase to forecasts and a higher price target, Evolution Mining could be worth investigating further.

Still, the long-term prospects of the business are much more relevant than next year's earnings. We have estimates - from multiple Evolution Mining analysts - going out to 2024, and you can see them free on our platform here.

Another way to search for interesting companies that could be reaching an inflection point is to track whether management are buying or selling, with our free list of growing companies that insiders are buying.

This article by Simply Wall St is general in nature. We provide commentary based on historical data and analyst forecasts only using an unbiased methodology and our articles are not intended to be financial advice. It does not constitute a recommendation to buy or sell any stock, and does not take account of your objectives, or your financial situation. We aim to bring you long-term focused analysis driven by fundamental data. Note that our analysis may not factor in the latest price-sensitive company announcements or qualitative material. Simply Wall St has no position in any stocks mentioned.

Have feedback on this article? Concerned about the content? Get in touch with us directly. Alternatively, email editorial-team (at) simplywallst.com.

See the article here:

Bullish: Analysts Just Made A Significant Upgrade To Their Evolution Mining Limited (ASX:EVN) Forecasts - Simply Wall St

New legal framework evolving on tech, internet; data protection bill step towards that: MoS IT – ETCIO.com

NEW DELHI: A new legal framework will start taking shape in the country around technology and internet, and the data protection bill is the first step towards that, minister of state for electronics and IT Rajeev Chandrasekhar said on Thursday.

Chandrasekhar further said digitalisation of the government and public services is going to be "rapid" and soon-to-be-unveiled Digital India 2.0 will seek to accelerate on the gains made over the past few years.

The government is committed to ensuring that internet and technology remain open, safe, secure and accountable, as 1.2 billion Indians come online in next few years, the minister added.

He also supported the need to construct "trusted common digital ID" for every citizen of India.

The minister said issues of national security are "real issues" and so there will be a growing demand and need for trusted ID and ability to validate citizenship, he added.

Originally posted here:

New legal framework evolving on tech, internet; data protection bill step towards that: MoS IT - ETCIO.com

Data-Mining to Serve Your Patient, Your Team and Your Business Better – InvisionMag

I GO ON RETREATS with many practices, from one location to eight or more, across the U.S. and Canada. Im often surprised that the majority of these practices do not routinely data-mine their clinic and optical KPI metrics capture rate, revenue per patient, and other optical KPIs to grow intentionally.

Many of these businesses rely on their Practice Management Software (PMS) to provide these metrics. Some data-mine the system monthly, a few weekly, and many only when things seem off.

But data/KPIs are king. In our offices, we mine the data daily and weekly; this has led to very intentional focus by the whole team. We use EdgePro by GPN (I do not endorse nor collect any gratuities from any vendors). Glimpse is another program available.

These software systems simplify the process of gathering the data needed to drive growth.

What I learned about using software like this is simplicity. In a few clicks of the mouse, we can parse the data by individual, product, dollars, percentages. What we measure, we manage and grow.

Do you know, off hand, your capture rate? Ocular Nutrition? Average Frame sale? Patient Own Frame? Photochromics? Multiple pairs? Plano sun? Rx Sun? Screening? You grow when you know.

These data-mining programs are fast and specific. We can pull out trends by viewing the dashboards. We can view lines, or graphs, or pie charts, and these charts show us trends that we can address that we could never determine using only raw data from our PMS dropped into a spreadsheet. And in a fraction of the time it takes to run reports from our PMS and transfer that data to a spreadsheet. KPI metric software is like looking through the windshield, seeing whats ahead. PMS data is like looking through the rearview mirror. I can change the future of the business; the past is gone.

In our offices, and those I work with, the doctors and opticians print their own personal KPI metrics every Monday morning so they can compare their production last week to the previous week. This is invaluable; when people know their numbers and percentages and continue to track them, their eye is clearly on the ball. Team members dont want to fall below their new norm. Conversely, when a team doesnt know their metrics, they guess, and often guess high. We all want to imagine we perform higher than we likely do.

We are goal focused for the benefit of the patient, first, and making profit, second. The data helps us focus on better communication. For example, when the doctor prescribes Rx sunglasses to 10 patients, and one patient includes the Rx sunglasses, that data tells us a few things. One in 10 patients is 10%. Weekly updated data provided our team with a clear target to focus our process on and develop smarter communication to engage and influence the patient. As an example of communicating far better with the patient, we stopped using words like UV protection because we found almost all patients did not clearly understand UV or what was protected. We switched to sun damage and the Rx and Plano sunglass percentage increased from 10% to 16% almost immediately.

The simple reason to measure KPI metrics data weekly is threefold: to serve the patient better, to serve the team better, and to serve the business better. You run the business, or the business runs you. Data-mining your clinic and optical KPIs weekly and planning growth goals is a winning combination for success.

Read this article:

Data-Mining to Serve Your Patient, Your Team and Your Business Better - InvisionMag