Category Archives: Data Mining

Preserve Data History In The Cloud — Or Lose It – ITPro Today

In a discussion I had with a spokesperson for a cloud services vendor several months ago, the representative said something that stuck with me. One thing you dont get with the cloud is a history of how your data changes over time, the spokesperson said. Theres no way you can look at a record [at a point in time] and compare it to other time periods. What were doing is were preserving the historical data that is [otherwise] lost in the cloud.

The spokesperson was not affiliated with a cloud data lake, data warehouse, database or object storage vendor. It seemed that the company hadnt previously considered that cloud subscribers could use one of these services to collect and preserve the historical data produced by cloud applications. Or, if the company had previously considered this, it was rejected as an undesirable, or nonviable, option.

In my conversations specific to the data mesh architecture and data fabric markets, the terms history and historical tend to come up infrequently. Instead, the emphasis is on (a) enabling business domain experts to produce their own data and (b) making it easier for outsiders -- experts and non-experts alike -- to discover and use data. And yet, discussion of the technologies that underpin both data mesh architecture and the data fabric, viz., data virtualization, metadata cataloguing, knowledge discovery, focuses on connectivity to operational databases, applications and services; i.e., resources that do not preserve data history.

Historical data is of crucial importance to machine learning (ML) engineers, data scientists and other experts, of course. It is not an afterthought. And there are obvious schemes you can use to accommodate historical data in data mesh architecture -- a historical repository that is instantiated as its own domain, for example.

But these two data points got me wondering: Are we forgetting about data history? In the pell-mell rush to the cloud, are some organizations poised to reprise the mistakes of past decades?

Most cloud apps and services do not preserve historical data. That is, once a field, value or record changes, it gets overwritten with new data. Absent a routinized mechanism for preserving it, this data is lost forever. However, having said this, some cloud services do give customers a means to preserve data history.

This option to preserve data history might seem convenient, at least so far as the customer is concerned. But there is a myriad of reasons why organizations should consider taking on and owning the responsibility of preserving historical data themselves.

The following is a quick-and-dirty exploration of considerations germane to the problem of preserving, managing and enabling access to historical data produced by cloud applications and services.It is not in any sense an exhaustive tally, but it does aspire to be a solid overview.

Then you need a plan to preserve, manage and use the data produced by your cloud apps and services.

The good news is that it should be possible to recover historical data from extant sources. Back in the early days of decision support, for example, recreating data history for a new data warehouse project usually involved recovering data from backup archives, which, in most cases, were stored on magnetic tape.

In the cloud, this legacy dependency on tape may go away, but the process of recreating data history is still not always straightforward. For example, in the on-premises environment, it was not unusual for a backup archive to tie into a specific version of an application, database management system (DBMS) or operating system (OS). This meant that recovering data from an old backup would entail recreating the context in which that backup was created.

Given the software-defined nature of cloud services, virtual abstraction on its own does not address the problem of software dependencies. So, for example, in infrastructure as a service, you have the same dependencies (OS, DBMS, etc.) as you did in the on-premises data center. With respect to platform as a service (PaaS) and software as a service (SaaS), changes to newer versions of core cloud software (e.g., deprecated or discontinued APIs) could also complicate data recovery.

The lesson: Develop a plan to preserve and manage your data history sooner rather than later.

You should still have a plan. When you use your providers offerings to preserve data history, it creates an unnecessary dependency. That is, do you really own your data if it lives in the providers cloud services?

Moreover, your access to your own data is mediated by the tools and APIs -- and the terms of service -- that are specified by your cloud provider. But what if the provider changes its terms of service? What if you decide to discontinue use of the providers services? What if the provider is acquired by a competitor or discontinues its services? How much will it cost you to move your data out of the providers cloud environment? What formats can you export it in?

In sum: Are you comfortable with these constraints? This is why it is incumbent upon customers to own and take responsibility for the historical data produced by their cloud apps and services.

Even in the era of data scarcity -- first, scarcity with respect to data volumes; second, scarcity with respect to data storage capacity -- savvy data warehouse architects preferred to preserve as much raw historical data as possible, in some cases using change-data capture (CDC) technology to replicate all deltas to a staging area. Warehouse architects did this because having raw online transaction processing (OLTP) data on hand made it relatively easy to change or to maintain the data warehouse. For example, they could add new dimensions or rekey existing ones.

Today, this is more practicable than ever, thanks to the availability (and cost-effectiveness) of cloud object storage. It is likewise more necessary than ever, due to the popularity of disciplines such as data science and machine learning engineering. These disciplines, along with traditional practices such as data mining, typically require raw, unconditioned data.

A caveat, however: If you use CDC to capture tens of thousands of updates an hour, you will ingest tens of thousands of new, time-stamped records each hour. Ultimately, this adds up.

The lesson is that not all OLTP data is destined to become historical. If for some reason you need to capture all updates -- e.g., if you are using a data lake to centralize access to current cloud data for hundreds of concurrent consumers -- you do not need to persist all these updates as part of your data history. (Few customers could afford to persist updates at this volume.) What you should do is persist a sample of all useful OLTP data at a fixed interval.

On its own, it is possible to query against data produced by cloud applications or services to establish a history of how it has changed over time. Data scientists and ML engineers can trawl historical data to glean useful features, assuming they can access the data. But data is also useful when it is combined with (historical) data from other services to create different kinds of multidimensional views: you know, analytics.

For example, by combining data in Salesforce with data from finance, logistics, supply chain/procurement, and other sources, analysts, data scientists, ML engineers and others can produce more useful analytics, design better (more reliable) automation features and so on.

By linking sales and marketing, finance, HR, supply chain/procurement, logistics, and other business function areas, executive decision makers can obtain a complete, synoptic view of the business and its operations. They can make decisions, plan and forecast on that basis.

This is just to scratch the surface of its usefulness.

The purpose of this article was to introduce and explore the problem of capturing and preserving the data that is produced by cloud apps and services -- specifically, the historical operational data that typically gets overwritten when new data gets produced. There are several reasons organizations will want to preserve and manage this data, including the following:

There is another reason that organizations will want to capture and preserve all the data their cloud apps produce, however. Most SaaS apps (and even many PaaS apps) are not designed for accessing, querying, moving, and/or modifying data. Rather, they are designed to be used by different kinds of consumers who work in different types of roles. The apps likewise impose constraints, such as API rate limits and, alternately, per-API charges, that can complicate the process of accessing and using data in the cloud.

In a follow-up article, I will delve into this problem, focusing specifically on API rate limits.

Link:

Preserve Data History In The Cloud -- Or Lose It - ITPro Today

Top Ten: Disruptive supply chain technologies – Supply Chain Digital – The Procurement & Supply Chain Platform

Disruptive technology is advancement that changes the way consumers, industries, or businesses operate. Disruptive techs have the potential to replace age-old practices. Few areas of business are seeing more of such changes than supply chains. These are evolving to become digital networks that are on-demand and always-on. As such, they have the potential to deliver massive economic and environmental rewards, boosting productivity and sustainability, driving new markets, and encouraging innovation.

As with all disruptive tech, choosing the right time to invest is a key consideration. Some disruptive techs - such as Industry 4.0 advancements (Internet of Things) - are more embedded than others (3D printing). But in ten years time, most of the techs listed here will have had a seismic impact on the way companies move goods from A to B.

Here they are, in order of their impact on the supply chains of today:

3D printing has comea long way since the early 1980, when it produced only plastic prototypes. Now, 3D printing produce finished goods from materials including metal, body tissue, concrete and food.

The Chartered Institute of Procurement and Supply says: The 3D printer could be set to revolutionise future supply chains by reducing supply complexities, increasing the speed to market of products, reducing global impact by limiting the shipping of goods around the globe, whilst enabling the printing of replacement parts in remote locations. Whilst 3D printing is not set to displace mass production it will open up a world of product personalisation that could shape the demands of consumers.

We all know about the cloud but what about the fog? This is a new concept that spreads data across multiple servers with no reliance on any single location. It uses virtual buffers to endlessly relocate data packets, meaning a file is never completely in one place. Such a network has the potential to make big-data analytics more secure and stable, because at no point is any given single piece of data in a single location.

In a decades time, lorry drivers will be driving in an entirely new way. Through the integration of vehicle-to-vehicle comms and radar-based collision avoidance systems fleets of trucks will drive in platoons, using aerodynamics to save fuel. The lead truck sets the pace. The tech synchronizes braking and acceleration between pairs of trucks. This communication ensures lorries travel at a safe distance.

Platoons are managed by cloud-based network that connect trucks through mobile comms and Wi-Fi. Cloud-based supervision limits platoons to specified roads in safe driving conditions.

Its hardly an original idea. Platoon is just another word for convoy. In the 1970s truck drivers used CB radio to do exactly the same thing. There was even a hit song about convoys in the 1970s. Weve got ourselves a platoon, doesnt have quite the same ring, somehow.

The increasing demand for contactless deliveries is seeing drones being used to deliver packages. There are an estimated 20,000 drones carrying out retail deliveries today, and an estimated 500,000 drones are registered for commercial use with the Federal Aviation Administration (FAA) in the US. The drone package-delivery market is expected to grow from $528 million in 2020 to $39 billion. Gartner estimates drone delivery will not only reduce last-mile delivery costs by 70 pacer cent but will also make the delivery process more energy efficient.

Wearable tech are smart, hands-free electronic devices that automatically deliver information to the wearer. Connected glasses and clothing are becoming increasingly common in warehouse facilities.Companies such as UPS use wearables to streamline their supply chain processes and save time.

A recent IDC study revealed 160 million wearables shipped in 2019 -although 120 million of these ended up on the wrists of the general public, in the guise of smart watches and digital lifestyle devices.

Blockchain is set to significantly change the way retailers and consumer-packaged-goods manufacturers run their supply chains. The universal need for secure transactions and the demand for transparency are already steering how buyers relate to brands. Blockchain is meeting these needs by guaranteeing the origin of goods, as well as the security of the transaction.

No longer the stuff of Isaac Asimov novels, robotics is revolutionising the supply chain and delivering significant value. It improves speed and accuracy of operations, particularly in warehousing and manufacturing.

Robots are also helping companies increase worker productivity, reduce error rate, cut pick-, sort-, and storing-times; and increase access to difficult or dangerous locations. And with the current labour shortage - and peoples general disinclination to work in warehouse environments - robots are the last-mile workers of the future.

Big Data has long ceased to be a buzzword; its now a guiding principle of supply chain management. But having data and making sense of it are different things. Enter predictive analytics. This uses applied statistical modeling and data mining to give supply chain bosses a 360-degree view of the supply landscape. Research shows the decision making of high-performing organizations is five times likelier to be analytics-driven than intuition-based, with intuition being the habit of low-performing firms.

These types of tools help businesses produce, store and distribute products more efficiently and effectively. Customer service is the watchword here, with all manner of businesses now demanding speed, visibility and transparency from supply chain partners with a view to satisfying customer expectation. Once mostly a retail-centric notion, on-demand is now mission-critical for all companies, regardless of size, industry or position in the supply chain. Solutions that optimize inventory or supply chain networks are now de rigueur.

Not so long ago, complex supply chains were seen more a barrier to progress. But no more, thanks in main to the Internet of Things (IoT). IoT is really little more than a collective noun for the billions of sensors that are embedded in various devices (within the supply chain but also the wider world, just as in fridges and cars). The IoT allows for data-exchange - both within and between systems - over the Internet, and can leverage actionable data from step of the supply chain. Its now used for locating materials, servicing equipment and monitoring productivity and efficiency.

Back in 2013, there were around 20 million smart sensors in use in supply chains, feeding live data back to those controlling supply levers. In 2022, that number is expected to climb to 1 trillion, and by 2030, Deloitte says 10 trillion sensors will be deployed

Originally posted here:

Top Ten: Disruptive supply chain technologies - Supply Chain Digital - The Procurement & Supply Chain Platform

Shares of companies in the metals and mining space are trading lower as omicron concerns and worse-than-expected US job growth data weigh on economic…

This headline-only article is meant to show you why a stock is moving, the most difficult aspect of stock trading. Every day we publish hundreds of headlines on any catalyst that could move the stocks you care about on Benzinga Pro, our flagship platform for fast, actionable information that promotes faster, smarter trading.

Benzinga Pro has an intuitively designed workspace that delivers powerful market insight, and is the solution of choice for thousands of professional and retail traders across the world.

Stop Googling for information and check out Benzinga Pro. You will never again be left in the dark on when a stock moves. Youll have what you need to act in real-time before the crowd.

Start your FREE 14-day trial of Benzinga Pro today.

See original here:

Shares of companies in the metals and mining space are trading lower as omicron concerns and worse-than-expected US job growth data weigh on economic...

Value of data mining key in propelling Ghana’s digitisation efforts Expert – GhanaWeb

Data mining is key for growth and development in the ICT sector

Chief Executive of Afrifanom, an ICT solutions provider, Nana Osei Kwasi Afrifa has underscored the value of data in propelling Ghanas digitisation efforts.

According to him, current beneficiaries and persons with knowledge of data mining have used it to turn around the economic and developmental fortunes of countries.

Speaking at the 2021 Ghana Internet Conference spearheaded by the Ghana Internet Conference 2021, Nana Osei Kwasi Afrifa on his part said the lack of trust in Ghanaian ICT firms, to undertake key projects, still remains an obstacle for the growth and capacity development of the local ICT space.

The biggest government contracts on tech development are seeded to foreign brands and sadly, of the 20 plus banks in Ghana, none is using a bank software system developed by a Ghanaian, Afrifa is quoted by Citi Business News.

He further called on state actors, stakeholders in the telecommunication and data industry to ramp efforts in safeguarding their data space in order to leverage its predictability to spur economic growth.

Meanwhile, Communications and Digitalisation Minister, Ursula Owusu-Ekuful has urged tech developers in the country to create new apps for communication to serve Ghanaian users and Africa at large.

The call by the Minister came after a global outage of social media giants, Facebook, WhatsApp and Instagram were earlier reported on Monday October 4, 2021.

Im all for government using its purchasing power to stimulate the tech ecosystem. Weve demonstrated that by procuring the National ID system from a local company. The digital address system and the Ghana.gov payment platform were developed by a local company. Our SIM registration platform was developed by a local company on existing infrastructure that we already have, she explained in an earlier interaction with journalists.

She added, So if our young people are able to utilize technology to address challenges that we face and come up with solutions, I dont see why yesterdays incident cant also be an opportunity for them to innovate and develop something that we can use, if not as our main source of communication, as our alternative source that overtime can also grow to rival these global giants.

Here is the original post:

Value of data mining key in propelling Ghana's digitisation efforts Expert - GhanaWeb

Agency suffered data breach in September on mining permit applications – ABC 36 News – WTVQ

Notice given as required by law, some info was unredacted

FRANKFORT, Ky. (WTVQ) On September 8, 2021, the Kentucky Energy and Environment Cabinet (EEC) discovered original un-redacted mining permit applications containing some mine owners and controllers personal information was available for public inspection at Department for Natural Resources field offices and on an EEC hosted and public-facing website.

Under federal law (30 C.F.R. 773.6), EEC is required to make permit information such as the owners and controllers identifying information available for public inspection.

Although internal EEC policy and procedures require redaction of certain personal information (including SSNs) before permit information is made publicly available, some un-redacted permit materials have been available: since sometime in 2015 at public reading rooms located at DNR field offices, and 2) since January 16, 2021 on a public, internet-accessible database maintained by EEC. As a result, individuals or software programs may have accessed permit information.

On September 8, 2021, EEC discovered the security issue, and immediately disabled access to the files. After investigation, EEC was unable to determine whether personal information was accessed or downloaded during either the time in which it was available in EECs regional offices or when it was hosted on the EECs website.

In order to meet its obligations under federal and state law and prevent this issue from reoccurring, EEC has implemented further staff training, and reviewed and appropriately removed the subject materials such that documents containing sensitive personal information are no longer publicly accessible. Sensitive personal information will be redacted prior to providing those documents in response to an open records request.

EEC does not have any information indicating that any personal information has been misused. However, out of an abundance of caution, given that the files were hosted for an extended period of time, EEC initiated the personal information security breach protocols as required by Kentucky law, which includes notifying impacted individuals through personal communication and by notifying local, regional and statewide media, including broadcast media.

More here:

Agency suffered data breach in September on mining permit applications - ABC 36 News - WTVQ

Data Shows a Myriad of Crypto Networks Are More Profitable to Mine Than Bitcoin Mining Bitcoin News – Bitcoin News

As the end of the year approaches, digital currency values have risen a great deal in 2021 and crypto asset miners have been profiting as a result. According to statistics, the most profitable coin to mine at the end of November is kadena, as an 18 terahash (TH/s) machine can get up to $326 per day. Scrypt coins are the second most profitable these days with up to $110 per day and Ethash miners can make up to $105 per day.

Close to 13 years ago when Bitcoin first launched, the cryptocurrency could be mined with a central processing unit (CPU). This means that anyone with a decent computer at the time could mine and find bitcoin (BTC) block rewards. After that phase, people started to leverage devices with specialized electronic circuits called graphics processing units (GPUs).

Today, bitcoin miners utilize application-specific integrated circuit (ASIC) devices to mine BTC. Bitcoin mining rigs dedicate processing power to the SHA256 algorithm and this means a bitcoin ASIC mining device cannot mine coins like ethereum, litecoin, or kadena.

Those networks leverage different consensus algorithms and theres a slew of machines manufactured to mine specific crypto asset networks with unique consensus algorithms. SHA256 is a consensus algorithm used by Bitcoin, but SHA256 miners can also mine coins like bitcoin cash (BCH), bitcoinsv (BSV), peercoin (PPC), and unbreakable (UNB).

SHA256 cryptocurrencies are the fifth most profitable to mine at the end of November 2021. The top four most profitable consensus algorithms to mine today include Kadena, Scrypt, Ethash, and Eaglesong.

A Kadena-based ASIC miner can get up to $326 per day with 18 TH/s at $0.12 per kilowatt hour (kWh), according to asicminervalue.com stats. A Scrypt-based miner with 9.5 gigahash per second (GH/s) can get $110 per day with the same electrical costs.

750 megahash per second (MH/s) ASIC machines mining the algorithm Ethash (ethereum, ethereum classic, pirl) can get up to $52 per day. Eaglesong-based mining rigs that mine nervos (CKB) can get $45 per day with 12 TH/s.

There are also consensus algorithms such as Blake2bsia, X11, Blake256R14, and Equihash. Blake2bsia compatible mining rigs mine sia (SIA) and handshake (HNS), while X11 compatible rigs mine dash (DASH) and cannabiscoin (CANN). Blake256R14 mines decred (DCR) while Equihash-based machines can mine zcash (ZEC), hush (HUSH), and zencash (ZEN).

SHA256 miners mining bitcoin (BTC) with around 100 TH/s at $0.12 per kWh, can get up to $27 per day mining. The top bitcoin miners process at speeds up to 100 TH/s but SHA256 miners with at least 11.5 TH/s can turn a small profit. During the next few months, a number of next-generation miners are slated to launch.

Upcoming mining rig releases that pack a lot more hashpower will be dedicated to consensus algorithms like Ethash and SHA256, according to a few prior announcements. For instance, in July 2022, Bitmain is expected to release the Antminer S19 XP (140 TH/s) and Innosilicons A11 Pro ETH (2,000 MH/s) is reportedly coming soon as well.

What do you think about the top consensus algorithms today and the profits these networks can obtain? Let us know what you think about this subject in the comments section below.

Image Credits: Shutterstock, Pixabay, Wiki Commons

Disclaimer: This article is for informational purposes only. It is not a direct offer or solicitation of an offer to buy or sell, or a recommendation or endorsement of any products, services, or companies. Bitcoin.com does not provide investment, tax, legal, or accounting advice. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.

Read the rest here:

Data Shows a Myriad of Crypto Networks Are More Profitable to Mine Than Bitcoin Mining Bitcoin News - Bitcoin News

No Credit Score? No Problem! Just Hand Over More Data. – The New York Times

The company is also something of a regulatory guinea pig: Upstart was the first business to receive a no-action letter from the Consumer Financial Protection Bureau. The letter essentially said the bureau had no plans to take any regulatory action against the company in return for detailed information about its loans and operations.

Though the bureau didnt recreate Upstarts results on its own, it said the company had approved 27 percent more applicants than the traditional model, while the average interest rates they paid were 16 percent lower. For example, near prime customers with FICO scores from 620 to 660 were approved about twice as frequently, according to company data. Younger and lower-income applicants also fared better.

Upstart, which also agreed to be monitored by two advocacy groups and an independent auditor, takes into account more than 1,000 data points inside and outside a consumers credit report. It has tweaked its modeling at times it no longer uses the average incoming SAT and ACT scores of a borrowers college but includes the persons college, area of study and employment history. (Nurses rank well, for example, because theyre rarely unemployed, Mr. Girouard said.) The amount that borrowers are asking for may also be a factor: If they are seeking more than Upstarts algorithms believe is appropriate, that may work against them.

Other companies work in a similar way, although the methods and data they use vary.

TomoCredit, for example, will issue a Mastercard credit card to applicants even those with no credit score after receiving permission to peer at their financial accounts; it analyzes more than 50,000 data points, such as monthly income and spending patterns, savings accounts and stock portfolios. Within two minutes, consumers are approved for anywhere from $100 to $10,000 in credit, to be paid off weekly. On-time payments help build users traditional credit files and scores.

Zest AI, a Los Angeles company that already works with banks, auto lenders and credit unions, is also working with Freddie Mac, which recently began using the companys tools to evaluate people who may not fit squarely inside traditional scoring models.

Jay Budzik, Zest AIs chief technology officer, said the company went deep into applicants credit reports, and might incorporate information from a loan application, such as the mileage or potential resale value of a used car. It can also look at consumers checking accounts.

How frequently are they getting close to zero? Mr. Budzik said. Those things are helpful in creating an additional data point on a consumer that is not in the credit report.

Continued here:

No Credit Score? No Problem! Just Hand Over More Data. - The New York Times

Will Palantir Be a Trillion-Dollar Stock by 2040? – The Motley Fool

Palantir Technologies (NYSE:PLTR) has been a volatile and polarizing investment since its direct listing last September. The bulls claimed its data-mining platforms would continue to grow as it signed more government and enterprise contracts.

The bears pointed out that Palantir was too heavily dependent on government clients, its enterprise business faced too many competitors, it was deeply unprofitable, and its stock was too expensive.

Palantir's stock price has experienced some wild swings over the past year, but it has still more than doubled since its first trade at $10 per share.

Image source: Getty Images.

Today, Palantir is valued at $41.3 billion, or 27 times this year's sales. The bears will argue that the high price-to-sales (P/S) ratio will limit its upside, especially as rising interest rates and inflation make many high-growth tech stocks less attractive.

But let's look beyond the near-term noise and see if Palantir can still generate big multibagger gains, or even become a trillion-dollar stock, over the next two decades.

Palantir expects to grow its revenue by at least 30% annually between fiscal 2021 and 2025. That forecast implies its revenue will rise from its target of $1.5 billion this year to at least $4.3 billion in 2025.

The company expects that growth to be driven by its new and expanded contracts with government agencies, as well as the growth of its Foundry platform for large commercial customers. The accelerating growth of its commercial business over the past year, which notably outpaced the growth of its government business last quarter, supports that thesis.

Palantir hasn't provided any longer-term targets beyond 2025. But based on the growth trajectory of other big data companies like Salesforce (NYSE:CRM), its annual revenue increase could potentially decelerate and stabilize at about 20% over the following 10 years.

If it hits its target for 2025, then continues to grow its revenue at an average rate of 20% over the following 10 years, it could generate nearly $27 billion in revenue in 2035.

If Palantir's revenue growth then slows down to 15% per year, which would be more comparable to Microsoft's (NASDAQ:MSFT) current rate, it could generate over $53 billion in revenue in 2040.

Assuming the company is still valued at over 20 times sales, its market cap could surpass $1 trillion. But most tech giants that grow their revenue 15% to 25% annually aren't valued at more than 20 times sales.

Microsoft, which is expected to generate 17% sales growth this year, trades at 13 times that estimate. Salesforce, which is expected to generate 24% sales growth this year, trades at just 11 times this year's sales.

Therefore, Palantir's market cap could potentially hit $1 trillion by 2040, but it seems highly unlikely. Instead, it will likely be closer to $500 billion (which would still be a 12-bagger gain from its current valuation) if its stock is trading at a more reasonable P/S ratio of 10.

Instead of focusing on Palantir's path toward joining the 12-zero club, investors should focus on its ability to generate sustainable growth.

The company has gained a firm foothold with the U.S. government, but it still faces competition from internally developed systems. Immigration and Customs Enforcement (ICE), for example, has been developing its own platform to replace Palantir's Falcon. If other agencies follow ICE's lead, the company's dream of becoming the "default operating system for data across the U.S. government" could abruptly end.

Palantir is making solid progress in the commercial market, but its Foundry platform still faces plenty of indirect competitors like C3.ai, Salesforce's Tableau, and Glue from Amazon Web Services.

The company likely believes its reputation as a battle-hardened platform for the U.S. military and government agencies will attract more enterprise customers. But there's no guarantee that this appeal will last for decades or fend off newer, hungrier, and more disruptive players in the data-mining market.

I still believe Palantir's stock is a promising long-term investment on the secular growth of the data-mining and analytics market. However, there's a lot of growth already baked into the stock, and its high valuations could limit its near-term and long-term potential. Palantir probably won't hit a trillion-dollar valuation within the next two decades, but it could still outperform the market and generate impressive multibagger gains.

This article represents the opinion of the writer, who may disagree with the official recommendation position of a Motley Fool premium advisory service. Were motley! Questioning an investing thesis -- even one of our own -- helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.

Follow this link:

Will Palantir Be a Trillion-Dollar Stock by 2040? - The Motley Fool

HotSpot Therapeutics Closes $100M Series C to Advance First-in-Class Allosteric Drug Discovery Platform to the Clinic – PRNewswire

BOSTON, Nov. 29, 2021 /PRNewswire/ -- HotSpot Therapeutics, Inc., a biotechnology company pioneering the discovery and development of first-in-class allosteric therapies targeting regulatory sites on proteins referred to as "natural hotspots," today announced the close of its oversubscribed $100 million Series C financing, bringing its total funding to $190 million.

Led by Pivotal bioVenture Partners, with significant participation by LSP and B Capital Group, the round includes new investors Monashee Investment Management, LLC, CaaS Capital Management, Revelation Partners and Pavilion Capital, as well as participation from previous investors, Atlas Venture, Sofinnova Partners, SR One Capital Management, funds managed by Tekla Capital Management, LLC, and MRL Ventures Fund. Ash Khanna, PhD, of Pivotal bioVenture Partners, and Fouad Azzam, PhD, of LSP, will join the HotSpot Board of Directors.

The new financing will be used to continue advancingHotSpot's Smart Allosteryplatform, with a focus on undrugged and poorly druggable targets, as well as the company's existing pipeline.

"With the support of top-tier healthcare and technology investors, we will expand on the significant productivity of the Smart Allostery platform to enable a treasure trove of sought-after disease targets and thereby develop medicines that broadly benefit patients," said Jonathan Montagu, Co-founder and Chief Executive Officer of HotSpot Therapeutics. "We have diligently established a deep pipeline of product opportunities in cancer and autoimmune disease, each offering a clear path to clinical value through precision and patient-targeted trial design."

HotSpot's Smart Allostery platform unlocks a vast range of disease-relevant proteins for the first time through the identification of protein pockets called "natural hotspots." Natural hotspots are pockets that are decisive in protein function yet previously unexploited using conventional allostery approaches. The Smart Allostery platform encompasses a broad suite of AI-enabled technologies and the industry's largest and most diverse chemical library tailored to hotspots. The company is leveraging the Smart Allostery platform to uncover, capture, and drug natural hotspots across a wide array of disease-causing proteins.

"Within 36 months, HotSpot has established a new paradigm for allostery drug discovery that is reproducibly delivering small molecules across multiple target classes, including transcription factors and E3 ligases," said Ash Khanna, PhD, Venture Partner at Pivotal bioVenture Partners. "We are thrilled to be partnering with HotSpot as they advance their pipeline to the clinic and realize the promise of delivering potentially lifesaving therapeutics to patients."

Earlier this month, HotSpot announced new datavalidating its Smart Allostery platform in the elucidation and preclinical evaluation of a novel allosteric inhibitor of the E3 ubiquitin ligase CBL-B, an important target in cancer immunotherapy. The data show successful targeting with the first and only selective small molecule inhibitors of CBL-B. Identified via the company's Smart Allostery platform, these small molecules promote T cell responses in vitro and in mice. Through allosteric inhibition of this negative regulator of immune cells, HotSpot's small molecule CBL-B inhibitors offer the potential for increased immunotherapy efficacy for cancer patient populations that have historically exhibited poor treatment responses.

About HotSpot TherapeuticsHotSpotTherapeuticsistargetingnaturally occurringpockets on proteins called "natural hotspots" that are decisive in the control of cellular protein function. Largely unexploited by industry, these pockets are highly attractive for drug discovery and enable the systematic design of highly potent and selective small molecules that exhibit novel pharmacology. The company's Smart Allostery technology platform utilizes AI-driven data mining of large and highly diverse data sets to identify pockets that matter on proteins, integrated with a tailored pharmacology toolkit and bespoke chemistry to rapidly deliver superior hotspot-targeted small molecules.The company has successfully exploited natural hotspots across multiple classes, including E3 ligases, kinases, and transcription factors. HotSpot has established a product pipeline of first-in-class small molecules for the treatment of cancer and autoimmune diseases, each enabled by precision and patient-targeted clinical design. To learnmore,visitwww.hotspotthera.com.

SOURCE HotSpot Therapeutics

Read more:

HotSpot Therapeutics Closes $100M Series C to Advance First-in-Class Allosteric Drug Discovery Platform to the Clinic - PRNewswire

Bigger contracts and crypto mining ignite DC Twos FY22 vision – Stockhead

It takes a special business to capitalise on remote work trends and cryptocurrencys emergence. Data centre and cloud computing play DC Two is tracking to do exactly that.

DC Two (ASX:DC2) is a vertically integrated data centre, cloud and software business which last quarter generated a quarter-on-quarter recurring revenue increase of 20% to $568,264 impressive growth by any measure.

But management believes theres potential for exponential growth in the business, as it moves closer to securing lucrative Tier III accreditation for its flagship Bibra Lake data centre in WA, and simultaneously explores a modular space attracting attention from crypto mining plays locally and abroad.

There really is a few different aspects of this business that we are particularly excited about, DC Two executive director Blake Burton told Stockhead.

Over the last year weve been building out that data centre at Bibra Lake to get it to a point of Tier III accreditation from the Uptime Institute.

At the same time, weve got the DC Two regional modular data centre side of the business, which is growing and attracting interest from overseas because it allows us to provide energy in an ESG-friendly way.

Its clearly an exciting time for DC2 as it grows from its small business roots to something much larger.

We were all thinking it. The world of digital infrastructure is governed by an organisation called the Uptime Institute which measures facilities across the world using a tier system.

Its a measurement which allows customers to better understand the level of facility theyre using for their data. Tier III is the second-highest certification achievable.

Tier III is really important to us, because it will allow us to access contracts with a lot of government enterprises and medium-sized businesses, Burton said.

Tier III basically means that the way your data centre is built allows it to keep your services online for at least 99.982% of the time its all about keeping your data and services online.

Smaller data centres cant do that, which bigger customers view as a risk to their business. Tier III certification will allow us to actually start targeting the medium- and larger-sized businesses that our current facilities dont allow us to.

The Bibra Lake facility has already received ISO27001 certification an internationally recognised security management standard regarded as best practice for information security management systems.

Grouping Tier III and ISO27001 is a very good thing when it comes to attracting new business. Which is timely, since a $2.5 million capital raise announced in September flagged the expansion of DC2s sales team.

Business growth you love to see it. The data storage industry appears to be going exactly the same way.

The whole world has experienced changes with COVID theres been that big shift from having everything stored on a server at the office to people connecting remotely, Burton said.

People are now working from home, theyre connecting remotely to an email server or the program they use, and to facilitate that remote connection you need the data centres and cloud platforms.

People have really started to see this shift from the physical to the virtual over the last few years, and we feel it highlights the importance of this industry and what were looking to do.

At first glance this subhead may look like a list of 2021 investment buzzwords, but in DC Twos case its a real factor driving interest and growth in the companys modular offerings.

DC Twos regional modular facilities are containerised data centres which are placed on renewable energy sites, tapping the power generated onsite to run the data centre equipment.

The first of these is located in WAs Midwest, where the company recently won contracts worth more than $1.7 million and is expanding capacity to around 2 megawatts by the end of 2021.

DC Two is also delivering a modular data centre in Victoria next year, in addition to its data centre project in Western Melbourne.

What were doing mostly at the moment with modular is hosting digital currency mining equipment, Burton said.

Weve got people coming from overseas, where theres been a big clamp down in terms of crypto mining, and were able to offer it in an ESG way with renewable energy.

At the moment weve got the Midwest, which is a wind farm, were working on a site in Victoria, which is biogas, and theres another site in Collie under a non-binding MOU which is a solar farm.

We place customer equipment on these renewable energy sites, and were able to offer really cheap power to them that they cant get in the metro area. In some cases, we actually also sell crypto mining equipment to the customer.

Having built itself primarily in WA in the early stages of the business, DC Two is also looking at the east coast post Tier III accreditation at Bibra Lake.

The company currently has a Northern Territory cloud platform in operation, with Victoria potentially the next point of call according to Burton.

Our next push is going to be over east, he said.

Victoria is a really interesting state for us, because were already working on modular there.

Now we can potentially look at running our own cloud platform and services over there as well.

Thats likely shaping as our first relatively big investment outside of Western Australia.

This article was developed in collaboration with DC Two, a Stockhead advertiser at the time of publishing.

This article does not constitute financial product advice. You should consider obtaining independent advice before making any financial decisions.

Get the latest Stockhead news delivered free to your inbox.

It's free. Unsubscribe whenever you want.

You might be interested in

Read this article:

Bigger contracts and crypto mining ignite DC Twos FY22 vision - Stockhead