Category Archives: Data Mining
Springer Nature becomes first publisher to partner with CiteAb – Research Information
Reagents - core components in chemical, biochemical and related lab work - are essential for researchers and academics to be able to plan their experiments and develop their research. However, many experiment reproducibility problems can be tied directly to reagents. It isestimatedthat up to $28 billion annually is wasted on irreproducibility in pre-clinical research - which has a large global impact on the development of high quality research to tackle medical, environmental problems, due to the time wasted in having to correct the reagents the work is based on.
The partnership between Springer Nature and CiteAb, the first with a publisher for the data company, will see selected journal and book citation data from the publisher integrated into CiteAbs search engine, helping researchers make more informed decisions when identifying the best reagent for their experiments. With over 3 million citations and 10 million+ reagents now included in the platform, researchers are provided with a more comprehensive view of the most commonly used products in their research field enabling them to: see how and where they are used and to what effect; search for what they need in one place; and have access to information on the suppliers of reagents, all which helps to save them time and money when identifying the products that are most appropriate for their own experiments.
Commenting on the partnership, Robin Padilla, PhD, director of product management, Digital Life Science Solutions at Springer Nature said: 'We are delighted to be the first publisher to partner with CiteAb, as we look to utilise the expertise of the Springer Nature Data Solutions team alongside CiteAbs fresh approach to the market, to help find solutions to some of the data and resource challenges that face researchers when conducting experiments.
We know that it is incredibly time intensive for researchers to find not only high quality content, but the most suitable resources - data and products - that they need for their lab, experiment and research work. This partnership seeks to address that challenge by combining our high quality data and content with CiteAbs innovative platform and search functionality, and using state of the art text and data mining APIs to streamline that search process.'
CiteAbs CEO, Dr Andrew Chalmers added: 'We are extremely excited to be partnering with Springer Nature. Their commitment to publishing research of the highest quality aligns with our commitment to generating the highest quality data. Accessing their content will allow us to generate unique data and use it to inform researchers and suppliers, ultimately helping them to advance science more quickly.'
Read more here:
Springer Nature becomes first publisher to partner with CiteAb - Research Information
The global automotive artificial intelligence market size is projected to grow from USD 2.3 Billion in 2022 to USD 7.0 Billion by 2027, it is expected…
ReportLinker
The key factors contributing to the growth of the automotive artificial intelligence market include the growing adoption of ADAS technology by OEMs and the increasing use of AI to make buying decisions.
New York, Sept. 05, 2022 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Automotive Artificial Intelligence Market by Offering, Technology, Process, Application, Component and Region - Global Forecast to 2027" - https://www.reportlinker.com/p05090296/?utm_source=GNW Safety is becoming a prime concern in terms of vehicle features. A majority of accidents occur because the driver is either distracted or has lost focus due to drowsiness. The World Health Organization (WHO) estimates that approximately 1.3 million road traffic deaths occur globally each year. Several driver assistance systems have been developed to assist the driver and significantly reduce the number of accidents. Such systems will warn inattentive drivers of approaching danger and help increase safety. Driver assistance systems have gained significant importance over the past few years. Hence, the demand for adaptive cruise control (ACC) and ADAS is increasing. The trend is estimated to continue for the next five years. The improving infrastructure, the increasing struggle of automobile manufacturers to offer improved features, and the changing lifestyles worldwide have boosted the overall sale of premium passenger cars. However, beyond a certain degree, these factors will not influence the demand for driver assistance systems with automotive AI in the overall market.
Increasing use of AI to make buying decisionsAI is not only used in the production of cars and the delivery of improved user experiences but also makes the process of buying and selling cars incredibly easy.Future customer interface systems with AI capabilities may potentially provide recommendations for the best cars based on the drivers insurance, health, and penalties received.
Next-generation automobiles can also receive real-time information on traffic jams or any other emergency, and they can use AI to build detailed 3D images of actual roadways.
APAC is the fastest-growing region in the automotive artificial intelligence marketThe automotive artificial intelligence market in the Asia Pacific is expected to grow at the highest CAGR from 2022 to 2027 owing to technological advancements and financial support from the government is expected to propel the growth of this market.Increasing population, improving lifestyles, and growing economies have accelerated the pace of passenger car demand in the Asia Pacific.
China, Japan, India, and South Korea are the key countries in the Asia Pacific region for the automotive artificial intelligence market. Toyota (Japan), Hyundai Motor Company (South Korea), and Honda Motor Company (Japan)are among the top companies in the region operating in the automotive artificial intelligence market.
The breakup of primaries conducted during the study is depicted below: By Company Type: Tier 1 55 %, Tier 2 25%, and Tier 3 20% By Designation: C-Level Executives 60%, Directors 20%, and Others 20% By Region: North America 40%, Europe 30%, APAC 20%, Rest of world10%
Research CoverageThe report segments the automotive artificial intelligence market and forecasts its size, by volume and value, based on region (North America, Europe, Asia Pacific, and RoW), Offering (Hardware, Software), Technology (Deep Learning, Machine Learning, Computer Vision, Context-aware Computing and Natural Language Processing), Offering (Hardware, Software), Process (Signal Recognition, Image Recognition, and Data Mining), Application (Human-Machine Interface, Semi-autonomous Driving, Autonomous Vehicle, Identity Authentication, Driver Monitoring, and Autonomous Driving Processor Chips), and Components (GPU, Microprocessors (Incl.ASIC), FPGA, Memory and Storage Systems, Image Sensors, Biometric Scanners, and Others).
The report also provides a comprehensive review of market drivers, restraints, opportunities, and challenges in the automotive artificial intelligence market. The report also covers qualitative aspects in addition to the quantitative aspects of these markets.
Key Benefits of Buying This Report1 This report includes market statistics pertaining to the offerings, technology, process, application, components, and region.2 An in-depth value chain analysis has been done to provide deep insight into the automotive artificial intelligence market.3 Major market drivers, restraints, challenges, and opportunities have been detailed in this report.4 Illustrative segmentation, analyses, and forecasts for the market based on offerings, technology, process, application, components, and region have been conducted to provide an overall view of the automotive artificial intelligence market.The report includes an in-depth analysis and ranking of key players.Read the full report: https://www.reportlinker.com/p05090296/?utm_source=GNW
About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.
__________________________
Story continues
Link:
How Data Mining Can Positively Impact Your Business – mitechnews.com
DETROIT Data mining is an often misunderstood term. Consumers hear it and they think privacy violation. Small businesses hear it and assume its only a luxury of Fortune 500 companies. In fact, neither assumption is strictly accurate. Appropriately sourced data mining can be ethical and safe for consumers and businesses alike.
And, even modest-sized companies use data mining to their advantage with the right tools and understanding.
In this article, we take a look at some of the many benefits of data mining and reflect on how your business can use it advantageously.
Data mining is the process of extracting insights from long string forms of informationusually done by identifying and interpreting patterns within the information set. Its a process that typically involves a combination of machine learning and human-driven interpretation, all combined to deliver actionable insights for business professionals.
The data sets themselves can be generated by a wide variety of sources, including internal data, and even customer surveys. While often seen as a practice exclusive to large businesses, with the proliferation of data technology, as well as the ever-increasing number of professionals with master degrees in data science, the practice has become more democratized than ever before.
Data mining helps businesses better understand their customers by extracting granular insights into their buying and product usage habits. This information is key when it comes to forming closer relationships with customers but it can also be used from a marketing perspective.
Marketers with reliable data access are not only able to enjoy customized marketing campaigns based on the specific interests of their customers, but they can also determine how and when to implement those insights.
For example, data mining might inform a business of how and when their customer base is most active on social media. With this information, they can make very specific ads on Twitter or Facebook cater precisely to their demographic.
If their customer base consists primarily of teenagers, they may target ads for the middle of the day, during school lunch breaks. Or, late afternoon, for when students get out of school.
If, on the other hand, the demographic consists mostly of business people, the ads might get posted around 7 PM when adults are most likely to be on social media.
Data mining also always you to make calculated adjustments to your product. By extracting information directly from customers who have used what you are setting, its easy to get a clear idea of what people like, and what they dont use as much.
For example, software companies update their products every several years. Though they are usually selling different versions of the same product, features are routinely tweaked to increase marketability.
This might mean tweaking features to make them more in line with what consumers are looking for. It might also mean adjusting how they describe their product, fixing ad language to make it more compatible with how consumers think of the software.
Looking at historic customer data, consumers can anticipate future product demand, using projections to accurately predict incoming revenue for a fiscal period. While is no crystal ball, it is accurate enough for businesses to make reasonable adjustments to their spending habits, equipping them to buckle down in lean periods or get aggressive with their investments during times where the cash is flowing.
Mined data can reveal two important customer insights.
Customer service has become very personalized. Nearly 80% of consumers expect this personal touch when dealing with businesses. Data mining helps you provide it, increasing profitability and making your customers happier with one move.
Data mining can also help your business recognize fraud by identifying abnormalities within patterns, and flagging unusual activity. While not a catch-all, data can help differentiate between legitimate transactions and ones that might not be.
The more transactions an algorithm can analyze, the better it gets at detecting fraud, meaning that theoretically, businesses that use data mining will be less susceptible to fraud as time goes on.
Data mining does come with its drawbacks. Most notably, complexity. Interpreting large sets of data is difficult, both as a practice and because of the technology it requires. For true data mining success, businesses must consider the services of a data professional.
Scalability is another issue. The tools required to do data mining impactfully are complicated, necessitating large databases that can be difficult for the layperson to manage.
Its also worth keeping in mind that data mining practices have been the source of some controversy amongst consumers. Though the information that is involved in data mining is typically anonymous (X customers like banana smoothies, not Sally likes banana smoothies) they can still generate bad pressparticularly in instances where that data has been sold.
Addressing the former concern is mostly a matter of hiring the right people and using the right tools while taking care of the latter can be solved with transparency. Handle data responsibly with the respect and attentiveness it requires, and there wont be any issues.
Bio: Ryan Ayers has consulted a number of Fortune 500 companies within multiple industries including information technology and big data. After earning his MBA in 2010, Ayers also began working with start-up companies and aspiring entrepreneurs, with a keen focus on data collection and analysis.
Here is the original post:
How Data Mining Can Positively Impact Your Business - mitechnews.com
Can AI Be A Patent Inventor? UK Justices To Weigh In – Law360
By Alex Baldwin (September 2, 2022, 5:27 PM BST) -- Britain's highest court will hear a researcher's high-profile attempt to get an artificial intelligence listed as an inventor on a patent application as he pursues his global IP litigation campaign.
The U.K. Supreme Court will consider whether an artificial intelligence can be listed as an inventor after a lower court ruled that only a person could be classified as such under the Patents Act 1977. (iStock.com/Jenar) The U.K. Supreme Court said Thursday that it will consider the case after the Court of Appeal ruled in September 2021 that only a person could be classified as an inventorunder the Patents Act 1977....
In the legal profession, information is the key to success. You have to know whats happening with clients, competitors, practice areas, and industries. Law360 provides the intelligence you need to remain an expert and beat the competition.
TRY LAW360 FREE FOR SEVEN DAYS
The rest is here:
Can AI Be A Patent Inventor? UK Justices To Weigh In - Law360
How does Bitcoin mining keep the lights on? Some companies may have solutions – Technical.ly
Picture that its June 30, 2021, Baltimores hottest day of the year at 99F.
Imagine every house with AC cranking it to 65. Suddenly, everything goes dark and the cool air cuts off from all the demand on the power grid. A second later, electricity returns because, somewhere, a major power consumer (like a factory, industrial plant or Bitcoin mine) shut off the power.
Also in 2021, China, which has more crypto mining activity than any country, decided to ban Bitcoin mining because it deemed the industry highly pollutive. This opened the door for the United States to become the top location for Bitcoin mining in the world. States like New York, Texas and Pennsylvania, as well as small towns like Coshocton, Ohio, eventually became places where industrial-size Bitcoin mines want to set up shop.
Bitcoin mining essentially involves computers doing intense math to validate transactions because the decentralized nature of cryptocurrencies needs a verification process to prevent bad actors from manipulating the currency. The process is called proof of work. The first person to solve the math equation and add their validated transactions to the overall ledger that is, the blockchain receives cryptocurrency as their reward.
The computational power needed to be first and reap the benefits is what drives Bitcoin mining energy demands to 132.48 terawatt-hours (TWh) annually across the world, according to estimates from the University of Cambridge that were reported by Business Insider. Those energy demands in the US, where 35.4% of Bitcoin mining takes place, translate to 0.85 pounds of carbon dioxide per kilowatt-hour (kWh) and 40 billion pounds of carbon dioxide.
That energy usage is not lost on utility services companies like Baltimores CPower Energy Management, a company that acts as a liaison between Bitcoin miners and power companies.
They use a lot of energy, David Chernis, a CPower account executive, told Technical.ly. Thats both a problem and a solution, in that they can also turn off very quickly if needed to.
CPower offers grid reliability and demand response services. On days over 90 degrees or during intense cold, theres stress on the power grid; high-energy consumers need to curtail power consumption so the community at large continues to get power smoothly.
If they were to turn off a traditional data center, imagine turning off Google. Whats going to happen? Youre going to see webpages go down, said CEO Daniel Lawrence of Baltimore-based cryptocurrency software firm OBM Inc. If you turn off a Bitcoin mine, the mine stops generating Bitcoin, but thats about it. Theres no user impact.
OBM Inc. created the crypto-mining management platform Foreman to help miners better manage the power consumption of their hundreds of devices that process transactions on the blockchain. This process aims to offset an environmental impact that enabled bans in countries like China, Egypt, Iraq, Qatar, Oman, Morocco, Algeria, Tunisia and Bangladesh.
The software allows companies like Standard Power, which builds and maintains enterprise-level Bitcoin mines for clients, to automate shutting down Bitcoin mines in five minutes (it used to take two hours to get this done manually).
In 2018, the company bought 125 acres of what used to be a paper mill to create a data center for bitcoin mining in Coshocton, a town roughly 78 miles northeast of Columbus. The WestRock Paper Mill had shut down in 2015, leaving hundreds jobless. Its an age-old story of manufacturing leaving a small town and its residents out to dry. Places like Coshocton are hoping the new market of cryptocurrency infrastructure can revitalize the town. Coshoctons Mayor Steve Mercer even boasted that refurbishing the plant into a cryptocurrency operation would lead to 100 jobs and $100 million of technological infrastructure.
At that level of scale, we have a community development project, essentially, said Maxim Serezhin, CEO of Standard Power.
What makes towns like Coshocton attractive to Bitcoin miners is the old industrial infrastructure, the proximity to natural resources (like Utica and Marcellus Shale gas formations) creating excess energy capacity for the state and the proximity to fiber data centers to reduce latency. In rural Pennsylvania, Bitcoin mines have set up shop using waste coal as fuel, sparking debates about issues ranging from the carbon footprint to the noise pollution from Bitcoin mine power generators using natural gas. These debates echo a larger conflict about what it takes to actually mine Bitcoin that has played out elsewhere as the recent crypto crash dovetailed with calls for more oversight.
OBM, CPower and Standard Power want to change the stigma around Bitcoin mining by offering a solution to energy concerns instead of a problem.
Its a partnership with the community where we give to the community the thing it needs most, at the time it needs it most, right at the time that the community needs power the most, Serezhin said. At those times, we are able to turn our business off and say, we can share the resource that we own with the community and our ability to do that fundamentally changes everything.
Read more here:
How does Bitcoin mining keep the lights on? Some companies may have solutions - Technical.ly
California AG Asks Hospital CEOs for More Information on Their SDOH Technology – HealthLeaders Media
California Attorney General Rob Bonta is investigating whether data mining software used by healthcare providers is hindering underserved populations in accessing healthcare.
Bonta has sent a letter to the CEOs of 30 California hospitals asking for information on how their clinical software is addressing racial and ethnic disparities in care delivery and whether those algorithms may be discriminating against minorities.
Our health affects nearly every aspect of our lives from work to our relationships," Bonta said in a press release. "Thats why its so important that everyone has equal access to quality healthcare."
"We know that historic biases contribute to the racial health disparities we continue to see today," he continued. "Its critical that we work together to address these disparities and bring equity to our healthcare system. Thats why were launching an inquiry into healthcare algorithms and asking hospitals across the state to share information about how they work to address racial and ethnic disparities when using software products to help make decisions about patient care or hospital administration."
Healthcare organizations across the country are tackling social determinants of health by identifying and addressing the barriers to healthcare access, such as home and family concerns, employment, food insecurity and digital literacy. But some are questioning whether the technology they're using to identify and take on those issues are instead causing more problems.
"While there are many factors that contribute to current disparities in healthcare access, quality, and outcomes, research suggests that algorithmic bias is likely a contributor," Bonta's office says in the press release. "For example, data used to construct a commercial algorithmic tool may not accurately represent the patient population for which the tool is used. Or the tools may be trained to predict outcomes that do not match the corresponding healthcare objectives. For example, researchers found one widely used algorithm that referred white patients for enhanced services more often than Black patients with similar medical needs. The problem was that the algorithm made predictions based on patients past record of healthcare services, despite widespread racial gaps in access to care. Whatever the cause, these types of tools perpetuate unfair bias if they systematically afford increased access for white patients relative to patients who are Black, Latino, or members of other historically disadvantaged groups."
In his letter to hospital CEOs, Bonta is asking for:
"As healthcare technology continues to advance, we must ensure that all Californians can access the care they need to lead long and healthy lives, he said.
Eric Wicklund is the Innovation and Technology Editor for HealthLeaders.
View post:
California AG Asks Hospital CEOs for More Information on Their SDOH Technology - HealthLeaders Media
No longer Fintech v/s bank: It is Fintech with banks – Times of India
Over the past couple of years, fintech in the country has grown at a pace helping expand financial services to the masses and making India a benchmark for a lot of global markets. India is labelled as Asias top fintech hub with a fintech adoption rate of 87% over the global average of 64%as explained by RBI Governor Mr Shaktikanta Das.
This present-day enthusiasm shared by Indian financial institutions about the fintech industry is well connected to a natural evolution of the banking sector in India. For the better part of the last decade, traditional financial institutions that gradually realised the potential of fintech companies, redefined their time-tested strategies of conducting business. And the biggest enabler is the tech or technology of fintech that is motivating traditional financial institutions to rethink their perspective and forge a partnership for the future.
Embracing the new approach Over the past 75 years of Indias Independence, a majority of the countrys financial institutions have made significant strides to increase the pace of their customer operations and this has been further accelerated with the entry of the fintechs in this area.
Leveraging on strengths of each other, while the traditional financial institutions have garnered immense inherent trust of their customers riding on their vintage and distribution, the new-age fintechs bring agile data mining capabilities to help them stay connected and relevant to their customers. The combined capabilities can now process historical data on customer behaviour thus slowly improving processes, bandwidth, reducing errors, creating value-added services as per needs of todays customers across income levels.
Erasing Customer fear and bringing confidence The partnership also helps new-age fintech companies, with their wide catalogue of financial products, designed to meet the needs of modern customers, gain adaptation faster. And a significant contributor to this phenomenon is the Indian consumers affinity to bank only on trust. This partnership has helped the fintechs to pinpoint the customer pain points and devise solutions to bridge past process gaps, thus making the end customer happier and showcasing the same bank as much more customer-friendly. And through the ease of availability of these services on banks mobile application, the end consumer is much less inhibited by the traditional way of transacting cash & is welcoming the digitised process with increased confidence. The ease and effortless usage have helped more and more customers who didnt adopt the banking system in the past to happily use it today. And SME/Kiranas/farmers largely fall in this segment in the new Bharat.
Technology as the enabler
A partnership between financial institutions and fintech institutions visible today is built on the shared agenda of better addressing the problems of the end customers and steered by their common vision of increasing financial inclusion using technology as a great enabler.
Over the past couple of years, the popularity of PaaS has grown exponentially among banks and financial institutions mainly because they offer scalability, ready availability and cost-benefit to the banks. Due to this, not only are banks better able to leverage advanced capabilities, but also meet customer expectations leading to more profitability and retention.
Big data, artificial intelligence and technology, powered by fintechs, generally have been accelerating change in payments, lending, insurance and wealth management.
Opportunities ahead While the bank fintech partnership spans across the entire spectrum of financial services, the merchant payments space off late has seen increased activity. The new approach combines integrated omni channel payment solutions with value added adjacencies leveraging payments data to deliver relevant and easy to use offerings for MSME/SME.
One platform solving all things as payments, business growth and working capital delivered by combining the best of capabilities from the bank (payment rails, distribution, underwriting) and the fintech (ease of use, data mining and agility) for a particular segment as MSME.
Redefining the way banks operate
By leveraging the PaaS model and forging equal opportunity partnerships, not only can banks better understand the exact needs of the modern customer, but also positively impact their bottom line. However, it is important to reiterate that for this partnership to truly realize its ultimate potential, banks and fintechs need to work together, as the partnership between trust and technology is at the heart of its success.
Views expressed above are the author's own.
END OF ARTICLE
Continued here:
No longer Fintech v/s bank: It is Fintech with banks - Times of India
‘Big data’ is now the automotive retail sector’s biggest driver, opinion – AM
The newly-appointed head of Keyloop's Alliances divisionMegan Harveyhas a vision of automotive retail which sees 'big data' join the dots between sales, aftersales, customer relations and mobility.
In this exclusive Automotive Management (AM) 'guest opinion' articleshe outlines the direction she sees the sector heading.
The automotive industry is undergoing a period of unprecedented change. Ownership and retail models are changing with OEMs and retailers redefining their relationships with each other and the consumer.
If the benefits of new ways of working are to be realised, the industry will need to fundamentally reappraise how it uses data if its to improve the customer experience and find a more efficient way of operating.
OEMs need to connect dealers and a long chain of supply partners to a real-time, shared, single customer view. Its an approach that will deliver a seamless experience to the customer but will also drive the efficient operation of the sales process and aftersales.
If automotive manufacturers don't improve and digitise their client experiences, they won't be in a position to grow into tomorrow's mobility providers.
A vast amount of value remains untapped in automotive data, from richer consumer and vehicle data to the use of AI in advanced analytics and predictive modelling. Data can be used to unlock sales opportunities and drive efficiencies through all processes.
Big data in the automotive sector comprises information on consumer behaviour, preferences, and information on locations and driving habits. This can be combined with vehicle data to anticipate SMR issues and opportunities to generate aftersales revenue. It will also help guide the manufacturing process with real-time feedback on performance.
Automotive engineers are already using data analytics and AI to model a vehicle's performance. It is likely that with an increased flow of data that model iterations and advances will be made more quickly with the use of digital twin technologies that create a digital version of the vehicle that be tested rapidly in the virtual environment. The industry will see a marked acceleration in the roll-out of new technologies and models as a result.
Predictive analytics is widely used to understand consumer buying trends and to forecast the future utilising methods like data mining/modelling, machine learning, and AI. It enables OEMs and retailers to better anticipate consumer trends in the future and make better-informed decisions.
Customers can be proactively engaged with the right offer at the right time to add value to their ownership journey. Customer lifetime value (CLV) is a measure of the average customer's revenue generated over their entire relationship with a company. It will increasingly be important as OEMs seek to drive revenue from in-life sales such as over-the-air subscriptions and MaaS models.
Data will help OEMs and dealers anticipate customer needs and fulfil them before the customer looks elsewhere.
The change is happening now. Industry estimates show the potential car-buying cohort in 2025 will consist of more than 45% millennials. They will make up the largest segment of new-car buyers, making it crucial for OEMs and retailers to comprehend and cater to their tastes.
To predict the needs of this first generation of integrated-mobility clients over the next five to ten years, automotive players need to look beyond the automotive sector.
The consumer technology industry is a wonderful location to seek ideas since businesses like Airbnb, Amazon, and Uber consistently raise the bar for what constitutes a great digital customer experience.
Todays consumers demand a seamless and dependable service, capable advisory roles, personalised omnichannel communication, round-the-clock assistance, and pertinent social media.
The industry will need to work together to deliver these changes. Technology teams will be required to work in partnership, and data must flow seamlessly between companies to enable the benefits to be unlocked.
Its an approach that underpins the mission of the Alliances division at Keyloop, which is bringing together people and technology from across the industry to accelerate change.
Together we can deliver the change that is required and keep the automotive industry at the forefront of technology and customer experience.
Author: Megan Harvey is EVP of Alliances and OEMs at Keyloop.
View original post here:
'Big data' is now the automotive retail sector's biggest driver, opinion - AM
Seismic Noise Analysis and its Role in Mining Hazard Prevention – AZoMining
Seismic noise can be generated through continuous vibrations at different frequencies inside the earth. Typically, there are three commonseismic noise sources encountered in the field: natural seismic noise, man-made electromagnetic interference (EMI), and man-made seismic noise. This noise can monitor a mines seismicity and structural integrity to detect potential ground hazards.
Image Credit:calcassa/Shutterstock.com
The mining process is non-spontaneous and can induce stresses in the mining region. On average, the mining-induced deformation rate is greater than the average slip rate of tectonic plates by a magnitude of at least two orders.
Deformation and stress-relaxation caused by underground openings are followed by rock extractions, which initiate bulk seismic activity in the mines.
During stress-relaxation, the accumulated potential energy in the rock mass may be released suddenly or gradually.
A sudden inelastic deformation will lead to a seismic event.
The subsequent seismic waves vary in amplitude and frequency depending on the rocks strength and stress state, the seismic sources size, and the magnitude of the deformation rate during fracture.
The seismic response of rocks in mines can be monitored to quantify, prevent, control, and even predict potential rock mass destabilization resulting in rock bursts. Furthermore, blasting vibrations, energy release due to rock crack propagation, and mechanical vibrations induce microseismic events in underground mines.
Increased mining-induced stress can also cause the filling up of cracks, fractures, pores, and voids, which increases the elastic wave velocity.
Microseismic events of varying magnitudes due to rock fracturing can excite elastic waves in the surrounding rocks.
Although these events are of smaller magnitudes, their locations are superficial and closer to the surface. Therefore, they can inflict major damage and loss of life and have a high occurrence rate.
Although past data can be beneficial for comparing seismic rock masses to various mining scenarios, it is not the optimum indicator of future hazards due to the irregular nature of mass loading. Seismic monitoring is widely used to detect, locate, and estimate the size or strength of ground movements.
Seismicity recordings in the mines of the United States ranged from using small-scale arrays, collecting analog signals by geophones, and personal computer seismic unified data structures (PC-SUDS).
In Germany, seismicity monitoring at underground coal mines is conducted by 60 seismic stations, which use three-component 4.5-Hz geophones operating at 500 samples/second. About 20 mines in Poland use the seismic network developed by the central mining institute (CMI) comprising conventional and broadband 1-Hz short-period triaxial sensors.
In Australia, fracturing processes due to longwall caving have been delineated using microseismic monitoring systems developed from numerous temporary research networks. Most of these networks utilized a triaxial geophone sensor. These sensors are grouted into the drilled surface boreholes that provide real-time analysis of the microseismic data.
Methods such as ambient noise tomography record dispersive surface waves and create a three-dimensional (3D) model of the s-wave velocity structure. These passive methods provide a low-cost solution to image the subsurface, delineate structural integrity, and help avoid potential hazards.
Techniques such as velocity tomography use seismic waves produced from mining-induced microseismic activities such as explosions to structure the stress state or the rock mass conditions. However, the quality of results from velocity tomography can be affected by factors such as accurate arrival time picking, high-resolution source location, ray tracing, and efficient inversion algorithms.
Velocity tomography is primarily used to monitor rock mass stability, detect geological structure and discontinuities, and prevent rock burst hazards.
A recent study by Wu and team evaluated noise cross-correlations functions (CCFs) by analyzing one month of surface 17 geophone array data at a longwall coal mine. Their work established that the CCFs related to the ambient noise could be isolated and used to trace mining-induced signals, enabling further research on structural integrity.
Other recent studies have reported using passive seismic data to model the subsurface as an attractive approach for implementing mine safety and preventing hazards.
Seismic noise tomography is another environmentally friendly, inexpensive technique used in exploration and subsurface structure detection. This, coupled with the latest advancements in compact autonomous seismic recording stations and low-cost data collection spanning over a month, makes it an attractive prospect in future mining applications.
Methods such as velocity tomography are advantageous due to their broad monitoring range, allowing them to be utilized at a mine scale or a regional scale and carry out long-term continuous observations. However, it is limited due to deviations of the calculated source locations from their actual positions and can be less reliable in mined-out areas with lesser ray coverage.
Velocity tomography has efficiently analyzed mining-induced stress distribution in hard rock and underground coal mines. Hence, this methods ability to assess rock instability and rock disturbances can be helpful in early predicting and evaluating mining-induced hazards and enhanced ground control. However, the results of this method are not always effective and successful and hence require improvements in timeliness and accuracy.
Swanson P, Boltz MS, and Chambers D. Spokane, Seismic monitoring strategies for deep longwall coal mines. WA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, National Institute for Occupational Safety and Health, DHHS (NIOSH) Publication No. 2017-102, RI 9700. https://www.cdc.gov/niosh/mining/works/coversheet1944.html
Mendecki, Aleksander & Lotter, Ernest. (2011). Modelling Seismic Hazard for Mines. https://www.researchgate.net/publication/236183155_Modelling_Seismic_Hazard_for_Mines
Mendecki, Aleksander & Van Aswegen, Gerrie &Mountfort, Peter. (1999). A guide to routine seismic monitoring in mines. https://www.researchgate.net/publication/236204233_A_guide_to_routine_seismic_monitoring_in_mines
Durkin, John, Greenfield, Roy J., Evaluation of the Seismic System for Locating Trapped Miners, Buereau of Mines Report of Investigations/1981, United States Department of the Interior. https://www.cdc.gov/niosh/mining/UserFiles/works/pdfs/ri8567.pdf
Nicola Ramm, Tjaart de Wit & Gerrit Olivier (2019) Passive Seismic Imaging for Mineral Exploration, ASEG Extended Abstracts, 2019:1, 1-3, DOI: 10.1080/22020586.2019.12073177. https://doi.org/10.1080/22020586.2019.12073177
Li, Y.; Deng, H.; Wen, L.; Qin, Y.; Xu, X. Method for Identifying and Forecasting Mining-Induced Earthquakes Based on Spatiotemporal Characteristics of Microseismic Activities in Fankou Lead/Zinc Mine.Minerals2022,12, 318. https://doi.org/10.3390/min12030318
Zhu, Qiankun& Zhao, Xingdong& Westman, Erik. (2021). Review of the Evolution of Mining-Induced Stress and the Failure Characteristics of Surrounding Rock Based on Microseismic Tomography. Shock and Vibration. 2021. 1-19. 10.1155/2021/2154857. https://www.researchgate.net/publication/354647108_Review_of_the_Evolution_of_Mining-Induced_Stress_and_the_Failure_Characteristics_of_Surrounding_Rock_Based_on_Microseismic_Tomography.
Santiago Rabade, SinMei Wu, FanChi Lin, Derrick J. A. Chambers; Isolating and Tracking Noise Sources across an Active Longwall Mine Using Seismic Interferometry.Bulletin of the Seismological Society of America2022; doi:https://doi.org/10.1785/0120220031
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.
Read this article:
Seismic Noise Analysis and its Role in Mining Hazard Prevention - AZoMining
Digital Identities Will Change The Nature Of Online Reputation – Forbes
Web 3 reputation
The internet is changing. The era of Web 2, dominated by big tech, social media, streaming, and subscription-based service models, is quickly fading away and giving rise to Web 3. Ownership and control of user data in Web 2 rests firmly in the hands of centralized tech companies.
By contrast, Web 3 allows individuals to seamlessly transfer their data and assets over multiple platforms privately, securely, and transparently. Most importantly, it doesnt expose an individuals information and metadata to commoditization unless the individual wishes to provide it, leaving them with complete control. While this self-sovereign approach to individual ownership and control will apply to most forms of personal information, such as financial and medical history, it will also be incredibly pertinent to our future digital reputations.
A New Type of Identification
Digital reputations can be curated but are often anchored directly to the social media platform, group, or community upon which they were created. Furthermore, an individuals social media persona and following are not directly portable from one platform to another and can easily be revoked at the discretion of the company running the platform. However, In Web 3, individuals will avoid such issues through direct ownership of a single digital identity that goes wherever they go, both online and in the real world.
Similarly, individuals will retain control over visibility and access to their personal information stored within their digital identity otherwise known as Self-Sovereign Identity (SSI). This personal data can also be provided via homomorphic encryption or to Secure Enclave processing environments, which reap the benefits of contributing the data without disclosing any private user information. If medical information does need to be shared with a healthcare provider, for example, it can be selectively disclosed or ideally provided as zero-knowledge proofs (ZKPs). Permission can even be time-based so that the data is removed once a predetermined expiry point is reached.
Conventional reputation could even involve an individual's credit history or credit score. Credit history can be proven by the individual in the form of verifiable credentials or receipts held by the individual proving their transaction history and eliminating the need to store it centrally with a third party.
Another more current area in which digital identity and portable reputation could prove the most useful is within Decentralized Autonomous Organizations or DAOs. DAOs allow for cooperation between members in a way that has no central authority. There are a variety of different ways in which DAOs can be structured, but they increasingly require trust.
Having a merit-based reputation is essential in this case. With blockchain-based verified digital identity, it is possible to remain anonymous, but have cryptographic proof of a specific humans verified, merit-based credentials, even without revealing any of their secure personal information. The digital identity can act as a reputation scorecard, which can be updated in real-time, proving what they have contributed or other relevant reputation data.
What Digital Identities Could Look Like, And Offer
So, what would a system such as this look like?
User metadata and consumer spending information, in the form of verifiable credentials and receipts, can be blind signed for the owner and encrypted within a digital identity and then provided across networks and platforms. Financial transaction data, account balances, digital assets, permissions and social network interactions would all be applicable. The data shared or requested could be customized to specific apps, and all data cryptographically secured so that no third party could access them without permission.
Perhaps most importantly, this digital identity could use an individuals biometric data, such as face, fingerprints, or similar. These biometric credentials can be supported with storage of the individual's biometric template in a self-sovereign manner, allowing only the user to access and control them. This would make it impossible for anyone besides the actual owner of the identity to utilize the credentials.
This could then act as a form of universal Verifiable Credentials (VCs) that would be acceptable for any platform to confirm who someone is but wouldnt give access to any information that an individual elected to restrict. As an individual demonstrates experience, earns certifications, or expands their credentials, this digital identity would instantly reflect that, evolving an online reputation.
In fact, the potential for this type of system goes way beyond just the internet. There is also a vast case for Digital IDs in more traditional offline settings. Most likely, through smartphone integration of a supporting ID wallet, individuals could access workplaces, entertainment venues, festivals, and events with verifiable credentials or NFT tokenized access. This will make security at such locations tighter, as only approved persons will be able to gain entry.
Giving Power Back to the Individual
The benefits of digital ID to user security are many and will have major implications for combating scams, fraud and money laundering. However, perhaps the biggest boon for the adoption of digital ID adoption is their function in the empowerment of individuals. Centralized Web 2 businesses have had full access and control over user data for too long; Web 3 will change this.
Some have shared concerns about placing such responsibility in the hands of users. At worst, they can lose access to the digital identity; however, even in this instance, the ID would still be secure and unusable by others, Moreover, users could simply create another digital identity as they are the only ones in control of the assets required to recreate it, such as photo IDs and biometrics, which others could not substantiate.
In addition to protection from data mining, digital IDs will also protect people from scammers, hackers and other malicious activity. Data leaks, identity theft, and malware attacks that all too often cause havoc for Web 2 users will be all but eradicated. Even age verification requests can be actioned without having to reveal a persons age, as the individuals verified credentials are seamlessly tied to their digital ID through zero-knowledge proofs.
SSIDs and their integration with Web 3 are sure to dramatically impact how we all interact both online and in our day-to-day lives. A persons reputation will become a form of currency because, unlike most of human history, it wont be able to be falsified or obscured. This marks the end of the age when big tech companies govern our information and the beginning of an era where individual control over personal data is the standard and making a verified self-sovereign digital identity as the gateway to the internet.
About the author:
Alastair Johnson is Founder and CEO of Nuggets, an award-winning decentralized, self-sovereign identity and payment platform. Its the only platform of its kind that truly brings together payments and digital identity, utilizing self-sovereign data principles.
Follow this link:
Digital Identities Will Change The Nature Of Online Reputation - Forbes