Category Archives: Data Mining
Two technicians work at a bitcoin mining facility in Quebec.
lars Hagbarg | AFP | Getty Images
For years, bitcoin critics have maligned the world's biggest cryptocurrency for polluting the planet. But new data from Cambridge University shows that the geography of mining has drastically changed over the last six months, and experts tell CNBC this will improve bitcoin's carbon footprint.
China's big crypto crackdown this spring set off a chain reaction in the mining world.
For one, it took half the world's bitcoin miners offline practically overnight. Fewer people mining has meant less machines running and less power being consumed overall, which slashed bitcoin's environmental impact.
Beijing's new crypto rules also permanently took a lot of older and more inefficient gear offline.
And crucially, China shutting its doors to crypto mining has set off a massive migration. Miners are now heading to the cheapest sources of energy on the planet, which more often than not are renewable.
"The bitcoin network is ruthless in its drive for the lowest cost," said Mike Colyer, CEO of digital currency company Foundry. "Miners around the world are looking for stranded power that is renewable. That will always be your lowest cost. Net-net this will be a big win for bitcoin's carbon footprint."
China has long been the mecca of the crypto mining world, accounting for nearly three-quarters of all bitcoin miners at its peak, according to the Cambridge Centre for Alternative Finance. But after Beijing decided to expel its miners in May, more than 50% of the hashrate the collective computing power of miners worldwide dropped off the network.
Today, bitcoin draws roughly 70 terawatt hours of energy per year, or 0.33% of the world's total electricity production. That is almost half of what it was in May and is roughly equivalent to the annual energy draw of countries like Bangladesh and Chile.
The exodus from China also means that a lot of older mining equipment that was probably long-past due for retirement will never be turned back on.
"It took off, likely forever, a large amount of the most energy inefficient rigs," explained Alex Brammer of Luxor Mining, a cryptocurrency pool built for advanced miners.
Colyer says the overall bitcoin network will now be mostly made up of more efficient rigs that get about double the hashpower for the same amount of electricity. "This continues to significantly improve the security-to-energy ratio of the bitcoin network," he said.
But not all of China's miners are going dark. Many have begun to patriate elsewhere, gravitating to the world's cheapest sources of power.
"The cool thing about bitcoin that is under appreciated by a lot of the naysayers is that it's...like a portable market; you can bring it right to the source of energy," explained Steve Barbour, founder of Upstream Data, a company that manufactures and supplies portable mining solutions for oil and gas facilities.
Because miners at scale compete in a low-margin industry, where their only variable cost is typically energy, they are incentivized to migrate to the world's cheapest sources of power.
"They need to constantly reduce their electricity costs, which is their number one expense, in order to be competitive," said Ria Bhutoria, former director of research for Fidelity Digital Assets.
The data shows that a whole lot of these miners are headed for cheaper pastures in the U.S.
The United States has fast become the new hotspot for the world's global crypto miners. In the last six months, the country has jumped from fifth to second place and now accounts for nearly 17% of all global bitcoin miners. Although China was still solidly in first place as of April, with 46% share, America's share of the market is likely a lot higher now since the Chinese government booted miners in May.
U.S.-based bitcoin mining operators have seen a huge uptick in business. Whit Gibbs, CEO and founder of Compass, a bitcoin mining service provider, says that retail hardware and hosting sales have increased nearly 300% since mid-June.
Darin Feinstein, founder of Blockcap and Core Scientific, says he's seen a rapid rise in mining operations looking to relocate in North America, mostly in the U.S., and Fred Thiel of Marathon Digital, another major player in the U.S. mining industry, tells CNBC that if the roughly 500,000 formerly Chinese miner rigs looking for homes in the U.S. are deployed, this would mean that North America would account for close to 40% of the global hashrate by the end of 2022.
Long-term, this is good news for bitcoin's carbon footprint.
Energy consumption is not equivalent to carbon emissions. While it is relatively easy to determine the amount of energy that is consumed by the bitcoin network, it is much harder to determine its carbon footprint.
An accurate read of bitcoin's carbon emissions would require exact knowledge of the energy mix used to generate electricity used by each bitcoin mining operation. One unit of hydropower, for example, does not have the same environmental impact as the equivalent amount of power sourced from coal. And China's bitcoin mining operations were known for both.
But on the whole, the market is pushing North American energy sources to get greener.
Each year, investment bank Lazard releases a breakdown of energy costs by source. Its 2020 report shows that many of the most common renewable energy sources are either equal to or less expensive than conventional energy sources like coal and gas. And the cost of renewable power keeps going down.
Thiel says that most miners new to North America will be powered by renewables, or gas offset by renewable energy credits. Gibbs estimates that bitcoin mining in the U.S. is more than 50% powered by renewables.
Miners migrating to North America are also preparing for a future in which their energy usage is questioned by putative investors -- and possibly regulated.
Brammer has been helping Chinese clients find new homes. He says that most are aware of the political and normative winds in North America and want to hedge themselves against regulatory risks in the future by establishing new facilities in primarily renewable-powered locations.
"The largest of them are also looking at the potential of going public or are looking for investors to help them grow," Brammer told CNBC. "They realize that public markets nowadays have no appetite for proof of work mining that is powered by non-renewable [energy sources]. I have yet to even have a discussion about a deal involving coal power, which is heartening to us."
Bitcoin mining engineerBrandon Arvanaghi tells CNBC that in the long run, the migration to the U.S., where innovation around bitcoin and renewables is already underway, will be an overwhelming positive for bitcoin's energy mix.
"Places like Texas have cheap electricity, in large part because of subsidies toward wind power," according to Arvanaghi.
Miami Mayor Francis Suarez has also popularized the idea of mining bitcoin with nuclear power in Florida.
"And all this is largely voluntary the federal and state governments haven't even gotten involved to require any renewable mix," continued Arvanaghi.
Not all miners, however, are headed to renewable destinations.
Kazakhstan is now just behind the U.S. in terms of its share of the global bitcoin mining market, with about 8% of all crypto mining. It's home to coal mines that provide a cheap and abundant supply of energy but also ample carbon dioxide emissions.
However, several mining experts tell CNBC they think that Kazakhstan, which neighbors China, is just a temporary stopover on a longer migration west.
Brammer sees large miners going there in the short-term with older-generation equipment. "But as older-generation machines reach the end of their service lives, those companies will likely deploy new machines into more stable and energy efficient and renewable jurisdictions," he said.
Also likely to put a damper on Kazakhstan's popularity is a law newly signed by the president that will introduce extra taxes for crypto miners starting in 2022.
"This will significantly change the incentives for people to deploy capital in Kazakhstan," said Brammer.
Follow this link:
New data from Cambridge University shows that the geography of Bitcoin mining has drastically changed over the last six months in the wake of Chinas massive crackdown.
CNBC has been gathering the opinions of industry experts to get their take on the great miner migration and how it has altered the environmental impact of the industry. They have come to the consensus that crypto mining is now environmentally friendlier than it was when China dominated global hash power.
The report noted that as many as half the worlds Bitcoin miners went dark in a matter of days. This was confirmed by the 65% slump in hash rate between May 13 and June 28.
Beijings heavy-handed approach has forced miners to relocate, and many of them have found friendlier jurisdictions such as Texas that have lower-cost renewable energy sources.
CEO of digital currency company Foundry, Mike Colyer, confirmed that it had been a big plus for Bitcoin.
Miners around the world are looking for stranded power that is renewable. That will always be your lowest cost. Net-net this will be a big win for bitcoins carbon footprint.
According to the Cambridge data, almost 17% of all BTC mining is now conducted in North America, and at least 50% of that uses renewable energy. Chinas previously 65%+ share of the hash pie had dwindled to 46%, according to the data released in April, and it has fallen a lot more since then.
The exodus from China also had an effect on a lot of older and less energy-efficient mining equipment that has now been shut down for good. Alex Brammer of Luxor Mining said, It took off, likely forever, a large amount of the most energy inefficient rigs,
Renewable alternative energy sources such as hydropower, solar, wind, nuclear, and even gas flare energy are now being sought as burning fossil fuels gets phased out.
In a related development, the first-ever green mining exchange-traded fund (EFT) was launched on the New York Stock Exchange yesterday by Viridi. The investment product aims to attract mainstream and institutional investors with a focus on environmental, social, and governance (ESG) issues.
At the time of writing, Bitcoin had regained its psychological support zone at $30K and was trading up 3.2% on the day at $30,700, according to CoinGecko.
The worlds leading crypto asset has lost 10% over the past week and is now down 52% from its mid-April all-time high.
PrimeXBT Special Offer: Use this link to register & enter POTATO50 code to get 50% free bonus on any deposit up to 1 BTC.
Spain already has its first Chief Data Officer to respond to the challenges of the data economy | comp – thedailyguardian.net
The government has appointed Alberto Palomo Lozano as Spains first chief data officer. A number that already exists in other countries such as France, the United States, the United Kingdom, and Canada. Palomo will be responsible for launching the Data Office, which relies on the Secretary of State for Digitization and Artificial Intelligence, which must respond to the major challenges of the data economy, and define legal and policy frameworks for engagement and governance.
On Tuesday, the government clarified that the aforementioned office should establish principles and standards to ensure data flows across sectors, ensuring privacy and respecting the rights of citizens. The Executive Director emphasized that the authority was born with the purpose of becoming a key agent for strengthening a framework ensuring data sovereignty in Spain and at the European level, promoting the construction of data centers, as well as the processing and storage of this data in Spain.
Specifically, data office functions go through the design of strategies and frameworks of reference in the field of data management, and the creation of spaces for sharing data between businesses, citizens, and public administrations in a secure, well-governed and usable way. Data in the productive sectors of the economy through big data technologies and artificial intelligence.
They insisted from the government that this figure will be very relevant to the establishment of a clear management of the data ecosystem in Spain, where the public sector and the private sector can cooperate with confidence to spread the Spanish data economy.
The new data manager will also be responsible for representing Spain at GAIA-X, a community-based European cloud project. They noted that the Data Office, together with the GAIA-X Regional Center in Spain, will support the deployment of sectoral data spaces, with a particular focus on the sectors of tourism, health, agri-food or sustainable mobility.
In terms of training, Palomo Lozano holds a PhD in Theoretical Physics from the Autonomous University of Madrid and CSIC, and has training in data mining and statistical learning. He worked at Iberdrola and at Huawei; Finally, in the R&D lab deep learning (Deep Learning) at the Canadian branch, where he worked on hardware design to improve the training phase of neural networks based on artificial intelligence.
Carme Artigas, Minister of State for Digitalization and Artificial Intelligence, announced a year ago that she would appoint a data manager to advance the governments strategy in this area.
After knowing the appointment, some companies in the technology sector celebrated it. In a statement, Sothis CEO Raul Martinez stated, Having a figure to coordinate a national strategy for data management and use is a major boost to making it easier for our companies and public administrations to adopt methodologies that allow them to extract the most value possible from data and, ultimately, be better able to competition as a country.
The director added that in the digital economy, information is the most important resource and this appointment will help promote the digitization of business processes and public administration, thus strengthening the culture of real data in Spain.
In it, Rafael Quintana, general manager of consultancy Quilk for Spain and Portugal, noted that the data would be a pillar of Spains economic recovery and that it was vital to have a number that coordinates the strategy.
Why Your Business Needs to Treat Your Edge Data as Capital – SPONSOR CONTENT FROM DELL TECHNOLOGIES AND INTEL – Harvard Business Review
Why Your Business Needs to Treat Your Edge Data as Capital
Data is one of the most powerful resources at your organizations disposala resource that may seem both infinite and underused.
It seems infinite because it is everywhere and its growing exponentially, as sectors including finance, telecommunications, pharma, health care, and manufacturing gather more data from more sources. And it seems underused because organizations gather far more of it than they can turn into new strategies to innovate, compete, and grow.
And the pressure is only mounting, with the proliferation of edge computing: all the cameras, sensors, and other Internet of Things (IoT) tools powered by artificial intelligence (AI) and machine learning (ML) that document activity from the remote locations where and when it happens.
Edge computing is yielding a deluge of data, and without the infrastructure and culture to manage it, or an organizational culture that embraces it, this incoming data can turn into wasted capital.
That conflict is at the heart of a data paradox. Many organizations say they need more data than they can get their hands onbut theyre already collecting more than they can process and analyze. While 64% of respondents to a recent Forrester study, commissioned by Dell Technologies, consider data the lifeblood of their organization, only 23% said they treat data as capital and prioritize its use across the business. And 87% of organizations are neglecting their data technology and processes, their data culture and skills, or both.
Your Capital, Your Risks
What are the risks of squandering this capital?
Organizations operating with legacy infrastructure that lock up their data in silos prevent their workforces from sharing the information across practices and mining it for the integrated insights that could boost their real-time understanding of their audiences and support their customer experience (CX).
Legacy operations may also leave an organization vulnerable to unnecessary storage costs, data loss or theft, and reputational damage. And even beyond the unpredictable, any routine maintenance and planned inspections that take infrastructure offline amount to costly downtime.
Reducing Dwell Time with Data
With 28% of U.S. products now transported by rail, its critical to keep the trains moving.
One of the primary metrics railroads operate with is velocity: the total travel time (hours) divided by the total distance traveled (miles). When railcars are moving and operational, railroads can maximize and expand capacity, increase equipment availability, shorten cycle times, and boost on-time performance. But the total travel time also includes any dwell timethe hours when a train is not moving.
Freight-railroad industry leaders are improving performance by working with Duos Technologies (Nasdaq: DUOT), which provides turnkey technology solutions to automatically inspect in-transit railcars and includes AI-enabled edge-computing servers to capture images, model data, and handle sophisticated analytics and reporting on locationall while the trains keep running at track speeds up to 90 mph.
In strategic locations along 140,000 miles of rail track throughout North America, Duoss network of sensor-agnostic inspection portals run on edge-computing processors, data servers, storage, and networking that collect, process, and analyze images and data sets of railcar defects. Each location can generate as much as 30 TB of data per day.
By streamlining the previously labor-intensive manual railcar inspection process, this technology dramatically reduces the amount of time a railcar is stopped and in-dwell, which increases the velocity of the train.
Beyond converting this wealth of data into powerful capital for analytics and efficiency, the automated AI-powered edge-computing solution can increase employee safety and security.
Duoss edge-computing portals can automatically detect issues at track speed. Using remote server management tools, Duos can access system logs and event notifications to monitor tracking trends, hardware failures, and firmware and basic input/output system (BIOS) updates to rebuild a server remotely, saving $3,000 per instance.
Duos also applies this edge-computing technology to other transportation industries (trucking and intermodal) and transit railroads across North America.
Putting Data to Work
Even organizations outside the freight industry can see that valuable insights no longer come exclusively from data centers and the public cloud. Increasingly, they also come from the edge, where the data is generated.
Supporting your edge infrastructure with the optimal suite of compute, storage, and analytics solutions can help your organization get insights in real time and on locationand uncover hidden growth opportunities.
However plentiful your organizations data from the edge may seem, and however you may be getting by without tapping it to its fullest, a suite of edge-computing solutions can help you convert your data into invaluable capital. Your edge data is capital that could be helping you grow your business. Dont let it go to waste.
Learn more about the Dell solutions that can help your organization use your edge data to its fullest.
See the original post:
Process mining helps organisations discover and improve the performance of their processes, identifying bottlenecks and other areas of improvement
When implemented correctly, process mining reduces costs, improves service level agreements (SLAs) and empowers teams to solve inefficiencies quickly. Its become an asset for Business Analysts and Operations Excellence Consultants, but crucially, it provides management with rich insights about the performance of their operation.
For those organisations who have successfully implemented process mining, creating and analysing current state processes is entirely automated. So, process discovery is no longer a time-consuming and resource intensive practice. But what about those organisations who havent yet seen the benefits of process mining?
If you employ people to execute business processes using IT systems, then you should be exploring process mining.
How does process mining work?
Process mining combines data science and process analytics to mine data from information systems. For example, Enterprise Resource Planning (ERP) or Customer Relationship Management (CRM) tools, create event logs with every transaction which provides an audit trail of processes. This shows what work is being done, when, and by who.
Process mining software then uses this information to create a process model. This allows the end-to-end process to be examined, showing the detailed steps taken and any variations. Built in machine learning models then help give insights into any root cause of deviations. For example, it might point out that every time a new customer needs proof of address, the process is slowed down. These models enable management to see if their processes are performing efficiently, or if they arent, they provide the information needed to optimise them.
Choosing the right places to apply process mining is important. Organisations who apply it to processes that have already been digitised, i.e. processes that use core IT systems, tend to see the best results. It provides an evidence-based view of how processes are performing and its an easy sell to senior management once they see where problems and opportunities lie.
Different Types of Process Mining
There are three basic types of process mining:discovery,conformance, andenhancement:
Why is it so Important?
Lean Six Sigma is a method that relies on a collaborative team effort to improve performance by systematically removing waste and reducing variation. Lean Six Sigma has proved itself to be an effective methodology for reducing operating costs and increasing return-on-investment. However, identifying opportunities and measuring the effects of improvements has been difficult. Process mining helps by identifying and quantifying the inefficiencies in processes and showing how effective any changes have been. The use of these processes not only reduces costs,but it also drives more innovation, quality, and better customer retention.
Process minings success can be demonstrated by the experience of a large insurance company. A major source of inefficiency and cost for the company was their end-to-end claims process: from FNOL (First Notice of Loss) through to claims assessment to final claims payout. Process mining was used to understand how approximately 300,000 claims were routed, which steps in the process had the longest lead times, which the most variation and why. The company found that a combination of manual data processing, handling various documents and managing multiple hand-offs between third parties during the claims assessment was adding operating costs whilst also impacting the customer experience negatively. Within eight weeks, process mining provided a rich map of the end-to-end claims process, with insights on pain-points and number of opportunities for improvement. A combination of user training, process automation and process improvement initiatives followed. The result was an astounding 43% improvement in cycle times and over 1200 hours of time savings generated each month.
Challenges to overcome
But process mining is still a relatively new discipline, so there are still some challenges to overcome, including:
Organisations that are striving to become digital businesses need to enhance the ability to investigate and analyse processes. The adoption of new automation technologies, such as RPA (robotic process automation), machine learning and NLP (natural language processing), has proven that business leaders want to invest in technologies that improve business performance. Process mining is another tool that organisations will increasingly lean on to achieve their business outcomes.
About the Author
Dan Johnson, Director of Automation, Future Workforce. Dans expertise in intelligent automation began in 2011 when he was part of the team delivering the first RPA projects into COOP Financial Services. He later joined Accenture as the Insurance Process Automation Lead for UK&I before leading an Automation and AI team at a UK Bank. Clients trust his exceptional experience to deliver their intelligent automation programmes.
See the rest here:
IDDP's research fellowships support projects by Ph.D. or other terminal degree holders at any stage of their career. The research should align with IDDPs mission to help the public, journalists and policymakers understand digital medias influence on public dialogue and opinion and to develop sound solutions to disinformation and other ills that arise in these spaces.
This is the second of three Q & As in which GW Today introduces the fellows to the university community.
Kai Shu is a Gladwin Development Chair Assistant Professor in the computer science department at Illinois Institute of Technology. His research and computational tool development address challenges varying from big data, to social media, to AI and issues on disinformation, responsible machine learning, trust social computing, and social media mining. He is a recipient of Arizona State University (ASU) Fulton Schools of Engineering 2020 Dean's Dissertation Award and the 2020/2015 ASU School of Computing, Informatics and Decision Systems Engineering Doctoral Fellowship. He also is a winner of the 2018 SBP Disinformation Challenge. He has interned at Microsoft Research AI, Yahoo Research and HP Labs.
Q: What is the aim of your research project with IDDP?
A: The goal of my research project is to study the scientific underpinnings of disinformation and to develop a computational framework to detect, adapt and explain disinformation for policy making. I am motivated to advance interdisciplinary research to discover knowledge, enhance understanding and inform actions for enabling trust and truth online.
Q: What is your favorite platform to study? Why?
A: I used Twitter the most for my current research as it provides publicly accessible application programming interfaces to obtain rich social media data. Even though many of the developed algorithms are general to other social media platforms, I often build proof-of-concept frameworks using Twitter data.
Q: What do you recommend social media platforms do to build trust in online environments?
A: Social media platforms are playing an important role to ensure a safe and healthy online information space. I think it is important for these tech giants to discuss and collaborate with third-party fact checkers and researchers, and to implement functionalities to identify and mitigate disinformation and other forms of information operation effectively.
Q: Are there any pertinent policy solutions you would like platforms to adopt to better identify and rectify mis- and disinformation?
A: I think the regulation policies and technologies for combating disinformation are still in the early stages. As a computational scientist, I believe we can benefit from useful policy rules to design effective platform services. The techniques we develop can also facilitate policy design.
Q: Should governments be involved in regulating online spaces? How similar should policies look between social media platforms and state governments?
A: As disinformation grows at unprecedented volumes on social media, it is now viewed as one of the greatest threats to democracy, justice, public trust, freedom of expression, journalism and economic growth. Governments can provide useful insights and can benefit from effective technologies on combating disinformation, on mitigating foreign influence and ensuring national security. A consensus on the policies between social media platforms and governments is desired in the future to better combat disinformation.
Q: Outside of your research with IDDP, what subject matters interest you?
A: My research interests include machine learning, data mining and social computing. I am also interested in inventing machine learning algorithms on weak (noisy, limited and unreliable) data, and building fair, robust and interpretable models that tackle problems in real-world applications.
Inside a Chinese bitcoin mine
New data shows Bitcoin mining in China was already in sharp decline before the latest crackdown by the government.
The research by the Cambridge Centre for Alternative Finance (CCAF) found China's share of mining fell from 75.5% in September 2019 to 46% in April 2021.
It also revealed Kazakhstan was now the third most significant Bitcoin mining nation.
Miners earn money by creating new Bitcoins, but the computing used consumes large amounts of energy.
They audit Bitcoin transactions in exchange for an opportunity to acquire the digital currency.
Global mining requires enormous computing power, which in turn uses huge amounts of electricity, and consequently contributes significantly to global emissions.
The CCAF's Cambridge Bitcoin Electricity Consumption Index shows that at time of writing Bitcoin consumed almost as much electricity annually as Colombia.
In June the Chinese authorities took strong action against Bitcoin.
The authorities told banks and payments platforms to stop supporting digital currency transactions causing prices to tumble.
The data from the CCAF covers a period before the crackdown, but it shows China's share of global mining power was already in significant decline prior to the action by the Chinese authorities.
The Cambridge researchers observed that the crackdown, once enacted, effectively led to all of China's mining power "disappearing overnight, suggesting that miners and their equipment are on the move".
Kazakhstan is a heavy user of coal-fired generation
Experts say the miners are highly mobile.
"Miners pack shipping containers with mining rigs", said David Gerard, author of Attack Of The 50 Foot Blockchain, "so that in effect they are mobile computer data centres, and they are now trying to ship those out of China".
It's not clear where they will go, but even before the crackdown the geography of mining was shifting.
Kazakhstan, a country rich in fossil fuels, saw an almost six-fold increase in mining - increasing its share from 1.4% in September 2019 to 8.2% in April 2021.
According to the US Department of Commerce, 87% of Kazakhstan's electricity "is generated from fossil fuels" with coal accounting for more than 70% of generation.
The country is now the third largest miner of Bitcoins, behind the US, which saw its share of global mining power also rise significantly - to 16.8%.
The data also revealed the close ties between sources of cheap electricity and Bitcoin mining.
Researchers found a seasonal movement of mining between Chinese provinces in response, it was suggested, to the availability of hydro-electric power.
Mining moved from the coal-burning northern province of Xinjiang in the dry season, to the hydro-abundant southern province of Sichuan in the rainy season.
The researchers noted that "this seasonal migration has materially affected the energy profile of Bitcoin mining in China", adding that it illustrated "the complexity of assessing the environmental effects of mining".
Sichuan banned Bitcoin mining in June.
Read more from the original source:
Like many of us, schools in the United States are active on social media. They use their accounts to share timely information, build community and highlight staff and students. However, our research has shown that schools social media activity may harm students privacy.
As a researcher who specializes in data science in education, I and my colleagues came to the topic of student privacy unintentionally. We were exploring how schools used social media during the early days of the COVID-19 pandemic, specifically March and April of 2020. In the course of this research, we noticed something surprising about how Facebook worked: We could view the posts of schools including images of teachers and students even when not logged in to our personal Facebook accounts.
The ability to access pages and pictures even when we were not logged in revealed that not only could schools posts be accessed by anyone, but they could also be systematically accessed using data mining methods, or new research methods that involve using computers and statistical techniques to discover patterns in large often publicly accessible datasets.
Since practically all U.S. schools report their websites to the National Center for Education Statistics, and many schools link to their Facebook pages from their websites, these posts could be accessed in a comprehensive manner. In other words, not only researchers but also advertisers and hackers could use data mining methods to access all of the posts by any school with a Facebook account. This comprehensive access allowed us to study phenomena like violations of students privacy at a massive scale.
The easy access to student photos that we encountered comes despite broader concerns about individuals privacy on social media. Parents, for instance, have expressed concerns about teachers posting about their children on social media.
Fortunately, our search of news coverage and academic publications did not reveal any harms that have come to students because their schools posted about them. However, there are a number of possible risks that identifiable posts of students could pose. For instance, would-be stalkers and bullies could use the postings to identify individual students.
Also, there are newer threats that students may face. For instance, the facial recognition company Clearview collects internet data and social media data from across the World Wide Web. Clearview then sells access to this data to law enforcement agencies, who can upload photos of a potential suspect or person of interest to view a list of potential names of the individual depicted in the uploaded photo. Clearview already accesses identifiable photos of minors in the U.S. from public posts on Facebook. It is possible that photos of students from schools Facebook pages could be accessed and used by companies such as Clearview.
Even though we are not aware of these things actually happening, that is not reason to not be concerned about it. At a time when our privacy is often threatened in surprising ways, as technology journalist Kara Swisher writes, only the paranoid survive. My fellow researchers and I think this cautious view even a paranoid view is particularly justified when it comes to students as minors who may not provide their explicit permission to be included within posts.
In our study, we used federal data and an analytical tool provided by Facebook to access posts from schools and school districts. We use the term schools to refer to both schools and school districts in our study. From this collection of 17.9 million posts by around 16,000 schools from 2005 to 2020, we randomly selected sampled 100 and then coded these publicly accessible posts. We determined whether students were named in the post with their first and last name and whether their faces were clearly depicted in a photo. If both of these elements were present, we considered a student to be identified by name and school.
For example, a student in a Facebook post whose photo includes a name in the caption, such as Jane Doe, would be deemed identified.
We determined that 9.3 million of the 17.9 million posts we analyzed contained images. Within those 9.3 million posts, we estimated that around 467,000 students were identified. In other words, we found nearly half a million students on schools publicly accessible Facebook pages who are pictured and identified by first and last name and the location of their school.
While many of us already post photos of ourselves, friends and family and sometimes our children on social media, the posts of schools are different in one important sense. As individuals, we can control who can see our posts. If we want to limit it to just friends and family, we can change our own privacy settings. But people do not necessarily control how schools share their posts and images, and all of the posts we analyzed were strictly publicly accessible. Anyone in the world can access them.
Even if one considers the potential harm of this situation to be minimal, there are small steps that schools can take that could make a notable difference in whether that potential is present at all:
Not posting students full names would make it much more difficult for individual students to be targeted and for students data to be sold and linked with other data sources by companies.
Making school pages private means that data mining approaches similar to our own would be much more difficult if not impossible to carry out. This single step would drastically minimize risks to students privacy.
Opt-in media release policies require parents to explicitly agree to have photos of their child shared via communications and media platforms. These may be more informative to parents especially if they mention that the communications and media platforms include social media and more protective of students privacy than opt-out policies, which require parents to contact their childs school if they do not want their childs photo or information to be shared.
In sum, schools Facebook pages are different from our personal social media accounts, and posts on these pages may threaten the privacy of students. But using social media doesnt have to be an either-or proposition for schools. That is to say, it doesnt necessarily come down to a choice between using social media without considering privacy threats or not using social media at all. Rather, our research suggests that educators can and should take small steps to protect students privacy when posting from school accounts.
[Understand new developments in science, health and technology, each week. Subscribe to The Conversations science newsletter.]
Read the rest here:
Shares of Intrusion Inc (NASDAQ: INTZ) are trading lower after the company announced worse-than-expected preliminary Q2 sales results and said it has engaged an investment banking firm to evaluate various funding sources.
Additionally, the company announced the departure of CEO and President Jack B. Blount. The companyannounced plans for CFO,Franklin Byrd, and CTO, Joe Head to assume operating responsibilities until a successor is named.
Intrusion Inc is engaged in developing, marketing, and supporting a group of high-speed data mining, cybercrimeand advanced persistent threat detection products. The companys product line includes TraceCop which is utilized as a big data tool with IP intelligence to support forensic investigations while Savant is a network data mining and advanced persistent threat detection software utilized for data privacy protection.
At the time of publication, shares of Intrusion Inc were trading 51.7% lower at $4.11. The stock has a 52-week low of $4.04 and a 52-week high of $29.90.
2021 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
The U.S. Environmental Protection Agency has attempted to clean up the Coeur d'Alene Basin since labeling the Bunker Hill Mining Complex a superfund site in 1983. Almost 40 years later, that effort is ongoing with no end in sight.
EPA Remedial Project Manager Ed Moreen said the agency has "cleaned up" approximately 7,200 residential properties with elevated lead in the soil since acid mine drainage treatment began in 1995.
"This area has some of the highest blood lead levels in the nation," Moreen said in a presentation with the National Academies of Sciences Committee on Monday, entitled "The Future of Water Quality in Coeur d'Alene Lake."
There were, and still are, multiple uncontrolled sources of lead throughout the upper Coeur d'Alene Basin, Moreen said. Without control of those sources, he said heavy metal fluxes continue to impact the Coeur d'Alene River.
Moreen explained that most of those metals in the upper basin originate from mining operations surrounding the Bunker Hill Mining Complex. North Idaho mining operations began in the 1800s and continue today around the Silver Valley.
Those operations contributed to more than 100 million tons of mine waste in the basin, he noted, including 2.4 billion pounds of lead dispersed over thousands of acres.
"The Coeur d'Alene Basin was impacted by over 100 years of mining," Moreen said. "Until 1968, roughly 2,200 tons of mine waste was discharged (by mine sites) in the South Fork of the Coeur d'Alene River per day."
Historical mining and milling practices that disposed of waste in the river spread contaminants throughout the flood plain, EPA data shows. Smelter operations, which gave Smelterville its name, also contributed to the soil, sediment, groundwater, and surface water contamination with heavy metals like lead.
"Many of the waste piles were located or adjacent to streams which released into surface water," Moreen explained.
At one time, the Silver Valley was one of the most productive mining districts in the United States, Moreen said.
From Cataldo into Lake Coeur d'Alene, the lower basin or "main stem" of the river continued to see ongoing contamination from the upper basin, EPA Remedial Project Manager Kim Prestbo said.
"There are significant sources of lead that remain downstream in the riverbed banks," she said. "It may require many years for us to get a significant data set in which we can indicate trends."
Cleaning up the "mega-mining site" follows a two-pronged approach that centers on:
Receptors, such as human health, environmental quality, and susceptible wildlife.
Control the source of the harmful waste through consolidation and stabilization efforts.
Given the extent of contamination, the EPA has focused on providing "clean oases to protect human health" and wetland rehabilitation, Prestbo said. Current projects follow the 2016 Recreational Sites Implementation Plan developed with the Coeur d'Alene Tribe, Panhandle Health District, and the Basin Environmental Improvement Project Commission.
"Through this work, EPA has identified over 65 private and public recreational sites of concern in the lower basin," Prestbo said.
Cleanup of the basin includes removing and replacing surface soil in people's yards, public playgrounds, parks, commercial properties, and mine waste at nonoperating mine and mill sites in the Upper Basin. EPA also identified several habitat remediation projects like Grays Meadows and Lane Marsh for further research.
Bunker Hill was the first area the EPA took action due to high blood lead levels. Next, and the process presently ongoing, Moreen said, is working on a "top-down approach" concentrated on basin tributaries and areas producing large amounts of contaminants.
Historically, EPA efforts have focused on addressing heavy metals instead of phosphorus levels. However, the agency has begun monitoring phosphate as part of the basin surface water and wetland remediation programs, Prestbo said. There is not yet enough data to gauge the interactions between lead and phosphate, she noted.
Over the next 10 years, the EPA intends to focus on:
Addressing the discrete Mine & Mill sites in Nine Mile and Canyon Creek Basins
Prioritizing sites loading the most dissolved metals
Monitoring remedial action effects on the groundwater collection system at the Bunker Hill site
Reducing lead contamination in humans by remediating public recreation areas
"The Coeur d'Alene River is a conveyor belt of contaminated sediment transported to the lake that will occur for years to come," Prestbo said.
"Given the broad scope of work that we need to accomplish, remediation and restoration must always be in sync to most effectively protect people, wildlife, and aquatic resources," Prestbo later added.
After three decades of the cleanup in the Coeur d'Alene Basin, NAS committee chairman Samuel Luoma asked if the EPA anticipates continued support for the project financially and otherwise.
"We expect to see a continued level of effort, if not an increase in effort as other source sets are addressed in the upper basin," Moreen said. "I think, if anything, we expect to be very busy in the coming decade and beyond."
In their closing remarks, the EPA representatives said it would help them if the NAS identifies:
Spatial/temporal trends and biogeochemical processes that impact metal/metalloid and nutrient levels and cycling within Lake Coeur d'Alene, and
Pathways and processes impacting nutrient levels in Lake Coeur d'Alene to guide local protection management options
Here is the original post: