Page 2,049«..1020..2,0482,0492,0502,051..2,0602,070..»

Creating databases to help cure diseases worldwide – University of Georgia

Jessica Kissinger never set out to make databases. From the time she was a little girl, she wanted to be a biologist.

Today, the University of Georgia professor not only studies deadly pathogens like malaria and Cryptosporidium (a waterborne parasite), but also is a driving force behind worldwide, groundbreaking collaborations on novel databases. During her time at UGA, she has received nearly $40 million in federal and private grants and contracts.

These databases can crunch vast amounts of biological information at warpspeed and reveal important patterns that pave the way for new approaches to scourges such as Leishmania (common in the tropics, subtropics, and southern Europe), toxoplasmosis (a systemic disease due to one of the worlds most common parasites), and Valley Fever (a fungus born on the wind that can cause lung and systemic infections). Novel drug and vaccine targets can be developed, as well as fresh insights on life-threatening pathogens.

Fighting infections and developing new drug and vaccine targets requires detailed knowledge of a pathogen and how it functions, explained Kissinger, a Distinguished Research Professor in UGAs Department of Genetics, Institute of Bioinformatics and Center for Tropical and Emerging Global Diseases.

And, like internet searches, the databases are all free. Kissinger said its likely that pharmaceutical companies are mining some of the information in their quest to discover new therapeutic targets.

They dont tell us what theyre working on, she said. A database itself doesnt produce a cure. A database can, however, remove most barriers to analysis of existing data.

It once took an entire decade to sequence a single genomeand the cost was many millions. Today, researchers can sequence a genome in a single afternoon for a few thousand dollars, transforming the field of genomics. Similar astounding advances have reshaped other omics specialties, such as proteomics (study of proteins), metabolomics (study of metabolism), transcriptomics (study of RNA), and epigenomics (the influence of the environment on gene function). These advances mark the Big Data era in biology.

The power that is unleashed by big data is phenomenal, said Kissinger, and its a very exciting time in history, with major funders and visionaries all across the world forming consortia to create a kind of ideal data universe. Like explorers trekking into a new world, they will make discoveries we might only imagine right now.

Kissingers innovations began over 23 years ago, while she was a postdoctoral researcher at the University of Pennsylvania studying a single-celled parasite called Toxoplasma gondii. The parasite shares some important features with the malaria pathogen, whose genome was in the process of being sequenced.

I rounded up genome data from all over the world on Plasmodium (the causative agent of malaria), and ran analyses and put it on a website, so I could study the genes it might share with Toxoplasma, she recalled. It turns out nobody had made the Plasmodium data available for searching before.

Soon she and her adviser, David Roos, had a million-dollar grant to formally establish a malaria database, PlasmoDB, and since its launch in 1999 it has grown to include additional pathogens and received continual funding from the NIH, the most recent for up to $38.4 million to maintain what has now become the Eukaryotic Pathogen, Vector and Host Informatics Resources knowledgebase (VEuPathDB), covering 14 different pathogens as well as host responses to infections. This comprehensive database is an integrated centralized resource for data mining on over 500 organisms.

The databases collectively contain over nine terabytes (9,000 gigabytes) of data, and have been compared to a Wikipedia for molecular parasitology by the British Society for Parasitology, which noted back in 2006: We dont know what we would do without it!

Each month, VEuPathDB receives over 11 million hits from an average of 36,000 unique visitors in more than 100 countries, including India, Brazil and Kenya. A related database on vectors of disease (such as ticks that carry Lyme disease) was recently merged into VEuPathDB. The merger expanded each resource and enables researchers to better explore data on vectors such as ticks and mosquitoes and the pathogens they transmit.

The databases are not just strings of numbers or words. They allow visualizations and graphic interfaces. Already, research is emerging that can help direct vaccine and drug development away from proteins that hosts and pathogens share, in order to protect the cell. Scientists using the databases have discovered proteins that reduce severe malaria and other proteins that protect malaria parasites from the human fever response. They have also found proteins that help Toxoplasma penetrate host cells.

In a single year an average of 200 publications a month cite VEuPathDB, and to date there have already been 24,000 citations total. Next up: cloud-ready applications and improved integration with yet other databases. These databases have become essential data mining and access platforms for fungal and parasite genomics research, said microbiologist and plant pathologist Jason Stajich of the University of California at Riverside.

Without powerful, user-friendly tools to analyze it, Big Data is more a curse than a blessing, explained John Boothroyd, an immunologist and microbiologist at Stanford University School of Medicine. VEuPathDB is just such a tool and we owe Jessica Kissinger and her colleagues an enormous thank you for their tireless and selfless efforts to first conceive and then continuously improve this absolutely essential resource.

Grants for related projects have come from a wide array of organizations, among them the Bill & Melinda Gates Foundation, the Sloan Foundation, and the World Health Organization. One of those projects, called ClinEpiDB, is home to a multicenter study that contains data from over 22,000 children from seven different sites in South Asia and Africa. This study is the largest ever to investigate the causes of diarrhea in children in lower- to middle-income countries. Other uses of ClinEpiDB include new data on hidden signs of malaria transmission in areas where incidence is declining, or how breastfeeding protects infants from common infections.

The VEuPathDB database would be enough to secure Kissingers reputation in the biological sciences, but she has not stopped there. At the University of Georgia, she was a founding member of the Institute of Bioinformatics, and served as its director from 2011 to 2109. The Institutes mission is to facilitate cutting-edge interdisciplinary research in computational biology, and the program offers both masters and doctorates. She is a key researcher helping to partner a national hub for infectious disease research by linking with Emory University in Atlanta. The two institutions have grants totaling over $45 million to work on everything from tuberculosis to HIV to malaria.

These databases are a success beyond my wildest dreams, said Kissinger. They are made by biologists for other biologists and address a real-life need.

Continue reading here:

Creating databases to help cure diseases worldwide - University of Georgia

Read More..

GEM-TRX The Best Cloud Mining Tron (TRX) Services in 2022 – Analytics Insight

GEM-TRX has emerged as the industrys favorite cloud mining service in 2022

Cryptocurrency cloud mining has been developed as a way to mine crypto by using rented cloud computing power without having to deal with the technical know-how of installing or running any hardware directly yourself. People can easily participate in crypto mining by opening an account on a cloud mining service and renting hash power for a minimal cost. As such, cloud mining companies have made mining much more accessible and profitable for a wide group of people.

Mining can be a tedious process that can be time-consuming and expensive for an individual. Cloud mining services like GEM-TRX make it easy for users to break into the industry without having to deal with all the technical issues that come with setting up their own mining farm. Cloud mining companies provide a dashboard that makes the mining process extremely easy, with a few clicks.

GEM-TRX was launched in 2018 and is one of the leading TRX cloud mining service providers. The purpose of this company is to maximize the interests of each user, so they can leverage large wealth with relatively low capital. Invest once, enjoy forever, thats why all options are recognized and followed by GEM-TRX. Through future data analysis, users should understand that GEM-TRX pursues long-term strategic cooperative relationships with users, which is the reason theyve grown to over 1 million members at this time. Launched in 2018, GEM-TRX is a cloud mining service built on the Tron network. It provides users with a safe, convenient, and efficient TRX cloud mining experience. Users can visit the official GEM-TRX website and register for an account with ease.

The company enables users to mine the TRX cryptocurrency with a low-cost cloud service. Whether its quantitative trading or DeFi technology, GEM-TRX makes it easy to participate in the blockchain revolution with its cloud mining services. To start cloud mining with GEM-TRX, users can visit their website, register with an email, deposit TRX and the service will automatically start mining the TRX cryptocurrency for you.

GEM-TRX features an affiliate program for users that will reward them with extra rewards. You can invite users by sending your own invitation code by clicking the Share button on the platform and copy-pasting the link and sharing it via social media.

If users invite friends who deposit funds to their accounts they will receive rebates. Below is the complete breakdown of the various levels and rebates users can accumulate:

You invite level 1 users and if they complete registration, youll get 30TRX.

Level 1 user invites level 2 user and completes registration, youll get 20TRX.

Level 2 user invites level 3 user and completes registration, youll get 10TRX.

Deposit Rebate: Based on your down-line deposit amount per time.

Level 1 user deposit 1000TRX, youll get 120TRX(12%)

Level 2 user deposit 1000TRX, youll get 20TRX(2%)

Level 3 user deposit 1000TRX, youll get 10TRX(1%)

Trading Rebate: Based on your down-line mining income per time.

Level 1 user mines 1000TRX and gets 50TRX, youll get 5TRX(10%)

Level 2 user mines 1000TRX and get 50TRX, youll get 2.5TRX(5%)

Level 3 user mines 1000TRX and gets 50TRX, youll get 1.5TRX(3%)

Launched in 2018, Tron is one of the leading blockchains on the market. It currently features a market cap of $7.6 billion, ranking it as the top 25th largest blockchain on the market.

Trons native token is TRX, powering the Tron blockchain, enabling users to pay for transaction fees and interact with its ecosystem.

Tron also has a highly popular token standard called TRC-20, its often the network for choice for transactions since the fees on the Tron network are minuscule compared to those on the Ethereum blockchain.

Official website: https://gem-trx.com/share?code=337336

Official Facebook page: https://www.facebook.com/gemtrxofficial

Official Twitter: https://twitter.com/MINEGEMTRX

Telegram Group: https://t.me/GemTrx

Whitepaper: https://tron.network/static/doc/white_paper_v_2_0.pdf

https://t.me/gemtrxofficial

https://t.me/gemtrxofficial2

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Analytics Insight is an influential platform dedicated to insights, trends, and opinions from the world of data-driven technologies. It monitors developments, recognition, and achievements made by Artificial Intelligence, Big Data and Analytics companies across the globe.

Original post:

GEM-TRX The Best Cloud Mining Tron (TRX) Services in 2022 - Analytics Insight

Read More..

Intel Launches Energy-Efficient Bitcoin Mining Chip: Here’s What You Need To Know – Benzinga – Benzinga

The first shipments ofIntel CorporationsINTC new energy-efficientBitcoinBTC/USD mining microchip are set to begin in the third quarter.

What Happened:In anannouncement on Monday, Intel unveiled details about its newly launched mining chip, Blocksale ASIC.

The mining chip will reportedly have a hashrate of 580 Giga Hashes per second (GH/s) with an energy efficiency of 26 joules per Tera Hash (J/TH).

A hashrate of 580 GH/s equates to 0.58 TH/s. This isconsiderably lowerthan existing ASIC manufacturers that offer 112 TH/s.

The infrastructure is designed to support a maximum of 256 integrated circuits per chain, meaning that the chips can be merged into a single mining unit.

A report fromCointelegraphestimated that a mining unit with 256 Intel Blockscale ASICs would consume between 1,228 to 5,811 watts of power with a total hash rate of 148.5 TH/s and around the same energy efficiency as leading Bitcoin mining machines on the market.

Argo Blockchain ADRARBK, Block Inc.SQ, Hive Blockchain Technologies and GRIID Infrastructurewill be the first customers to receive their Blockscale ASIC shipments in the third quarter of 2022.

"The Intel Blockscale ASIC is going to play a major role in helping bitcoin mining companies achieve both sustainability and hash rate scaling objectives in the years ahead, said Jose Rios, general manager of Blockchain and Business Solutions in the Accelerated Computing Systems and Graphics Group at Intel.

Price Action:According to data fromBenzinga Pro, Intel shares closed 2.27% higher on Monday at $49.20.

Go here to see the original:

Intel Launches Energy-Efficient Bitcoin Mining Chip: Here's What You Need To Know - Benzinga - Benzinga

Read More..

State Grid Hangzhou Power Supply Company: Communities Have Equipped Eyes of the Electric Data, Cloud Services are Guarding the Elderly Living Alone -…

HANGZHOU, China--(BUSINESS WIRE)--In recent years, State Grid Hangzhou Power Supply Company has been committed to the application of electric power big data and proposes to build a win-win energy internet ecosystem in the whole society. Since 2019, it has been providing a smart nanny for the elderly living alone, using electric power data analysis technology, and up to now, the innovative service has been spreading in China.

Population aging has become a common concern in both developed and developing countries, and traditional family pensions are facing challenges. A study shows Hangzhou is one of the cities with the highest levels of aging in China. Since 2019, the State Grid Hangzhou Power Supply Company has been empowering social pension services with digital reform, launching the action of "the Elderly Living Alone Should Be Loved". The service is offered through existing smart meters and acquisition devices to build relevant analytical models using big data mining techniques, machine learning and deep learning technologies to act as a "smart nanny" for the elderly living alone. Now, it benefits over 2,000 homes of elderly people living alone in Hangzhou and has been spread in other provinces in China.

"Nowadays, there is a smart meter in every household, and the data has high volume, accuracy, and real-time characteristics," Ma Di, an expert in electric metering and acquisition, one of the founders of the action "the Elderly Living Alone Should Be Loved", from State Grid Hangzhou Power Supply Company, said. "During community visits, we found that as the value of data becomes more and more important, smart meters are very effective for dynamic monitoring of abnormal fluctuations in electricity consumption of the elderly, and this work has been recognized by community staff. Therefore, we took the lead in the power cloud computing service module for the elderly living alone. Especially during the epidemic prevention and control period, it can act as a smart nanny for the elderly."

"In the long run, the integrated analysis and panoramic application of massive energy data will build a new industrial ecology, optimize services for the society and people's livelihood, and support smart city management." Facing the prospects, Ma Di is full of confidence.

More:

State Grid Hangzhou Power Supply Company: Communities Have Equipped Eyes of the Electric Data, Cloud Services are Guarding the Elderly Living Alone -...

Read More..

How standardisation of business resilience and interoperability benefit mines – Mining Technology

Standardisation provides greater consistency in business resilience and decision-making for mining companies operating across multiple regions, enabling more efficient coordination of activities.

However, achieving the standardisation of a resilience program in the mining industry is filled with challenges. These may include assets spread far apart globally and a workforce with varying approaches and cultures.

Furthermore, operations at other locations may use different communications and data exchange platforms. Then there is the potential language barrier. Ultimately, this all contributes to a lack of consistency and greater complexity throughout operations, leading to a fragmented approach to incident management and inefficiencies. And during a crisis, any inconsistencies or inaccuracies will be exposed.

Yet if everyone has access to the same operational data from a shared resilience platform, it increases clarity. Any trends can be assessed through organisation-wide data analysis, identifying when there is a need for an intervention. Centralisation enables targeted interventions and continuous improvements, maximising investment and minimising the cost of incidents to an organisation.

Achieving standardisation can depend on the leadership style and business culture within an organisation. A hierarchical organisation that imposes commands from the top will likely be worse off than one that fosters collaboration and takes constructive feedback on board to drive continuous improvement, which connects back to resilience.

An important part around operational resilience is the continuous improvement. So, the standardisation and then a continuous improvement culture, post-exercise and post-event, where theyre doing the reviews, its all allowing them to improve the system each time, explains Jarrod Wilson, CEO of Dynamiq.

If they have standardisation, it allows that to spread across all the operations. Rather than just that lesson being learned in a localised environment, they can improve the whole system.

Dynamiqs EMQnet platform is a centralised digital resilience platform that allows everyone within an organisation to access a consistent set of operational data. Users can also log in on any device anywhere in the world, further increasing accessibility. The platform facilitates post-event reviews and identifies areas of risk for further improvements.

EMQnet allows everyone to come together with a common operating picture. Even if they are operating differently, they can still at least agree on what the facts are, adds Wilson.

In an interconnected world, leveraging technology for interoperability has become even more important in improving the speed and accuracy of communications.

A standardised business resilience platform allows data to be targeted to the people who need it, without sending out alerts to everyone for everything happening across all operations. Personnel receive the exact information they need to do their jobs without being distracted by multiple notifications.

You dont want to send a deluge of information to people, you just want to keep it specific, and to the point of exactly what they need, adds Wilson.

Often with mining businesses, a corporate office may be in a completely different country to where impacted operations are. This can lead to gaps in the understanding of national regulatory requirements. Having that common operating terminology and reliable communication channels removes any potential ambiguity.

A shared digital resilience platform enhances communication channels. EMQnet enables the secure sharing of vital documentation with approved and trusted third parties and official bodies. For example, if regulators need to approve a re-entry plan for a mine, they can view this on EMQnet. This increases the speed of approval processes, minimises downtime, and allows operations to resume as quickly as possible.

In addition, with global operations, the language barrier can obstruct operations. There can be confusion between management and the workforce due to the lack of clarity in communications.

Thats a significant barrier in some organisations. We have clients that may be headquartered in the US and have several mines in remote areas of South America where the dominant language is Spanish. So, having that ability to translate accurately is a substantial benefit, says Lucas Saunders, Head of Advisory at Dynamiq.

EMQnet allows communication in different languages, removing barriers and increasing operations efficiency.

Standardisation in business resilience means that they can operate a leaner model during an incident due to the flexibility and agility in responding to threats. Resources can be leveraged both internally and externally according to what is happening on the ground. This avoids the issue of overstaffing or understaffing in certain sites. Operators can have the best people for the job regardless of their location without compromising operational confidentiality.

You can also bring in external subject matter experts and limit what they can see within the platform, adds Saunders.

For example, if you need a geotech expert for a high wall failure or a dam failure and they need a dam engineer, they can bring those experts into their platform and give them access to a workspace only.

They can limit the information external parties can see through access control. They can also attach documents, plans or inundation mapping, bringing these into the workspace feature. This allows them to see the relevant information that enables them to make key decisions at the appropriate time.

Regular training is also essential for an effective business resilience platform. Through routine crisis management training, personnel become familiar with the platform, enabling them to understand what to do during a serious event. Furthermore, a centrally managed capability avoids exposing inexperienced staff to duties and locations they are ill-equipped for.

EMQnet allows the management of risks and is fully configurable around existing management systems. The resilience platform supports a just in time resource model for both training and real response. The simpler you can keep the approach, the more simple you can keep their processes and procedures when theyre working under significant pressure youre talking about peoples lives, youre talking about their organisations the more effective the resilience program will be, adds Saunders.

Read the original post:

How standardisation of business resilience and interoperability benefit mines - Mining Technology

Read More..

April 2022: Extramural Papers of the Month – Environmental Factor Newsletter

ExtramuralBy Megan Avakian

Nearly all children have nicotine on their hands, even those living in smoke-free environments, according to NIEHS-funded research. The children were exposed via thirdhand smoke (THS), or the residue that lingers on surfaces and in dust where tobacco was used.

The study included 504 children under the age of 12 from the Cincinnati area. The researchers analyzed child handwipe samples for nicotine, a marker of THS exposure. A child was considered protected from exposure if no household member smoked or vaped, smoking and vaping were banned in homes and cars, and there was no contact with tobacco users within the previous week.

Nicotine was detected on 98% of children living in exposed conditions and, surprisingly, on 95% of children in protected spaces. On average, exposures of about 3 nanograms per wipe were observed among children from protected households and 22 nanograms per wipe among children in exposed households.

Among children considered to be protected, children of Black parents had higher exposure than those with white or multiracial parents. Children from the lowest income families had about five-fold higher nicotine exposure compared to children from families with incomes greater than $30,000. The association between income and exposure points to a potential role of income-related disparities, such as housing type and quality.

According to the authors, results suggest that decades of permissive smoking policies have created significant THS reservoirs in many indoor environments. Smoking bans, exposure screening, and THS remediation are needed to help protect children.

Citation:Matt GE, Merianos AL, Quintana PJE, Hoh E, Dodder NG, Mahabee-Gittens EM. 2022. Prevalence and income-related disparities in thirdhand smoke exposure to children. JAMA Netw Open 5(2):e2147184.

NIEHS-funded researchers showed that pine needles can be used as a tool to monitor the presence and distribution of per- and polyfluoroalkyl substances (PFAS) over time. The pine needles waxy coating traps PFAS and other airborne pollutants, providing a record of contamination.

The study included 60 pine needle samples from six North Carolina counties. For historical comparison, the researchers examined 15 archived samples, dating back to the 1960s, collected from the same counties. The team analyzed each sample using a non-targeted, multidimensional approach that allowed them to distinguish between PFAS based on molecular structure.

More than 70 different PFAS were identified in the pine needles. The types of PFAS detected in samples correlated with known changes in PFAS use over time. For example, samples from the past three decades had an increasing number of newer PFAS, such as GenX, compared with samples collected before the newer substances had emerged. The pine needles, taken at varying distances from contamination sources such as airports, firefighter training sites, and chemical plants, revealed where specific PFAS were being used. For example, samples collected near a contamination source had extremely elevated levels of a type of PFAS commonly used in firefighting foams compared with samples collected further away.

According to the researchers, study results showed that using pine needles in combination with non-targeted multidimensional analyses is a viable method for monitoring the distribution of diverse PFAS.

Citation:Kirkwood KI, Fleming J, Nguyen H, Reif DM, Baker ES, Belcher SM. 2022. Utilizing pine needles to temporally and spatially profile per- and polyfluoroalkyl substances (PFAS). Environ Sci Technol 56(6):3441-3451.

Long-term exposure to polychlorinated biphenyl (PCB) mixtures in school air may affect the nervous and immune systems, according to an NIEHS-funded study in rats. Although PCBs were banned in the U.S. in the late 1970s, air in older schools may be contaminated with PCBs released from building materials. This study contributes to knowledge of the health effects of long-term PCB inhalation, which are poorly understood.

For 13 weeks, the researchers exposed female rats to a PCB mixture and dose representative of air in a Chicago school built in 1968. They used a combination of transcriptomics, metabolomics, and neurobehavioral tests to examine the effects of PBC exposure on the nervous, immune, reproductive, and endocrine systems.

PCBs were detected in the brain, liver, lung, serum, and adipose tissue. PCB exposure impaired memory, induced anxiety-like behavior, substantially reduced white blood cell counts, and disrupted plasma metabolite composition. Exposed rats had altered expression of genes in the brain that are important in neurotransmitter signaling, cognitive function, vascular function, and immune response.

Although the exposure level used in this study was higher than what is found in most new schools, it is similar to concentrations reported for some older schools. According to the authors, results indicate that this exposure level 45.5 micrograms per cubic meter may be close to the lowest dose in which airborne PCB exposure induces adverse health effects.

Citation:Wang H, Adamcakova-Dodd A, Lehmler HJ, Hornbuckle KC, Thorne PS. 2022. Toxicity assessment of 91-day repeated inhalation exposure to an indoor school air mixture of PCBs. Environ Sci Technol 56(3):1780-1790.

NIEHS-funded researchers used a data mining approach to identify a diverse set of chemicals that may contribute to disparities in preterm birth among different populations. In the U.S., preterm birth occurs disproportionately in the Black population. Black women also face disproportionate exposure to chemicals in personal care products and from racial disparities in siting of polluting facilities.

The study included 19 chemicals observed at higher levels in the blood or urine of Black women compared with white women. The researchers obtained chemical-gene interactions from the Comparative Toxicogenomics Database and a list of genes involved in preterm birth from the Preterm Birth Database. They examined chemicals for enrichment with preterm birth genes and identified biological pathways affected by these genes.

All 19 chemicals were associated with enriched expression of genes involved in preterm birth. Exposure levels for several chemicals were at least 1.5-fold higher in Black women compared with white women. These chemicals, which included methyl mercury, methylparaben, propylparaben, diethyl phthalate, DDE, and bisphenol S, also had a higher degree of enrichment with preterm birth genes. The chemicals affected genes involved in pathways relevant to preterm birth, such as inflammation, aging, and estradiol response. Most chemicals impacted genes involved in all three pathways.

Study results suggest that exposure to a diverse array of chemicals contributes to racial disparities in preterm birth and that multiple chemicals drive these effects. According to the authors, results may help to prioritize chemicals for further study and exposure reduction in the fight against preterm birth disparities.

Citation:Harris SM, Colacino J, Buxton M, Croxton L, Nguyen V, Loch-Caruso R, Bakulski KM. 2022. A data mining approach reveals chemicals detected at higher levels in non-Hispanic Black women target preterm birth genes and pathways. Reprod Sci; doi:10.1007/s43032-022-00870-w [Online 2 Feb 2022].

((Megan Avakian is a science communication specialist for MDB Inc., a contractor for the NIEHS Division of Extramural Research and Training.)

Read the original:

April 2022: Extramural Papers of the Month - Environmental Factor Newsletter

Read More..

Does this look like the face of a bully? | Editorial – NJ.com

Local officials in Irvington Township said they were being bullied and annoyed by an 82-year-old retired schoolteacher, so they went to court to make her stop.

Elouise McDaniel said her only recent info request from the town clerk was about the Rent Leveling Board who sits on it, what they make, when they meet, etc. Its her right to ask for such records, under the Open Public Records Act.

The town responded by filing a lawsuit that said the womans sole purpose was to harass, abuse, and harm people who worked there, including the mayor. The suit said Jaccuse! that she had committed harassment and defamation and malicious abuse of process.

Actually, Ms. McDaniel made exactly 75 OPRA requests in three years. That comes out to one every two weeks. That doesnt seem excessive. And it certainly doesnt sound malicious, unless the clerk has a torturous papercut that prevents him from opening cabinet drawers or tapping the print button on his keyboard.

Fortunately, after two days of bad press and a pile of national shame that comes with this kind of officious resistance, Irvington dropped its suit late Thursday.

But it might help to review the OPRA ground rules again for other municipalities and public entities that might get nettled when grandma comes around asking for a few documents during Sunshine Week.

Rule 1: These records belong to the peoplethey are public records, and the clerk is the statutory custodian of our records, so this isnt extra work, said attorney CJ Griffin, who represented McDaniel. Its a core duty of the clerk thats just as important as any other duty.

Yes, on occasion a requestor asks for too much info. Its usually a commercial entity that is data mining, such as the pet food company looking for the addresses of a towns dog owners for a mail campaign. It is not usually a private citizen.

But even the nuisance requestor probably knows the rules. Reporters know them better than anyone.

Dont miss the best in editorials, opinion columns and commentary from NJ.com writers. Add your email here:

Under our OPRA law, anyone can request documents -- budgets, labor contracts, salary info, etc. and the custodian must provide that data within 7 days or request more time to comply. If it is an unreasonable delay, the record-keeper can be sued and be subject to paying the plaintiffs legal expenses.

There are other rules commonly known. If the request involves an extraordinary expenditure of time and effort to accommodate it, the agency may apply a reasonable service charge. If the request would substantially disrupt agency operations, the custodian may deny access after attempting to reach a reasonable solution.

But a government office should not file a lawsuit against a citizen looking for records. In fact, more than 20 states have laws to prevent that, though New Jersey isnt one of them.

Volume is not an OPRA problem, OK? An OPRA problem is when information is being misused or abused, and I wasnt seeing that at all in this case, said attorney Walter Luers, the president of the NJ Foundation For Open Government. If the volume is too much, you just ask for extensions, reasonable accommodations, or request special service charges. But you dont sue them to make them stop.

Luers knows the folly of such suits: He represented Elie Jones, who made 300 OPRA requests in just two months from the borough of Teaneck. The borough sued in 2017, but the judge sided with Jones, and ruled that Teaneck had to pay his $20,000 in legal fees.

The message was clear: While some requests are excessive, the court must not drop a blanket ban on gadflies, and requests should be analyzed on a case-by-case basis.

Thankfully, the courts have been strong in upholding OPRA in recent years. Unlike other branches of government, they know that it is a cornerstone to our democracy that cannot be weakened.

Our journalism needs your support. Please subscribe today to NJ.com.

Bookmark NJ.com/Opinion. Follow on Twitter @NJ_Opinion and find NJ.com Opinion on Facebook.

Read this article:

Does this look like the face of a bully? | Editorial - NJ.com

Read More..

What Is Web3 and How Will It Change Your Digital Life – Make Tech Easier

The Internet went through massive shifts, starting in the 90s, where some innovations were adopted by enough of its users and developers that they eventually became standards. Sites like YouTube, Facebook, Twitter and Reddit all represent what we now call Web 2.0.

The definitions can get a little fuzzy, but looking in the broad strokes helps us understand what kicked off the transition between Web 1.0 and the next iteration. This will also help us understand how the emergence of Web3 will work.

Also read:How to Delete Your Personal Data from the Internet

When the Internet was just starting to enter homes around the world, servers and bandwidth were expensive commodities. Getting a site up and running that could handle large amounts of traffic required a large upfront investment. One way to mitigate this was to minimize the amount of assets you displayed to the visitor. Thats why sites from the 90s have a reputation for being quirky and aesthetically unremarkable. There are still relics of this era around today.

Around the early to mid 2000s, the market around bandwidth and storage started loosening a bit. Startups that came with the Dot-com Bubble and survived the devastating blow kept adopting new ideas on how visitors can interact with their sites, transforming them into creators. This is how sites like YouTube and MySpace got their starts. The latter eventually collapsed, but the idea was picked up by Facebook. Behold, the Web 2.0 era.

Web 2.0 was designed by two crucial things:

Whereas the majority of people were consumers in Web 1.0, the next iteration saw websites that encouraged people to produce their own content and share it with the world.

As the 2010s approached, the websites we use to consume our media and socialize today began to take off and usher in this new era of the Internet.

Also read:How to Securely Send Sensitive Information Over the Internet

One of the biggest problems with the Web 2.0 model is that it allowed for a significant amount of consolidation of the infrastructure of the Internet. Facebook, YouTube, and Google became quasi-monopolies, controlling a huge chunk of all Internet traffic. Because of this, both developers and their users have been put in an awkward position, as the former engages in activities that led to widespread accusations of censorship.

Since about 2015 (though its very difficult to pinpoint an exact year), some people have been thinking of decentralizing services on the Internet to solve this issue.

Put simply, the Web3 principle is focused entirely on using something known as a blockchain to decentralize certain aspects of the Web.

Also read:Cryptocurrency vs. Blockchain Whats the Difference?

We have already written a detailed explanation about it, but in short, blockchains are just like databases, except that you can only use it to store and record not delete. Theyre generally immutable (you cannot delete something once its created) and redundant. (A large number of machines distributed around the world voluntarily hold th contents.)

Digital currencies like Bitcoin use blockchains because they provide a perfect platform with which to make an immutable ledger that cannot be seized. (Youre going to have a tough time seizing thousands or millions of personal machines around the world).

Also read:The Differences Between Permissioned and Permissionless Blockchains

Blockchains usually come in two flavors:

Because blockchains are capable of being decentralized and hosted on numerous systems at the same time, theyre also incredibly resilient. At this moment, the technology is getting a bad reputation because of all the scams in the cryptocurrency and digital token world. However, as more mature implementations appear, and it stops being a wild west, well likely see this become an integral part of Internet services, in the same way Facebook and Google became integral parts of Web 2.0.

Were already seeing examples of blockchain implementations growing into decently healthy projects like Odysee and DTube. Both are video-sharing sites that take different approaches to how they use their blockchains. While Odysee hosts everything entirely on the blockchain platform, DTube uses the chain to pull videos from other sites, storing only the comments associated with those videos.

The former has a hybrid permissioned blockchain, while the latter uses a fully permissionless implementation.

At this moment, Odysee pulls in millions of viewers from all over the world, demonstrating that this model is actually viable for the future of the Web.

You may think that a decentralized Web for platforms is a silly pipe dream, but the concept of decentralization is actually rather old. In fact, things like BitTorrent (a decentralized file-sharing protocol that makes use of trackers and other discovery layers) have been around since 2001!

The point is that decentralization was wildly successful with file sharing in the past and theres no reason blockchains cant help reinforce this in other areas, like social media and search engines.

Before we hop on the hype train, its important to reassess exactly what were getting into with this shift in the Internets arterial structure:

Pros:

Cons:

Its very clear that the only real challenge that Web 3.0 providers will have to overcome is the issue of immutability. Yes, censorship resistance is great and all, but what about when its dealing with something truly criminal or vile in nature? This is where blockchain technology becomes a double-edged sword, and so far the discussion around it has been too small. Its probably time we start to examine how we are going to be able to operate in this new paradigm.

Also read:How to Earn Cryptocurrency by Browsing the Web

For people who make content, nothing will really change on the surface. But since blockchains are immutable, nothing can truly be censored. Although websites can still ignore a particular block containing your content, theres nothing stopping someone else from making a site that doesnt do that.

Theres no need to reinvent the wheel. This new hypothetical site can use the same exact ledger and display it completely uncensored if they want to.

Web3 will not necessarily bring an end to censorship, but it will make it uncompetitive due to how little investment is required to make an uncensored version of a blockchain that already contains all the material it needs to populate its content.

The question surrounding data harvesting/mining on the Internet cant be answered by technology. It is in fact the infiltration of new tech into our lives that created this issue in the first place.

In theory, the same blockchain created by someone who collects data on its front end can be used to create another front end that doesnt do this. Its possible that the very existence of a public blockchain would put competitive pressure on sites to stop data mining.

The painful reality of the situation is that the only real pressure that can conceivably change anything has to come from the users themselves through their refusal to share data. Consent laws like GDPR and newer standards adopted by websites to let visitors manage how their data is collected have made some progress in mending this issue, but ultimately, the only real solution is to educate people until they become more conscious about how they use the Web.

The hardest part of Web 3.0 is the development effort required to make the backbone itself. However, there are many open-source blockchains people can just rip off to make derivatives. A huge amount of blockchain projects available today practically copy/paste the code from other projects. The uniqueness is in what data they store.

In the end, with the sheer amount of effort being put into making excellent open-source implementations of blockchain technology, it isnt inconceivable for Web 3.0 projects that dont focus on cryptocurrencies (which is still the flavor of the year at the time writing) to pop up everywhere like daisies at some point.

Image credit: Pete Linforth Pixabay

Is this article useful? Yes No

Subscribe to our newsletter!

Our latest tutorials delivered straight to your inbox

See the original post:

What Is Web3 and How Will It Change Your Digital Life - Make Tech Easier

Read More..

Cybersecurity hiring levels in the mining industry rose to a year-high in February 2022 – Mining Technology

The proportion of mining industry operations and technologies companies hiring for cybersecurity-related positions rose significantly in February 2022 compared with the equivalent month last year, with 39.1% of the companies included in our analysis recruiting for at least one such position.

This latest figure was higher than the 26.8% of companies that were hiring for cybersecurity related jobs a year ago and an increase compared to the figure of 35.7% in January 2022.

When it came to the rate of all job openings that were linked to cybersecurity, related job postings rose in February 2022, with 2.5% of newly posted job advertisements being linked to the topic.This latest figure was an increase compared to the 2.1% of newly advertised jobs that were linked to cybersecurity in the equivalent month a year ago.

Cybersecurity is one of the topics that GlobalData, from which our data for this article is taken, has identified as being a key disruptive force facing companies in the coming years. Companies that excel and invest in these areas now are thought to be better prepared for the future business landscape and better equipped to survive unforeseen challenges.

Our analysis of the data shows that mining industry operations and technologies companies are currently hiring for cybersecurity jobs at a rate lower than the average for all companies within GlobalData's job analytics database. The average among all companies stood at 3.1% in February 2022.

GlobalData's job analytics database tracks the daily hiring patterns of thousands of companies across the world, drawing in jobs as they're posted and tagging them with additional layers of data on everything from the seniority of each position to whether a job is linked to wider industry trends.

You can keep track of the latest data from this database as it emerges by visiting our live dashboard here.

Full-Service Engineering Services for Mining Products

Selective Electroplating and Anodising Solutions for the Mining Industry

Excerpt from:

Cybersecurity hiring levels in the mining industry rose to a year-high in February 2022 - Mining Technology

Read More..

Jordan Peterson breaks down in tears describing Antifa: They’re ‘worse than animals’ – TheBlaze

Popular psychologist and author Dr. Jordan Peterson recently got emotional while discussing the psychology of Antifa and other violent leftist rioters, calling them "worse than animals."

The poignant remarks came during a recent episode of Peterson's podcast, during which the former University of Toronto professor interviewed independent journalist Andy Ngo about his up close and personal experiences with the mob.

As part of the discussion, Ngo recalled a time when a mob of rioters in Minnesota kicked a man's teeth in gleefully. To his confusion, he said, the violent act elicited enthusiasm from the crowd.

Later, Ngo asked Peterson his perspective on the motivations behind rioters' actions.

"I wanted to ask you based on your knowledge, your background, your clinical experience what is the psychology of this mob violence? When I see it, I dont even recognize some of these it seems animalistic is what I mean."

"Theyre worse than animals," Jordan replied. "Theyre worse than animals because animals, they just kill to eat, you know. Human beings, they have a twist in them that makes them far worse than animals once they really get going."

ANTIFA: The Rise of the Violent Left | Andy Ngo & Jordan Peterson youtu.be

"You really want to know what I think," Peterson continued. "I think its revenge against God for the crime of being. Thats really what I think. Its 'Cain' in [the story of] Cain and Abel.

"Like, Oh, Abels your guy, what if I take him out into a field and beat him to death. How do you feel about that?'" he said.

"All my sacrifices went unrewarded. Yeah," a teary-eyed Peterson added, his voice starting to crack. "Yeah, thats what it is at the bottom of the hell of things."

Moments earlier, Peterson pushed back against the idea that rioters celebrate violent actions because of some inner, though twisted, sense of justice such as "a fascist or racist [getting] the violence against them they deserve," as suggested by Ngo.

"I don't believe that, because I don't think they're that good," Peterson said.

"I think they're celebrating watching some poor son of b***h get hurt and that satisfies something unbelievably dark in their souls like the desire to burn, the desire to burn down buildings, the desire to melt cars, the desire for the whole goddamn ting to go up in flames because they're resentful and bitter," he explained.

(H/T: Mediaite)

Read more:
Jordan Peterson breaks down in tears describing Antifa: They're 'worse than animals' - TheBlaze

Read More..