Page 1,775«..1020..1,7741,7751,7761,777..1,7801,790..»

If banks are the problem, cryptocurrency probably isn’t the answer – Stuff

Jenny Nicholls is a Waiheke-based writer, specialising in science commentary.

OPINION: I have a good friend, a cyber-security expert, who mines cryptocurrency in his laundry, and his mothers basement. This kind of mining doesnt use picks and shovels, but towers of stacked computers linked with fat cables, which look like licorice straps. The air feels warm hot, even, just like inside a real mine.

In winter Patricks mining rigs generate so much heat that it keeps two floors of his mothers four-bedroom house warm; his electricity bill at the address is at least $400 a month.

So far, Patrick says, even though some cryptocurrencies have soared (and plummeted), in value, his hobby hasnt made him rich he has ended up with a lot of shit-coins.

If Id mined Ethereum and banked it Id probably be a few hundred thousand bucks up. But I traded it for shit-coins, hoping those would moon (go up in value) when I should have kept the Ethereum.

READ MORE:* How can we address the climate impact of cryptocurrencies? * Wall Street: Bitcoin price falls sharply, value cut in half since November* 90 per cent of bitcoin's supply has been mined and four other crypto updates you should know

Ill pause here to acknowledge the boyish tang of crypto jargon, in phrases like crypto bros, cryptojacking, lambos (lamborghinis), bags and bag holders, ATH and altcoins, DeFi and getting rekt, or preferably not getting rekt, as this is exactly what it sounds like, just spelled differently.

Patrick mines Ethereum because he loves the mathematical workings of the blockchain, a digital ledger linked together using cryptography a technique used in cybersecurity in the presence of adversarial behaviour, as Wikipedia puts it.

He also likes the way cryptos DeFi, or decentralised finance system, is managed by a network of users, rather than a central entity like the head office of a bank. Patrick thinks banks are bloated and greedy hard to argue with and he sees DeFi as a way to fairly redistribute the wealth on such full display in bankers corporate car parks.

Unsplash

So far, Patrick says, even though some cryptocurrencies have soared (and plummeted), in value, his hobby hasnt made him rich, Jenny Nicholls writes.

He admits, though, that DeFi is still in its infancy, and thinks cryptocurrency needs more regulation to protect users and guard against criminal activity.

Banks may have been bringing creditors and debtors together since the Banco dei Medici improved ledgers in the 15th century, but Patrick thinks it is time to move on. Many others share this view, and not all of them are crypto bros.

In 2016, the British journalist John Lanchester wrote a long think piece in the London Review of Books called When Bitcoin Grows Up. The simplest and biggest possibilities [for radical change to financial systems], wrote Lanchester, concern connectivity. We are more connected in more ways to more people than we ever have been at any point in human history. This is changing everything, and it would be deeply strange if it didnt change money too.

He points to the billions of people in the developing world who own a phone, but have no bank account.

INA FASSBENDER | AFP | Getty Images

If Id mined Ethereum and banked it Id probably be a few hundred thousand bucks up, says Jenny Nicholls friend Patrick.

If your phone can give you access to the things you would need from a bank, says Lanchester, well, youve just disinvented the need for banks, and fundamentally changed the operation of the money system, across whole swathes of the world.

Cryptocurrency, though, is probably not the answer.

M-Pesa, a Kenyan mobile phone-based money-transfer service, was created about the same as Bitcoin. Unlike Bitcoin, M-Pesa is now popular across much of Africa, handling the kind of mass daily transactions crypto-as-currency maximalists can only dream about.

Sceptics often use M-Pesa to demonstrate cryptos shortcomings: Bitcoins decentralised blockchain, they say, makes mass transactions slow and horribly inefficient, and its volatility means you never know how much it will be worth tomorrow.

Supplied

Jenny Nicholls: But even using renewables for crypto mining sucks power away from electric cars and houses and hospitals. Is it all worth it?

Nicholas Weaver, a US cryptocurrency expert who has become one of its most vocal critics, put it this way: M-Pesa is huge. Because it just basically attaches a balance to your phone account. And you can text to somebody else to transfer money that way. And so even with the most basic dumb phone you have easy-to-use electronic money. And this has taken over multiple countries and become a huge primary payment system. [Whereas] the cryptocurrency doesnt work.

Ironically, Patrick, no slouch in the brains department, has designed an app that does much the same thing as M-Pesa. New Zealand banks were not interested.

Forbes magazine recently published a comparison of digital transaction rates. Visa, for instance, can handle around 1700 transactions per second (TPS) compared with Bitcoins 4 TPS. And Visa uses much less energy.

Bitcoin consumes electricity, noted Forbes, at an annual rate exceeding the entire annual electricity consumption of Norway. In fact, Bitcoin uses 707 kilowatt-hours (kWh) of electricity per transaction, which is 11 times that of Ethereum.

For greenies like me, this makes Bitcoin untenable.

Cryptos defenders would dismiss these criticisms, and the ones about rampant money-laundering, as FUD Fear, Uncertainty and Doubt. FUD is cryptos version of fake news. Ethereums planned new algorithm, they say, will drastically improve its carbon footprint.

Most of Patricks mining rig is in Auckland, which gets its power from renewable hydro, wind and geothermal sources. But even using renewables for crypto mining sucks power away from electric cars and houses and hospitals. Is it all worth it?

Journalist Nathan Robinson concluded his interview of arch sceptic Weaver with this summary of his argument: There is no problem that cryptocurrency solves, and to the extent that it is functional, it does things worse than we can already do them with existing electronic payment systems. To the extent it has advantages, the advantage is doing crimes.

The title of the magazine interview was Why This Computer Scientist Says All Cryptocurrency Should Die in a Fire.

Read this article:
If banks are the problem, cryptocurrency probably isn't the answer - Stuff

Read More..

Persystic Token (PSYS): The social cryptocurrency – AMBCrypto News

Persystic Token (PSYS) is the cryptocurrency used on the Persystic platform, a community-driven decentralized social media network that is the first of its type. Persystic was created to resolve the many issues affecting social media today, including unfair monetization policies, privacy and security concerns, and more. Persystic employs advanced security measures to ensure that user data remains safe from bad actors such as hackers.

Persystic Token (PSYS) vs Traditional Social Media

Social media platforms are integral to how we live our lives today, however, the existing market is dominated by a relatively small number of platforms in which the creators and owners have become very powerful.

This has led to issues such as a lack of democracy and transparency in how the platforms are run, with a disproportionate amount of benefits going to advertisers and partners rather than the community of users.

For example, social media networks like Facebook have come under fire for a lack of fairness and transparency in their monetization policies. Creators do not own their content on these platforms, and rather sign away their rights to the content in return for access to the platform and its community of users. In addition, creators cannot monetize their content on many of these platforms.

On the Persystic social media platform, creators have full ownership over their content. This includes the right to monetization, meaning that users can earn from the content they produce. This mechanism is also used to incentivize the creation and generation of authentic content with Persystic Token (PSYS) rewards.

Privacy has been a rising concern with traditional social media networks. Many argue that the measures in place for protecting the privacy of users are insufficient and that these platforms are exploitative of users data which is then sold to and shared with third parties such as advertisers. Often users accept the mandatory terms and conditions required for accessing these platforms without fully reading or understanding the policies.

There is also a lack of right to erasure mechanisms, meaning that content published on these platforms is stored on the company servers indefinitely and users have no course of action to take for removing them.

On Persystic, user privacy is of the utmost importance and the platform has been created in a way that does not steal or share user data with unwanted third parties. Additionally, users can request that their content is permanently removed from the platform in line with EU right-to-be-forgotten laws.

What is Persystic Token (PSYS) used for?

Persystic Token (PSYS) is a BEP20 cryptocurrency built on the Binance Smart Chain and it is used for utility in the Persystic ecosystem. Its primary uses include sending and receiving cryptocurrency transactions between users, trading with users for other cryptocurrencies and fiat currencies, as part of the rewards mechanism for incentivizing content creators, and purchasing content licenses.

In total, 3 billion Persystic Token (PSYS) will be minted.

Token distribution model:

Disclaimer: This is a paid post and should not be treated as news/advice.

Excerpt from:
Persystic Token (PSYS): The social cryptocurrency - AMBCrypto News

Read More..

RIT Certified and Foundry collaborate on cryptocurrency course | RIT – Rochester Institute of Technology

Underserved students from the city of Rochester with a strong interest in cryptocurrency and blockchain technology recently participated in an immersive, weeklong course at RIT to learn the latest about digital currency.

A mystery to many people, cryptocurrency is a type of currency thats stored exclusively in a digital formatnot issued or maintained by a central authority like a government or bank. Its issued with cryptography, distributed consensus mechanisms, and economic incentive alignment, according to Jonathan S. Weissman, a senior lecturer in computing security in RITs Golisano College of Computing and Information Sciencesand teacher of the class.

Digital currency is interesting to me because its fun learning about how it was created and all aspects related to it, said Teresa Spivey, a Rochester, N.Y., resident who participated in Weissmans class inside Eastman Hall. My future plans are to learn everything I can and really find my passion for what I want to do forever. I will figure that out by learning and experiencing other classes and opportunities like this one.

Thats exactly the mission of RIT Certified, which launched in June and aims to provide a wide range of alternative education courses, certificate programs, and skill-based learning experiences targeting people beginning their careers, changing roles, maintaining their existing job, or advancing in the workplace. RIT Certified offered the class collaboratively with Foundry, a Rochester, N.Y.-headquartered and wholly-owned subsidiary of Digital Currency Group (DCG) focused on digital asset mining and staking.

The class was the outcome of discussions between RIT Certified, Foundry, and University Advancement on how they might partner in supporting alternative pathways to training in the space of cryptocurrency and mining. The Foundry Scholars program resulted in a gift by the company to support seven underserved students this summer and 13 more next year to experience weeklong, industry-focused classes focused on the field of cryptocurrency and blockchain.

Both Foundry and RIT Certified are actively invested in career and technical education, not only for K-12 students, but for traditionally underserved populationsfor whom the scholarships are targeted.

We believe that employer-driven educational experiences for high school students will only improve their ability to be successful, said Dennis Di Lorenzo, chief business officer for RIT Certified. A program like this brings students from communities with limited opportunities to a college campus, provides them with a college experience, and industry exposure. Its about changing their perspective on the future of work.

Foundry Academy Executive Director Craig Ross 06 (telecommunications engineering technology) said the class fit Foundry CEO Mike Colyers vision of Western New York becoming the center for innovation in blockchain technology. Ross heads up the companys new initiative to train and develop top technicians for the fast-growing bitcoin mining industry.

Considering RITs reputation of academic excellence and prominence in Western New York, Foundry and RIT are a natural partnership, Ross said. The goal of this course was to provide an overview of bitcoin mining and cryptocurrencies to historically marginalized groups in Rochester, ultimately working to break down barriers to employment in the industry.

We believe that the Bitcoin and cryptocurrency industry is a hotbed for innovation, just like the Internet and mobile revolutions, he added. It is Foundrys goal to ensure that all interested members of our communityno matter their socioeconomic statuscan capitalize on this exciting technology and be contributing members of the blockchain revolution.

Weissman said his courses objectives included students walking away with a granular understanding of cryptocurrencies and blockchains and the ability to reason about newfangled technologies.

I wanted to provide them the familiarity with trends and notable projects in the field and industry, he noted, along with the confidence to pursue opportunities for participation and contribution in the future.

During the weeklong class, Weissman asked his students questions such as what problems do cryptocurrencies and decentralized applications help solve; where do they see themselves in this industry; and how are blockchain entrepreneurs disrupting industries nowand how might they do so in the future?

Blockchains can be used in many different ways besides cryptocurrencies, Weissman said. Blockchain is often listed as one of the leading cutting-edge technologies of the future.

Adrian Hale, director of Economic and Community Development at Foundry, said he hopes that programming like the one at RIT provides the necessary groundwork for people interested in our industry to become familiar with the fundamentals that will enable them to grow into fully functional contributing members of a work team and the broader blockchain community.

About RIT Certified

RIT Certified provides alternative education-to-employment pathways, offering applied training which serves both individuals in and out of the workforce and working professionals. Committed to promoting economic mobility and sustainability for individuals from all sectors of the workforce across the region, nation, and globe, RIT Certified is a partner to employers, helping organizations develop potential, fill core and specialized skills gaps, provide outcomes-based training and development to nurture and promote talent, and improve the models by which employers evaluate and assess talent. RIT Certified will begin offering a diverse portfolio of workforce development and professional training courses and certificate programs late this fall.

About Foundry LLC

A subsidiary of DCG, Foundry LLC was created to meet the institutional demand for better capital access, efficiency, and transparency in the digital asset mining and staking industry. Headquartered inRochester, N.Y., Foundry leverages its institutional expertise, capital, and market intelligence to empower participants within the crypto ecosystem by providing the tools they need to build tomorrow's decentralized infrastructure. For more information, go to Foundrys website.

View post:
RIT Certified and Foundry collaborate on cryptocurrency course | RIT - Rochester Institute of Technology

Read More..

Learn Everything About Data Science at Your Own Pace With DataCamp – Tech Times

(Photo : Christopher Gower / Unsplash)

Everywhere you go, you see data science and analytics increasingly transforming lives. After all, we haven't passed the information age and have been gathering more information than ever. However, not everyone can adequately analyze all that data to derive valuable insights.

We all recognize that data has tremendous power, and DataCamp has a solution to bring this power into our hands. Check out all the key features that make DataCamp the best place to analyze, interact, and manipulate data in all forms.

Today, data is at the heart of every business operation. There is little doubt that there is a significant demand for these positions, and the pay is attractive. However, what makes this a successful job is the ability to translate any data around the globe into meaningful insights. With this much power, you can tackle practically any problem your business have.

Many would say that while being a data scientist is rewarding, it can also be challenging. There are online courses accessible, but DataCamp is one of the very few data science and analytics learning applications to provide better opportunities for learning.

DataCamp can get you started on the right track with the help of professional courses for data science and other methods to manage and control data. They provide a range of courses handpicked by industry professionals to assist you in developing your data abilities and taking the next major step in your career.

With this data science learning software, you can learn Python, Tableau, SQL, Power BI, R programming, and other abilities you'll need to gain skills in constructing applications, managing data, and performing sophisticated data analysis. There is no need for prior coding knowledge at all. This class will teach you how to handle data, build fast code, and work with complex data.

Here's the best part! You can save up to 60% on a full year of learning data science (Impact ID# 1422828). This is a limited-time offer available till August 12th.

You can also acquire and expand your expertise working with important libraries through practical activities. This should help you execute core programming jobs like website development, data analysis, and task automation.

Through DataCamp, you can be a:

Data Scientist who can manipulate data and explore machine learning.

Data Analyst with skills from exploratory data analysis with Dplyr to data visualization with ggplot2, Python data analyst skills. Also, one who can master popular libraries like NumPy, Power BI skills to prepare, model, and visualize data for the PL-300 data analyst certification and SQL data analyst capabilities.

Statistician who can analyze and interpret corporate data, recognize patterns, and make informed judgments.

SQL Server Developer to develop, debug, and optimize SQL server queries by learning SQL.

R and Python Programmer, so you can build career-building skills with no prior experience needed.

Quantitative Analyst who can guarantee that portfolios are risk-balanced, assist in the discovery of fresh trading opportunities, and use quantitative models to analyze asset prices.

Aside from that, you can also become a Machine Learning Analyst where you can experiment with various methods for creating machine learning models. All the abilities you will learn will require a different number of courses and time. Each will begin with basic information so that you can gradually increase your knowledge over time. The benefit of DataCamp is that you may study at your own pace while still gaining hands-on experience to develop your data science abilities.

If you're new to data science, and you don't know where to start, you may select from various courses with DataCamp, beginning with Python.

Python

You may begin with Introduction to Python or Introduction to Data Science in Python, which are excellent starting points. You'll start with the fundamentals of Python and how it pertains to data science before progressing to crucial techniques and tools for collecting, processing, manipulating, and visualizing data. This will provide you with a good foundation for further study.

R Programming

R programming is a talent in high demand among data analysts and scientists, making it an ideal career venture. DataCamp teaches you how to program in R and utilize it for a range of positions in the data business.

SQL

Because business is becoming more data-driven, SQL abilities are in great demand. You can learn how to apply SQL to your data difficulties with a course given by real-world specialists through DataCamp's course portfolio.

Power BI

Power BI, one of the most interesting and crucial additions to the data visualization scene, may lead to new and profitable employment. Luckily, DataCamp lets you learn how to turn your data into attractive, effective graphics by taking a Power BI course.

Tableau

You can join Tableau courses to understand how to utilize this business intelligence powerhouse to convert data into dynamic, complex representations. Learn how to construct dynamic, colorful narratives from your facts and advance your presenting abilities.

Other Courses

Spreadsheet and Data Analysis courses can help you gain a much more fundamental understanding of data science. On the other hand, DataCamp provides advanced courses in Data Visualization, Machine Learning, and Data Engineering.

Data is developing and has been incorporated into numerous corporate fields. Definitely, we should arm ourselves with data science skills. DataCamp is a fantastic resource to upskill our data science and programming abilities, to warm us up for in-demand careers, and to prepare us in becoming one with data.

Overall, DataCamp is unquestionably a good place to start when it comes to learning data. You can take a course to learn about each topic, practice with DataCamp Projects, create a portfolio with Workspace, apply for employment, and get certified all in one platform. There's also a ton of extra information on the blog, lessons, podcasts, and cheat sheets for additional knowledge.

Now, for the most exciting part. DataCamp offers an XP Learner Challenge, valid between August 15 and August 31 (Impact Ad ID: 1421093). Contestants will gain XP by taking R, SQL, Python, and other courses. And each day, a unique learner can earn $500 for gaining the most XP for the day. The ultimate Bonus Prize of $1000 will be awarded to the learner with the most single-day XP for the duration.

One individual learner will receive a total of $1500, and this could be you! Take the XP Learner Challenge to gain knowledge, gain points, and win rewards!

2022 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Read more here:

Learn Everything About Data Science at Your Own Pace With DataCamp - Tech Times

Read More..

Analytics and Data Science News for the Week of August 12; Updates from Anaconda, insightsoftware, Verta, and More – Solutions Review

The editors at Solutions Review have curated this list of the most noteworthy analytics and data science news items for the week of August 12, 2022.

Keeping tabs on all the most relevant analytics and data science news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last month, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy analytics and data science news items.

The Anaconda and Oracle Cloud Infrastructure Partnership will offer open-source Python and R tools and packages by embedding and enabling AnacondasrepositoryacrossOCI Artificial Intelligence and Machine Learning Services. Customers have access to Anaconda services directly from within OCI without a separate enterprise license.

Read on for more.

The annual Industry Excellence Awards acknowledge vendors who have achieved leadership position in the companys 2022 Wisdom of Crowds Analytical Data Infrastructure (ADI), Business Intelligence (BI), and/or Enterprise Performance Management (EPM) Flagship Market Studies. The reports are based on data collected from end-users and provide a broad assessment of each market.

Read on for more.

insightsoftware plans to leverage the Dundas platform to enhance its existing Logi solutions, notably adding a strong extract, transform, and load (ETL) engine and pixel perfect reporting. The Dundas solution is a flexible, end-to-end BI platform that offers software providers the ability to customize dashboards, reports, and visualizations. It was designed to operate as a one stop shop for self-service analytics, with integration into multiple data sources.

Read on for more.

Adding BigSquare to the Litera solutions portfolio will empower law firms to make better financial decisions with fast access to financial data and insights. BigSquares BI software retrieves, analyzes, and visualizes financial data onto a configurable financial intelligence dashboard that is easy to digest and understand. Lawyers and management teams can leverage the practical insights themselves, reducing the need to hire or rely on specialists to interpret the data.

Read on for more.

The report is based on a survey of 500 data and technology leaders, across a variety of industries, who are managing active data workloads of 150 terabytes or more. The purpose of the survey and report is to uncover key trends around how organizations are managing the shift from big-data volumes toward ingesting, storing, and analyzing hyperscale data sets, which include trillions of data records, and the expected technical requirements and business results from that shift.

Read on for more.

TheNet Emotional Footprint (NEF)of each software provider is a result of aggregated emotional response ratings across the areas of service, negotiation, product impact, conflict resolution, strategy, and innovation. The NEF is a powerful indicator of overall user sentiment toward the provider and its product from the software users point of view.

Read on for more.

The new updates include additions to Vertas native integration ecosystem and subsequent capabilities around enterprise security, privacy and access controls, model risk management, and the pursuit of responsible AI. Verta was recently recognized as a 2022 Gartner Cool Vendor in AI Core Technologies.

Read on for more.

For consideration in future analytics and data science news roundups, send your announcements to the editor: tking@solutionsreview.com.

Tim is Solutions Review's Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.

See more here:

Analytics and Data Science News for the Week of August 12; Updates from Anaconda, insightsoftware, Verta, and More - Solutions Review

Read More..

Lucy Family Institute for Data & Society funds A&L faculty project proposals // Department of Political Science // University of Notre Dame – nd.edu

Faculty in the College of Arts and Letters are participating in interdisciplinary projects funded by the Lucy Family Institute for Data & Society (the Institute) that inspire novel research and scholarship, enhance stakeholder engagement, foster collaboration, and address wicked problems.

The Institute solicited proposals from affiliates and fellows to discover and define thematic goals of interest to a broader coalition of faculty on campus, as part of its strategic planning initiatives for the next three to five years.

Submitted proposals were within four funding tracks: Convening, Research Accelerator, Infrastructure & Services, and Partnerships. The Institute, after a substantial review process led by the members of the steering committee, awarded the following 13 projects that involve collaboration among all colleges and schools and are intended to generate translational value for societal benefit:

To learn more about these grants and other funding opportunities, visit https://lucyinstitute.nd.edu/about/funding-opportunities/.

Originally published by Alissa Doroh at lucyinstitute.nd.edu on August 08, 2022.

Go here to read the rest:

Lucy Family Institute for Data & Society funds A&L faculty project proposals // Department of Political Science // University of Notre Dame - nd.edu

Read More..

The World is Moving Beyond Big Data, According to Ocient Survey of 500 Data and Technology Leaders – insideBIGDATA

Ocient, a leading hyperscale data analytics solutions company serving organizations that derive value from analyzing trillions of data records in interactive time,released a report, Beyond Big Data: The Rise of Hyperscale. The report is based on a survey of 500 data and technology leaders, across a variety of industries, who are managing active data workloads of 150 terabytes or more. The purpose of the survey and report is to uncover key trends around how organizations are managing the shift from big-data volumes toward ingesting, storing and analyzing hyperscale data sets, which include trillions of data records, and the expected technical requirements and business results from that shift.

The survey was conducted in May 2022 by Propeller Insights. Respondents include partners, owners, presidents, C-level executives, vice presidents and directors in many industries including technology, manufacturing, financial services, retail, and government. Their organizations annual revenue ranges from $50 million to $5 billion. Approximately 50% of respondents represent companies with annual revenue greater than $500 million.

Key findings of the survey include:

Extraordinary data growth

Data is growing at an extraordinary rate. According to John Rydning, research vice president of the IDC Global DataSphere, a measure of how much new data is created, captured, replicated, and consumed each year, The Global DataSphere is expected to more than double in size from 2022 to 2026. The Enterprise DataSphere will grow more than twice as fast as the Consumer DataSphere over the next five years, putting even more pressure on enterprise organizations to manage and protect the worlds data while creating opportunities to activate data for business and societal benefits.

IDC Global DataSphere research also documented that in 2020, 64.2 zettabytes of data was created or replicated and forecasted that global data creation and replication will experience a compound annual growth rate (CAGR) of 23% over the 2020-2025 forecast period. At that rate, more than 180 zettabytes thats 180 billion terabytes will be created in 2025.

The survey respondents reflect forecasts of such exponential data growth. When asked how fast the volume of data managed by their organization will grow over the next one to five years, 97% of respondents answered fast to very fast, with 72% of C-level executives expecting the volume to grow very fast over the next five years.

Barriers to supporting data growth and hyperscale analytics

To support such tremendous data growth, 98% of respondents agreed its somewhat or very important to increase the amount of data analyzed by their organizations in the next one to three years. However, respondents are experiencing barriers to harnessing the full capacity of their data and cited these top three limiting factors:

When asked about their biggest data analysis pain points today, security and risk ranked first among C-level respondents (68%), with metadata and governance (41%) and slow data ingestion (31%) being two other top concerns. When scaling data management and analysis within their organization, 63% said maintaining security and compliance as data volume and needs grow was a challenge they are currently facing.

Survey respondents also indicated legacy systems are another source of pain and a barrier to supporting data growth and hyperscale analytics. When asked if they plan to switch data warehousing solutions, more than 59% of respondents answered yes, with 46% of respondents citing a legacy system motivating them to switch. When ranking their most important considerations in choosing a new data warehouse technology, modernizing our IT infrastructure was ranked number one.

Faster data analytics improve decisions, revenue and success

The survey respondents believe hyperscale data analytics is crucial to their success. Sixty-four percent of respondents indicate hyperscale data analytics provides important insights used to make better business decisions, and 62% said it is essential for planning and strategy.

The survey respondents also indicated there is a strong relationship between implementing faster data analytics and growing the companys bottom line. When asked about this relationship, an overwhelming 78% of respondents agreed there is a definite relationship. For the C-level audience, more than 85% cited the relationship.

Data analysis is no longer a nice-to-have for organizations. Hyperscale data intelligence has become a mission-critical component for modern enterprises and government agencies looking to drive more impact and grow their bottom line. With the rapid pace of growth, its imperative for enterprises and government agencies to enhance their ability to ingest, store, and analyze fast-growing data sets in a way that is secure and cost effective, said Chris Gladwin, co-founder and CEO, Ocient. The ability to migrate from legacy systems and buy or build new data analysis capabilities for rapidly growing workloads will enable enterprises and government organizations to drive new levels of agility and growth that were previously only imaginable.

Join us on Twitter:@InsideBigData1 https://twitter.com/InsideBigData1

Sign up for the free insideBIGDATAnewsletter.

See original here:

The World is Moving Beyond Big Data, According to Ocient Survey of 500 Data and Technology Leaders - insideBIGDATA

Read More..

OU researchers award two NSF pandemic prediction and prevention projects – EurekAlert

image:Photo of the University of Oklahoma's Norman campus. view more

Credit: Photo provided by the University of Oklahoma.

Two groups of researchers at the University of Oklahoma have each received nearly $1 million grants from the National Science Foundation as part of its Predictive Intelligence for Pandemic Prevention initiative, which focuses on fundamental research and capabilities needed to tackle grand challenges in infectious disease pandemics through prediction and prevention.

To date, researchers from 20 institutions nationwide were selected to receive an NSF PIPP Award. OU is the only university to receive two grants to the same institution.

The next pandemic isnt a question of if, but when, said OU Vice President for Research and Partnerships Toms Daz de la Rubia. Research at the University of Oklahoma is going to help society be better prepared and responsive to future health challenges.

Next-Generation Surveillance

David Ebert, Ph.D., professor of computer science and electrical and computer engineering in the Gallogly College of Engineering, is the principal investigator on one of the projects, which explores new ways of sharing, integrating and analyzing data using new and traditional data sources. Ebert is also the director of the Data Institute for Societal Challenges at OU, which applies OU expertise in data science, artificial intelligence, machine learning and data-enabled research to solving societal challenges.

While emerging pathogens can circulate among wild or domestic animals before crossing over to humans, the delayed response to the COVID-19 pandemic has highlighted the need for new early detection methods, more effective data management, and integration and information sharing between officials in both public and animal health.

Eberts team, composed of experts in data science, computer engineering, public health, veterinary sciences, microbiology and other areas, will look to examine data from multiple sources, such as veterinarians, agriculture, wastewater, health departments, and outpatient and inpatient clinics, to potentially build algorithms to detect the spread of signals from one source to another. The team will develop a comprehensive animal and public health surveillance, planning and response roadmap that can be tailored to the unique needs of communities.

Integrating and developing new sources of data with existing data sources combined with new tools for detection, localization and response planning using a One Health approach could enable local and state public health partners to respond more quickly and effectively to reduce illness and death, Ebert said. This planning grant will develop proof-of-concept techniques and systems in partnership with local, state and regional public health officials and create a multistate partner network and design for a center to prevent the next pandemic.

The Centers for Disease Control and Prevention describes One Health as an approach that bridges the interconnections between people, animals, plants and their shared environment to achieve optimal health outcomes.

Co-principal investigators on the project include Michael Wimberly, Ph.D., professor in the College of Atmospheric and Geographic Sciences; Jason Vogel, Ph.D., director of the Oklahoma Water Survey and professor in the Gallogly College of Engineering School of Civil Engineering and Environmental Science; Thirumalai Venkatesan, director of the Center for Quantum Research and Technology in the Dodge Family College of Arts and Sciences; and Aaron Wendelboe, Ph.D., professor in the Hudson College of Public Health at the OU Health Sciences Center.

Predicting and Preventing the Next Avian Influenza Pandemic

Several countries have experienced deadly outbreaks of avian influenza, commonly known as bird flu, that have resulted in the loss of billions of poultry, thousands of wild waterfowl and hundreds of humans. Researchers at the University of Oklahoma are taking a unique approach to predicting and preventing the next avian influenza pandemic.

Xiangming Xiao, Ph.D., professor in the Department of Microbiology and Plant Biology and director of the Center for Earth Observation and Modeling in the Dodge Family College of Arts and Sciences, is leading a project to assemble a multi-institutional team that will explore pathways for establishing an International Center for Avian Influenza Pandemic Prediction and Prevention.

The goal of the project is to incorporate and understand the status and major challenges of data, models and decision support tools for preventing pandemics. Researchers hope to identify future possible research and pathways that will help to strengthen and improve the capability and capacity to predict and prevent avian influenza pandemics.

This grant is a milestone in our long-term effort for interdisciplinary and convergent research in the areas of One Health (human-animal-environment health) and big data science, Xiao said. This is an international project with geographical coverage from North America, Europe and Asia; thus, it will enable OU faculty and students to develop greater ability, capability, capacity and leaderships in prediction and prevention of global avian influenza pandemic.

Other researchers on Xiaos project include co-principal investigators A. Townsend Peterson, Ph.D., professor at the University of Kansas; Diann Prosser, Ph.D., research wildlife ecologist for the U.S. Geological Survey; and Richard Webby, Ph.D., director of the World Health Organization Collaborating Centre for Studies on the Ecology of Influenza in Animals and Birds with St. Jude Childrens Research Hospital. Wayne Marcus Getz, professor at the University of California, Berkeley, is also assisting on the project.

The National Science Foundation grant for Eberts research is set to end Jan. 31, 2024, while Xiaos grant will end Dec. 31, 2023.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Read more here:

OU researchers award two NSF pandemic prediction and prevention projects - EurekAlert

Read More..

Building the Best Zeolite – University of Houston

Natural zeolite mineral originating from Croft Quarry in Leicester, England

Jeffrey Rimer, Abraham E. Dukler Professor of chemical and biomolecular engineering at the University of Houston, has summarized methods of making zeolites in the lab and examined how the emergence of data analytics and machine learning are aiding zeolite design.

If science and nature were to have a baby, it would surely be the zeolite. This special rock, with its porous structure that traps water inside, also traps atoms and molecules that can cause chemical reactions. Thats why zeolites are important as catalysts, or substances that speed up chemical reactions without harming themselves. Zeolites work their magic in the drug and energy industries and a slew of others. With petrochemicals, they break large hydrocarbon molecules into gasoline and further into all kinds of petroleum byproducts. Applications like fluid catalytic cracking and hydrocracking rely heavily on zeolites.

So important is the use of zeolites that decades ago scientists began making them (synthetic ones) in the lab with the total number of crystal structures exceeding 250.

Now, an undisputed bedrock in the global zeolite research community, Jeffrey Rimer, Abraham E. Dukler Professor of chemical and biomolecular engineering at the University of Houston, has published a review in the Nature Synthesis journal summarizing methods over the past decade that have been used to prepare state-of-the art zeolites with nano-sized dimensions and hierarchical structures.

The findings emphasize that smaller is better and structure is critical.

These features are critical to their performance in a wide range of industrial applications. Notably, the small pores of zeolites impose diffusion limitations for processes involving catalysis or separations where small molecules must access pores without obstruction from the accumulation of residual materials like coke, which is a carbonaceous deposit that blocks pores, reports Rimer. This calls for new methods to prepare zeolites with smaller sizes and higher surface area, which is a challenging task because few zeolites can be prepared with sizes less than 100 nanometers.

The review article summarizes advanced methods to accomplish this goal, including work from Rimers own group on finned zeolites, which he invented. Zeolites with fins are an entirely new class of porous catalysts using unique nano-sized features to speed up the chemistry by allowing molecules to skip the hurdles that limit the reaction.

Rimer also examines how the emergence of data analytics and machine learning are aiding zeolite design and provides future perspectives in this growing area of research. That helps make up the new methods that Rimer suggests as imperative, resulting in major advantages of infusing computational and big data analyses to transition zeolite synthesis away from trial-and-error methodologies.

Besides, speeding up the process of crystallizing zeolites, and speeding up the reactions of the zeolites themselves, will result in many socioeconomic advantages, according to Rimer.

Improved zeolite design includes the development of improved catalysts for energy applications (including advancements in alternative energy), new technologies for regulating emissions that impact the environment and separations to improve industrial processes with impact on petroleum refining, production of chemicals and water purification, he said.

Read the original post:

Building the Best Zeolite - University of Houston

Read More..

MLOps | Is the Enterprise Repeating the Same DIY Mistakes? – insideBIGDATA

There is a reason the enterprise doesnt build their own cloud computing infrastructure.Last decade, IT infrastructure teams sought to build their own private clouds because they thought they could do it cheaper and better suited to their business versus public cloud. Instead, they ended up taking longer and costing more than expected to build, requiring more resources to maintain, and having less of the latest capabilities in security and scaling than what was provided by the public clouds. Instead of investing in core business capabilities, these enterprises ended up investing significant time and headcount to infrastructure that couldnt match expanded business needs.

Many enterprises are now repeating that same do-it-yourself approach to most things MLOps by creating custom solutions cobbled together from various open source tools like Apache Spark.

These often result in model deployments taking weeks or even months per model, inefficient runtimes (as measured by inferences run over compute and time required), and especially lack the observability needed to test and monitor the ongoing accuracy of models over time. These approaches are too bespoke to provide scalable, repeatable processes to multiple use cases in different parts of the enterprise.

The case of the misdiagnosed problem

In addition, conversations with line of business leaders and chief data and analytics officers have taught us that organizations keep hiring more data scientists but arent seeing the return. As we delved deeper, however, and started asking questions to identify the blockers to their AI, they quickly realized their bottleneck was actually at the last mile deploying the models to use against live data, running them efficiently so the compute costs didnt outweigh the gains, and then measuring their performance.

Data scientists excel at turning data into models that help solve business problems and make business decisions. But the expertise and skills required to build great models arent the same skills needed to push those models in the real world with production-ready code, and then monitor and update on an ongoing basis.

This is where ML engineers come in. ML engineers are responsible for integrating tools and frameworks together to ensure the data, data pipelines, and key infrastructure are working cohesively to productionize ML models at scale (see our more in-depth breakdown comparing the roles of data scientists versus ML engineers available here).

So now what? Hire more ML engineers?

But even with the best ML engineers, enterprises face two major problems to scaling AI:

How to get the most value from AI

Enterprises have poured billions of dollars into AI based on promises around increased automation, personalizing the customer experience at scale, or delivering more accurate and granular predictions. But so far there has been a massive gap between AI promises and outcomes, with only about 10% of AI investments yielding significant ROI.

In the end, to solve the MLOps problem, Chief Data & Analytics officers need to build the capabilities around data science that are core to the business, but invest in technologies that automate the rest of MLOps. Yes, this is the common build vs. buy dilemma, but this time the right way to measure isnt solely OpEx costs, but in how quickly and effectively your AI investments are permeating throughout the enterprise, whether generating new revenues through better products and customer segments or cutting costs through greater automation and decreased waste.

About the Author

Aaron Friedman is VP of Operations atWallaroo.ai. He has a dynamic background in scaling companies and divisions, including IT Outsourcing at Verizon, Head of Operations forLowes.comand JetBlue, Head of Global Business Development at Qubole, and growing and selling two systemintegrationcompanies.

Sign up for the free insideBIGDATAnewsletter.

Join us on Twitter:@InsideBigData1 https://twitter.com/InsideBigData1

Here is the original post:

MLOps | Is the Enterprise Repeating the Same DIY Mistakes? - insideBIGDATA

Read More..