Category Archives: Data Science

Scalability: The Key to High, Long-Term Analytics ROI – RTInsights

To deliver strong ROI, analytics capabilities need to be able to scale. But enabling scalability across highly tailored use cases isnt always easy.

Across the modern enterprise, every team wants something slightly different from analytics. They have their own goals, own data, and own KPIsleading many teams to create and deploy their own analytics solutions with available resources.

Over time, thats created a highly fragmented analytics landscape across many organizations. Siloed solutions stagnate within individual teams and lines of business, creating an environment where:

The missing piece in environments and organizations like these is scalability. When teams push ahead with their own siloed analytics projects, the solutions they create cant scalemaking it far harder to realize high ROI from them.

Unfortunately, theres one big reason why that all-important piece is still missing across many enterprise analytics landscapes: Its tough to enable if you dont have the right strategy and support.

Three main challenges limit organizations ability to scale their analytics solutions and investments today:

#1) Disparate and inconsistent data

When an individual team builds its own analytics model, it builds around its datadesigning models that make the most of the available data sets, whatever their type or quality may be. Models created for and driven by those data sets become incredibly tough to use in different contexts, where the same type or quality of data isnt available. Theres no interoperability, so the models cant be scaled elsewhere.

#2) Low visibility and understanding across silos

If one team doesnt know about an adjacent teams existing analytics investments, they cant leverage and customize them for their own use. Siloed creation and management of analytics capabilities create cultures where people simply arent aware of where and how the organization has already invested in analyticsleading to significant duplication of effort and increased costs for the enterprise.

#3) Scalability is hard to build in retroactively

When a team identifies a new internal use case for analytics, they rarely stop to ask, how could other teams or markets benefit from what were creating? As a result, solutions are built with a single purpose in mind, making it difficult for other teams to utilize them across slightly different use cases. Instead of building a widely usable foundation, then customizing it for each team, solutions are designed for a single team at a core level, making them tough to repurpose or apply elsewhere.

See also: Moving to the Cloud for Better Data Analytics and Business Insights

Organizations need to fundamentally change how they think about, design, and manage analytics capabilities to overcome those challenges and unlock the full ROI of highly scalable analytics models and solutions.

Here are four practices helping organizations do that effectively:

Start with a standardized foundation

Each team across your organization needs bespoke, tailored capabilities to get the most from analytics. But that doesnt mean they have to build their own solutions from the ground up.

By having a centralized team create a customizable, standardized foundation for analytics, teams can create exactly what they need in a consistent way that enables interoperability and sharing of models and insights across the enterprise.

With an analytics center of excellence (CoE), for example, a centralized team can create everything each team needs for their unique use cases and add value for that team by including insights and capabilities from which adjacent teams have seen value.

Bring data science and data engineering closer together

Even today, many still view analytics as the exclusive domain of data scientists. But, if you want to enable the scalability of analytics models, data engineers need to be involved in the conversation and decision-making.

Data scientists may build the models and algorithms that generate analytical insights, but data engineers ensure a consistent, interoperable data foundation to power those models. By working closely together, they can align their decisions and design choices to help scale analytical capabilities across the business.

Zoom out and get some external perspective on whats possible

Suppose you want analytics investments to deliver broad value across your organization. In that case, your projects should start with a broad view of what analytics could help you achieve across multiple use cases. Practically, that means:

Continuously learn and improve.

Analytics always requires some degree of experimentation, and you cant realistically expect every single use case to deliver high long-term value. But, even if theyre unsuccessful, organizations should take steps to learn from each of them.

Within an enterprise, someone needs to take responsibility for learning from each use case explored. That person or team can then apply those lessons across new use cases and use them to develop assets and modules that can be reused across geographies and domains, extending, and increasing the value they deliver to the business.

See original here:

Scalability: The Key to High, Long-Term Analytics ROI - RTInsights

California Students Are Struggling in Math. Will Reforms Make the Problem Worse? – The New Yorker

The California Mathematics Framework (C.M.F.) is an arguably obscure but extremely consequential document informing the educations of millions of children, from transitional kindergarten to twelfth grade, which is revised every eight years. The C.M.F. doesnt decide what math concepts to teach; that is decided by the common-core standards. Instead, it makes recommendations to public-school teachers about how and when to teach what elements of math.

The United States has been ranked thirty-seventh of seventy-nine industrialized countries in math achievement among fifteen-year-olds. Among the states, California is considerably below average, and sometimes in the lowest quartile, in a number of national assessments of math proficiency. There is also a pronounced achievement gap, with women, minorities, and the poor falling behind. Given that math is a necessary step toward a STEM careerin demand, high-payingthese numbers are more than simply lamentable. So revisiting our approach to teaching math, in California and beyond, is urgent.

The current draft of the C.M.F., which will be voted on by the California State Board of Education in 2023, is intended to make math education more equitable. Chapter 1 (of fourteen) is titled Mathematics for All; Chapter 2 is called Teaching for Equity and Engagement. The goals were chosen through a series of focus groups of teachers and others; then three math-education experts, a mathematician, and a retired math teacher were tasked with developing a research-driven plan for achieving the goals.

The C.M.F. details some arguments against students taking higher-level math courses in middle school. It also suggests that math be de-trackedthat kids not be sorted into higher- and lower-level courses in their early schooling, and certainly not before high school, in the theory that slowing down math education can lead to deeper understanding. (Algebra I is now commonly offered as an option in the eighth grade.) We know were not doing kids any favors cramming high-school-level math courses into middle school, Jo Boaler, one of the writers of the C.M.F. and a professor at Stanfords Graduate School of Education, told me. And students who are not in higher-level math courses might be damaged by the message that theyre not good at math, which becomes a self-fulfilling prophecy. Brian Lindaman, a professor at California State University, Chico, who chaired the C.M.F. committee, told me, We let school boards make their own choice, but we wanted to make sure students and parents had the most information about how acceleration has workedand also how it hasnt worked. We wanted to point out the benefits of a classroom of diverse learners, of people at different ability levels, rather than have a group that just struggles all the time.

The C.M.F. also suggested adding a data-science track, so that students could choose to take data science in place of the more familiar sequence of Algebra II, precalculus, and calculus. Data science is often seen as the math of the present and future, and Boaler herself has helped to design a data-science course, through youcubed, a company she co-founded. (Boaler has attracted criticism for the lucrative consulting work she has done for public-school districts. She disputes the characterization of her earnings.) The course is organized around a different set of pedagogical values than most current math classes: project-oriented group work, no tests, and a focus on real-world applications.

When the first draft of the C.M.F. was released for public comment, phrases such as equity and social justice attracted negative attention from right-leaning press. The headline of an op-ed in the Wall Street Journal read California Leftists Try to Cancel Math Class. The C.M.F. has been called woke math.

But more fierce and fine-grained criticism of the C.M.F. cameperhaps surprisinglyfrom people who have done a lot of work promoting diversity and equity in math and other STEM fields. Everything Ive read about this proposal is going to make matters worse, Adrian Mims said, of the initial draft of the C.M.F. Mims is the founder of the Calculus Project, a program that has been remarkably successful at getting more students of color to take and succeed in advanced math courses. Modifying curriculum that way will not bring equity, he said. It will just bring in a lower track. The lower track he refers to is the data-science track, which he argued would not prepare students for a possible future career in data science, let alone in engineering, physics, economics, or computer science. And we all know who ends up in that trackBlack, Hispanic, and low-income students.

Brian Conrad, a professor of mathematics at Stanford, told me that, during the Trump era, he had been avoiding the news, so I had no idea about this campaign to replace Algebra II with data science. He then began to follow movements in California's K-12 math curricula more closely. When he read the nine-hundred-page C.M.F. document, he was concerned. I encountered a lot of assertions that were hard to believe and were justified via citations to other papers. So I read those other papers, Conrad wrote on a Web site he started for the purpose of sharing his findings. (He is not on Twitter.) Conrad is a graduate of public schools; his father was a public-school math teacher. To my astonishment, in essentially all cases, the papers were seriously misrepresented, he wrote.

One relevant large-scale experiment on public-school math education has already happened in California. In 2014, the San Francisco public-school system stopped offering an Algebra I course to eighth graders. The change was intended to emphasize depth in teaching, and to delay separating students into different levels and kinds of math. Everyone would be in the same math class. Except, of course, not everyone: students at private schools still had an accelerated track. (As of 2015, nearly thirty per cent of school children in San Francisco attended private school, one of the highest rates in the nation.) And public-school kids whose parents could pay for outside classes also had that option. I dont know why it seemed like you could solve a problem by widening the privilege gap. What is solved by eliminating content? Maya Keshavan, an engineer and a mother of two, said. I have a son who was old enough to have the old curriculum; then for my daughter I paid for her to take an outside class. Not everyone can do what I did.

In order for typical students to be ready for calculus by their senior year, something that is close to essential for a future STEM major, they would either need to double up on math courses or take a compression course in junior year that combined Algebra II and precalculus. In news releases and in presentations given around the country by the board of the San Francisco Unified School District, leaders touted the success of the program. They have claimed that the repeat rate for Algebra I had dropped from forty per cent to seven per cent. Keshavan filed a request for the data, through the California Public Records Act. She found that the earlier repeat rate was in fact four per cent. (The district maintains the forty-per-cent figure, saying that this figure was based on students repeating Algebra I at any point during high school.) What I think was truly educational malpractice was the compression course, Keshavan said. It made students think they were prepared when they werent. To be eligible for admission to the University of California system, students need to have completed three years of college-preparatory mathematics, and whether compression courses qualify depends on the courses content. A group of professors from Stanford, Berkeley, U.C.L.A., and Harvard, including Nelson and Mims, wrote in a joint letter that STEM degrees are some of the best paths for social mobility and that the compression course was antithetical to responsible preparation.

School-board leaders also claimed that diversity had increased in advanced math courses. One presentation slide showed that the number of Black students passing an advanced math class had increased from eleven per cent to forty-two per centan extraordinary change. But those percentages referred to a shift from three out of twenty-seven students to five out of twelve studentsin the entire San Francisco public-school system. Those numbers, any way you look at them, are awful, and too small to reveal any trend.

Keshavans kids are now both out of high school. Her son is an electrical engineer, and her daughter is a junior in college, majoring in a STEM field, but Keshavan has continued her activism. I dont know why I cant let it go, but its just so wrong, she said. its just so upsetting to me that were putting all these kids at a disadvantageand why? Because some grownups cant admit that they made a mistake. San Franciscos curricular changes remain in place. In fact, the first draft of the C.M.F. cited the statistics from the school-board presentations.

Jelani Nelson, a professor of electrical engineering and computer science at the University of California, Berkeley, is the son of an Ethiopian mother and an African American father. For the past eleven years, he has devoted time and energy to AddisCoder, a free summer program he founded while finishing his Ph.D. at M.I.T., that teaches computer science to high-school students in Ethiopia; he has also launched a program called JamCoders, in Jamaica, and is a co-founder of the David Harold Blackwell Summer Research Institute, which aims to help more Black students pursue Ph.D.s in mathematics. Im extremely worried, he said, that the C.M.F. is implicitly advocating for certain groups of people to be pushed away from rigorous math courses into essentially a lower track, setting back progress in improving diversity in STEM.

Critics of the data-science track, such as Conrad and Nelson, are not opposed to data science as a subject. Conrad said that some of these courses are fine, but he argued that data literacy could be taught in any number of high-school courses, such as biology, political science, or a regular math class. Nelson, while teaching at Harvard, was part of a team tasked with integrating quantitative reasoning with data into the undergraduate curriculum. Among the fields that came to include courses with Q.R.D. were biology, government, economics, and sociology. Recently, both California State University and the University of California system wrote formal letters expressing concerns, following a review of the C.M.F. The California State letter suggested that the data-science pathway could potentially diminish equity.

Here is the original post:

California Students Are Struggling in Math. Will Reforms Make the Problem Worse? - The New Yorker

Here’s how cloud computing will drive future of data analytics – ETCIO

By Amit Gupta

In our daily life, the number of the devices connected via IoT is rapidly increasing. Businesses today rely greatly on cloud data analytics to tackle the challenge of keeping up with the current pace of data being generated, used, and stored. Cloud data analytics comes as an effective solution to make better decisions for businesses by tracking data patterns and eliminating the need for assumptions. As per recent study by Statista, by 2030 about 50 million IoT-connected devices will be used across the world, generating huge chunks of data. Majority of this data will be store and analyzed on cloud.

Cloud offers access to various services like servers, data analytics, AI, and more. Cloud data analytics helps attract new customers as well as recognize patterns in data enabling businesses to serve their existing customers better making the whole process cost-effective.

Cloud computing came as lifesaver to help organizations tackle the hassle of physical servers.

Cloud computing consists of a set of hardware and software that can be accessed remotely through any web browser. It allows you to access data from any web browser as opposed to being restricted to a particular hard drive. When you apply the principle of analytics to the data in the cloud drives, it is termed cloud-based analytics.

Cloud analytics stores, processes and analyses business data using cloud technologies. A hybrid cloud analytical model, used by some organizations allows some functions to be performed only in cloud-hosted environments, while others are performed on-premises servers. However, most organizations are moving to the cloud entirely as they decide to up their analytics program.

Capabilities Of the CloudThe Cloud removes the need for the organization to manage their traffic as it comes with readymade infrastructure. It lets organizations bring changes by increasing or decreasing cloud storage depending on whether the business is scaling up or down. This makes it more suitable as it cuts the cost of installing new hardware and enables businesses to quickly respond to market demands ensuring they make customers happy by meeting their requirements on time.

Put simply, cloud analytics gives way to a more flexible budget as compared to in-house analytics solutions by eliminating the need for hardware, equipment, data centers or regular upgradation. More and more organizations are now benefiting from enhanced data quality by combining data analytics with machine learning technologies. They are heavily investing in developing cloud capabilities so as to gain maximum benefits. With the right tackling of Cloud & Data by skilled cloud professionals, the cloud can play a significant role in improving the performance of organizations and provide greater insights through detailed data analytics.

Creating value with CloudHeres top benefits that organizations have found to be useful by embracing cloud for data analytics & machine learning:Track better insights

Cloud analytics makes it possible to record and process data simultaneously irrespective of the location of local servers. It allows companies to track crucial insights like sales of an item from a specific location thus helping them to adjust their productions accordingly. Collecting and analyzing data, helps understand customers' behavior. Cloud thus provides the right insights companies to make better decisions and function more effectively.

Scalability:Scaling analytics becomes easier with the cloud as it allows companies to add storage depending on the changes in the organizations. Having data analytics in place synced with Cloud, allows organizations to be more responsive to new demands of the business and adjust rapidly to meet customer requirements and make the most of the opportunities as they open up.

Social Media:Processing activities across different social media platforms isn't an easy task. With cloud storage, however, social media site data can be analyzed simultaneously, allowing for faster quantification of results and a more targeted approach. The time to analyze the data insights gathered from different platforms becomes easier when integrated with Cloud which can throw light on the areas of improvement for the business to flourish as well as know the performance of the business on each platform in a detailed manner in real-time.

Governance & Security:Cloud offers superior control & security to its users. Over 50% of companies prefer to store their confidential data in the cloud. The main reason being the data stored in Cloud is encrypted thus preventing access to the hackers. Data analytics via Cloud offers better auditing capabilities, gives better control on the access, and acts as single source of truth in understanding the data of an organization. Cloud storage of analytics thus becomes useful also during times of emergencies or other natural disasters since data is safeguarded well.

Collaboration & Easy Access:With cloud analytics, different individuals can work together through file-sharing applications. This collaboration proves beneficial, especially for global organizations with employees operating across different regions. The changes made can be seen in real-time cutting down on the turnaround time between teams and makes collaboration easy.

Cost Savings:One of the advantages of cloud computing services is that they don't need hardware, data centers, or equipment which makes them highly cost-efficient. Organizations can easily scale up or down their cloud server usage and pay based on the changes in the ML or data science workload. Cloud allows organization to follow a subscription based model which enables businesses to have a more flexible budget planning as per the demands.

The FutureTaking all the cloud capabilities as well as the potential risks that come with it into consideration, organizations are increasingly choosing cloud for its many benefits with data being one of the most important deciding factors. The goal is to ensure that data can be successfully processed and analyzed faster with the help of cloud experts.

The enormous possibilities of the cloud combined with the security, reliability and affordability factors predict an expansion of data analytics use in the cloud. For a business to achieve peak performance, Cloud & data insights provided through them will become more crucial than ever before. Cloud is now touted as a necessity rather than a luxury for businesses to thrive.

The author is the Founder and CEO at Rapyder Cloud Solutions

Disclaimer: The views expressed are sole of the author and ETCIO.com does not necessarily subscribe to it. ETCIO.com shall not be responsible for any damage caused to any person/organization directly or indirectly.

Read the original:

Here's how cloud computing will drive future of data analytics - ETCIO

Carta Healthcare Releases Semaphore, an Integrated Development Environment for Healthcare Organizations to Develop Their Own Data Science and…

SAN FRANCISCO--(BUSINESS WIRE)--Carta Healthcare, provider of solutions for common healthcare data challenges through a combination of people, processes, and technology, today announced the release of Semaphore, an integrated development environment (IDE) built for healthcare organizations internal data science teams to design and deploy custom applications for analysis and visualization of their data. Powered by Cartographer, Carta Healthcares AI platform, Semaphore has instant access to standardized datasets derived from across the organizations health records, significantly decreasing the time spent collecting data up front.

Healthcare organizations need to be able to use the data they have today, today to improve patient care quality, enhance operational efficiencies, support research initiatives, and reduce costs, said Matt Hollingsworth, CEO and Co-founder of Carta Healthcare. Siloed systems, data security requirements to meet patient privacy regulations, and internal approval processes often result in lengthy delays for new analytics packages to become available to the teams who need them. Semaphore directly addresses this challenge.

Semaphore was built in-house by Carta Healthcares own data scientists to address the shortcomings of existing off-the-shelf IDEs when used within the healthcare environment. Semaphore offers all the benefits of a traditional IDE, in addition to:

Semaphore is powered by Cartographer, an AI-driven analytics tool that powers Carta Healthcares broader ecosystem of products. By searching, analyzing, and standardizing data from all data sources, Cartographer provides organizations with high-quality, immediately usable data.

The standardized data sets created by Cartographer are also used by Atlas, Carta Healthcares AI-assisted data abstraction solution for submitting to clinical registries, and Navigator, a complete analytics tool for solving common healthcare operational challenges such as anesthesia and nursing coverage and operating room scheduling. Together, Atlas, Semaphore, Navigator, and Cartographer allow healthcare organizations to collect, analyze, and act on their data.

To learn more about the innovative solutions that Carta Healthcares interdisciplinary team of experts create to address common healthcare challenges, visit http://www.carta.healthcare.

About Carta Healthcare

Founded in 2017, Carta Healthcares mission is to improve patient care by harnessing the value of clinical data. Through its combination of industry-leading, AI-driven technology and multidisciplinary team of experts, Carta Healthcare has transformed the traditional process for abstracting and analyzing clinical and operational data. The companys agile, innovative approach to expertise+technology allows healthcare organizations to collect, analyze, and act on their data in a fraction of the time. The result is high-quality, accurate, trustworthy datasets for use across a healthcare organization's initiatives to operate more efficiently, optimize care delivery, improve patient outcomes, and allow clinicians to practice at the top of their license. For more information, visit http://www.carta.healthcare or contact us at hello@carta.healthcare.

The rest is here:

Carta Healthcare Releases Semaphore, an Integrated Development Environment for Healthcare Organizations to Develop Their Own Data Science and...

Geolog and Petro.ai Announce a New Strategic Partnership to Deliver Data Science Products and Services to the Global Energy Industry – OILMAN Magazine

Share0

Geolog International BV (GEOLOG) the largest independent global provider of wellsite surface logging solutions to the Oil, Gas & Geothermal Industries and Petro.ai, global leader in subsurface data analytics, machine learning, and AI, have today announced a new strategic partnership to deliver Machine Learning and AI-based data science and predictive products and services to the global Energy Industry.

Richard Calleri, CEO and owner of GEOLOG said, our clients have a ferocious appetite for receiving quality, structured, analytics ready data and predictive products. Partnering with Petro.ai allows us to deliver additional value to our customers, by accelerating and signicantly expanding our portfolio of Machine Learning and AI solutions that are designed to improve drilling eciency and reservoir prediction.

Dr. Troy Ruths, Founder and CEO of Petro.ai added, this partnership will allow Petro.ai to deliver a new set of analytics and accurate predictions to Geologs global customer base. Our unique geoscience expertise that merges both structured and unstructured data is a natural t for the high quality and dierentiated data streams that Geolog collects. We couldnt be more excited to launch this strategic partnership.

Geolog International BV (www.geolog.com) is the worlds largest independent surface logging services provider to the Energy industry with global operations in over 50 countries and a workforce in excess of 1600.

Petro.ai (www.petro.ai) combines the latest geoscience research with cutting edge data science, delivering the most accurate predictions of economic outcomes in the energy industry. In the last 12 months, these predictions have advised $1.9 billion of capital investment with better than 94% accuracy.

3 Ways Technology is Going to Shape the Oil and Gas Industry Free to Download Today

Oil and gas operations are commonly found in remote locations far from company headquarters. Now, it's possible to monitor pump operations, collate and analyze seismic data, and track employees around the world from almost anywhere. Whether employees are in the office or in the field, the internet and related applications enable agreater multidirectional flow of information and control than ever before.

Go here to see the original:

Geolog and Petro.ai Announce a New Strategic Partnership to Deliver Data Science Products and Services to the Global Energy Industry - OILMAN Magazine

Reaping the Benefits of Having a Data Backup and Recovery Plan – Data Science Central

Recovery Backup Restoration Data Storage Security Concepts

In todays data-driven economy, any business needs to make sure that their data is easily recoverable and secured in an emergency. The National Archives and Records Administration suggests that close to 93% of the organizations which witness downtime and data loss for over ten and more days can file bankruptcy in a year. No wonder why businesses need to give importance to data backup.

When there is a disaster, averting crucial data losses and minimizing downtime must be the priority for any organization. An effective data backup system and a disaster recovery plan are crucial in attaining all the priorities and setting up essential safeguards for securing your business and the data.

A robust disaster recovery plan can ensure that your business can get back to work at the earliest after there has been a massive data loss. It acts as a detailed plan that helps in business continuity when a disaster wholly or partially destroys the business resources, for instance, data records, IT devices, databases, and corporate networks.

Simply put, data backup is the process of storing your data in one storage medium other than the primary storage. If the primary hardware gets damaged or fails, it is necessary to recover the company data. It is the method of duplicating the information retrieved during data loss. As an example, check out the data recovery services from Computerbilities, Inc.

These are some essential advantages of data backup and recovery solutions that will help you secure your mission-critical information.

Read this article:

Reaping the Benefits of Having a Data Backup and Recovery Plan - Data Science Central

Casting for data on microplastics in the ocean – Virginia Tech Daily

The new endeavor is possible because of the generosity of Virginia Tech alumni Bill and Carol Seale, who have committed a $2 million gift to the project.Their generous support has enabled Weiss and his team to develop a comprehensive plan to begin ocean monitoring. The Coastal Zone Observatorys work is a critical step forward to help us become better stewards of the worlds oceans, which are arguably our most critical resource on Earth, said Bill Seale.

According to Weiss, our knowledge of the marine environment apart from the presence of microplastics is full of holes. Thats in part because the ocean itself is a fickle source of data.

We know the resolution of the surface of Mars better than the surface of our ocean floor, Weiss said. But thats topography just on the seafloor. Now imagine how little we know about how conditions are when the water in the ocean is constantly moving. How can we describe a condition in a certain area if its constantly changing? If the moment you measure it, its gone?

Researchers at the Coastal Zone Observatory will collect ocean data such as temperature and turbidity the ability of sunlight to travel into depth in a way that adapts to the oceans transience. They'll use sensor-equipped swarms of underwater robots developed by a team of engineers led by Dan Stilwell, an electrical engineering professor in the College of Engineering and director of the Virginia Tech Center for Marine Autonomy and Robotics.

It is energizing to watch the Seale Coastal Zone Observatory rapidly take shape, said Kevin Pitts, dean of the College of Science. The pollution of our oceans is worsening by the day, and Im excited to see researchers in the Colleges of Science, Engineering, and veterinary medicine collaborate to learn more about these issues and find ways to help mitigate a global problem.

The approach rethinks the way we take ocean data, Weiss said. He believes it can help the team establish a data set that reflects the marine environment as its shaped by climate change over time. Autonomous vehicles give researchers a much more dynamic method for measuring environmental conditions, with the ability to move through ocean depths and with currents to follow the data. Eventually, the team can then operate those vehicles to collect microplastics concentrations and learn how theyre affected by the ocean conditions in flux around them.

Lets say, in the future, we have a sensor that would allow us to determine in situ, very quickly, the concentration of microplastics, said Weiss, who is director of the Academy of Integrated Science, also part of the College of Science. We can follow the value of concentrations in the ocean, and by the motion of the vehicle, we can determine how these concentrations evolve over time. So that gives us a much more comprehensive, and full, data set to understand how microplastics move in the ocean. What conditions, like temperature, are they dependent on?

As researchers gather data on the marine environment, others at the Coastal Zone Observatory will study the impact of microplastics on marine life as that impact extends from individuals to species and moves up the food chain. In the Chesapeake Bay area, biologists from the College of Science and veterinarians from the Virginia-Maryland College of Veterinary Medicine will study the effects of ingesting microplastics on fish used for seafood. Others will collaborate with biologists from Radford University and Connecticuts Fairfield University to study microplastics consumption by tilapia and Magellanic penguins and learn which types of microplastics affect coastal organisms the most.

The Center for Coastal Studies is part of the Fralin Life Sciences Institute. Weiss launched the center in 2020 to coordinate research, teaching, and outreach aimed at ensuring a more sustainable coexistence of humankind and nature within coastal communities.

Read more here:

Casting for data on microplastics in the ocean - Virginia Tech Daily

In State of University Address, President Hemphill Announces Plans for School of Data Science, Expanded Baseball Stadium – Old Dominion University

Old Dominion University President Brian O. Hemphill, Ph.D., announced plans for a School of Data Science and an expanded baseball facility during his first State of the University address, which he delivered Friday at Chartway Arena.

The University is in the final stages of submitting a proposal to the State Council of Higher Education for Virginia regarding the School of Data Science, he said.

Through partnerships with Jefferson Lab and NASA Langley, researchers from these national labs will have ODU faculty status. Also, ODU faculty and students will have special access to these national lab collaborations and facilities.

"This is truly a win-win for ODU and our partners," President Hemphill said.

The baseball stadium project exists thanks to the generosity of several donors, including Priority Automotive President and CEO Dennis Ellmer, who gave $2.5 million.

"ODU is honored that the expanded facility will be known as the Ellmer Family Baseball Complex," President Hemphill said. (See related story.)

In a further announcement, the ODU leader said the University was in the first phase of a multiyear effort to increase stipend levels for graduate assistants. Master's students with teaching assistant duties will receive $15,000 - a boost of $5,000 - to support the institution's mission of teaching, as well as research.

Stipends for doctoral students will increase from $15,000 to $20,000, effective immediately, President Hemphill said.

Among other achievements he highlighted from the past year was ODU's designation in December 2021 as an R1 research university by the Carnegie Classification of Institutions of Higher Education. This placed ODU among the nation's top research institutions.

"This is a truly significant accomplishment and will forever change the future possibilities for our institution and our students," President Hemphill said.

But his vision is grander still.

"We are truly committed to moving up among our R1 counterparts with a bold and aggressive agenda across all research areas, with a special emphasis on maritime, coastal resiliency, offshore wind energy, data science, cybersecurity, autonomous systems and health care."

On the health care front, President Hemphill cited the University's expanded work with Eastern Virginia Medical School and a deepening collaboration with Sentara Healthcare and other partners. Since the summer of 2021, the entities have been meeting to explore the establishment of an academic health sciences center to address health disparities facing the region and its people, President Hemphill said.

"A strategic integration between EVMS and ODU would result in an academic health sciences center that offers the highest number of academic programs and the largest enrollment in health sciences in the Commonwealth of Virginia," he said.

The proposed integration would strengthen and increase the workforce pipeline in Hampton Roads, he said, creating an economic impact of $4.9 billion for Virginia.

Highlighting other initiatives in progress, President Hemphill updated the audience on construction of what he called "state-of-the-art facilities that create a learning environment that is second to none."

The Health Sciences Building, a capital project of more than $76 million, is taking shape at 41st Street and Monarch Way and is scheduled to be finished in summer 2023. The three-story building will house the School of Dental Hygiene, the School of Rehabilitation Sciences and the School of Medical Diagnostic and Translational Sciences.

During the most recent legislative session, the University secured $188 million in project funding for a new Biology Building. With more than 160,000 square feet across five floors, it is scheduled to be ready in 2026, replacing the Mills Godwin Life Sciences Building.

Regarding athletics, President Hemphill spoke about the "new era" Old Dominion entered in July of this year when it joined the Sun Belt Conference.

"For ODU, the conference movement was always about providing the very best experience for our student-athletes as well as our fans," he said. "We are honored to join the Sun Belt and look forward to a great deal of collaboration and competition in our inaugural season and beyond."

President Hemphill also:

See photo gallery

Watch video

Here is the original post:

In State of University Address, President Hemphill Announces Plans for School of Data Science, Expanded Baseball Stadium - Old Dominion University

NASA Oceo Pre-Submission Webinar: Appendix N: MUREP PBI/HBCU Data Science Equity, Access and Priority for Research and Education (DEAP) – Space Ref

Press Release

NASA

September 2, 2022

Engagement Opportunities in NASA STEM 2022 (EONS2022)Appendix N: MUREP PBI/HBCU Data Science Equity, Access and Priority for Research and Education (DEAP)

Pre-Submission Webinar is Scheduled for Wednesday, September 14, 2022at 4:00 pm 5:00 pm Eastern Time

Pre-Registration Link:https://nasaenterprise.webex.com/nasaenterprise/j.php?RGID=r618b325e3a1b4c80ee1380b7f88d6a08

Pre-Registration is required (link to participate will be sent to your email after pre-registration is completed and approved.)

During this session, the MUREP DEAP team will give an in-depth overview of the opportunity and highlight information contained in the EONS 2022 document regarding proposal preparation and requirements. Please visit the MUREP DEAP website for details on how to join the call. Any changes to this session will be posted there as well. Proposers are strongly advised to check for updates prior to the call.

For more information regarding this opportunity, please visit the ENGAGEMENT OPPORTUNITIES IN NASA STEM (EONS 2022) on the NASA Solicitation and Proposal Integrated Review and Evaluation System (NSPIRES) website and click on List of Open Program Elements.

Please submit your questions to MUREPDEAP@nasaprs.com. A recording of this webinar will be made available after the session.

Should you or your colleagues be interested in this opportunity, please apply at the MUREP DEAP website

SpaceRef co-founder, Explorers Club Fellow, ex-NASA, Away Teams, Journalist, Space & Astrobiology, Lapsed climber.

More here:

NASA Oceo Pre-Submission Webinar: Appendix N: MUREP PBI/HBCU Data Science Equity, Access and Priority for Research and Education (DEAP) - Space Ref

Analytics and Data Science News for the Week of September 2; Updates from Alteryx, AWS, Teradata, and More – Solutions Review

The editors at Solutions Review have curated this list of the most noteworthy analytics and data science news items for the week of September 2, 2022.

Keeping tabs on all the most relevant analytics and data science news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last month, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy analytics and data science news items.

The launch of Server-FIPS is monumental for the use of analytics in public sector environments that require enhanced data encryption. Alteryx Server-FIPS is a FIPS-capable and scalable server-based product for scheduling, sharing, and running apps and models created in Alteryx Designer-FIPS for others in the organization to leverage.

Read on for more.

With this new feature, Amazon QuickSight customers can easily provide insights to end-users where they need them most by embedding individual visualizations from Amazon QuickSight dashboards into high-trac webpages and applications without the need for server or software setup or infrastructure management.

Read on for more.

The 9th annual report examines data science and machine learning end user deployment and trends, covering analytical features and functions, neural networks, data preparation, tool usability, model operations, scalability, and open source and big data.

Read on for more.

The pairing of CXOs purpose-built, web-based Enterprise Performance Management (EPM) reporting and Longviews modular suite of integrated products for tax, transfer pricing, close, plan, and consolidation will allow organizations to deliver richer narrative behind their data with built-in commentary and dynamic dashboards.

Read on for more.

Read on for more.

With these new capabilities, Vantage customers can now take advantage of the most in-database analytic functions anywhere in the market and critical artificial intelligence/machine learning (AI/ML) model management tools. This new functionality in combination with the launch ofTeradata VantageCloud Lake is intended to provide customers the ability to activate massive amounts of data and solve complex business challenges.

Read on for more.

For consideration in future analytics and data science news roundups, send your announcements to the editor: tking@solutionsreview.com.

Widget not in any sidebars

Tim is Solutions Review's Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.

Read more:

Analytics and Data Science News for the Week of September 2; Updates from Alteryx, AWS, Teradata, and More - Solutions Review